Publications
Tales from a Graph: a Pipeline for Mathematical Problem Generation
MATH-AI Workshop at Neurips 2025 - Download paper
We present a framework for generating pairs of mathematical and word problems with controllable complexity, where the shared underlying mathematical steps are guaranteed to be correct. Our approach enables controlled studies revealing systematic performance gaps between mathematical and word problem variants, while providing verifiable training signals for reasoning models as each problem pair is based on an underlying graph where each step is solved symbolically.
Learning Mathematical Rules with Large Language Models
Outstanding paper award at MATH-AI Workshop at NeurIPS 2024 - Download paper
In this paper, we study the ability of large language models to learn specific mathematical rules such as distributivity or simplifying equations. We present an empirical analysis of their ability to generalize these rules, as well as to reuse them in the context of word problems. For this purpose, we provide a rigorous methodology to build synthetic data incorporating such rules, and perform finetuning of large language models on such data. Our experiments show that our model can learn and generalize these rules to some extent, as well as suitably reuse them in the context of word problems.
