A Meaning-Based Statistical English Math Word Problem Solver (original) (raw)
Related papers
A Tag-based English Math Word Problem Solver with Understanding, Reasoning and Explanation
Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Demonstrations
This paper presents a meaning-based statistical math word problem (MWP) solver with understanding, reasoning and explanation. It comprises a web user interface and pipelined modules for analysing the text, transforming both body and question parts into their logic forms, and then performing inference on them. The associated context of each quantity is represented with proposed role-tags (e.g., nsubj, verb, etc.), which provides the flexibility for annotating the extracted math quantity with its associated syntactic and semantic information (which specifies the physical meaning of that quantity). Those role-tags are then used to identify the desired operands and filter out irrelevant quantities (so that the answer can be obtained precisely). Since the physical meaning of each quantity is explicitly represented with those role-tags and used in the inference process, the proposed approach could explain how the answer is obtained in a human comprehensible way.
Designing a Tag-Based Statistical Math Word Problem Solver with Reasoning and Explanation
2015
This paper proposes a tag-based statistical framework to solve math word problems with understanding and reasoning. It analyzes the body and question texts into their associated tag-based logic forms, and then performs inference on them. Comparing to those rule-based approaches, the proposed statistical approach alleviates rules coverage and ambiguity resolution problems, and our tag-based approach also provides the flexibility of handling various kinds of related questions with the same body logic form. On the other hand, comparing to those purely statistical approaches, the proposed approach is more robust to the irrelevant information and could more accurately provide the answer. The major contributions of our work are: (1) proposing a tag-based logic representation such that the system is less sensitive to the irrelevant information and could provide answer more precisely; (2) proposing a unified statistical framework for performing reasoning from the given text.
Reasoning about Quantities in Natural Language
2015
Little work from the Natural Language Processing community has targeted the role of quantities in Natural Language Understanding. This paper takes some key steps towards facilitating reasoning about quantities expressed in natural language. We investigate two different tasks of numerical reasoning. First, we consider Quantity Entailment, a new task formulated to understand the role of quantities in general textual inference tasks. Second, we consider the problem of automatically understanding and solving elementary school math word problems. In order to address these quantitative reasoning problems we first develop a computational approach which we show to successfully recognize and normalize textual expressions of quantities. We then use these capabilities to further develop algorithms to assist reasoning in the context of the aforementioned tasks. 1
A Novel Framework for Math Word Problem Solving
International Journal of Information and Education Technology, 2013
Mathematical word problems represent many real world scenarios. Often timesfound difficult to solve. The reason for their difficulty is because of miscomprehension and the inability to formulate a mathematical representation of the text. In this paper we present a framework based on fuzzy logicontology model that interprets a mathematical word problem in natural language and compute a solution. This framework uses ontology as a working memory and search engine to compute the solution. Fuzzy logic is used to determine the confidence of result returned, which could eliminate the confusion for a user trying to determine if the solution provided is correct. The ability to interpret a mathematical word problem and return a detailed solution will help educated users by providing them detailed steps of a solution.
Arithmetic Word Problem Solver using Unit Dependency Graph and Verb Categorization
Nowadays, the arithmetic questions that are expressed in natural language such as English are hugely getting interest by researchers. Although some useful researches have been proposed to solve word problems, there are still gaps in implementing a robust arithmetic word problem solver as the answers of the word problems cannot be easily extracted with the approach of keyword or pattern matching. According to this motivation, this research focuses on generating the correct equation from the word problem and deriving the solution. The aim of this proposed work is to implement an arithmetic word problem solver that can understand the elementary math word problems, derive the symbolic equation, and generate the result from the equation. The system is implemented with the combination of the verb semantics and the graph. The elementary student can obtain many benefits since the system is resulted the equation along with the answers. Keywords-Mathematical Word Problem (MWP), Natural Language, Arithmetic Word Problem, Word Problem Solver.
Mining Commonsense and Domain Knowledge from Math Word Problems
2021
Current neural math solvers learn to incorporate commonsense or domain knowledge by utilizing pre-specified constants or formulas. However, as these constants and formulas are mainly human-specified, the generalizability of the solvers is limited. In this paper, we propose to explicitly retrieve the required knowledge from math problemdatasets. In this way, we can determinedly characterize the required knowledge andimprove the explainability of solvers. Our two algorithms take the problem text andthe solution equations as input. Then, they try to deduce the required commonsense and domain knowledge by integrating information from both parts. We construct two math datasets and show the effectiveness of our algorithms that they can retrieve the required knowledge for problem-solving.
IRJET- NLP based Mathematical Problem Solving
IRJET, 2020
A computer program can easily compute solutions to an equation, but it cannot understand the same problem given in the form of a word problem where the computer needs to identify which equation is to be put and also identify what information is provided in the problem text. Here, we describe a NLP based approach with which a computer can be trained to identify the topic and subtopic and ultimately the equation that is applicable to the solution of the given problem and also identify the known values in the problem. Only a handful of utility software is available today providing this functionality. The ones online provide solutions for template-based problems. This study goes beyond currently used template-based models and uses Parsing Technique for identifying the given information in the paragraph and focuses on TF-IDF measure with SVM classifier for topic classification. The current implementation of the system considers only high school level problems of Physics.
Arabic Arithmetic Word Problems Solver
Procedia Computer Science, 2017
The recent evolution in Natural Language Processing (NLP) and machine learning have played a crucial role in the development of solving word problems written in human language. This paper, to the best of our knowledge, presents the first attempt of automatically solving Arabic arithmetic word problems. In addition, as part of this work, we prepared an Arabic annotated dataset by translating a standard arithmetic word problems English dataset (AddSub Dataset). The AddSub dataset has been used by several researchers to evaluate their models for English arithmetic word problems. The proposed algorithm relies on our automatic verbs learning approach based on the training dataset. Moreover, the algorithm utilizes various NLP tools to assign objects to problem states until reaching to the goal state such as Stanford Parser, Named Entity Recognition (NER), and Cosine Metric Distance. Our approach overcomes various issues such as tracking both entities and their related results during the transfer process as well as dealing with different forms of the same verb. The performance evaluation process showed promising results resolving 80.78% of the problems. On the other hand, there are still several areas that can be extended and improved. For instance, the lack of common knowledge, presence of irrelevant information, and quantity conversions.
MathBot – A Deep Learning based Elementary School Math Word Problem Solver
2019
We built a Deep Neural Network architecture based framework to make the MathBot learn to convert English language based math word problems into equations involving few unknowns and arithmetic quantities, and solve the equations thus generated. There have been many semantic parser and rule based math word problem solvers, but application of any learning algorithm to reduce natural language based math problems into equations is a topic of recent research. In this work, We show that the use of deep learning based natural language processing techniques, such as, Recurrent Neural Networks and Transformers, can help build such a learning system. Our work primarily focused on the use of transformers to predict the equation. We also added an equation solver to get the final result from the equation. In addition to traditional BLEU score we used an ingenious solution accuracy metric to evaluate our models. To improve solution accuracy, we introduced number mapping for word embedding as a nov...
A Multi-language Platform for Generating Algebraic Mathematical Word Problems
2019 14th Conference on Industrial and Information Systems (ICIIS)
Existing approaches for automatically generating mathematical word problems are deprived of customizability and creativity due to the inherent nature of template-based mechanisms they employ. We present a solution to this problem with the use of deep neural language generation mechanisms. Our approach uses a Character Level Long Short Term Memory Network (LSTM) to generate word problems, and uses POS (Part of Speech) tags to resolve the constraints found in the generated problems. Our approach is capable of generating Mathematics Word Problems in both English and Sinhala languages with an accuracy over 90%.