Abstract | ||
---|---|---|
Answering complex questions that require multi-step multi-type reasoning over raw text is challenging, especially when conducting numerical reasoning. Neural Module Networks (NMNs), follow the programmer-interpreter framework and design trainable modules to learn different reasoning skills. However, NMNs only have limited reasoning abilities, and lack numerical reasoning capability. We upgrade NMNs by: (a) bridging the gap between its interpreter and the complex questions; (b) introducing addition and subtraction modules that perform numerical reasoning over numbers. On a subset of DROP, experimental results show that our proposed methods enhance NMNs’ numerical reasoning skills by 17.7% improvement of F1 score and significantly outperform previous state-of-the-art models. |
Year | Venue | DocType |
---|---|---|
2022 | International Conference on Computational Linguistics | Conference |
Volume | Citations | PageRank |
Proceedings of the 29th International Conference on Computational Linguistics | 0 | 0.34 |
References | Authors | |
0 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jiayi Chen | 1 | 0 | 0.34 |
Xiao-Yu Guo | 2 | 0 | 0.68 |
Yuan-Fang Li | 3 | 245 | 39.15 |
Gholamreza Haffari | 4 | 381 | 59.13 |