Artificial Intelligence 7 min read

Question Directed Graph Attention Network for Numerical Reasoning over Text (QDGAT)

The paper introduces QDGAT, a question‑directed graph attention network that enhances numerical reasoning in reading comprehension by explicitly modeling relationships among numbers, entities, and questions, and demonstrates its effectiveness through extensive experiments on the DROP dataset.

AntTech
AntTech
AntTech
Question Directed Graph Attention Network for Numerical Reasoning over Text (QDGAT)

At the 2021 Global FinTech Trends conference in Shanghai, Ant Group announced its paper "Question Directed Graph Attention Network for Numerical Reasoning over Text" (QDGAT) was accepted at EMNLP 2020 and ranked first on the DROP leaderboard.

QDGAT builds on pretrained language models (e.g., BERT, RoBERTa, ALBERT) and constructs a heterogeneous graph linking numbers of the same type and connecting numbers with entities appearing in the same sentence. A question‑directed graph attention mechanism guides multi‑step reasoning, improving both inference capability and interpretability.

The model makes two core contributions: (1) explicit connections among same‑type numbers and between numbers and entities, which constrain valid calculations and narrow the reasoning space; (2) a specialized reasoning module that incorporates the question vector into the graph network to direct the inference path.

Engineering optimizations include mixed‑precision and distributed training on 8×16 GB V100 GPUs, and model parallelism via APipe to overcome OOM issues with ALBERT‑XXLarge, achieving a 10% speedup over GPipe.

Experiments on the DROP dataset using RoBERTa as the base model show that QDGAT outperforms strong baselines such as NumNet and NumNet+. Ablation studies confirm the importance of the heterogeneous graph and the question‑driven reasoning module.

Qualitative visualizations illustrate QDGAT’s ability to handle three reasoning scenarios: entity‑numeric relations, numeric‑type relations, and question‑guided inference.

In conclusion, QDGAT is an effective semantic understanding model with strong numerical reasoning abilities, advancing reading comprehension performance and offering valuable insights for building knowledge‑graph‑driven financial event analysis.

NLPKnowledge Graphpretrained modelsnumerical reasoningDROPQDGATReading Comprehension
AntTech
Written by

AntTech

Technology is the core driver of Ant's future creation.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.