Zhi Zhang, Yan Liu, Zhejing Hu, Gong Chen, Shenghua Zhong
Compounding errors pose a significant challenge in automatic literature review generation, as inaccuracies can cascade across multi-stage retrieval and generation workflows. Existing self-correction strategies often lack mechanisms to effectively track and consolidate verified information throughout the process, making it difficult to prevent error accumulation and propagation. In this paper, we propose Structure-Guided Memory Consolidation (SGMC), a novel framework that incrementally consolidates and verifies information using structured representations at each stage of the literature review pipeline. SGMC consists of three key modules: Tree-Guided Memory for hierarchical literature retrieval and outline generation, Hub-Guided Memory for evidence extraction and iterative content refinement, and Self-Loop Memory for proactive error correction via historical feedback. Extensive experiments on public benchmarks and a newly constructed large-scale dataset demonstrate that SGMC achieves state-of-the-art performance in citation accuracy and content quality, significantly mitigating compounding errors in long-form literature review generation.
Quantitative mode stability for the wave equation on the Kerr-Newman spacetime
Risk-Aware Objective-Based Forecasting in Inertia Management
Chainalysis: Geography of Cryptocurrency 2023
Periodicity in Cryptocurrency Volatility and Liquidity
Impact of Geometric Uncertainty on the Computation of Abdominal Aortic Aneurysm Wall Strain
Simulation-based Bayesian inference with ameliorative learned summary statistics -- Part I