View a PDF of the paper titled Towards Reasoning Era: A Survey of Long Chain-of-Thought for Reasoning Large Language Models, by Qiguang Chen and 9 other authors
View PDF
HTML (experimental)
Abstract:Recent advancements in reasoning with large language models (RLLMs), such as OpenAI-O1 and DeepSeek-R1, have demonstrated their impressive capabilities in complex domains like mathematics and coding. A central factor in their success lies in the application of long chain-of-thought (Long CoT) characteristics, which enhance reasoning abilities and enable the solution of intricate problems. However, despite these developments, a comprehensive survey on Long CoT is still lacking, limiting our understanding of its distinctions from traditional short chain-of-thought (Short CoT) and complicating ongoing debates on issues like “overthinking” and “inference-time scaling.” This survey seeks to fill this gap by offering a unified perspective on Long CoT. (1) We first distinguish Long CoT from Short CoT and introduce a novel taxonomy to categorize current reasoning paradigms. (2) Next, we explore the key characteristics of Long CoT: deep reasoning, extensive exploration, and feasible reflection, which enable models to handle more complex tasks and produce more efficient, coherent outcomes compared to the shallower Short CoT. (3) We then investigate key phenomena such as the emergence of Long CoT with these characteristics, including overthinking, and inference-time scaling, offering insights into how these processes manifest in practice. (4) Finally, we identify significant research gaps and highlight promising future directions, including the integration of multi-modal reasoning, efficiency improvements, and enhanced knowledge frameworks. By providing a structured overview, this survey aims to inspire future research and further the development of logical reasoning in artificial intelligence.
Submission history
From: Qiguang Chen [view email]
[v1]
Wed, 12 Mar 2025 17:35:03 UTC (3,320 KB)
[v2]
Thu, 13 Mar 2025 04:34:15 UTC (3,320 KB)
[v3]
Wed, 9 Apr 2025 11:20:18 UTC (4,590 KB)
[v4]
Wed, 9 Jul 2025 15:13:24 UTC (5,137 KB)
[v5]
Fri, 18 Jul 2025 15:57:54 UTC (4,993 KB)