← 回總覽

突发:昂贵的新证据表明「规模即一切」并非真理

📅 2026-03-15 02:23 Gary Marcus 人工智能 9 分鐘 10411 字 評分: 79
AI 规模扩张 AGI 策略 Meta xAI 神经符号 AI
📌 一句话摘要 文章认为,Meta 和 xAI 最近的挫折削弱了纯粹的规模扩张叙事,并重新唤起了对世界模型和神经符号 AI 的关注。 📝 详细摘要 本文报道了 AI 领域的一个重要转折:Meta 和 xAI 在超大规模 AI 训练项目上的挫折为「规模即一切」的范式敲响了警钟。作者 Gary Marcus 指出,这些代价高昂的失败实验表明,单纯增加模型规模和计算量并不能保证智能的涌现。文章重新点燃了关于 AI 发展路径的讨论,认为世界模型(能够理解和模拟物理世界的模型)和神经符号 AI(结合神经网络和符号推理的方法)可能是突破当前瓶颈的关键方向。这对于 AGI 策略和投资方向都有深远影响。
Skip to main content ![Image 2: LogoBestBlogs](https://www.bestblogs.dev/ "BestBlogs.dev")Toggle navigation menu Toggle navigation menuArticlesPodcastsVideosTweetsSourcesNewsletters

⌘K

Change language Switch ThemeSign In

Narrow Mode

BREAKING: Expensive new evidence that scaling is not all you need =================================================================

!Image 3: Marcus on AI Marcus on AI @Gary Marcus

One Sentence Summary

The article argues that recent setbacks at Meta and xAI weaken the pure-scaling narrative and renew the case for world models and neurosymbolic AI.

Summary

This short opinion piece claims that recent events at Meta and xAI provide costly evidence against the idea that simply scaling compute and data is sufficient for AGI progress. It points to reports that Meta's latest model underperformed internal expectations and to Elon Musk's admission that xAI was not built correctly at first, framing both as failures of a scale-first strategy. Based on this, the author reiterates a long-standing position: the field should shift attention toward cognitive world models and neurosymbolic approaches rather than continuing to chase larger models alone. The article is timely and provocative, but it is primarily argument-driven commentary with limited empirical analysis.

Main Points

* 1. Recent setbacks at Meta and xAI are presented as evidence that pure scaling is insufficient.The author interprets reported model delays and organizational rebuilding as signs that massive spending on larger models did not deliver expected outcomes. * 2. The piece argues that the industry overcommitted to a compute-and-data-only roadmap.It characterizes the scaling consensus as hype-driven and expensive, with high opportunity costs in time, money, and research focus. * 3. A shift toward world models and neurosymbolic AI is framed as the next necessary direction.The author revisits earlier recommendations and suggests the current moment creates room for alternative paradigms that emphasize reasoning structure.

Metadata

AI Score

79

Website garymarcus.substack.com

Published At Yesterday

Length 257 words (about 2 min)

Sign in to use highlight and note-taking features for a better reading experience. Sign in now

Remember the good old days a few years ago when almost everybody thought that the royal road to AGI ws just spending more money on compute and data?

That hypothesis continues to go badly, as expensive experiments from two of the world’s wealthiest men have just shown.

On the hand, it’s become clear that Mark Zuckerberg’s latest model at Meta is good but not great, and not what he was hoping for. ![Image 4](https://imagedelivery.net/qGOFcc1O8XwTZW3W1JAHHg/10de9392-ee8a-4746-66fb-30cd8a796e00/public)

And in the very same week, Elon Musk has conceded that for all the gigantic models xAI has built, xAI was ”not built right first time around”. Instead most of the founders are gone, and Musk has said that the company “is being rebuilt from the foundations up”.1

So much for pure scaling2

Two of the most expensive scientific experiments in history. Sooo much money down the drain.

All based on the strange and dubious-from-the-start religion of scaling-über-alles.

In my 2020 article, The Next Decade in AI, I foresaw all this, and urged the field to start focusing on world (cognitive) models and neurosymbolic AI. Now, maybe, we can finally move on to those projects?

So much time, money, and energy was lost chasing hype. Subscribe now 1

Elon’s concession also fits well with the theory offered here on Feb 3 that SpaceX’s acquisition of xAI at $250 billion was a thinly disguised bailout. You don’t pay $250 billion for a company that was built wrong from the foundations. 2

Musk could —literally— have saved tens of billions of dollars, if he had asked me….

!Image 5: Marcus on AI Marcus on AI @Gary Marcus

One Sentence Summary

The article argues that recent setbacks at Meta and xAI weaken the pure-scaling narrative and renew the case for world models and neurosymbolic AI.

Summary

This short opinion piece claims that recent events at Meta and xAI provide costly evidence against the idea that simply scaling compute and data is sufficient for AGI progress. It points to reports that Meta's latest model underperformed internal expectations and to Elon Musk's admission that xAI was not built correctly at first, framing both as failures of a scale-first strategy. Based on this, the author reiterates a long-standing position: the field should shift attention toward cognitive world models and neurosymbolic approaches rather than continuing to chase larger models alone. The article is timely and provocative, but it is primarily argument-driven commentary with limited empirical analysis.

Main Points

* 1. Recent setbacks at Meta and xAI are presented as evidence that pure scaling is insufficient.

The author interprets reported model delays and organizational rebuilding as signs that massive spending on larger models did not deliver expected outcomes.

* 2. The piece argues that the industry overcommitted to a compute-and-data-only roadmap.

It characterizes the scaling consensus as hype-driven and expensive, with high opportunity costs in time, money, and research focus.

* 3. A shift toward world models and neurosymbolic AI is framed as the next necessary direction.

The author revisits earlier recommendations and suggests the current moment creates room for alternative paradigms that emphasize reasoning structure.

Key Quotes

* That hypothesis continues to go badly, as expensive experiments from two of the world's wealthiest men have just shown. * So much for pure scaling * Now, maybe, we can finally move on to those projects?

AI Score

79

Website garymarcus.substack.com

Published At Yesterday

Length 257 words (about 2 min)

Tags

AI Scaling

AGI Strategy

Meta

xAI

Neurosymbolic AI

Related Articles

* The Algorithm That Powers Your X (Twitter) Post * AI bot swarms threaten to undermine democracy * Manus is entering the next chapter: we’re joining * xAI Launches Grok Imagine 1.0 for Video Generation * Elon Musk – "In 36 months, the cheapest place to put AI will be space” * OpenClaw (a.k.a. Moltbot) is everywhere all at once, and a disaster waiting to happen, characterizing it as a security and privacy disaster that grants LLMs unfettered system access without adequate safeguards.") * FFmpeg at Meta: Media Processing at Scale HomeArticlesPodcastsVideosTweets

BREAKING: Expensive new evidence that scaling is not all ... ===============

查看原文 → 發佈: 2026-03-15 02:23:18 收錄: 2026-03-15 10:00:56

🤖 問 AI

針對這篇文章提問,AI 會根據文章內容回答。按 Ctrl+Enter 送出。