⌘K
Change language Switch ThemeSign In
Narrow Mode
BREAKING: Expensive new evidence that scaling is not all you need =================================================================
!Image 3: Marcus on AI Marcus on AI @Gary Marcus
One Sentence Summary
The article argues that recent setbacks at Meta and xAI weaken the pure-scaling narrative and renew the case for world models and neurosymbolic AI.
Summary
This short opinion piece claims that recent events at Meta and xAI provide costly evidence against the idea that simply scaling compute and data is sufficient for AGI progress. It points to reports that Meta's latest model underperformed internal expectations and to Elon Musk's admission that xAI was not built correctly at first, framing both as failures of a scale-first strategy. Based on this, the author reiterates a long-standing position: the field should shift attention toward cognitive world models and neurosymbolic approaches rather than continuing to chase larger models alone. The article is timely and provocative, but it is primarily argument-driven commentary with limited empirical analysis.
Main Points
* 1. Recent setbacks at Meta and xAI are presented as evidence that pure scaling is insufficient.The author interprets reported model delays and organizational rebuilding as signs that massive spending on larger models did not deliver expected outcomes. * 2. The piece argues that the industry overcommitted to a compute-and-data-only roadmap.It characterizes the scaling consensus as hype-driven and expensive, with high opportunity costs in time, money, and research focus. * 3. A shift toward world models and neurosymbolic AI is framed as the next necessary direction.The author revisits earlier recommendations and suggests the current moment creates room for alternative paradigms that emphasize reasoning structure.
Metadata
AI Score
79
Website garymarcus.substack.com
Published At Yesterday
Length 257 words (about 2 min)
Sign in to use highlight and note-taking features for a better reading experience. Sign in now
Remember the good old days a few years ago when almost everybody thought that the royal road to AGI ws just spending more money on compute and data?
That hypothesis continues to go badly, as expensive experiments from two of the world’s wealthiest men have just shown.
On the hand, it’s become clear that Mark Zuckerberg’s latest model at Meta is good but not great, and not what he was hoping for. 
And in the very same week, Elon Musk has conceded that for all the gigantic models xAI has built, xAI was ”not built right first time around”. Instead most of the founders are gone, and Musk has said that the company “is being rebuilt from the foundations up”.1
So much for pure scaling2
Two of the most expensive scientific experiments in history. Sooo much money down the drain.
All based on the strange and dubious-from-the-start religion of scaling-über-alles.
In my 2020 article, The Next Decade in AI, I foresaw all this, and urged the field to start focusing on world (cognitive) models and neurosymbolic AI. Now, maybe, we can finally move on to those projects?
So much time, money, and energy was lost chasing hype. Subscribe now 1
Elon’s concession also fits well with the theory offered here on Feb 3 that SpaceX’s acquisition of xAI at $250 billion was a thinly disguised bailout. You don’t pay $250 billion for a company that was built wrong from the foundations. 2
Musk could —literally— have saved tens of billions of dollars, if he had asked me….
!Image 5: Marcus on AI Marcus on AI @Gary Marcus
One Sentence Summary
The article argues that recent setbacks at Meta and xAI weaken the pure-scaling narrative and renew the case for world models and neurosymbolic AI.
Summary
This short opinion piece claims that recent events at Meta and xAI provide costly evidence against the idea that simply scaling compute and data is sufficient for AGI progress. It points to reports that Meta's latest model underperformed internal expectations and to Elon Musk's admission that xAI was not built correctly at first, framing both as failures of a scale-first strategy. Based on this, the author reiterates a long-standing position: the field should shift attention toward cognitive world models and neurosymbolic approaches rather than continuing to chase larger models alone. The article is timely and provocative, but it is primarily argument-driven commentary with limited empirical analysis.
Main Points
* 1. Recent setbacks at Meta and xAI are presented as evidence that pure scaling is insufficient.
The author interprets reported model delays and organizational rebuilding as signs that massive spending on larger models did not deliver expected outcomes.
* 2. The piece argues that the industry overcommitted to a compute-and-data-only roadmap.
It characterizes the scaling consensus as hype-driven and expensive, with high opportunity costs in time, money, and research focus.
* 3. A shift toward world models and neurosymbolic AI is framed as the next necessary direction.
The author revisits earlier recommendations and suggests the current moment creates room for alternative paradigms that emphasize reasoning structure.
Key Quotes
* That hypothesis continues to go badly, as expensive experiments from two of the world's wealthiest men have just shown. * So much for pure scaling * Now, maybe, we can finally move on to those projects?
AI Score
79
Website garymarcus.substack.com
Published At Yesterday
Length 257 words (about 2 min)
Tags
AI Scaling
AGI Strategy
Meta
xAI
Neurosymbolic AI
Related Articles
* The Algorithm That Powers Your X (Twitter) Post * AI bot swarms threaten to undermine democracy * Manus is entering the next chapter: we’re joining * xAI Launches Grok Imagine 1.0 for Video Generation * Elon Musk – "In 36 months, the cheapest place to put AI will be space” * OpenClaw (a.k.a. Moltbot) is everywhere all at once, and a disaster waiting to happen, characterizing it as a security and privacy disaster that grants LLMs unfettered system access without adequate safeguards.") * FFmpeg at Meta: Media Processing at Scale HomeArticlesPodcastsVideosTweets
BREAKING: Expensive new evidence that scaling is not all ... ===============