大模型越用越笨,核心原因只有一个:上下文塞太多,它记模糊了。用龙虾三步可以彻底绕开这个限制:
- 关键信息存成文件,让它随时调取,不靠上下文硬撑
- 复杂任务先让它说步骤,确认没跑偏再执行
- 重要事项让它主动归纳记忆,而不是靠你一遍遍重复
02:13
2 Replies
4 Retweets
12 Likes
1,593 Views 
One Sentence Summary
Fu Sheng proposes a three-step method involving file storage, step verification, and active summarization to solve LLM performance degradation caused by excessive context.
Summary
To address the 'fuzzy memory' or performance drop LLMs experience when handling long contexts, Fu Sheng introduced the 'Lobster Three Steps' optimization strategy. This includes: 1. Filing key information for on-demand retrieval to reduce reliance on the context window (similar to the RAG approach); 2. Adopting a 'plan-then-execute' workflow to ensure complex tasks stay on target (a Prompt Engineering technique); and 3. Guiding the model to actively summarize key points to reinforce memory. This tweet aims to help users refine their AI interactions to make tools more precise and intelligent in practice.
AI Score
82
Influence Score 7
Published At Today
Language
Chinese
Tags
LLM Tips
Prompt Engineering
Context Optimization
Fu Sheng
AI Applications