-
Notifications
You must be signed in to change notification settings - Fork 59.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: remove the condition that uses max_token to reduce the context #5269
base: main
Are you sure you want to change the base?
Conversation
@QAbot-zh is attempting to deploy a commit to the NextChat Team on Vercel. A member of the Team first needs to authorize it. |
WalkthroughThe recent changes in the Changes
Poem
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (1)
- app/store/chat.ts (1 hunks)
Additional comments not posted (2)
app/store/chat.ts (2)
497-497
: Commenting outmaxTokenThreshold
is appropriate.The removal of
maxTokenThreshold
aligns with the PR objective to stop usingmax_tokens
for context management.
503-503
: Removing token limit in loop condition is consistent with objectives.The loop now includes more messages without the token limit, aligning with the PR's goals. However, verify performance implications due to potential increases in message volume.
Your build has completed! |
💻 变更类型 | Change Type
🔀 变更说明 | Description of Change
如今 max_tokens 已经不再作为上下文的限制变量了,应该回归它的本意(单次最大回复),移除通过该判断条件,仅通过上下文消息数量来管理上下文,方便用户理解相关的设置,避免长文对话时以为大模型“失去”记忆。
📝 补充信息 | Additional Information
Summary by CodeRabbit
New Features
Potential Impacts