fix: upgrade openai and vercel ai packages to fix o1 errors#3146
Conversation
|
Important Review skippedAuto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
|
Have we tested that this doesn't introduce regressions for non-o1 models |
mcp97
left a comment
There was a problem hiding this comment.
LGTM outside of comment :)
Works with default models |
fix: upgrade openai and vercel ai packages to fix o1 errors
Relates to
OpenAI requests for o1 was not succeeding due to
max_tokensbeing deprecated in favor ofmax_completion_tokensfrom the API parameter list. This upgrades the SDKs to fix this issue.
API: https://platform.openai.com/docs/api-reference/chat/object
Risks
Low
Background
Fix with screenshots


What does this PR do?
What kind of change is this?
Bug fix
Documentation changes needed?
Testing
To reproduce, run the existing develop branch and use OpenAI and choose the model
o1-previewWhen you make a call to the agent, there will be an error thrown that says
max_tokensis not supported for the model and to usemax_completion_tokensinstead.Where should a reviewer start?
Detailed testing steps