-
Notifications
You must be signed in to change notification settings - Fork 215
fix(llma): extract model from response for OpenAI stored prompts #2789
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
+63
−31
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
Contributor
Contributor
|
Size Change: +3.4 kB (+0.06%) Total Size: 5.33 MB
ℹ️ View Unchanged
|
When using OpenAI stored prompts, the model is defined in the OpenAI dashboard rather than passed in the API request. This change adds a fallback to extract the model from the response object when not provided in kwargs. Fixes PostHog/posthog#42861 Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
4d2e9d6 to
3de5a98
Compare
Radu-Raicea
reviewed
Dec 19, 2025
packages/ai/src/openai/azure.ts
Outdated
| ...posthogParams, | ||
| //@ts-expect-error | ||
| model: openAIParams.model, | ||
| model: openAIParams.model ?? '', |
Member
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we not send an empty string instead? Didn't check if it makes sense though.
Radu-Raicea
approved these changes
Dec 19, 2025
When using OpenAI stored prompts and an error occurs, we don't have access to the response to extract the model. Instead of sending an empty string, make the model field optional and send undefined. This is more semantically correct - PostHog handles undefined values gracefully by omitting the property. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Problem
When using OpenAI stored prompts, the model is defined in the OpenAI dashboard rather than passed in the API request. The PostHog AI wrapper only extracts the model from request parameters, causing generations to show up without a model and preventing cost calculations.
Fixes PostHog/posthog#42861
Changes
WrappedResponsesandWrappedCompletionsclasses to extract model from response when not in paramsLibraries affected
Checklist