-
Notifications
You must be signed in to change notification settings - Fork 2.4k
Add support for OPENAI_API_BASE environment variable #769
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: next
Are you sure you want to change the base?
Add support for OPENAI_API_BASE environment variable #769
Conversation
|
OPENAI_API_BASE Implementation ReportExecutive SummaryThis report details the implementation of 1. Initial State OverviewPrior to this implementation, the task-master codebase had limited flexibility for OpenAI API endpoints:
This created limitations for users who needed to:
2. Implementation ChangesThe following changes were made to support custom OpenAI API endpoints:
3. Technical ApproachThe implementation follows a clear precedence logic to determine which base URL to use for OpenAI API calls: Environment Variable Resolution// Special handling for OpenAI base URL from environment
if (providerName?.toLowerCase() === 'openai' && !baseURL) {
const envBaseURL = resolveEnvVariable('OPENAI_API_BASE', session, effectiveProjectRoot);
if (envBaseURL) {
baseURL = envBaseURL;
log('debug', `Using OpenAI base URL from environment variable: ${baseURL}`);
}
}Configuration PrecedenceWhen determining which base URL to use for OpenAI API calls, the system follows this order:
This approach is similar to how Azure OpenAI endpoints are handled, but generalized for any OpenAI-compatible service. Integration with Existing CodeThe implementation leverages existing utility functions:
4. Implementation BenefitsThis implementation provides several key advantages:
5. Developer Usage GuideSetting a Custom OpenAI API EndpointThere are multiple ways to configure a custom OpenAI API endpoint: Option 1: Environment Variable (Preferred for Local Development)Add to your Option 2: Global ConfigurationIn {
"global": {
"openaiBaseURL": "https://your-custom-endpoint.example.com",
// other global settings...
},
// remaining configuration...
}Option 3: Role-Specific ConfigurationFor more advanced scenarios, configure different endpoints per role: {
"models": {
"main": {
"provider": "openai",
"modelId": "gpt-4o",
"baseURL": "https://main-endpoint.example.com",
// other settings...
},
"research": {
"provider": "openai",
"modelId": "gpt-4-turbo",
"baseURL": "https://research-endpoint.example.com",
// other settings...
}
}
}Comparison with Azure ConfigurationUnlike the Azure configuration which requires both an API key and endpoint, the
Best Practices
ConclusionThe implementation of |
|
isn't baseURL inside config.json enough ? |
|
For anyone finding this in the future that was struggling to get this work, this is what you have to do.
According to Claude Opus, each provider has their own API structure, so changing the base URL is not enough. Changing the provider to openai will force it to use the openai-compatible strucutre. |
Crunchyman-ralph
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm, does this imply changes in the .taskmaster/config.json ? if so, we should probably update it in the readme, and eventually inside apps/docs (our documentation)
No description provided.