A flexible and secure Go SDK for OpenAI and OpenAI-compatible APIs including local models like Ollama.
- π Multi-Provider Support: OpenAI, Azure OpenAI, DeepSeek, ZhiPu, Ollama, and any OpenAI-compatible API
- π‘οΈ Security First: Comprehensive validation, TLS warnings, safe error handling
- π§ͺ Well-Tested: 71.2% test coverage with comprehensive unit tests
- π Developer-Friendly: Clean functional options pattern, extensive documentation
- β‘ Performant: Optimized HTTP client, connection pooling, context-aware
- π§ Flexible: Support for custom BaseURL, proxies, timeouts, and all OpenAI parameters
go get github.com/ysicing/openaipackage main
import (
"context"
"log"
"os"
"github.com/ysicing/openai/openai"
)
func main() {
client, err := openai.New(
openai.WithToken(os.Getenv("OPENAI_API_KEY")),
openai.WithModel(openai.GPT4oMini),
)
if err != nil {
log.Fatal(err)
}
resp, err := client.Completion(
context.Background(),
"You are a helpful assistant.",
"Explain quantum computing in simple terms",
)
if err != nil {
log.Fatal(err)
}
log.Println(resp.Content)
}client, err := openai.New(
openai.WithToken(os.Getenv("DEEPSEEK_API_KEY")),
openai.WithBaseURL("https://api.deepseek.com/v1"),
openai.WithModel(openai.DeepseekChat),
)| Option | Description | Example |
|---|---|---|
WithToken |
API authentication token | "sk-..." |
WithModel |
Model name | "gpt-4o-mini" |
WithProvider |
Service provider | openai.Ollama |
WithBaseURL |
Custom API endpoint | "http://localhost:11434/v1" |
WithTemperature |
Response creativity (0-2) | 0.7 |
WithTopP |
Nucleus sampling | 0.9 |
WithTimeout |
Request timeout | 60 * time.Second |
WithProxyURL |
HTTP proxy | "http://proxy:8080" |
WithSocksURL |
SOCKS5 proxy | "socks5://proxy:1080" |
WithSkipVerify |
Skip TLS verification | true |
It is recommended to prioritize using WithBaseURL over WithProvider. WithProvider has limited support.
| Provider | Official Support | Default URL | Notes |
|---|---|---|---|
| OpenAI | β | https://api.openai.com/v1 |
Default provider |
| Azure OpenAI | β | Azure endpoint | Enterprise features |
| Ollama | β | http://localhost:11434/v1 |
Local models |
β = Built-in provider with special configuration
All other providers use the default OpenAI-compatible mode. Just specify the endpoint:
// DeepSeek
openai.WithBaseURL("https://api.deepseek.com/v1")
// ZhiPu (ζΊθ°±)
openai.WithBaseURL("https://open.bigmodel.cn/api/paas/v4/")
// LM Studio, LocalAI, vLLM, etc.
openai.WithBaseURL("http://localhost:1234/v1")Compatible with any OpenAI-compatible service:
- DeepSeek, ZhiPu, Moonshot, Kimi, Qwen, etc. (Chinese providers)
- LM Studio, LocalAI, vLLM (Self-hosted)
- Any other service with OpenAI-compatible API
We use a simplified approach:
- β Only OpenAI, Azure, and Ollama have special configurations
- β All other providers use default OpenAI-compatible mode
- β
Just use
WithBaseURLto specify the endpoint
This design makes the code simpler and supports more services!
messages := []openai.ChatCompletionMessage{
{Role: openai.ChatMessageRoleUser, Content: "Hello!"},
}
resp, err := client.CreateChatCompletionWithMessage(context.Background(), messages)
if err != nil {
log.Fatal(err)
}
messages = append(messages, resp.Choices[0].Message)
messages = append(messages, openai.ChatCompletionMessage{
Role: openai.ChatMessageRoleUser,
Content: "What is your name?",
})
resp2, err := client.CreateChatCompletionWithMessage(context.Background(), messages)resp, err := client.ImageCompletion(
context.Background(),
"https://example.com/image.jpg",
"You are a helpful assistant.",
"Describe this image in detail",
)client, err := openai.New(
openai.WithToken("api-key"),
openai.WithHeaders([]string{
"X-Custom-Header=custom-value",
"Authorization=Bearer token",
}),
)- β Response Validation: Prevents panics on malformed API responses
- β
TLS Warnings: Clear documentation about security implications of
WithSkipVerify - β Safe Error Handling: No credential leakage in error messages
- β
Input Validation: Robust header parsing with
SplitN - β Proxy Security: Support for both HTTP and SOCKS5 proxies
Run all tests:
go test ./... -vWith coverage report:
go test ./... -coverTest results:
PASS
coverage: 71.2% of statements
ok github.com/ysicing/openai/openai 0.496s
See the example/ directory for complete working examples:
example/deepseek/- DeepSeek API usageexample/zhipu/- ZhiPu API usageexample/ollama/- Local Ollama usage
# Run DeepSeek example
cd example/deepseek
DEEPSEEK_API_KEY=your_key go run main.go
# Run Ollama example (requires Ollama running)
cd example/ollama
ollama run llama3.1:latest
go run main.goContributions are welcome! Please feel free to submit a Pull Request.
MIT License - see LICENSE file for details.
- sashabaranov/go-openai - OpenAI Go library
- Ollama - Local AI models
- All contributors and users of this SDK