Skip to content

ysicing/openai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

48 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

OpenAI Go SDK

Go Report Card Test Coverage Go Version

A flexible and secure Go SDK for OpenAI and OpenAI-compatible APIs including local models like Ollama.

✨ Features

  • πŸ”Œ Multi-Provider Support: OpenAI, Azure OpenAI, DeepSeek, ZhiPu, Ollama, and any OpenAI-compatible API
  • πŸ›‘οΈ Security First: Comprehensive validation, TLS warnings, safe error handling
  • πŸ§ͺ Well-Tested: 71.2% test coverage with comprehensive unit tests
  • πŸš€ Developer-Friendly: Clean functional options pattern, extensive documentation
  • ⚑ Performant: Optimized HTTP client, connection pooling, context-aware
  • πŸ”§ Flexible: Support for custom BaseURL, proxies, timeouts, and all OpenAI parameters

πŸ“¦ Installation

go get github.com/ysicing/openai

πŸš€ Quick Start

OpenAI

package main

import (
    "context"
    "log"
    "os"

    "github.com/ysicing/openai/openai"
)

func main() {
    client, err := openai.New(
        openai.WithToken(os.Getenv("OPENAI_API_KEY")),
        openai.WithModel(openai.GPT4oMini),
    )
    if err != nil {
        log.Fatal(err)
    }

    resp, err := client.Completion(
        context.Background(),
        "You are a helpful assistant.",
        "Explain quantum computing in simple terms",
    )
    if err != nil {
        log.Fatal(err)
    }

    log.Println(resp.Content)
}

DeepSeek (OpenAI-Compatible)

client, err := openai.New(
    openai.WithToken(os.Getenv("DEEPSEEK_API_KEY")),
    openai.WithBaseURL("https://api.deepseek.com/v1"),
    openai.WithModel(openai.DeepseekChat),
)

πŸ”§ Configuration Options

Option Description Example
WithToken API authentication token "sk-..."
WithModel Model name "gpt-4o-mini"
WithProvider Service provider openai.Ollama
WithBaseURL Custom API endpoint "http://localhost:11434/v1"
WithTemperature Response creativity (0-2) 0.7
WithTopP Nucleus sampling 0.9
WithTimeout Request timeout 60 * time.Second
WithProxyURL HTTP proxy "http://proxy:8080"
WithSocksURL SOCKS5 proxy "socks5://proxy:1080"
WithSkipVerify Skip TLS verification true ⚠️

It is recommended to prioritize using WithBaseURL over WithProvider. WithProvider has limited support.

πŸ“š Supported Providers

Built-in Providers (Special Configuration)

Provider Official Support Default URL Notes
OpenAI βœ… https://api.openai.com/v1 Default provider
Azure OpenAI βœ… Azure endpoint Enterprise features
Ollama βœ… http://localhost:11434/v1 Local models

βœ… = Built-in provider with special configuration

OpenAI-Compatible Providers (Via WithBaseURL)

All other providers use the default OpenAI-compatible mode. Just specify the endpoint:

// DeepSeek
openai.WithBaseURL("https://api.deepseek.com/v1")

// ZhiPu (ζ™Ίθ°±)
openai.WithBaseURL("https://open.bigmodel.cn/api/paas/v4/")

// LM Studio, LocalAI, vLLM, etc.
openai.WithBaseURL("http://localhost:1234/v1")

Compatible with any OpenAI-compatible service:

  • DeepSeek, ZhiPu, Moonshot, Kimi, Qwen, etc. (Chinese providers)
  • LM Studio, LocalAI, vLLM (Self-hosted)
  • Any other service with OpenAI-compatible API

πŸ’‘ Design Philosophy

We use a simplified approach:

  • βœ… Only OpenAI, Azure, and Ollama have special configurations
  • βœ… All other providers use default OpenAI-compatible mode
  • βœ… Just use WithBaseURL to specify the endpoint

This design makes the code simpler and supports more services!

🎯 Advanced Usage

Multi-turn Conversations

messages := []openai.ChatCompletionMessage{
    {Role: openai.ChatMessageRoleUser, Content: "Hello!"},
}

resp, err := client.CreateChatCompletionWithMessage(context.Background(), messages)
if err != nil {
    log.Fatal(err)
}

messages = append(messages, resp.Choices[0].Message)
messages = append(messages, openai.ChatCompletionMessage{
    Role: openai.ChatMessageRoleUser,
    Content: "What is your name?",
})

resp2, err := client.CreateChatCompletionWithMessage(context.Background(), messages)

Image Understanding (GPT-4V)

resp, err := client.ImageCompletion(
    context.Background(),
    "https://example.com/image.jpg",
    "You are a helpful assistant.",
    "Describe this image in detail",
)

Custom Headers

client, err := openai.New(
    openai.WithToken("api-key"),
    openai.WithHeaders([]string{
        "X-Custom-Header=custom-value",
        "Authorization=Bearer token",
    }),
)

πŸ›‘οΈ Security Features

  • βœ… Response Validation: Prevents panics on malformed API responses
  • βœ… TLS Warnings: Clear documentation about security implications of WithSkipVerify
  • βœ… Safe Error Handling: No credential leakage in error messages
  • βœ… Input Validation: Robust header parsing with SplitN
  • βœ… Proxy Security: Support for both HTTP and SOCKS5 proxies

πŸ§ͺ Testing

Run all tests:

go test ./... -v

With coverage report:

go test ./... -cover

Test results:

PASS
coverage: 71.2% of statements
ok      github.com/ysicing/openai/openai    0.496s

πŸ“– Examples

See the example/ directory for complete working examples:

  • example/deepseek/ - DeepSeek API usage
  • example/zhipu/ - ZhiPu API usage
  • example/ollama/ - Local Ollama usage
# Run DeepSeek example
cd example/deepseek
DEEPSEEK_API_KEY=your_key go run main.go

# Run Ollama example (requires Ollama running)
cd example/ollama
ollama run llama3.1:latest
go run main.go

🀝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

πŸ“„ License

MIT License - see LICENSE file for details.

πŸ™ Acknowledgments

About

deepseek sdk

Resources

Stars

Watchers

Forks

Packages

No packages published

Contributors 2

  •  
  •  

Languages