This package contains:
cli/py2go_converter.go— standalone CLI tool to convert Python files to Go using OpenAI or Ollama.backend/main.go— HTTP server that manages jobs, uses the CLI if present, serves the dashboard.frontend/index.htmlandfrontend/static/App.jsx— React + Tailwind web UI.
- Build the CLI (optional, server can run fallback):
cd cli
go build -o ../py2go_converter py2go_converter.go
cd ..- Run the backend server:
go run backend/main.go -addr :8080 -out ./server_out -workers 2 -ollama-model qwen2.5-coder:1.5b -openai-key sk-proj-......Notes:
- Ensure
goandgofmtare installed and on PATH for validation. - Configure OpenAI API key via
OPENAI_API_KEYenv if using OpenAI engine and not providing it as a CLI argument. - For Ollama, provide
-ollama-urland-ollama-modelflags to the CLI or set in backend options. - the test directory contains a python script that can be used for testing
- Ensure py2go_converter is in the root of the backend server or wherever you start the backend from
-
Local LLMs:
-
qwen2.5-coder:1.5b (results sometimes need a small amount of fixing but results aren't bad)
-
OpenAI:
-
gpt-4o-mini
-
gpt-5-nano
- backend.exe
- dist/ (containing the built frontend)
- py2go_converter (CLI tools)
- configs/ (containing agents.yaml config file) [if using the agent backend]
- Self-healing (conversion iterations include errors so agent has better context for the next attempt)
- Dynamically creates agent tasks based on the files in the provided workspace.
- Memory injection of current file, conversion errors and other workspace files.
- Code usually comes back correctly but not always extracted and formatted from the response correctly.
- add automatic go format and build validation
A new backend variant using github.com/Ingenimax/agent-sdk-go agent has been added under backend-agent/.
Run it with:
go run backend-agent/main.go -addr :8081 -out ./server_out_agent -workers 1 -ollama-model localModelAvailable -openai-key sk-proj-.....
The agent supports OpenAI (via OPENAI_API_KEY) and Ollama (provide ollama_url and ollama_model in options payload of job creation).