LLM plugin for OpenAI models.
This plugin is a preview. LLM currently ships with OpenAI models as part of its default collection, implemented using the Chat Completions API.
This plugin implements those same models using the new Responses API.
Currently the only reason to use this plugin over the LLM defaults is to access o1-pro, which can only be used via the Responses API.
Install this plugin in the same environment as LLM.
llm install llm-openai-pluginTo run a prompt against o1-pro do this:
llm -m openai/o1-pro "Convince me that pelicans are the most noble of birds"Run this to see a full list of models - they start with the openai/ prefix:
llm models -q openai/Here's the output of that command:
OpenAI: openai/gpt-4o
OpenAI: openai/gpt-4o-mini
OpenAI: openai/gpt-4.5-preview
OpenAI: openai/gpt-4.5-preview-2025-02-27
OpenAI: openai/o3-mini
OpenAI: openai/o3-mini-high
OpenAI: openai/o3-mini-low
OpenAI: openai/o1-mini
OpenAI: openai/o1-mini-high
OpenAI: openai/o1-mini-low
OpenAI: openai/o1
OpenAI: openai/o1-high
OpenAI: openai/o1-low
OpenAI: openai/o1-pro
OpenAI: openai/o1-pro-high
OpenAI: openai/o1-pro-low
OpenAI: openai/gpt-4.1
OpenAI: openai/gpt-4.1-2025-04-14
OpenAI: openai/gpt-4.1-mini
OpenAI: openai/gpt-4.1-mini-2025-04-14
OpenAI: openai/gpt-4.1-nano
OpenAI: openai/gpt-4.1-nano-2025-04-14
OpenAI: openai/o3
OpenAI: openai/o3-high
OpenAI: openai/o3-low
OpenAI: openai/o3-2025-04-16
OpenAI: openai/o3-2025-04-16-high
OpenAI: openai/o3-2025-04-16-low
OpenAI: openai/o3-streaming
OpenAI: openai/o3-streaming-high
OpenAI: openai/o3-streaming-low
OpenAI: openai/o3-2025-04-16-streaming
OpenAI: openai/o3-2025-04-16-streaming-high
OpenAI: openai/o3-2025-04-16-streaming-low
OpenAI: openai/o4-mini
OpenAI: openai/o4-mini-high
OpenAI: openai/o4-mini-low
OpenAI: openai/o4-mini-2025-04-16
OpenAI: openai/o4-mini-2025-04-16-high
OpenAI: openai/o4-mini-2025-04-16-low
OpenAI: openai/codex-mini-latest
OpenAI: openai/codex-mini-latest-high
OpenAI: openai/codex-mini-latest-low
Add --options to see a full list of options that can be provided to each model.
The o3-streaming model ID exists because o3 currently requires a verified organization in order to support streaming. If you have a verified organization you can use o3-streaming - everyone else should use o3.
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-openai-plugin
python -m venv venv
source venv/bin/activateNow install the dependencies and test dependencies:
llm install -e '.[test]'To run the tests:
python -m pytestThis project uses pytest-recording to record OpenAI API responses for the tests, and syrupy to capture snapshots of their results.
If you add a new test that calls the API you can capture the API response and snapshot like this:
PYTEST_OPENAI_API_KEY="$(llm keys get openai)" pytest --record-mode once --snapshot-updateThen review the new snapshots in tests/__snapshots__/ to make sure they look correct.