Releases: jackmpcollins/magentic
Releases · jackmpcollins/magentic
v0.14.1
What's Changed
- Add dependabot config by @jackmpcollins in #85
- Run poetry update by @jackmpcollins in #98
- Add ruff to pyproject.toml. Update ruff. by @jackmpcollins in #107
- Handle functions with *args and **kwargs by @jackmpcollins in #106
Dependabot
- Bump jinja2 from 3.1.2 to 3.1.3 by @dependabot in #81
- Bump jupyter-lsp from 2.2.0 to 2.2.2 by @dependabot in #82
- Bump notebook from 7.0.0 to 7.0.7 by @dependabot in #83
- Bump jupyterlab from 4.0.3 to 4.0.11 by @dependabot in #84
- Bump actions/checkout from 3 to 4 by @dependabot in #86
- Bump actions/setup-python from 4 to 5 by @dependabot in #87
- Bump pytest from 7.4.0 to 7.4.4 by @dependabot in #89
- Bump mypy from 1.4.1 to 1.8.0 by @dependabot in #92
- Bump openai from 1.1.1 to 1.9.0 by @dependabot in #88
- Bump litellm from 1.0.0 to 1.18.9 by @dependabot in #90
- Bump pytest from 7.4.4 to 8.0.0 by @dependabot in #93
- Bump pytest-asyncio from 0.21.1 to 0.23.4 by @dependabot in #94
- Bump openai from 1.10.0 to 1.12.0 by @dependabot in #105
- Bump pytest-asyncio from 0.23.3 to 0.23.5 by @dependabot in #103
- Bump litellm from 1.20.6 to 1.23.14 by @dependabot in #108
Full Changelog: v0.14.0...v0.14.1
v0.14.0
What's Changed
- Add stop param by @jackmpcollins and @mnicstruwig in #80
Full Changelog: v0.13.0...v0.14.0
v0.13.0
What's Changed
- Bump jupyter-server from 2.7.2 to 2.11.2 by @dependabot in #75
- Allow setting api_key in OpenaiChatModel by @jackmpcollins in #76
Full Changelog: v0.12.0...v0.13.0
v0.12.0
What's Changed
- Bump aiohttp from 3.8.6 to 3.9.0 by @dependabot in #70
- Add OpenAI seed param for deterministic sampling by @jackmpcollins in #71
Full Changelog: v0.11.1...v0.12.0
v0.11.1
v0.11.0
What's Changed
- Add support for Azure via OpenaiChatModel by @jackmpcollins in #65
Full Changelog: v0.10.0...v0.11.0
v0.10.0
v0.9.1
Full Changelog: v0.9.0...v0.9.1
v0.9.0
What's Changed
- Add LiteLLM backend by @jackmpcollins in #54
Full Changelog: v0.8.0...v0.9.0
Example of LiteLLM backend
from magentic import prompt
from magentic.chat_model.litellm_chat_model import LitellmChatModel
@prompt(
"Talk to me! ",
model=LitellmChatModel("ollama/llama2"),
)
def say_hello() -> str:
...
say_hello()See the Backend/LLM Configuration section of the README for how to set the LiteLLM backend as the default.
v0.8.0
What's Changed
- Make backend configurable by @jackmpcollins in #46
- Bump urllib3 from 2.0.6 to 2.0.7 by @dependabot in #47
- Replace black with ruff formatter by @jackmpcollins in #48
- Handle pydantic generic BaseModel in name_type and function schema by @jackmpcollins in #52
- Allow ChatModel to be set with context manager by @jackmpcollins in #53
Full Changelog: v0.7.2...v0.8.0
Allow the chat_model/LLM to be set using a context manager. This allows the same prompt-function to easily be reused with different models, and also makes it neater to dynamically set the model.
from magentic import OpenaiChatModel, prompt
@prompt("Say hello")
def say_hello() -> str:
...
@prompt(
"Say hello",
model=OpenaiChatModel("gpt-4", temperature=1),
)
def say_hello_gpt4() -> str:
...
say_hello() # Uses env vars or default settings
with OpenaiChatModel("gpt-3.5-turbo"):
say_hello() # Uses gpt-3.5-turbo due to context manager
say_hello_gpt4() # Uses gpt-4 with temperature=1 because explicitly configured