Releases: jackmpcollins/magentic
Releases · jackmpcollins/magentic
v0.41.0
What's Changed
- Add support for
verbosityandmax_completion_tokensrequest parameters to OpenAI chat model by @piiq in #457 - Bump astral-sh/setup-uv from 6 to 7 by @dependabot[bot] in #458
- Bump actions/checkout from 4 to 5 by @dependabot[bot] in #454
- Bump actions/setup-python from 5 to 6 by @dependabot[bot] in #455
Full Changelog: v0.40.0...v0.41.0
v0.40.0
What's Changed
- Bump astral-sh/setup-uv from 5 to 6 by @dependabot in #443
- Add OpenRouter chat model by @piiq in #448
- Add reasoning_effort param to OpenaiChatModel by @jackmpcollins in #451
New Contributors
Full Changelog: v0.39.3...v0.40.0
v0.39.3
What's Changed
- Fix: function call parsing positional args ignoring arg defaults by @jackmpcollins in #439
Full Changelog: v0.39.2...v0.39.3
v0.39.2
What's Changed
- Add tests for Gemini via openai package by @jackmpcollins in #382
Full Changelog: v0.39.1...v0.39.2
v0.39.1
What's Changed
- Add tests and docs for xAI / Grok via OpenaiChatModel by @jackmpcollins in #433
Full Changelog: v0.39.0...v0.39.1
v0.39.0
What's Changed
- Use TypeVar default to remove overloads by @jackmpcollins in #411
- Add missing Field import in docs by @jackmpcollins in #428
- feat: support for passing extra_headers to LitellmChatModel by @ashwin153 in #426
New Contributors
- @ashwin153 made their first contribution in #426
Full Changelog: v0.38.1...v0.39.0
v0.38.1
What's Changed
Full Changelog: v0.38.0...v0.38.1
v0.38.0
What's Changed
- Async streamed response to api message conversion by @ananis25 in #405
- Support AsyncParallelFunctionCall in message_to_X_message by @jackmpcollins in #406
Full Changelog: v0.37.1...v0.38.0
v0.37.1
What's Changed
Anthropic model message serialization now supports StreamedResponse in AssistantMessage. Thanks to @ananis25 🎉
PRs
New Contributors
Full Changelog: v0.37.0...v0.37.1
v0.37.0
What's Changed
The @prompt_chain decorator can now accept a sequence of Message as input, like @chatprompt.
from magentic import prompt_chain, UserMessage
def get_current_weather(location, unit="fahrenheit"):
"""Get the current weather in a given location"""
return {"temperature": "72", "forecast": ["sunny", "windy"]}
@prompt_chain(
template=[UserMessage("What's the weather like in {city}?")],
functions=[get_current_weather],
)
def describe_weather(city: str) -> str: ...
describe_weather("Boston")
'The weather in Boston is currently 72°F with sunny and windy conditions.'PRs
- Allow Messages as input to prompt_chain by @jackmpcollins in #403
Full Changelog: v0.36.0...v0.37.0