Skip to content

Conversation

@uezo
Copy link
Owner

@uezo uezo commented Aug 11, 2025

  • This commit unifies and refactors tool call execution and context management across ChatGPT, Claude, and Gemini services. Tool execution is now handled directly in the streaming loop, with tool responses injected into the context and recursive streaming calls for tool use.
  • The obsolete CreateMessageAfterFunction method is removed, and context update logic is improved to ensure valid tool call/response sequences.
  • The ITool interface and ToolBase are updated to support direct tool execution.
  • Increase default historyTurns from 10 to 100 in LLMServiceBase.

- This commit unifies and refactors tool call execution and context management across ChatGPT, Claude, and Gemini services. Tool execution is now handled directly in the streaming loop, with tool responses injected into the context and recursive streaming calls for tool use.
- The obsolete CreateMessageAfterFunction method is removed, and context update logic is improved to ensure valid tool call/response sequences.
- The ITool interface and ToolBase are updated to support direct tool execution.
- Increase default historyTurns from 10 to 100 in LLMServiceBase.
@uezo uezo merged commit b49e099 into master Aug 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant