feat: support injecting custom LLM clients into Gemini and AnthropicLlm#5035
feat: support injecting custom LLM clients into Gemini and AnthropicLlm#5035brucearctor wants to merge 4 commits intogoogle:mainfrom
Conversation
|
Response from ADK Triaging Agent Hello @brucearctor, thank you for your contribution! To help reviewers better understand and verify your changes, could you please add a You can find more details in our contribution guidelines. Thanks! |
|
Hi @brucearctor , Thank you for your contribution! We appreciate you taking the time to submit this pull request. |
looking into them |
- Move Client import to TYPE_CHECKING, remove duplicate top-level import - Exclude client field from Pydantic serialization using Field(exclude=True) - Apply custom client guard to _live_api_client for consistency - Add client parameter to ApigeeLlm.__init__ and forward to super()
|
I think got these comments addressed. While I hate using Any type, to address your comments/suggestions, I failed to find a workaround. |
Enable injecting pre-configured LLM clients into Gemini and AnthropicLlm models to support multi-agent systems with distinct configurations. Addresses #5027.
Testing Plan
I have verified these changes with the following:
Automated Tests
Manual Verification
Verification Command