Files
crewAI/lib
Devin AI f9ae6c52de Fix A2A LLM parameter forwarding in InternalInstructor
- Forward LLM configuration parameters (api_key, api_base, temperature, etc.) to instructor client for LiteLLM instances
- Only forward parameters for LiteLLM instances (is_litellm=True) to avoid breaking non-LiteLLM paths
- Filter out None values to prevent errors
- Prefer max_tokens over max_completion_tokens when both are present
- Fixes issue #3927 where A2A delegation lost LLM configuration when checking if remote agents are relevant

Co-Authored-By: João <joao@crewai.com>
2025-11-16 10:38:50 +00:00
..