mirror of
https://github.com/crewAIInc/crewAI.git
synced 2026-04-09 12:38:14 +00:00
When LLM(base_url=...) is used without api_base, litellm does not receive the custom endpoint because it reads api_base (not base_url). This causes requests to fall back to api.openai.com, breaking multi-provider setups (e.g. Scaleway + Nebius). The fix syncs base_url and api_base in LLM.__init__: - If only base_url is provided, api_base is set to match - If only api_base is provided, base_url is set to match - If both are provided, both keep their explicit values Closes #5139 Co-Authored-By: João <joao@crewai.com>