mirror of
https://github.com/crewAIInc/crewAI.git
synced 2026-04-15 23:42:37 +00:00
When using an OpenAI-compatible provider with a custom base_url (e.g. vLLM, Ollama, or any self-hosted endpoint), the InternalInstructor was silently discarding the base_url and sending structured output requests to api.openai.com instead of the configured endpoint. The fix adds a _get_llm_extra_kwargs() method that extracts base_url and api_key from the LLM object and forwards them as kwargs to instructor.from_provider(), which already supports these parameters. Fixes #5204 Co-Authored-By: João <joao@crewai.com>