Files
crewAI/lib
Devin AI 2ed6fa1b7d fix: forward base_url and api_key from LLM to instructor client
When using an OpenAI-compatible provider with a custom base_url (e.g.
vLLM, Ollama, or any self-hosted endpoint), the InternalInstructor was
silently discarding the base_url and sending structured output requests
to api.openai.com instead of the configured endpoint.

The fix adds a _get_llm_extra_kwargs() method that extracts base_url
and api_key from the LLM object and forwards them as kwargs to
instructor.from_provider(), which already supports these parameters.

Fixes #5204

Co-Authored-By: João <joao@crewai.com>
2026-04-01 10:53:07 +00:00
..