Files
crewAI/lib/crewai/tests
Devin AI 2ed6fa1b7d fix: forward base_url and api_key from LLM to instructor client
When using an OpenAI-compatible provider with a custom base_url (e.g.
vLLM, Ollama, or any self-hosted endpoint), the InternalInstructor was
silently discarding the base_url and sending structured output requests
to api.openai.com instead of the configured endpoint.

The fix adds a _get_llm_extra_kwargs() method that extracts base_url
and api_key from the LLM object and forwards them as kwargs to
instructor.from_provider(), which already supports these parameters.

Fixes #5204

Co-Authored-By: João <joao@crewai.com>
2026-04-01 10:53:07 +00:00
..
2025-10-20 14:10:19 -07:00
2025-12-04 16:53:19 -05:00
2025-10-20 14:10:19 -07:00
2026-03-24 19:03:35 +08:00
2026-02-13 21:34:37 -03:00
2025-12-04 13:34:29 -08:00
2025-10-20 14:10:19 -07:00
2025-12-01 18:56:56 -05:00
2025-10-20 14:10:19 -07:00
2025-10-20 14:10:19 -07:00
2025-10-20 14:10:19 -07:00