Files
crewAI/lib
Devin AI 9dabb3e81c fix: support remote Ollama server for function calling and pydantic output
Bug 1: InternalInstructor.to_pydantic() now forwards api_base/base_url/api_key
to litellm so that remote Ollama servers are reachable during structured output
parsing.

Bug 2: LLM.supports_function_calling() now falls back to querying the remote
Ollama /api/show endpoint when litellm returns False for Ollama models with a
non-localhost base URL.

Fixes #4694

Co-Authored-By: João <joao@crewai.com>
2026-03-04 12:15:16 +00:00
..
2026-02-27 09:44:47 -05:00