Files
crewAI/lib/crewai/tests
Devin AI 9dabb3e81c fix: support remote Ollama server for function calling and pydantic output
Bug 1: InternalInstructor.to_pydantic() now forwards api_base/base_url/api_key
to litellm so that remote Ollama servers are reachable during structured output
parsing.

Bug 2: LLM.supports_function_calling() now falls back to querying the remote
Ollama /api/show endpoint when litellm returns False for Ollama models with a
non-localhost base URL.

Fixes #4694

Co-Authored-By: João <joao@crewai.com>
2026-03-04 12:15:16 +00:00
..
2026-03-04 00:45:09 -05:00
2025-10-20 14:10:19 -07:00
2025-12-04 16:53:19 -05:00
2026-02-13 21:34:37 -03:00
2025-10-20 14:10:19 -07:00
2026-02-13 21:34:37 -03:00
2025-12-04 13:34:29 -08:00
2025-10-20 14:10:19 -07:00
2026-02-13 21:34:37 -03:00
2025-12-01 18:56:56 -05:00
2025-10-20 14:10:19 -07:00
2025-10-20 14:10:19 -07:00
2025-10-20 14:10:19 -07:00