Files
crewAI/tests
Devin AI aa82ca5273 Fix Ollama models ValueError when response_format is used
- Add _is_ollama_model method to detect Ollama models consistently
- Skip response_format validation for Ollama models in _validate_call_params
- Filter out response_format parameter for Ollama models in _prepare_completion_params
- Add comprehensive tests for Ollama response_format handling
- Maintain backward compatibility for other LLM providers

Fixes #3082

Co-Authored-By: João <joao@crewai.com>
2025-06-28 21:32:30 +00:00
..
2025-03-14 03:00:30 -03:00
2023-10-29 19:51:59 -03:00