This fix addresses issue #3986 where Azure AI Inference SDK doesn't support
the json_schema response_format required for structured outputs.
Changes:
- Add supports_response_model() method to BaseLLM (default False)
- Override supports_response_model() in OpenAI, Anthropic, Gemini providers
- Azure provider returns False for supports_response_model() since the native
SDK doesn't support json_schema response_format
- Add _llm_supports_response_model() helper in converter.py that checks for
supports_response_model() first, then falls back to supports_function_calling()
for backwards compatibility with custom LLMs
- Update Converter.to_pydantic(), to_json(), and get_conversion_instructions()
to use the new helper function
- Add comprehensive tests for the fix
The fix separates 'supports function calling' from 'supports structured outputs'
capabilities, allowing Azure to use function/tool calling while falling back to
text-based JSON extraction for structured outputs.
Co-Authored-By: João <joao@crewai.com>