Files
crewAI/lib/crewai/tests
Devin AI 9c5b68f1c5 feat: Add OpenAI Responses API integration
Implements native support for OpenAI's Responses API (/v1/responses) as a new
LLM provider in CrewAI. This addresses feature request #4152.

Key features:
- New OpenAIResponsesCompletion class extending BaseLLM
- Support for both explicit provider parameter and model prefix routing
- Message conversion from CrewAI format to Responses API format
- Tool/function calling support
- Streaming support (sync and async)
- Structured output via Pydantic models
- Token usage tracking
- Support for o-series reasoning models with reasoning_effort parameter
- Support for stateful conversations via previous_response_id

Usage:
  # Option 1: Using provider parameter
  llm = LLM(model='gpt-4o', provider='openai_responses')

  # Option 2: Using model prefix
  llm = LLM(model='openai_responses/gpt-4o')

Includes comprehensive test coverage for:
- Provider routing
- Message conversion
- Tool conversion
- API calls
- Parameter preparation
- Context window sizes
- Feature support methods
- Token usage extraction

Co-Authored-By: João <joao@crewai.com>
2025-12-27 06:19:28 +00:00
..
2025-12-05 13:23:26 -05:00
2025-10-20 14:10:19 -07:00
2025-12-04 16:53:19 -05:00
2025-12-04 12:54:49 -05:00
2025-10-20 14:10:19 -07:00
2025-10-20 14:10:19 -07:00
2025-12-04 13:34:29 -08:00
2025-12-05 13:23:26 -05:00
2025-10-20 14:10:19 -07:00
2025-10-20 14:10:19 -07:00
2025-12-01 18:56:56 -05:00
2025-12-04 17:08:08 -05:00
2025-10-20 14:10:19 -07:00
2025-10-20 14:10:19 -07:00
2025-10-20 14:10:19 -07:00