Commit Graph

1 Commits

Author SHA1 Message Date
Devin AI
ae59abb052 feat: implement Google Batch Mode support for LLM calls
- Add google-generativeai dependency to pyproject.toml
- Extend LLM class with batch mode parameters (batch_mode, batch_size, batch_timeout)
- Implement batch request management methods for Gemini models
- Add batch-specific event types (BatchJobStartedEvent, BatchJobCompletedEvent, BatchJobFailedEvent)
- Create comprehensive test suite for batch mode functionality
- Add example demonstrating batch mode usage with cost savings
- Support inline batch requests for up to 50% cost reduction on Gemini models

Resolves issue #3116

Co-Authored-By: João <joao@crewai.com>
2025-07-07 22:01:56 +00:00