Commit Graph

2 Commits

Author SHA1 Message Date
Devin AI
8a2ed4320e fix: Resolve remaining lint issues (W291, W293, B904)
- Remove trailing whitespace from examples/prompt_caching_example.py
- Fix exception handling to use 'from e' for proper error chaining
- All lint checks now pass locally

Co-Authored-By: João <joao@crewai.com>
2025-09-18 20:52:18 +00:00
Devin AI
a395a5cde1 feat: Add prompt caching support for AWS Bedrock and Anthropic models
- Add enable_prompt_caching and cache_control parameters to LLM class
- Implement cache_control formatting for Anthropic models via LiteLLM
- Add helper method to detect prompt caching support for different providers
- Create comprehensive tests covering all prompt caching functionality
- Add example demonstrating usage with kickoff_for_each and kickoff_async
- Supports OpenAI, Anthropic, Bedrock, and Deepseek providers
- Enables cost optimization for workflows with repetitive context

Addresses issue #3535 for prompt caching support in CrewAI

Co-Authored-By: João <joao@crewai.com>
2025-09-18 20:21:50 +00:00