feat: enhance LLM routing and support for native implementations

- Implemented provider prefix routing for LLMs, allowing for native implementations of OpenAI, Anthropic, and Google Gemini.
- Added fallback to LiteLLM if native implementations are unavailable.
- Updated create_llm function to support provider prefixes and native preference settings.
- Introduced new LLM classes for Claude and Gemini, ensuring compatibility with existing LLM interfaces.
- Enhanced documentation for LLM utility functions to clarify usage and examples.
This commit is contained in:
lorenzejay
2025-08-22 13:39:14 -07:00
parent 842bed4e9c
commit 79f716162b
9 changed files with 2136 additions and 9 deletions

View File

@@ -1 +1,11 @@
"""LLM implementations for crewAI."""
"""CrewAI LLM implementations."""
from .base_llm import BaseLLM
from .openai import OpenAILLM
from .anthropic import ClaudeLLM
from .google import GeminiLLM
# Import the main LLM class for backward compatibility
__all__ = ["BaseLLM", "OpenAILLM", "ClaudeLLM", "GeminiLLM"]