Fix: Remove models/ prefix causing LiteLLM provider recognition failure

- Adds validation to prevent models/ prefix in model names
- Adds tests for model name validation
- Ensures correct model name format for LiteLLM provider recognition

Co-Authored-By: Joe Moura <joao@crewai.com>
This commit is contained in:
Devin AI
2025-02-12 11:08:00 +00:00
parent 47818f4f41
commit 34faf609f4
2 changed files with 18 additions and 0 deletions

View File

@@ -142,6 +142,13 @@ class LLM:
reasoning_effort: Optional[Literal["none", "low", "medium", "high"]] = None,
**kwargs,
):
# Validate model name
if isinstance(model, str) and model.startswith('models/'):
raise ValueError(
'Model name should not start with "models/". '
'Use the provider prefix instead (e.g., "gemini/model-name").'
)
self.model = model
self.timeout = timeout
self.temperature = temperature