Files
crewAI/tests/cli/test_watsonx_model_support.py
Devin AI 048f05c755 Fix issue #2984: Add support for watsonx/meta-llama/llama-4-maverick-17b-128e-instruct-fp8 model
- Added watsonx/meta-llama/llama-4-maverick-17b-128e-instruct-fp8 to the watsonx models list in constants.py
- Created comprehensive tests to verify CLI model selection and LLM instantiation
- All existing tests continue to pass with no regressions
- Fixes CLI validation error when users try to select this model for watsonx provider

Resolves #2984

Co-Authored-By: João <joao@crewai.com>
2025-06-10 10:13:05 +00:00

1.9 KiB