Fixes#3702
When using langchain-google-genai with CrewAI, the model name would include
a 'models/' prefix (e.g., 'models/gemini/gemini-pro') which is added by
Google's API internally. However, LiteLLM does not recognize this format,
causing a 'LLM Provider NOT provided' error.
This fix strips the 'models/' prefix from model names when extracting them
from langchain model objects, ensuring compatibility with LiteLLM while
maintaining backward compatibility with models that don't have this prefix.
Changes:
- Modified create_llm() in llm_utils.py to strip 'models/' prefix
- Added comprehensive tests covering various scenarios:
- Model with 'models/' prefix in model attribute
- Model with 'models/' prefix in model_name attribute
- Model without prefix (no change)
- Case-sensitive prefix handling (only lowercase 'models/' is stripped)
Co-Authored-By: João <joao@crewai.com>
* worked on foundation for new conversational crews. Now going to work on chatting.
* core loop should be working and ready for testing.
* high level chat working
* its alive!!
* Added in Joaos feedback to steer crew chats back towards the purpose of the crew
* properly return tool call result
* accessing crew directly instead of through uv commands
* everything is working for conversation now
* Fix linting
* fix llm_utils.py and other type errors
* fix more type errors
* fixing type error
* More fixing of types
* fix failing tests
* Fix more failing tests
* adding tests. cleaing up pr.
* improve
* drop old functions
* improve type hintings