mirror of
https://github.com/crewAIInc/crewAI.git
synced 2026-04-14 15:02:37 +00:00
Add native support for IBM Granite models through the watsonx.ai Model Gateway OpenAI-compatible API. Implementation: - New watsonx provider at llms/providers/watsonx/ extending OpenAICompletion - IBM Cloud IAM token exchange with thread-safe caching and auto-refresh - Support for WATSONX_API_KEY, WATSONX_PROJECT_ID, WATSONX_REGION env vars - 12 Granite models in constants (3.x, 4.x, code, guardian families) - Full LLM routing: watsonx/ibm/granite-4-h-small or provider='watsonx' - No new dependencies required (uses existing openai + httpx) Usage: llm = LLM(model='watsonx/ibm/granite-4-h-small') llm = LLM(model='ibm/granite-4-h-small', provider='watsonx') Closes OSS-35