From e676c83d7f94e92111c07bc9f240737d677bf5d8 Mon Sep 17 00:00:00 2001 From: Mark McDonald Date: Thu, 29 May 2025 21:52:32 +0800 Subject: [PATCH] docs: Adds Gemini example to OpenAI-compat section (#2915) --- docs/learn/llm-connections.mdx | 27 +++++++++++++++++++++++---- 1 file changed, 23 insertions(+), 4 deletions(-) diff --git a/docs/learn/llm-connections.mdx b/docs/learn/llm-connections.mdx index 33be323b7..fcc264f09 100644 --- a/docs/learn/llm-connections.mdx +++ b/docs/learn/llm-connections.mdx @@ -9,7 +9,7 @@ icon: brain-circuit CrewAI uses LiteLLM to connect to a wide variety of Language Models (LLMs). This integration provides extensive versatility, allowing you to use models from numerous providers with a simple, unified interface. - By default, CrewAI uses the `gpt-4o-mini` model. This is determined by the `OPENAI_MODEL_NAME` environment variable, which defaults to "gpt-4o-mini" if not set. + By default, CrewAI uses the `gpt-4o-mini` model. This is determined by the `OPENAI_MODEL_NAME` environment variable, which defaults to "gpt-4o-mini" if not set. You can easily configure your agents to use a different model or provider as described in this guide. @@ -117,18 +117,27 @@ You can connect to OpenAI-compatible LLMs using either environment variables or - ```python Code + ```python Generic import os os.environ["OPENAI_API_KEY"] = "your-api-key" os.environ["OPENAI_API_BASE"] = "https://api.your-provider.com/v1" os.environ["OPENAI_MODEL_NAME"] = "your-model-name" ``` + + ```python Google + import os + + # Example using Gemini's OpenAI-compatible API. + os.environ["OPENAI_API_KEY"] = "your-gemini-key" # Should start with AIza... + os.environ["OPENAI_API_BASE"] = "https://generativelanguage.googleapis.com/v1beta/openai/" + os.environ["OPENAI_MODEL_NAME"] = "openai/gemini-2.0-flash" # Add your Gemini model here, under openai/ + ``` - ```python Code + ```python Generic llm = LLM( model="custom-model-name", api_key="your-api-key", @@ -136,6 +145,16 @@ You can connect to OpenAI-compatible LLMs using either environment variables or ) agent = Agent(llm=llm, ...) ``` + + ```python Google + # Example using Gemini's OpenAI-compatible API + llm = LLM( + model="openai/gemini-2.0-flash", + base_url="https://generativelanguage.googleapis.com/v1beta/openai/", + api_key="your-gemini-key", # Should start with AIza... + ) + agent = Agent(llm=llm, ...) + ``` @@ -169,7 +188,7 @@ For local models like those provided by Ollama: You can change the base API URL for any LLM provider by setting the `base_url` parameter: -```python Code +```python Code llm = LLM( model="custom-model-name", base_url="https://api.your-provider.com/v1",