diff --git a/docs/concepts/llms.mdx b/docs/concepts/llms.mdx
index 90e86adaf..835c2491f 100644
--- a/docs/concepts/llms.mdx
+++ b/docs/concepts/llms.mdx
@@ -25,52 +25,55 @@ By default, CrewAI uses the `gpt-4o-mini` model. It uses environment variables i
- `OPENAI_API_BASE`
- `OPENAI_API_KEY`
-### 2. String Identifier
-
-```python Code
-agent = Agent(llm="gpt-4o", ...)
-```
-
-### 3. LLM Instance
-
-List of [more providers](https://docs.litellm.ai/docs/providers).
-
-```python Code
-from crewai import LLM
-
-llm = LLM(model="gpt-4", temperature=0.7)
-agent = Agent(llm=llm, ...)
-```
-
-### 4. Custom LLM Objects
+### 2. Custom LLM Objects
Pass a custom LLM implementation or object from another library.
+See below for examples.
+
+
+
+ ```python Code
+ agent = Agent(llm="gpt-4o", ...)
+ ```
+
+
+
+ ```python Code
+ from crewai import LLM
+
+ llm = LLM(model="gpt-4", temperature=0.7)
+ agent = Agent(llm=llm, ...)
+ ```
+
+
+
## Connecting to OpenAI-Compatible LLMs
You can connect to OpenAI-compatible LLMs using either environment variables or by setting specific attributes on the LLM class:
-1. Using environment variables:
+
+
+ ```python Code
+ import os
-```python Code
-import os
+ os.environ["OPENAI_API_KEY"] = "your-api-key"
+ os.environ["OPENAI_API_BASE"] = "https://api.your-provider.com/v1"
+ ```
+
+
+ ```python Code
+ from crewai import LLM
-os.environ["OPENAI_API_KEY"] = "your-api-key"
-os.environ["OPENAI_API_BASE"] = "https://api.your-provider.com/v1"
-```
-
-2. Using LLM class attributes:
-
-```python Code
-from crewai import LLM
-
-llm = LLM(
- model="custom-model-name",
- api_key="your-api-key",
- base_url="https://api.your-provider.com/v1"
-)
-agent = Agent(llm=llm, ...)
-```
+ llm = LLM(
+ model="custom-model-name",
+ api_key="your-api-key",
+ base_url="https://api.your-provider.com/v1"
+ )
+ agent = Agent(llm=llm, ...)
+ ```
+
+
## LLM Configuration Options
@@ -97,55 +100,149 @@ When configuring an LLM for your agent, you have access to a wide range of param
| **api_key** | `str` | Your API key for authentication. |
-## OpenAI Example Configuration
+These are examples of how to configure LLMs for your agent.
-```python Code
-from crewai import LLM
+
+
-llm = LLM(
- model="gpt-4",
- temperature=0.8,
- max_tokens=150,
- top_p=0.9,
- frequency_penalty=0.1,
- presence_penalty=0.1,
- stop=["END"],
- seed=42,
- base_url="https://api.openai.com/v1",
- api_key="your-api-key-here"
-)
-agent = Agent(llm=llm, ...)
-```
+ ```python Code
+ from crewai import LLM
-## Cerebras Example Configuration
+ llm = LLM(
+ model="gpt-4",
+ temperature=0.8,
+ max_tokens=150,
+ top_p=0.9,
+ frequency_penalty=0.1,
+ presence_penalty=0.1,
+ stop=["END"],
+ seed=42,
+ base_url="https://api.openai.com/v1",
+ api_key="your-api-key-here"
+ )
+ agent = Agent(llm=llm, ...)
+ ```
+
-```python Code
-from crewai import LLM
+
-llm = LLM(
- model="cerebras/llama-3.1-70b",
- base_url="https://api.cerebras.ai/v1",
- api_key="your-api-key-here"
-)
-agent = Agent(llm=llm, ...)
-```
+ ```python Code
+ from crewai import LLM
-## Using Ollama (Local LLMs)
+ llm = LLM(
+ model="cerebras/llama-3.1-70b",
+ base_url="https://api.cerebras.ai/v1",
+ api_key="your-api-key-here"
+ )
+ agent = Agent(llm=llm, ...)
+ ```
+
+
+
-CrewAI supports using Ollama for running open-source models locally:
+ CrewAI supports using Ollama for running open-source models locally:
-1. Install Ollama: [ollama.ai](https://ollama.ai/)
-2. Run a model: `ollama run llama2`
-3. Configure agent:
+ 1. Install Ollama: [ollama.ai](https://ollama.ai/)
+ 2. Run a model: `ollama run llama2`
+ 3. Configure agent:
-```python Code
-from crewai import LLM
+ ```python Code
+ from crewai import LLM
-agent = Agent(
- llm=LLM(model="ollama/llama3.1", base_url="http://localhost:11434"),
- ...
-)
-```
+ agent = Agent(
+ llm=LLM(
+ model="ollama/llama3.1",
+ base_url="http://localhost:11434"
+ ),
+ ...
+ )
+ ```
+
+
+
+
+ ```python Code
+ from crewai import LLM
+
+ llm = LLM(
+ model="groq/llama3-8b-8192",
+ base_url="https://api.groq.com/openai/v1",
+ api_key="your-api-key-here"
+ )
+ agent = Agent(llm=llm, ...)
+ ```
+
+
+
+
+ ```python Code
+ from crewai import LLM
+
+ llm = LLM(
+ model="anthropic/claude-3-5-sonnet-20241022",
+ base_url="https://api.anthropic.com/v1",
+ api_key="your-api-key-here"
+ )
+ agent = Agent(llm=llm, ...)
+ ```
+
+
+
+
+ ```python Code
+ from crewai import LLM
+
+ llm = LLM(
+ model="fireworks/meta-llama-3.1-8b-instruct",
+ base_url="https://api.fireworks.ai/inference/v1",
+ api_key="your-api-key-here"
+ )
+ agent = Agent(llm=llm, ...)
+ ```
+
+
+
+
+ ```python Code
+ from crewai import LLM
+
+ llm = LLM(
+ model="gemini/gemini-1.5-flash",
+ base_url="https://api.gemini.google.com/v1",
+ api_key="your-api-key-here"
+ )
+ agent = Agent(llm=llm, ...)
+ ```
+
+
+
+
+ ```python Code
+ from crewai import LLM
+
+ llm = LLM(
+ model="perplexity/mistral-7b-instruct",
+ base_url="https://api.perplexity.ai/v1",
+ api_key="your-api-key-here"
+ )
+ agent = Agent(llm=llm, ...)
+ ```
+
+
+
+
+ ```python Code
+ from crewai import LLM
+
+ llm = LLM(
+ model="watsonx/ibm/granite-13b-chat-v2",
+ base_url="https://api.watsonx.ai/v1",
+ api_key="your-api-key-here"
+ )
+ agent = Agent(llm=llm, ...)
+ ```
+
+
## Changing the Base API URL