Add llm providers accordion group (#1534)

* add llm providers accordion group

* fix numbering
This commit is contained in:
Tony Kipkemboi
2024-10-30 21:56:13 -04:00
committed by GitHub
parent 4ae07468f3
commit ec2967c362

View File

@@ -25,42 +25,43 @@ By default, CrewAI uses the `gpt-4o-mini` model. It uses environment variables i
- `OPENAI_API_BASE` - `OPENAI_API_BASE`
- `OPENAI_API_KEY` - `OPENAI_API_KEY`
### 2. String Identifier ### 2. Custom LLM Objects
Pass a custom LLM implementation or object from another library.
See below for examples.
<Tabs>
<Tab title="String Identifier">
```python Code ```python Code
agent = Agent(llm="gpt-4o", ...) agent = Agent(llm="gpt-4o", ...)
``` ```
</Tab>
### 3. LLM Instance <Tab title="LLM Instance">
List of [more providers](https://docs.litellm.ai/docs/providers).
```python Code ```python Code
from crewai import LLM from crewai import LLM
llm = LLM(model="gpt-4", temperature=0.7) llm = LLM(model="gpt-4", temperature=0.7)
agent = Agent(llm=llm, ...) agent = Agent(llm=llm, ...)
``` ```
</Tab>
### 4. Custom LLM Objects </Tabs>
Pass a custom LLM implementation or object from another library.
## Connecting to OpenAI-Compatible LLMs ## Connecting to OpenAI-Compatible LLMs
You can connect to OpenAI-compatible LLMs using either environment variables or by setting specific attributes on the LLM class: You can connect to OpenAI-compatible LLMs using either environment variables or by setting specific attributes on the LLM class:
1. Using environment variables: <Tabs>
<Tab title="Using Environment Variables">
```python Code ```python Code
import os import os
os.environ["OPENAI_API_KEY"] = "your-api-key" os.environ["OPENAI_API_KEY"] = "your-api-key"
os.environ["OPENAI_API_BASE"] = "https://api.your-provider.com/v1" os.environ["OPENAI_API_BASE"] = "https://api.your-provider.com/v1"
``` ```
</Tab>
2. Using LLM class attributes: <Tab title="Using LLM Class Attributes">
```python Code ```python Code
from crewai import LLM from crewai import LLM
@@ -71,6 +72,8 @@ llm = LLM(
) )
agent = Agent(llm=llm, ...) agent = Agent(llm=llm, ...)
``` ```
</Tab>
</Tabs>
## LLM Configuration Options ## LLM Configuration Options
@@ -97,7 +100,10 @@ When configuring an LLM for your agent, you have access to a wide range of param
| **api_key** | `str` | Your API key for authentication. | | **api_key** | `str` | Your API key for authentication. |
## OpenAI Example Configuration These are examples of how to configure LLMs for your agent.
<AccordionGroup>
<Accordion title="OpenAI">
```python Code ```python Code
from crewai import LLM from crewai import LLM
@@ -116,8 +122,9 @@ llm = LLM(
) )
agent = Agent(llm=llm, ...) agent = Agent(llm=llm, ...)
``` ```
</Accordion>
## Cerebras Example Configuration <Accordion title="Cerebras">
```python Code ```python Code
from crewai import LLM from crewai import LLM
@@ -129,8 +136,9 @@ llm = LLM(
) )
agent = Agent(llm=llm, ...) agent = Agent(llm=llm, ...)
``` ```
</Accordion>
## Using Ollama (Local LLMs) <Accordion title="Ollama (Local LLMs)">
CrewAI supports using Ollama for running open-source models locally: CrewAI supports using Ollama for running open-source models locally:
@@ -142,10 +150,99 @@ CrewAI supports using Ollama for running open-source models locally:
from crewai import LLM from crewai import LLM
agent = Agent( agent = Agent(
llm=LLM(model="ollama/llama3.1", base_url="http://localhost:11434"), llm=LLM(
model="ollama/llama3.1",
base_url="http://localhost:11434"
),
... ...
) )
``` ```
</Accordion>
<Accordion title="Groq">
```python Code
from crewai import LLM
llm = LLM(
model="groq/llama3-8b-8192",
base_url="https://api.groq.com/openai/v1",
api_key="your-api-key-here"
)
agent = Agent(llm=llm, ...)
```
</Accordion>
<Accordion title="Anthropic">
```python Code
from crewai import LLM
llm = LLM(
model="anthropic/claude-3-5-sonnet-20241022",
base_url="https://api.anthropic.com/v1",
api_key="your-api-key-here"
)
agent = Agent(llm=llm, ...)
```
</Accordion>
<Accordion title="Fireworks">
```python Code
from crewai import LLM
llm = LLM(
model="fireworks/meta-llama-3.1-8b-instruct",
base_url="https://api.fireworks.ai/inference/v1",
api_key="your-api-key-here"
)
agent = Agent(llm=llm, ...)
```
</Accordion>
<Accordion title="Gemini">
```python Code
from crewai import LLM
llm = LLM(
model="gemini/gemini-1.5-flash",
base_url="https://api.gemini.google.com/v1",
api_key="your-api-key-here"
)
agent = Agent(llm=llm, ...)
```
</Accordion>
<Accordion title="Perplexity AI (pplx-api)">
```python Code
from crewai import LLM
llm = LLM(
model="perplexity/mistral-7b-instruct",
base_url="https://api.perplexity.ai/v1",
api_key="your-api-key-here"
)
agent = Agent(llm=llm, ...)
```
</Accordion>
<Accordion title="IBM watsonx.ai">
```python Code
from crewai import LLM
llm = LLM(
model="watsonx/ibm/granite-13b-chat-v2",
base_url="https://api.watsonx.ai/v1",
api_key="your-api-key-here"
)
agent = Agent(llm=llm, ...)
```
</Accordion>
</AccordionGroup>
## Changing the Base API URL ## Changing the Base API URL