mirror of
https://github.com/crewAIInc/crewAI.git
synced 2026-01-10 08:38:30 +00:00
updating LLM docs
This commit is contained in:
@@ -25,7 +25,100 @@ By default, CrewAI uses the `gpt-4o-mini` model. It uses environment variables i
|
|||||||
- `OPENAI_API_BASE`
|
- `OPENAI_API_BASE`
|
||||||
- `OPENAI_API_KEY`
|
- `OPENAI_API_KEY`
|
||||||
|
|
||||||
### 2. Custom LLM Objects
|
### 2. Updating YAML files
|
||||||
|
|
||||||
|
You can update the `agents.yml` file to refer to the LLM you want to use:
|
||||||
|
|
||||||
|
```yaml Code
|
||||||
|
researcher:
|
||||||
|
role: Research Specialist
|
||||||
|
goal: Conduct comprehensive research and analysis to gather relevant information,
|
||||||
|
synthesize findings, and produce well-documented insights.
|
||||||
|
backstory: A dedicated research professional with years of experience in academic
|
||||||
|
investigation, literature review, and data analysis, known for thorough and
|
||||||
|
methodical approaches to complex research questions.
|
||||||
|
verbose: true
|
||||||
|
llm: openai/gpt-4o
|
||||||
|
# llm: azure/gpt-4o-mini
|
||||||
|
# llm: gemini/gemini-pro
|
||||||
|
# llm: anthropic/claude-3-5-sonnet-20240620
|
||||||
|
# llm: bedrock/anthropic.claude-3-sonnet-20240229-v1:0
|
||||||
|
# llm: mistral/mistral-large-latest
|
||||||
|
# llm: ollama/llama3:70b
|
||||||
|
# llm: groq/llama-3.2-90b-vision-preview
|
||||||
|
# llm: watsonx/meta-llama/llama-3-1-70b-instruct
|
||||||
|
# ...
|
||||||
|
```
|
||||||
|
|
||||||
|
Keep in mind that you will need to set certain ENV vars depending on the model you are
|
||||||
|
using to account for the credentials or set a custom LLM object like described below.
|
||||||
|
Here are some of the required ENV vars for some of the LLM integrations:
|
||||||
|
|
||||||
|
<AccordionGroup>
|
||||||
|
<Accordion title="OpenAI">
|
||||||
|
```python Code
|
||||||
|
OPENAI_API_KEY=<your-api-key>
|
||||||
|
OPENAI_API_BASE=<optional-custom-base-url>
|
||||||
|
OPENAI_MODEL_NAME=<openai-model-name>
|
||||||
|
OPENAI_ORGANIZATION=<your-org-id> # OPTIONAL
|
||||||
|
OPENAI_API_BASE=<openaiai-api-base> # OPTIONAL
|
||||||
|
```
|
||||||
|
</Accordion>
|
||||||
|
|
||||||
|
<Accordion title="Anthropic">
|
||||||
|
```python Code
|
||||||
|
ANTHROPIC_API_KEY=<your-api-key>
|
||||||
|
```
|
||||||
|
</Accordion>
|
||||||
|
|
||||||
|
<Accordion title="Google">
|
||||||
|
```python Code
|
||||||
|
GEMINI_API_KEY=<your-api-key>
|
||||||
|
```
|
||||||
|
</Accordion>
|
||||||
|
|
||||||
|
<Accordion title="Azure">
|
||||||
|
```python Code
|
||||||
|
AZURE_API_KEY=<your-api-key> # "my-azure-api-key"
|
||||||
|
AZURE_API_BASE=<your-resource-url> # "https://example-endpoint.openai.azure.com"
|
||||||
|
AZURE_API_VERSION=<api-version> # "2023-05-15"
|
||||||
|
AZURE_AD_TOKEN=<your-azure-ad-token> # Optional
|
||||||
|
AZURE_API_TYPE=<your-azure-api-type> # Optional
|
||||||
|
```
|
||||||
|
</Accordion>
|
||||||
|
|
||||||
|
<Accordion title="AWS Bedrock">
|
||||||
|
```python Code
|
||||||
|
AWS_ACCESS_KEY_ID=<your-access-key>
|
||||||
|
AWS_SECRET_ACCESS_KEY=<your-secret-key>
|
||||||
|
AWS_DEFAULT_REGION=<your-region>
|
||||||
|
```
|
||||||
|
</Accordion>
|
||||||
|
|
||||||
|
<Accordion title="Mistral">
|
||||||
|
```python Code
|
||||||
|
MISTRAL_API_KEY=<your-api-key>
|
||||||
|
```
|
||||||
|
</Accordion>
|
||||||
|
|
||||||
|
<Accordion title="Groq">
|
||||||
|
```python Code
|
||||||
|
GROQ_API_KEY=<your-api-key>
|
||||||
|
```
|
||||||
|
</Accordion>
|
||||||
|
|
||||||
|
<Accordion title="IBM watsonx.ai">
|
||||||
|
```python Code
|
||||||
|
WATSONX_URL=<your-url> # (required) Base URL of your WatsonX instance
|
||||||
|
WATSONX_APIKEY=<your-apikey> # (required) IBM cloud API key
|
||||||
|
WATSONX_TOKEN=<your-token> # (required) IAM auth token (alternative to APIKEY)
|
||||||
|
WATSONX_PROJECT_ID=<your-project-id> # (optional) Project ID of your WatsonX instance
|
||||||
|
WATSONX_DEPLOYMENT_SPACE_ID=<your-space-id> # (optional) ID of deployment space for deployed models
|
||||||
|
```
|
||||||
|
</Accordion>
|
||||||
|
</AccordionGroup>
|
||||||
|
|
||||||
|
### 3. Custom LLM Objects
|
||||||
|
|
||||||
Pass a custom LLM implementation or object from another library.
|
Pass a custom LLM implementation or object from another library.
|
||||||
|
|
||||||
@@ -224,6 +317,29 @@ These are examples of how to configure LLMs for your agent.
|
|||||||
</Accordion>
|
</Accordion>
|
||||||
|
|
||||||
<Accordion title="IBM watsonx.ai">
|
<Accordion title="IBM watsonx.ai">
|
||||||
|
You can use IBM Watson by seeting the following ENV vars:
|
||||||
|
|
||||||
|
```python Code
|
||||||
|
WATSONX_URL=<your-url>
|
||||||
|
WATSONX_APIKEY=<your-apikey>
|
||||||
|
WATSONX_PROJECT_ID=<your-project-id>
|
||||||
|
```
|
||||||
|
|
||||||
|
You can then define your agents llms by updating the `agents.yml`
|
||||||
|
|
||||||
|
```yaml Code
|
||||||
|
researcher:
|
||||||
|
role: Research Specialist
|
||||||
|
goal: Conduct comprehensive research and analysis to gather relevant information,
|
||||||
|
synthesize findings, and produce well-documented insights.
|
||||||
|
backstory: A dedicated research professional with years of experience in academic
|
||||||
|
investigation, literature review, and data analysis, known for thorough and
|
||||||
|
methodical approaches to complex research questions.
|
||||||
|
verbose: true
|
||||||
|
llm: watsonx/meta-llama/llama-3-1-70b-instruct
|
||||||
|
```
|
||||||
|
|
||||||
|
You can also set up agents more dynamically as a base level LLM instance, like bellow:
|
||||||
|
|
||||||
```python Code
|
```python Code
|
||||||
from crewai import LLM
|
from crewai import LLM
|
||||||
|
|||||||
Reference in New Issue
Block a user