Update docs (#1550)

* add llm providers accordion group

* fix numbering

* Fix directory tree & add llms to accordion
This commit is contained in:
Tony Kipkemboi
2024-11-01 15:58:36 -04:00
committed by GitHub
parent ae33fdd05f
commit ad41065b03
2 changed files with 31 additions and 21 deletions

View File

@@ -577,18 +577,20 @@ This command will create a new directory for your crew within the `crews` folder
After adding a new crew, your folder structure will look like this:
name_of_flow/
├── crews/
│ ├── poem_crew/
├── config/
│ │ ├── agents.yaml
│ │ │ └── tasks.yaml
│ │ └── poem_crew.py
│ └── name_of_crew/
── config/
│ │ ├── agents.yaml
│ └── tasks.yaml
── name_of_crew.py
| Directory/File | Description |
| :--------------------- | :----------------------------------------------------------------- |
| `name_of_flow/` | Root directory for the flow. |
| ├── `crews/` | Contains directories for specific crews. |
| ├── `poem_crew/` | Directory for the "poem_crew" with its configurations and scripts. |
| │ │ ├── `config/` | Configuration files directory for the "poem_crew". |
| │ │ │ ├── `agents.yaml` | YAML file defining the agents for "poem_crew". |
| │ └── `tasks.yaml` | YAML file defining the tasks for "poem_crew". |
| ── `poem_crew.py` | Script for "poem_crew" functionality. |
| └── `name_of_crew/` | Directory for the new crew. |
| ├── `config/` | Configuration files directory for the new crew. |
| ── `agents.yaml` | YAML file defining the agents for the new crew. |
| │ └── `tasks.yaml` | YAML file defining the tasks for the new crew. |
| └── `name_of_crew.py` | Script for the new crew functionality. |
You can then customize the `agents.yaml` and `tasks.yaml` files to define the agents and tasks for your new crew. The `name_of_crew.py` file will contain the crew's logic, which you can modify to suit your needs.

View File

@@ -131,7 +131,6 @@ These are examples of how to configure LLMs for your agent.
llm = LLM(
model="cerebras/llama-3.1-70b",
base_url="https://api.cerebras.ai/v1",
api_key="your-api-key-here"
)
agent = Agent(llm=llm, ...)
@@ -166,7 +165,6 @@ These are examples of how to configure LLMs for your agent.
llm = LLM(
model="groq/llama3-8b-8192",
base_url="https://api.groq.com/openai/v1",
api_key="your-api-key-here"
)
agent = Agent(llm=llm, ...)
@@ -180,21 +178,18 @@ These are examples of how to configure LLMs for your agent.
llm = LLM(
model="anthropic/claude-3-5-sonnet-20241022",
base_url="https://api.anthropic.com/v1",
api_key="your-api-key-here"
)
agent = Agent(llm=llm, ...)
```
</Accordion>
<Accordion title="Fireworks">
<Accordion title="Fireworks AI">
```python Code
from crewai import LLM
llm = LLM(
model="fireworks/meta-llama-3.1-8b-instruct",
base_url="https://api.fireworks.ai/inference/v1",
model="fireworks_ai/accounts/fireworks/models/llama-v3-70b-instruct",
api_key="your-api-key-here"
)
agent = Agent(llm=llm, ...)
@@ -207,8 +202,7 @@ These are examples of how to configure LLMs for your agent.
from crewai import LLM
llm = LLM(
model="gemini/gemini-1.5-flash",
base_url="https://api.gemini.google.com/v1",
model="gemini/gemini-1.5-pro-002",
api_key="your-api-key-here"
)
agent = Agent(llm=llm, ...)
@@ -242,6 +236,20 @@ These are examples of how to configure LLMs for your agent.
agent = Agent(llm=llm, ...)
```
</Accordion>
<Accordion title="Hugging Face">
```python Code
from crewai import LLM
llm = LLM(
model="huggingface/meta-llama/Meta-Llama-3.1-8B-Instruct",
api_key="your-api-key-here",
base_url="your_api_endpoint"
)
agent = Agent(llm=llm, ...)
```
</Accordion>
</AccordionGroup>
## Changing the Base API URL