From af03042852a5d206174756b5b24b868716b30f29 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jo=C3=A3o=20Moura?= Date: Mon, 19 Feb 2024 22:01:09 -0300 Subject: [PATCH] Updating docs --- docs/core-concepts/Tasks.md | 3 +++ docs/how-to/LLM-Connections.md | 2 +- 2 files changed, 4 insertions(+), 1 deletion(-) diff --git a/docs/core-concepts/Tasks.md b/docs/core-concepts/Tasks.md index 6b73c1a92..cec0eb4ff 100644 --- a/docs/core-concepts/Tasks.md +++ b/docs/core-concepts/Tasks.md @@ -19,6 +19,9 @@ Tasks in CrewAI can be designed to require collaboration between agents. For exa | **Tools** *(optional)* | These are the functions or capabilities the agent can utilize to perform the task. They can be anything from simple actions like 'search' to more complex interactions with other agents or APIs. | | **Async Execution** *(optional)* | If the task should be executed asynchronously. | | **Context** *(optional)* | Other tasks that will have their output used as context for this task, if one is an asynchronous task it will wait for that to finish | +| **Output JSON** *(optional)* | Takes a pydantic model and returns the output as a JSON object. **Agent LLM needs to be using OpenAI client, could be Ollama for example but using the OpenAI wrapper** | +| **Output Pydantic** *(optional)* | Takes a pydantic model and returns the output as a pydantic object. **Agent LLM needs to be using OpenAI client, could be Ollama for example but using the OpenAI wrapper** | +| **Output File** *(optional)* | Takes a file path and saves the output of the task on it. | | **Callback** *(optional)* | A function to be executed after the task is completed. | ## Creating a Task diff --git a/docs/how-to/LLM-Connections.md b/docs/how-to/LLM-Connections.md index e9bcd7b28..fbf774425 100644 --- a/docs/how-to/LLM-Connections.md +++ b/docs/how-to/LLM-Connections.md @@ -15,7 +15,7 @@ Ollama is preferred for local LLM integration, offering customization and privac ### Setting Up Ollama - **Installation**: Follow Ollama's guide for setup. -- **Configuration**: [Adjust your local model with a Modelfile](https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md), considering adding `Observation` as a stop word and playing with parameters like `top_p` and `temperature`. +- **Configuration**: [Adjust your local model with a Modelfile](https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md), considering adding `Result` as a stop word and playing with parameters like `top_p` and `temperature`. ### Integrating Ollama with CrewAI Instantiate Ollama and pass it to your agents within CrewAI, enhancing them with the local model's capabilities.