mirror of
https://github.com/crewAIInc/crewAI.git
synced 2026-01-24 23:58:15 +00:00
docs: improve tool documentation and examples
- Update SerperDevTool documentation with accurate parameters and JSON response format - Enhance XMLSearchTool and MDXSearchTool docs with RAG capabilities and required parameters - Fix code block formatting across multiple tool documentation files - Add clarification about environment variables and configuration - Validate all examples against actual implementations - Successfully tested with mkdocs build Co-Authored-By: Joe Moura <joao@crewai.com>
This commit is contained in:
@@ -31,7 +31,7 @@ Remember that when using this tool, the code must be generated by the Agent itse
|
||||
The code must be a Python3 code. And it will take some time for the first time to run
|
||||
because it needs to build the Docker image.
|
||||
|
||||
```python Code
|
||||
```python
|
||||
from crewai import Agent
|
||||
from crewai_tools import CodeInterpreterTool
|
||||
|
||||
@@ -43,7 +43,7 @@ Agent(
|
||||
|
||||
We also provide a simple way to use it directly from the Agent.
|
||||
|
||||
```python Code
|
||||
```python
|
||||
from crewai import Agent
|
||||
|
||||
agent = Agent(
|
||||
|
||||
@@ -27,7 +27,7 @@ The following example demonstrates how to initialize the tool and execute a gith
|
||||
|
||||
1. Initialize Composio tools
|
||||
|
||||
```python Code
|
||||
```python
|
||||
from composio import App
|
||||
from crewai_tools import ComposioTool
|
||||
from crewai import Agent, Task
|
||||
@@ -38,19 +38,19 @@ tools = [ComposioTool.from_action(action=Action.GITHUB_ACTIVITY_STAR_REPO_FOR_AU
|
||||
|
||||
If you don't know what action you want to use, use `from_app` and `tags` filter to get relevant actions
|
||||
|
||||
```python Code
|
||||
```python
|
||||
tools = ComposioTool.from_app(App.GITHUB, tags=["important"])
|
||||
```
|
||||
|
||||
or use `use_case` to search relevant actions
|
||||
|
||||
```python Code
|
||||
```python
|
||||
tools = ComposioTool.from_app(App.GITHUB, use_case="Star a github repository")
|
||||
```
|
||||
|
||||
2. Define agent
|
||||
|
||||
```python Code
|
||||
```python
|
||||
crewai_agent = Agent(
|
||||
role="Github Agent",
|
||||
goal="You take action on Github using Github APIs",
|
||||
@@ -65,7 +65,7 @@ crewai_agent = Agent(
|
||||
|
||||
3. Execute task
|
||||
|
||||
```python Code
|
||||
```python
|
||||
task = Task(
|
||||
description="Star a repo ComposioHQ/composio on GitHub",
|
||||
agent=crewai_agent,
|
||||
@@ -75,4 +75,4 @@ task = Task(
|
||||
task.execute()
|
||||
```
|
||||
|
||||
* More detailed list of tools can be found [here](https://app.composio.dev)
|
||||
* More detailed list of tools can be found [here](https://app.composio.dev)
|
||||
|
||||
@@ -22,7 +22,7 @@ pip install 'crewai[tools]'
|
||||
|
||||
Remember that when using this tool, the text must be generated by the Agent itself. The text must be a description of the image you want to generate.
|
||||
|
||||
```python Code
|
||||
```python
|
||||
from crewai_tools import DallETool
|
||||
|
||||
Agent(
|
||||
@@ -31,9 +31,16 @@ Agent(
|
||||
)
|
||||
```
|
||||
|
||||
If needed you can also tweak the parameters of the DALL-E model by passing them as arguments to the `DallETool` class. For example:
|
||||
## Arguments
|
||||
|
||||
```python Code
|
||||
- `model`: The DALL-E model to use (e.g., "dall-e-3")
|
||||
- `size`: Image size (e.g., "1024x1024")
|
||||
- `quality`: Image quality ("standard" or "hd")
|
||||
- `n`: Number of images to generate
|
||||
|
||||
## Configuration Example
|
||||
|
||||
```python
|
||||
from crewai_tools import DallETool
|
||||
|
||||
dalle_tool = DallETool(model="dall-e-3",
|
||||
@@ -48,4 +55,4 @@ Agent(
|
||||
```
|
||||
|
||||
The parameters are based on the `client.images.generate` method from the OpenAI API. For more information on the parameters,
|
||||
please refer to the [OpenAI API documentation](https://platform.openai.com/docs/guides/images/introduction?lang=python).
|
||||
please refer to the [OpenAI API documentation](https://platform.openai.com/docs/guides/images/introduction?lang=python).
|
||||
|
||||
@@ -20,11 +20,11 @@ Install the crewai_tools package to use the `FileWriterTool` in your projects:
|
||||
pip install 'crewai[tools]'
|
||||
```
|
||||
|
||||
## Example
|
||||
## Usage
|
||||
|
||||
To get started with the `FileWriterTool`:
|
||||
Here's how to use the `FileWriterTool`:
|
||||
|
||||
```python Code
|
||||
```python
|
||||
from crewai_tools import FileWriterTool
|
||||
|
||||
# Initialize the tool
|
||||
@@ -45,4 +45,4 @@ print(result)
|
||||
|
||||
By integrating the `FileWriterTool` into your crews, the agents can execute the process of writing content to files and creating directories.
|
||||
This tool is essential for tasks that require saving output data, creating structured file systems, and more. By adhering to the setup and usage guidelines provided,
|
||||
incorporating this tool into projects is straightforward and efficient.
|
||||
incorporating this tool into projects is straightforward and efficient.
|
||||
|
||||
@@ -23,11 +23,11 @@ To install the JSONSearchTool, use the following pip command:
|
||||
pip install 'crewai[tools]'
|
||||
```
|
||||
|
||||
## Usage Examples
|
||||
## Example
|
||||
|
||||
Here are updated examples on how to utilize the JSONSearchTool effectively for searching within JSON files. These examples take into account the current implementation and usage patterns identified in the codebase.
|
||||
|
||||
```python Code
|
||||
```python
|
||||
from crewai.json_tools import JSONSearchTool # Updated import path
|
||||
|
||||
# General JSON content search
|
||||
@@ -47,7 +47,7 @@ tool = JSONSearchTool(json_path='./path/to/your/file.json')
|
||||
|
||||
The JSONSearchTool supports extensive customization through a configuration dictionary. This allows users to select different models for embeddings and summarization based on their requirements.
|
||||
|
||||
```python Code
|
||||
```python
|
||||
tool = JSONSearchTool(
|
||||
config={
|
||||
"llm": {
|
||||
@@ -70,4 +70,4 @@ tool = JSONSearchTool(
|
||||
},
|
||||
}
|
||||
)
|
||||
```
|
||||
```
|
||||
|
||||
@@ -22,11 +22,13 @@ Before using the MDX Search Tool, ensure the `crewai_tools` package is installed
|
||||
pip install 'crewai[tools]'
|
||||
```
|
||||
|
||||
## Usage Example
|
||||
## Example
|
||||
|
||||
To use the MDX Search Tool, you must first set up the necessary environment variables. Then, integrate the tool into your crewAI project to begin your market research. Below is a basic example of how to do this:
|
||||
To use the MDX Search Tool, you must first set up the necessary environment variables for your chosen LLM and embeddings providers (e.g., OpenAI API key if using the default configuration). Then, integrate the tool into your crewAI project as shown in the examples below.
|
||||
|
||||
```python Code
|
||||
The tool will perform semantic search using RAG technology to find and extract relevant content from your MDX files based on the search query:
|
||||
|
||||
```python
|
||||
from crewai_tools import MDXSearchTool
|
||||
|
||||
# Initialize the tool to search any MDX content it learns about during execution
|
||||
@@ -40,13 +42,16 @@ tool = MDXSearchTool(mdx='path/to/your/document.mdx')
|
||||
|
||||
## Parameters
|
||||
|
||||
- mdx: **Optional**. Specifies the MDX file path for the search. It can be provided during initialization.
|
||||
- `mdx`: **Optional**. Specifies the MDX file path for the search. It can be provided during initialization or when running the search.
|
||||
- `search_query`: **Required**. The query string to search for within the MDX content.
|
||||
|
||||
The tool inherits from `RagTool` which provides advanced RAG (Retrieval-Augmented Generation) capabilities for semantic search within MDX content.
|
||||
|
||||
## Customization of Model and Embeddings
|
||||
|
||||
The tool defaults to using OpenAI for embeddings and summarization. For customization, utilize a configuration dictionary as shown below:
|
||||
|
||||
```python Code
|
||||
```python
|
||||
tool = MDXSearchTool(
|
||||
config=dict(
|
||||
llm=dict(
|
||||
@@ -70,4 +75,4 @@ tool = MDXSearchTool(
|
||||
),
|
||||
)
|
||||
)
|
||||
```
|
||||
```
|
||||
|
||||
@@ -27,7 +27,7 @@ pip install 'crewai[tools]'
|
||||
|
||||
The following example demonstrates how to initialize the tool and execute a search with a given query:
|
||||
|
||||
```python Code
|
||||
```python
|
||||
from crewai_tools import SerperDevTool
|
||||
|
||||
# Initialize the tool for internet searching capabilities
|
||||
@@ -44,22 +44,27 @@ To effectively use the `SerperDevTool`, follow these steps:
|
||||
|
||||
## Parameters
|
||||
|
||||
The `SerperDevTool` comes with several parameters that will be passed to the API :
|
||||
The `SerperDevTool` comes with several parameters that can be configured:
|
||||
|
||||
- **search_url**: The URL endpoint for the search API. (Default is `https://google.serper.dev/search`)
|
||||
- **base_url**: The base URL for the Serper API. Default is `https://google.serper.dev`.
|
||||
- **n_results**: Number of search results to return. Default is `10`.
|
||||
- **save_file**: Boolean flag to save search results to a file. Default is `False`.
|
||||
- **search_type**: Type of search to perform. Can be either `search` (default) or `news`.
|
||||
|
||||
Additional parameters that can be passed during search:
|
||||
- **country**: Optional. Specify the country for the search results.
|
||||
- **location**: Optional. Specify the location for the search results.
|
||||
- **locale**: Optional. Specify the locale for the search results.
|
||||
- **n_results**: Number of search results to return. Default is `10`.
|
||||
|
||||
The values for `country`, `location`, `locale` and `search_url` can be found on the [Serper Playground](https://serper.dev/playground).
|
||||
The values for `country`, `location`, and `locale` can be found on the [Serper Playground](https://serper.dev/playground).
|
||||
|
||||
Note: The tool requires the `SERPER_API_KEY` environment variable to be set with your Serper API key.
|
||||
|
||||
## Example with Parameters
|
||||
|
||||
Here is an example demonstrating how to use the tool with additional parameters:
|
||||
|
||||
```python Code
|
||||
```python
|
||||
from crewai_tools import SerperDevTool
|
||||
|
||||
tool = SerperDevTool(
|
||||
@@ -71,18 +76,29 @@ print(tool.run(search_query="ChatGPT"))
|
||||
|
||||
# Using Tool: Search the internet
|
||||
|
||||
# Search results: Title: Role of chat gpt in public health
|
||||
# Link: https://link.springer.com/article/10.1007/s10439-023-03172-7
|
||||
# Snippet: … ChatGPT in public health. In this overview, we will examine the potential uses of ChatGPT in
|
||||
# ---
|
||||
# Title: Potential use of chat gpt in global warming
|
||||
# Link: https://link.springer.com/article/10.1007/s10439-023-03171-8
|
||||
# Snippet: … as ChatGPT, have the potential to play a critical role in advancing our understanding of climate
|
||||
# ---
|
||||
# Search results:
|
||||
{
|
||||
"searchParameters": {
|
||||
"q": "ChatGPT",
|
||||
"type": "search"
|
||||
},
|
||||
"organic": [
|
||||
{
|
||||
"title": "Role of chat gpt in public health",
|
||||
"link": "https://link.springer.com/article/10.1007/s10439-023-03172-7",
|
||||
"snippet": "ChatGPT in public health. In this overview, we will examine the potential uses of ChatGPT in"
|
||||
},
|
||||
{
|
||||
"title": "Potential use of chat gpt in global warming",
|
||||
"link": "https://link.springer.com/article/10.1007/s10439-023-03171-8",
|
||||
"snippet": "as ChatGPT, have the potential to play a critical role in advancing our understanding of climate"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
```python Code
|
||||
```python
|
||||
from crewai_tools import SerperDevTool
|
||||
|
||||
tool = SerperDevTool(
|
||||
|
||||
@@ -26,7 +26,7 @@ pip install spider-client 'crewai[tools]'
|
||||
This example shows you how you can use the `SpiderTool` to enable your agent to scrape and crawl websites.
|
||||
The data returned from the Spider API is already LLM-ready, so no need to do any cleaning there.
|
||||
|
||||
```python Code
|
||||
```python
|
||||
from crewai_tools import SpiderTool
|
||||
|
||||
def main():
|
||||
@@ -89,4 +89,4 @@ if __name__ == "__main__":
|
||||
| **query_selector** | `string` | CSS query selector for content extraction from markup. |
|
||||
| **full_resources** | `bool` | Downloads all resources linked to the website. |
|
||||
| **request_timeout** | `int` | Timeout in seconds for requests (5-60). Default is `30`. |
|
||||
| **run_in_background** | `bool` | Runs the request in the background, useful for data storage and triggering dashboard crawls. No effect if `storageless` is set. |
|
||||
| **run_in_background** | `bool` | Runs the request in the background, useful for data storage and triggering dashboard crawls. No effect if `storageless` is set. |
|
||||
|
||||
@@ -19,11 +19,11 @@ Install the crewai_tools package
|
||||
pip install 'crewai[tools]'
|
||||
```
|
||||
|
||||
## Usage
|
||||
## Example
|
||||
|
||||
In order to use the VisionTool, the OpenAI API key should be set in the environment variable `OPENAI_API_KEY`.
|
||||
To use the VisionTool, first ensure the OpenAI API key is set in the environment variable `OPENAI_API_KEY`. Here's an example:
|
||||
|
||||
```python Code
|
||||
```python
|
||||
from crewai_tools import VisionTool
|
||||
|
||||
vision_tool = VisionTool()
|
||||
|
||||
@@ -27,11 +27,11 @@ pip install 'crewai[tools]'
|
||||
|
||||
This command installs the necessary dependencies to ensure that once the tool is fully integrated, users can start using it immediately.
|
||||
|
||||
## Example Usage
|
||||
## Example
|
||||
|
||||
Below are examples of how the WebsiteSearchTool could be utilized in different scenarios. Please note, these examples are illustrative and represent planned functionality:
|
||||
|
||||
```python Code
|
||||
```python
|
||||
from crewai_tools import WebsiteSearchTool
|
||||
|
||||
# Example of initiating tool that agents can use
|
||||
@@ -52,7 +52,7 @@ tool = WebsiteSearchTool(website='https://example.com')
|
||||
By default, the tool uses OpenAI for both embeddings and summarization. To customize the model, you can use a config dictionary as follows:
|
||||
|
||||
|
||||
```python Code
|
||||
```python
|
||||
tool = WebsiteSearchTool(
|
||||
config=dict(
|
||||
llm=dict(
|
||||
@@ -74,4 +74,4 @@ tool = WebsiteSearchTool(
|
||||
),
|
||||
)
|
||||
)
|
||||
```
|
||||
```
|
||||
|
||||
@@ -29,7 +29,9 @@ pip install 'crewai[tools]'
|
||||
Here are two examples demonstrating how to use the XMLSearchTool.
|
||||
The first example shows searching within a specific XML file, while the second example illustrates initiating a search without predefining an XML path, providing flexibility in search scope.
|
||||
|
||||
```python Code
|
||||
Note: The tool uses RAG (Retrieval-Augmented Generation) to perform semantic search within XML content, so results will include relevant context from the XML file based on the search query.
|
||||
|
||||
```python
|
||||
from crewai_tools import XMLSearchTool
|
||||
|
||||
# Allow agents to search within any XML file's content
|
||||
@@ -47,12 +49,15 @@ tool = XMLSearchTool(xml='path/to/your/xmlfile.xml')
|
||||
|
||||
- `xml`: This is the path to the XML file you wish to search.
|
||||
It is an optional parameter during the tool's initialization but must be provided either at initialization or as part of the `run` method's arguments to execute a search.
|
||||
- `search_query`: The query string to search for within the XML content. This is a required parameter when running the search.
|
||||
|
||||
The tool inherits from `RagTool` which provides advanced RAG (Retrieval-Augmented Generation) capabilities for semantic search within XML content.
|
||||
|
||||
## Custom model and embeddings
|
||||
|
||||
By default, the tool uses OpenAI for both embeddings and summarization. To customize the model, you can use a config dictionary as follows:
|
||||
|
||||
```python Code
|
||||
```python
|
||||
tool = XMLSearchTool(
|
||||
config=dict(
|
||||
llm=dict(
|
||||
@@ -74,4 +79,4 @@ tool = XMLSearchTool(
|
||||
),
|
||||
)
|
||||
)
|
||||
```
|
||||
```
|
||||
|
||||
Reference in New Issue
Block a user