Compare commits

...

6 Commits

Author SHA1 Message Date
Gui Vieira
9a210afd80 Fix types 2024-02-08 18:34:04 -03:00
João Moura
44b6bcbcaa preparing verison 0.5.5 2024-02-07 23:13:39 -08:00
João Moura
a45c82c5f7 fixing RPM controlelr being set unencessarily 2024-02-07 23:09:36 -08:00
João Moura
98133a4eb6 Adding new crew specific docs 2024-02-07 23:09:16 -08:00
João Moura
44c2fd223d preparing version 0.5.4 2024-02-07 22:22:33 -08:00
João Moura
fc249eefda adding initial telemetry 2024-02-07 22:21:44 -08:00
16 changed files with 581 additions and 32 deletions

View File

@@ -30,6 +30,7 @@
- [How CrewAI Compares](#how-crewai-compares)
- [Contribution](#contribution)
- [Hire CrewAI](#hire-crewai)
- [Telemetry](#telemetry)
- [License](#license)
## Why CrewAI?
@@ -243,6 +244,24 @@ pip install dist/*.tar.gz
We're a company developing crewAI and crewAI Enterprise, we for a limited time are offer consulting with selected customers, to get them early access to our enterprise solution
If you are interested on having access to it and hiring weekly hours with our team, feel free to email us at [joao@crewai.com](mailto:joao@crewai.com).
## Telemetry
CrewAI uses anonymous telemetry to collect usage data with the main purpose of helping us improve the library by focusing our efforts on the most used features, integrations and tools.
There is NO data being collected on the prompts, tasks descriptions agents backstories or goals nor tools usage, no API calls, nor responses nor any data that is being processed by the agents, nor any secrets and env vars.
Data collected includes:
- Version of crewAI
- Version of Python
- General OS (e.g. number of CPUs, macOS/Windows/Linux)
- Number of agents and tasks in a crew
- Crew Process being used
- If Agents are using memory or allowing delegation
- If Tasks are being executed in parallel or sequentially
- Language model being used
- Roles of agents in a crew
- Tools names available
## License
CrewAI is released under the MIT License.

View File

@@ -0,0 +1,75 @@
---
title: crewAI Crews
description: Understanding and utilizing crews in the crewAI framework.
---
## What is a Crew?
!!! note "Definition of a Crew"
A crew in crewAI represents a collaborative group of agents working together to achieve a set of tasks. Each crew defines the strategy for task execution, agent collaboration, and the overall workflow.
## Crew Attributes
| Attribute | Description |
| :------------------- | :----------------------------------------------------------- |
| **Tasks** | A list of tasks assigned to the crew. |
| **Agents** | A list of agents that are part of the crew. |
| **Process** | The process flow (e.g., sequential, hierarchical) the crew follows. |
| **Verbose** | The verbosity level for logging during execution. |
| **Manager LLM** | The language model used by the manager agent in a hierarchical process. |
| **Config** | Configuration settings for the crew. |
| **Max RPM** | Maximum requests per minute the crew adheres to during execution. |
| **Language** | Language setting for the crew's operation. |
!!! note "Crew Max RPM"
The `max_rpm` attribute sets the maximum number of requests per minute the crew can perform to avoid rate limits and will override individual agents `max_rpm` settings if you set it.
## Creating a Crew
!!! note "Crew Composition"
When assembling a crew, you combine agents with complementary roles and tools, assign tasks, and select a process that dictates their execution order and interaction.
### Example: Assembling a Crew
```python
from crewai import Crew, Agent, Task, Process
from langchain_community.tools import DuckDuckGoSearchRun
# Define agents with specific roles and tools
researcher = Agent(
role='Senior Research Analyst',
goal='Discover innovative AI technologies',
tools=[DuckDuckGoSearchRun()]
)
writer = Agent(
role='Content Writer',
goal='Write engaging articles on AI discoveries'
)
# Create tasks for the agents
research_task = Task(description='Identify breakthrough AI technologies', agent=researcher)
write_article_task = Task(description='Draft an article on the latest AI technologies', agent=writer)
# Assemble the crew with a sequential process
my_crew = Crew(
agents=[researcher, writer],
tasks=[research_task, write_article_task],
process=Process.sequential,
verbose=True
)
```
## Crew Execution Process
- **Sequential Process**: Tasks are executed one after another, allowing for a linear flow of work.
- **Hierarchical Process**: A manager agent coordinates the crew, delegating tasks and validating outcomes before proceeding.
### Kicking Off a Crew
Once your crew is assembled, initiate the workflow with the `kickoff()` method. This starts the execution process according to the defined process flow.
```python
# Start the crew's task execution
result = my_crew.kickoff()
print(result)
```

View File

@@ -28,6 +28,11 @@ Cutting-edge framework for orchestrating role-playing, autonomous AI agents. By
Processes
</a>
</li>
<li>
<a href="./core-concepts/Crews">
Crews
</a>
</li>
</ul>
</div>
<div style="width:30%">

View File

@@ -0,0 +1,17 @@
## Telemetry
CrewAI uses anonymous telemetry to collect usage data with the main purpose of helping us improve the library by focusing our efforts on the most used features, integrations and tools.
There is NO data being collected on the prompts, tasks descriptions agents backstories or goals nor tools usage, no API calls, nor responses nor any data that is being processed by the agents, nor any secrets and env vars.
Data collected includes:
- Version of crewAI
- Version of Python
- General OS (e.g. number of CPUs, macOS/Windows/Linux)
- Number of agents and tasks in a crew
- Crew Process being used
- If Agents are using memory or allowing delegation
- If Tasks are being executed in parallel or sequentially
- Language model being used
- Roles of agents in a crew
- Tools names available

View File

@@ -124,6 +124,7 @@ nav:
- Tasks: 'core-concepts/Tasks.md'
- Tools: 'core-concepts/Tools.md'
- Processes: 'core-concepts/Processes.md'
- Crews: 'core-concepts/Crews.md'
- Collaboration: 'core-concepts/Collaboration.md'
- How to Guides:
- Getting Started: 'how-to/Creating-a-Crew-and-kick-it-off.md'
@@ -140,6 +141,8 @@ nav:
- Drafting emails with LangGraph: https://github.com/joaomdmoura/crewAI-examples/tree/main/CrewAI-LangGraph"
- Landing Page Generator: https://github.com/joaomdmoura/crewAI-examples/tree/main/landing_page_generator"
- Prepare for meetings: https://github.com/joaomdmoura/crewAI-examples/tree/main/prep-for-a-meeting"
- Telemetry: 'telemetry/Telemetry.md'
extra_css:
- stylesheets/output.css
- stylesheets/extra.css

286
poetry.lock generated
View File

@@ -202,6 +202,17 @@ files = [
[package.extras]
dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
[[package]]
name = "backoff"
version = "2.2.1"
description = "Function decoration for backoff and retry"
optional = false
python-versions = ">=3.7,<4.0"
files = [
{file = "backoff-2.2.1-py3-none-any.whl", hash = "sha256:63579f9a0628e06278f7e47b7d7d5b6ce20dc65c5e96a6f3ca99a6adca0396e8"},
{file = "backoff-2.2.1.tar.gz", hash = "sha256:03f829f5bb1923180821643f8753b0502c3b682293992485b0eef2807afa5cba"},
]
[[package]]
name = "black"
version = "24.1.1"
@@ -528,6 +539,23 @@ files = [
{file = "defusedxml-0.7.1.tar.gz", hash = "sha256:1bb3032db185915b62d7c6209c5a8792be6a32ab2fedacc84e01b52c51aa3e69"},
]
[[package]]
name = "deprecated"
version = "1.2.14"
description = "Python @deprecated decorator to deprecate old python classes, functions or methods."
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
files = [
{file = "Deprecated-1.2.14-py2.py3-none-any.whl", hash = "sha256:6fac8b097794a90302bdbb17b9b815e732d3c4720583ff1b198499d78470466c"},
{file = "Deprecated-1.2.14.tar.gz", hash = "sha256:e5323eb936458dccc2582dc6f9c322c852a775a27065ff2b0c4970b9d53d01b3"},
]
[package.dependencies]
wrapt = ">=1.10,<2"
[package.extras]
dev = ["PyTest", "PyTest-Cov", "bump2version (<1)", "sphinx (<2)", "tox"]
[[package]]
name = "distlib"
version = "0.3.8"
@@ -683,6 +711,23 @@ python-dateutil = ">=2.8.1"
[package.extras]
dev = ["flake8", "markdown", "twine", "wheel"]
[[package]]
name = "googleapis-common-protos"
version = "1.62.0"
description = "Common protobufs used in Google APIs"
optional = false
python-versions = ">=3.7"
files = [
{file = "googleapis-common-protos-1.62.0.tar.gz", hash = "sha256:83f0ece9f94e5672cced82f592d2a5edf527a96ed1794f0bab36d5735c996277"},
{file = "googleapis_common_protos-1.62.0-py2.py3-none-any.whl", hash = "sha256:4750113612205514f9f6aa4cb00d523a94f3e8c06c5ad2fee466387dc4875f07"},
]
[package.dependencies]
protobuf = ">=3.19.5,<3.20.0 || >3.20.0,<3.20.1 || >3.20.1,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<5.0.0.dev0"
[package.extras]
grpc = ["grpcio (>=1.44.0,<2.0.0.dev0)"]
[[package]]
name = "greenlet"
version = "3.0.3"
@@ -768,6 +813,72 @@ files = [
[package.dependencies]
colorama = ">=0.4"
[[package]]
name = "grpcio"
version = "1.60.1"
description = "HTTP/2-based RPC framework"
optional = false
python-versions = ">=3.7"
files = [
{file = "grpcio-1.60.1-cp310-cp310-linux_armv7l.whl", hash = "sha256:14e8f2c84c0832773fb3958240c69def72357bc11392571f87b2d7b91e0bb092"},
{file = "grpcio-1.60.1-cp310-cp310-macosx_12_0_universal2.whl", hash = "sha256:33aed0a431f5befeffd9d346b0fa44b2c01aa4aeae5ea5b2c03d3e25e0071216"},
{file = "grpcio-1.60.1-cp310-cp310-manylinux_2_17_aarch64.whl", hash = "sha256:fead980fbc68512dfd4e0c7b1f5754c2a8e5015a04dea454b9cada54a8423525"},
{file = "grpcio-1.60.1-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:082081e6a36b6eb5cf0fd9a897fe777dbb3802176ffd08e3ec6567edd85bc104"},
{file = "grpcio-1.60.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:55ccb7db5a665079d68b5c7c86359ebd5ebf31a19bc1a91c982fd622f1e31ff2"},
{file = "grpcio-1.60.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:9b54577032d4f235452f77a83169b6527bf4b77d73aeada97d45b2aaf1bf5ce0"},
{file = "grpcio-1.60.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:7d142bcd604166417929b071cd396aa13c565749a4c840d6c702727a59d835eb"},
{file = "grpcio-1.60.1-cp310-cp310-win32.whl", hash = "sha256:2a6087f234cb570008a6041c8ffd1b7d657b397fdd6d26e83d72283dae3527b1"},
{file = "grpcio-1.60.1-cp310-cp310-win_amd64.whl", hash = "sha256:f2212796593ad1d0235068c79836861f2201fc7137a99aa2fea7beeb3b101177"},
{file = "grpcio-1.60.1-cp311-cp311-linux_armv7l.whl", hash = "sha256:79ae0dc785504cb1e1788758c588c711f4e4a0195d70dff53db203c95a0bd303"},
{file = "grpcio-1.60.1-cp311-cp311-macosx_10_10_universal2.whl", hash = "sha256:4eec8b8c1c2c9b7125508ff7c89d5701bf933c99d3910e446ed531cd16ad5d87"},
{file = "grpcio-1.60.1-cp311-cp311-manylinux_2_17_aarch64.whl", hash = "sha256:8c9554ca8e26241dabe7951aa1fa03a1ba0856688ecd7e7bdbdd286ebc272e4c"},
{file = "grpcio-1.60.1-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:91422ba785a8e7a18725b1dc40fbd88f08a5bb4c7f1b3e8739cab24b04fa8a03"},
{file = "grpcio-1.60.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cba6209c96828711cb7c8fcb45ecef8c8859238baf15119daa1bef0f6c84bfe7"},
{file = "grpcio-1.60.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:c71be3f86d67d8d1311c6076a4ba3b75ba5703c0b856b4e691c9097f9b1e8bd2"},
{file = "grpcio-1.60.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:af5ef6cfaf0d023c00002ba25d0751e5995fa0e4c9eec6cd263c30352662cbce"},
{file = "grpcio-1.60.1-cp311-cp311-win32.whl", hash = "sha256:a09506eb48fa5493c58f946c46754ef22f3ec0df64f2b5149373ff31fb67f3dd"},
{file = "grpcio-1.60.1-cp311-cp311-win_amd64.whl", hash = "sha256:49c9b6a510e3ed8df5f6f4f3c34d7fbf2d2cae048ee90a45cd7415abab72912c"},
{file = "grpcio-1.60.1-cp312-cp312-linux_armv7l.whl", hash = "sha256:b58b855d0071575ea9c7bc0d84a06d2edfbfccec52e9657864386381a7ce1ae9"},
{file = "grpcio-1.60.1-cp312-cp312-macosx_10_10_universal2.whl", hash = "sha256:a731ac5cffc34dac62053e0da90f0c0b8560396a19f69d9703e88240c8f05858"},
{file = "grpcio-1.60.1-cp312-cp312-manylinux_2_17_aarch64.whl", hash = "sha256:cf77f8cf2a651fbd869fbdcb4a1931464189cd210abc4cfad357f1cacc8642a6"},
{file = "grpcio-1.60.1-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c557e94e91a983e5b1e9c60076a8fd79fea1e7e06848eb2e48d0ccfb30f6e073"},
{file = "grpcio-1.60.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:069fe2aeee02dfd2135d562d0663fe70fbb69d5eed6eb3389042a7e963b54de8"},
{file = "grpcio-1.60.1-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:cb0af13433dbbd1c806e671d81ec75bd324af6ef75171fd7815ca3074fe32bfe"},
{file = "grpcio-1.60.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:2f44c32aef186bbba254129cea1df08a20be414144ac3bdf0e84b24e3f3b2e05"},
{file = "grpcio-1.60.1-cp312-cp312-win32.whl", hash = "sha256:a212e5dea1a4182e40cd3e4067ee46be9d10418092ce3627475e995cca95de21"},
{file = "grpcio-1.60.1-cp312-cp312-win_amd64.whl", hash = "sha256:6e490fa5f7f5326222cb9f0b78f207a2b218a14edf39602e083d5f617354306f"},
{file = "grpcio-1.60.1-cp37-cp37m-linux_armv7l.whl", hash = "sha256:4216e67ad9a4769117433814956031cb300f85edc855252a645a9a724b3b6594"},
{file = "grpcio-1.60.1-cp37-cp37m-macosx_10_10_universal2.whl", hash = "sha256:73e14acd3d4247169955fae8fb103a2b900cfad21d0c35f0dcd0fdd54cd60367"},
{file = "grpcio-1.60.1-cp37-cp37m-manylinux_2_17_aarch64.whl", hash = "sha256:6ecf21d20d02d1733e9c820fb5c114c749d888704a7ec824b545c12e78734d1c"},
{file = "grpcio-1.60.1-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:33bdea30dcfd4f87b045d404388469eb48a48c33a6195a043d116ed1b9a0196c"},
{file = "grpcio-1.60.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:53b69e79d00f78c81eecfb38f4516080dc7f36a198b6b37b928f1c13b3c063e9"},
{file = "grpcio-1.60.1-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:39aa848794b887120b1d35b1b994e445cc028ff602ef267f87c38122c1add50d"},
{file = "grpcio-1.60.1-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:72153a0d2e425f45b884540a61c6639436ddafa1829a42056aa5764b84108b8e"},
{file = "grpcio-1.60.1-cp37-cp37m-win_amd64.whl", hash = "sha256:50d56280b482875d1f9128ce596e59031a226a8b84bec88cb2bf76c289f5d0de"},
{file = "grpcio-1.60.1-cp38-cp38-linux_armv7l.whl", hash = "sha256:6d140bdeb26cad8b93c1455fa00573c05592793c32053d6e0016ce05ba267549"},
{file = "grpcio-1.60.1-cp38-cp38-macosx_10_10_universal2.whl", hash = "sha256:bc808924470643b82b14fe121923c30ec211d8c693e747eba8a7414bc4351a23"},
{file = "grpcio-1.60.1-cp38-cp38-manylinux_2_17_aarch64.whl", hash = "sha256:70c83bb530572917be20c21f3b6be92cd86b9aecb44b0c18b1d3b2cc3ae47df0"},
{file = "grpcio-1.60.1-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9b106bc52e7f28170e624ba61cc7dc6829566e535a6ec68528f8e1afbed1c41f"},
{file = "grpcio-1.60.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:30e980cd6db1088c144b92fe376747328d5554bc7960ce583ec7b7d81cd47287"},
{file = "grpcio-1.60.1-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:0c5807e9152eff15f1d48f6b9ad3749196f79a4a050469d99eecb679be592acc"},
{file = "grpcio-1.60.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:f1c3dc536b3ee124e8b24feb7533e5c70b9f2ef833e3b2e5513b2897fd46763a"},
{file = "grpcio-1.60.1-cp38-cp38-win32.whl", hash = "sha256:d7404cebcdb11bb5bd40bf94131faf7e9a7c10a6c60358580fe83913f360f929"},
{file = "grpcio-1.60.1-cp38-cp38-win_amd64.whl", hash = "sha256:c8754c75f55781515a3005063d9a05878b2cfb3cb7e41d5401ad0cf19de14872"},
{file = "grpcio-1.60.1-cp39-cp39-linux_armv7l.whl", hash = "sha256:0250a7a70b14000fa311de04b169cc7480be6c1a769b190769d347939d3232a8"},
{file = "grpcio-1.60.1-cp39-cp39-macosx_10_10_universal2.whl", hash = "sha256:660fc6b9c2a9ea3bb2a7e64ba878c98339abaf1811edca904ac85e9e662f1d73"},
{file = "grpcio-1.60.1-cp39-cp39-manylinux_2_17_aarch64.whl", hash = "sha256:76eaaba891083fcbe167aa0f03363311a9f12da975b025d30e94b93ac7a765fc"},
{file = "grpcio-1.60.1-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e5d97c65ea7e097056f3d1ead77040ebc236feaf7f71489383d20f3b4c28412a"},
{file = "grpcio-1.60.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bb2a2911b028f01c8c64d126f6b632fcd8a9ac975aa1b3855766c94e4107180"},
{file = "grpcio-1.60.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:5a1ebbae7e2214f51b1f23b57bf98eeed2cf1ba84e4d523c48c36d5b2f8829ff"},
{file = "grpcio-1.60.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:9a66f4d2a005bc78e61d805ed95dedfcb35efa84b7bba0403c6d60d13a3de2d6"},
{file = "grpcio-1.60.1-cp39-cp39-win32.whl", hash = "sha256:8d488fbdbf04283f0d20742b64968d44825617aa6717b07c006168ed16488804"},
{file = "grpcio-1.60.1-cp39-cp39-win_amd64.whl", hash = "sha256:61b7199cd2a55e62e45bfb629a35b71fc2c0cb88f686a047f25b1112d3810904"},
{file = "grpcio-1.60.1.tar.gz", hash = "sha256:dd1d3a8d1d2e50ad9b59e10aa7f07c7d1be2b367f3f2d33c5fade96ed5460962"},
]
[package.extras]
protobuf = ["grpcio-tools (>=1.60.1)"]
[[package]]
name = "h11"
version = "0.14.0"
@@ -849,6 +960,25 @@ files = [
{file = "idna-3.6.tar.gz", hash = "sha256:9ecdbbd083b06798ae1e86adcbfe8ab1479cf864e4ee30fe4e46a003d12491ca"},
]
[[package]]
name = "importlib-metadata"
version = "6.11.0"
description = "Read metadata from Python packages"
optional = false
python-versions = ">=3.8"
files = [
{file = "importlib_metadata-6.11.0-py3-none-any.whl", hash = "sha256:f0afba6205ad8f8947c7d338b5342d5db2afbfd82f9cbef7879a9539cc12eb9b"},
{file = "importlib_metadata-6.11.0.tar.gz", hash = "sha256:1231cf92d825c9e03cfc4da076a16de6422c863558229ea0b22b675657463443"},
]
[package.dependencies]
zipp = ">=0.5"
[package.extras]
docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (<7.2.5)", "sphinx (>=3.5)", "sphinx-lint"]
perf = ["ipython"]
testing = ["flufl.flake8", "importlib-resources (>=1.3)", "packaging", "pyfakefs", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf (>=0.9.2)", "pytest-ruff"]
[[package]]
name = "iniconfig"
version = "2.0.0"
@@ -1471,6 +1601,125 @@ typing-extensions = ">=4.7,<5"
[package.extras]
datalib = ["numpy (>=1)", "pandas (>=1.2.3)", "pandas-stubs (>=1.1.0.11)"]
[[package]]
name = "opentelemetry-api"
version = "1.22.0"
description = "OpenTelemetry Python API"
optional = false
python-versions = ">=3.7"
files = [
{file = "opentelemetry_api-1.22.0-py3-none-any.whl", hash = "sha256:43621514301a7e9f5d06dd8013a1b450f30c2e9372b8e30aaeb4562abf2ce034"},
{file = "opentelemetry_api-1.22.0.tar.gz", hash = "sha256:15ae4ca925ecf9cfdfb7a709250846fbb08072260fca08ade78056c502b86bed"},
]
[package.dependencies]
deprecated = ">=1.2.6"
importlib-metadata = ">=6.0,<7.0"
[[package]]
name = "opentelemetry-exporter-otlp-proto-common"
version = "1.22.0"
description = "OpenTelemetry Protobuf encoding"
optional = false
python-versions = ">=3.7"
files = [
{file = "opentelemetry_exporter_otlp_proto_common-1.22.0-py3-none-any.whl", hash = "sha256:3f2538bec5312587f8676c332b3747f54c89fe6364803a807e217af4603201fa"},
{file = "opentelemetry_exporter_otlp_proto_common-1.22.0.tar.gz", hash = "sha256:71ae2f81bc6d6fe408d06388826edc8933759b2ca3a97d24054507dc7cfce52d"},
]
[package.dependencies]
backoff = {version = ">=1.10.0,<3.0.0", markers = "python_version >= \"3.7\""}
opentelemetry-proto = "1.22.0"
[[package]]
name = "opentelemetry-exporter-otlp-proto-grpc"
version = "1.22.0"
description = "OpenTelemetry Collector Protobuf over gRPC Exporter"
optional = false
python-versions = ">=3.7"
files = [
{file = "opentelemetry_exporter_otlp_proto_grpc-1.22.0-py3-none-any.whl", hash = "sha256:b5bcadc129272004316a455e9081216d3380c1fc2231a928ea6a70aa90e173fb"},
{file = "opentelemetry_exporter_otlp_proto_grpc-1.22.0.tar.gz", hash = "sha256:1e0e5aa4bbabc74942f06f268deffd94851d12a8dc30b02527472ef1729fe5b1"},
]
[package.dependencies]
backoff = {version = ">=1.10.0,<3.0.0", markers = "python_version >= \"3.7\""}
deprecated = ">=1.2.6"
googleapis-common-protos = ">=1.52,<2.0"
grpcio = ">=1.0.0,<2.0.0"
opentelemetry-api = ">=1.15,<2.0"
opentelemetry-exporter-otlp-proto-common = "1.22.0"
opentelemetry-proto = "1.22.0"
opentelemetry-sdk = ">=1.22.0,<1.23.0"
[package.extras]
test = ["pytest-grpc"]
[[package]]
name = "opentelemetry-exporter-otlp-proto-http"
version = "1.22.0"
description = "OpenTelemetry Collector Protobuf over HTTP Exporter"
optional = false
python-versions = ">=3.7"
files = [
{file = "opentelemetry_exporter_otlp_proto_http-1.22.0-py3-none-any.whl", hash = "sha256:e002e842190af45b91dc55a97789d0b98e4308c88d886b16049ee90e17a4d396"},
{file = "opentelemetry_exporter_otlp_proto_http-1.22.0.tar.gz", hash = "sha256:79ed108981ec68d5f7985355bca32003c2f3a5be1534a96d62d5861b758a82f4"},
]
[package.dependencies]
backoff = {version = ">=1.10.0,<3.0.0", markers = "python_version >= \"3.7\""}
deprecated = ">=1.2.6"
googleapis-common-protos = ">=1.52,<2.0"
opentelemetry-api = ">=1.15,<2.0"
opentelemetry-exporter-otlp-proto-common = "1.22.0"
opentelemetry-proto = "1.22.0"
opentelemetry-sdk = ">=1.22.0,<1.23.0"
requests = ">=2.7,<3.0"
[package.extras]
test = ["responses (==0.22.0)"]
[[package]]
name = "opentelemetry-proto"
version = "1.22.0"
description = "OpenTelemetry Python Proto"
optional = false
python-versions = ">=3.7"
files = [
{file = "opentelemetry_proto-1.22.0-py3-none-any.whl", hash = "sha256:ce7188d22c75b6d0fe53e7fb58501613d0feade5139538e79dedd9420610fa0c"},
{file = "opentelemetry_proto-1.22.0.tar.gz", hash = "sha256:9ec29169286029f17ca34ec1f3455802ffb90131642d2f545ece9a63e8f69003"},
]
[package.dependencies]
protobuf = ">=3.19,<5.0"
[[package]]
name = "opentelemetry-sdk"
version = "1.22.0"
description = "OpenTelemetry Python SDK"
optional = false
python-versions = ">=3.7"
files = [
{file = "opentelemetry_sdk-1.22.0-py3-none-any.whl", hash = "sha256:a730555713d7c8931657612a88a141e3a4fe6eb5523d9e2d5a8b1e673d76efa6"},
{file = "opentelemetry_sdk-1.22.0.tar.gz", hash = "sha256:45267ac1f38a431fc2eb5d6e0c0d83afc0b78de57ac345488aa58c28c17991d0"},
]
[package.dependencies]
opentelemetry-api = "1.22.0"
opentelemetry-semantic-conventions = "0.43b0"
typing-extensions = ">=3.7.4"
[[package]]
name = "opentelemetry-semantic-conventions"
version = "0.43b0"
description = "OpenTelemetry Semantic Conventions"
optional = false
python-versions = ">=3.7"
files = [
{file = "opentelemetry_semantic_conventions-0.43b0-py3-none-any.whl", hash = "sha256:291284d7c1bf15fdaddf309b3bd6d3b7ce12a253cec6d27144439819a15d8445"},
{file = "opentelemetry_semantic_conventions-0.43b0.tar.gz", hash = "sha256:b9576fb890df479626fa624e88dde42d3d60b8b6c8ae1152ad157a8b97358635"},
]
[[package]]
name = "packaging"
version = "23.2"
@@ -1636,6 +1885,26 @@ nodeenv = ">=0.11.1"
pyyaml = ">=5.1"
virtualenv = ">=20.10.0"
[[package]]
name = "protobuf"
version = "4.25.2"
description = ""
optional = false
python-versions = ">=3.8"
files = [
{file = "protobuf-4.25.2-cp310-abi3-win32.whl", hash = "sha256:b50c949608682b12efb0b2717f53256f03636af5f60ac0c1d900df6213910fd6"},
{file = "protobuf-4.25.2-cp310-abi3-win_amd64.whl", hash = "sha256:8f62574857ee1de9f770baf04dde4165e30b15ad97ba03ceac65f760ff018ac9"},
{file = "protobuf-4.25.2-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:2db9f8fa64fbdcdc93767d3cf81e0f2aef176284071507e3ede160811502fd3d"},
{file = "protobuf-4.25.2-cp37-abi3-manylinux2014_aarch64.whl", hash = "sha256:10894a2885b7175d3984f2be8d9850712c57d5e7587a2410720af8be56cdaf62"},
{file = "protobuf-4.25.2-cp37-abi3-manylinux2014_x86_64.whl", hash = "sha256:fc381d1dd0516343f1440019cedf08a7405f791cd49eef4ae1ea06520bc1c020"},
{file = "protobuf-4.25.2-cp38-cp38-win32.whl", hash = "sha256:33a1aeef4b1927431d1be780e87b641e322b88d654203a9e9d93f218ee359e61"},
{file = "protobuf-4.25.2-cp38-cp38-win_amd64.whl", hash = "sha256:47f3de503fe7c1245f6f03bea7e8d3ec11c6c4a2ea9ef910e3221c8a15516d62"},
{file = "protobuf-4.25.2-cp39-cp39-win32.whl", hash = "sha256:5e5c933b4c30a988b52e0b7c02641760a5ba046edc5e43d3b94a74c9fc57c1b3"},
{file = "protobuf-4.25.2-cp39-cp39-win_amd64.whl", hash = "sha256:d66a769b8d687df9024f2985d5137a337f957a0916cf5464d1513eee96a63ff0"},
{file = "protobuf-4.25.2-py3-none-any.whl", hash = "sha256:a8b7a98d4ce823303145bf3c1a8bdb0f2f4642a414b196f04ad9853ed0c8f830"},
{file = "protobuf-4.25.2.tar.gz", hash = "sha256:fe599e175cb347efc8ee524bcd4b902d11f7262c0e569ececcb89995c15f0a5e"},
]
[[package]]
name = "pycparser"
version = "2.21"
@@ -2633,7 +2902,22 @@ files = [
idna = ">=2.0"
multidict = ">=4.0"
[[package]]
name = "zipp"
version = "3.17.0"
description = "Backport of pathlib-compatible object wrapper for zip files"
optional = false
python-versions = ">=3.8"
files = [
{file = "zipp-3.17.0-py3-none-any.whl", hash = "sha256:0e923e726174922dce09c53c59ad483ff7bbb8e572e00c7f7c46b88556409f31"},
{file = "zipp-3.17.0.tar.gz", hash = "sha256:84e64a1c28cf7e91ed2078bb8cc8c259cb19b76942096c8d7b84947690cabaf0"},
]
[package.extras]
docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (<7.2.5)", "sphinx (>=3.5)", "sphinx-lint"]
testing = ["big-O", "jaraco.functools", "jaraco.itertools", "more-itertools", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-ignore-flaky", "pytest-mypy (>=0.9.1)", "pytest-ruff"]
[metadata]
lock-version = "2.0"
python-versions = ">=3.10,<4.0"
content-hash = "be62c4dcfaba5e9fc7c363895b9b1ea79aa3fdb8518cbb631953d8373b41c2fb"
content-hash = "2be3f98f57af7ea47f0985000be807e67c1bb3a95598b637e50f2cec54d11c80"

View File

@@ -1,7 +1,7 @@
[tool.poetry]
name = "crewai"
version = "0.5.3"
version = "0.5.5"
description = "Cutting-edge framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks."
authors = ["Joao Moura <joao@crewai.com>"]
readme = "README.md"
@@ -21,6 +21,9 @@ pydantic = "^2.4.2"
langchain = "0.1.0"
openai = "^1.7.1"
langchain-openai = "^0.0.2"
opentelemetry-api = "^1.22.0"
opentelemetry-sdk = "^1.22.0"
opentelemetry-exporter-otlp-proto-http = "^1.22.0"
[tool.poetry.group.dev.dependencies]
isort = "^5.13.2"
@@ -40,7 +43,6 @@ cairosvg = "^2.7.1"
profile = "black"
known_first_party = ["crewai"]
[tool.poetry.group.test.dependencies]
pytest = "^7.4"
pytest-vcr = "^1.0.2"

View File

@@ -1,6 +1,5 @@
from typing import Any
from pydantic import BaseModel, Field
from langchain_core.agents import AgentAction
from pydantic.v1 import BaseModel, Field
from .cache_handler import CacheHandler
@@ -11,8 +10,5 @@ class CacheHit(BaseModel):
class Config:
arbitrary_types_allowed = True
# Making it Any instead of AgentAction to avoind
# pydantic v1 vs v2 incompatibility, langchain should
# soon be updated to pydantic v2
action: Any = Field(description="Action taken")
action: AgentAction = Field(description="Action taken")
cache: CacheHandler = Field(description="Cache Handler for the tool")

View File

@@ -106,15 +106,16 @@ class CrewAgentExecutor(AgentExecutor):
**inputs,
)
if self._should_force_answer():
if isinstance(output, AgentAction) or isinstance(output, AgentFinish):
output = output
elif isinstance(output, CacheHit):
if isinstance(output, CacheHit):
output = output.action
else:
raise ValueError(
f"Unexpected output type from agent: {type(output)}"
)
yield self._force_answer(output)
if isinstance(output, AgentAction):
yield self._force_answer(output)
return
if isinstance(output, list):
yield from [self._force_answer(action) for action in output]
return
yield output
return
except OutputParserException as e:

View File

@@ -73,7 +73,7 @@ class CrewAgentOutputParser(ReActSingleInputOutputParser):
)
if self.cache.read(action, tool_input):
action = AgentAction(action, tool_input, text)
return CacheHit(action=action, cache=self.cache)
agent_action = AgentAction(action, tool_input, text)
return CacheHit(action=agent_action, cache=self.cache)
return super().parse(text)

View File

@@ -19,6 +19,7 @@ from crewai.agent import Agent
from crewai.agents.cache import CacheHandler
from crewai.process import Process
from crewai.task import Task
from crewai.telemtry import Telemetry
from crewai.tools.agent_tools import AgentTools
from crewai.utilities import I18N, Logger, RPMController
@@ -92,6 +93,8 @@ class Crew(BaseModel):
self._cache_handler = CacheHandler()
self._logger = Logger(self.verbose)
self._rpm_controller = RPMController(max_rpm=self.max_rpm, logger=self._logger)
self._telemetry = Telemetry()
self._telemetry.crew_creation(self)
return self
@model_validator(mode="after")
@@ -121,7 +124,8 @@ class Crew(BaseModel):
if self.agents:
for agent in self.agents:
agent.set_cache_handler(self._cache_handler)
agent.set_rpm_controller(self._rpm_controller)
if self.max_rpm:
agent.set_rpm_controller(self._rpm_controller)
return self
def _setup_from_config(self):
@@ -167,7 +171,7 @@ class Crew(BaseModel):
def _run_sequential_process(self) -> str:
"""Executes tasks sequentially and returns the final output."""
task_output = ""
task_output: str = ""
for task in self.tasks:
if task.agent is not None and task.agent.allow_delegation:
agents_for_delegation = [
@@ -181,6 +185,7 @@ class Crew(BaseModel):
output = task.execute(context=task_output)
if not task.async_execution:
assert output is not None
task_output = output
role = task.agent.role if task.agent is not None else "None"
@@ -204,14 +209,17 @@ class Crew(BaseModel):
verbose=True,
)
task_output = ""
task_output: str = ""
for task in self.tasks:
self._logger.log("debug", f"Working Agent: {manager.role}")
self._logger.log("info", f"Starting Task: {task.description}")
task_output = task.execute(
output = task.execute(
agent=manager, context=task_output, tools=manager.tools
)
if not task.async_execution:
assert output is not None
task_output = output
self._logger.log(
"debug", f"[{manager.role}] Task output: {task_output}\n\n"

View File

@@ -18,7 +18,7 @@ class Task(BaseModel):
__hash__ = object.__hash__ # type: ignore
i18n: I18N = I18N()
thread: threading.Thread = None
thread: threading.Thread | None = None
description: str = Field(description="Description of the actual task.")
callback: Optional[Any] = Field(
description="Callback to be executed after the task is completed.", default=None
@@ -71,7 +71,7 @@ class Task(BaseModel):
agent: Agent | None = None,
context: Optional[str] = None,
tools: Optional[List[Any]] = None,
) -> str:
) -> str | None:
"""Execute the task.
Returns:
@@ -85,12 +85,14 @@ class Task(BaseModel):
)
if self.context:
context = []
results = []
for task in self.context:
if task.async_execution:
assert task.thread is not None
task.thread.join()
context.append(task.output.result)
context = "\n".join(context)
if task.output is not None:
results.append(task.output.result)
context = "\n".join(results)
tools = tools or self.tools

View File

@@ -0,0 +1 @@
from .telemetry import Telemetry

View File

@@ -0,0 +1,114 @@
import json
import os
import platform
import socket
import pkg_resources
from opentelemetry import trace
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.resources import SERVICE_NAME, Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.trace import Status, StatusCode
class Telemetry:
"""A class to handle anonymous telemetry for the crewai package.
The data being collected is for development purpose, all data is anonymous.
There is NO data being collected on the prompts, tasks descriptions
agents backstories or goals nor responses or any data that is being
processed by the agents, nor any secrets and env vars.
Data collected includes:
- Version of crewAI
- Version of Python
- General OS (e.g. number of CPUs, macOS/Windows/Linux)
- Number of agents and tasks in a crew
- Crew Process being used
- If Agents are using memory or allowing delegation
- If Tasks are being executed in parallel or sequentially
- Language model being used
- Roles of agents in a crew
- Tools names available
"""
def __init__(self):
telemetry_endpoint = "http://telemetry.crewai.com:4318"
self.resource = Resource(attributes={SERVICE_NAME: "crewAI-telemetry"})
provider = TracerProvider(resource=self.resource)
processor = BatchSpanProcessor(
OTLPSpanExporter(endpoint=f"{telemetry_endpoint}/v1/traces")
)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)
def crew_creation(self, crew):
"""Records the creation of a crew."""
try:
tracer = trace.get_tracer("crewai.telemetry")
span = tracer.start_span("Crew Created")
self.add_attribute(
span, "crewai_version", pkg_resources.get_distribution("crewai").version
)
self.add_attribute(span, "python_version", platform.python_version())
self.add_attribute(span, "hostname", socket.gethostname())
self.add_attribute(span, "crewid", str(crew.id))
self.add_attribute(span, "crew_process", crew.process)
self.add_attribute(span, "crew_language", crew.language)
self.add_attribute(span, "crew_number_of_tasks", len(crew.tasks))
self.add_attribute(span, "crew_number_of_agents", len(crew.agents))
self.add_attribute(
span,
"crew_agents",
json.dumps(
[
{
"id": str(agent.id),
"role": agent.role,
"memory_enabled?": agent.memory,
"llm": json.dumps(self._safe_llm_attributes(agent.llm)),
"delegation_enabled?": agent.allow_delegation,
"tools_names": [tool.name for tool in agent.tools],
}
for agent in crew.agents
]
),
)
self.add_attribute(
span,
"crew_tasks",
json.dumps(
[
{
"id": str(task.id),
"async_execution?": task.async_execution,
"tools_names": [tool.name for tool in task.tools],
}
for task in crew.tasks
]
),
)
self.add_attribute(span, "platform", platform.platform())
self.add_attribute(span, "platform_release", platform.release())
self.add_attribute(span, "platform_system", platform.system())
self.add_attribute(span, "platform_version", platform.version())
self.add_attribute(span, "cpus", os.cpu_count())
span.set_status(Status(StatusCode.OK))
span.end()
except Exception:
pass
def add_attribute(self, span, key, value):
"""Add an attribute to a span."""
try:
return span.set_attribute(key, value)
except Exception:
pass
def _safe_llm_attributes(self, llm):
attributes = ["name", "model_name", "base_url", "model", "top_k", "temperature"]
safe_attributes = {k: v for k, v in vars(llm).items() if k in attributes}
safe_attributes["class"] = llm.__class__.__name__
return safe_attributes

View File

@@ -14,12 +14,14 @@ class RPMController(BaseModel):
_current_rpm: int = PrivateAttr(default=0)
_timer: threading.Timer | None = PrivateAttr(default=None)
_lock: threading.Lock = PrivateAttr(default=None)
_shutdown_flag = False
@model_validator(mode="after")
def reset_counter(self):
if self.max_rpm:
self._lock = threading.Lock()
self._reset_request_count()
if not self._shutdown_flag:
self._lock = threading.Lock()
self._reset_request_count()
return self
def check_or_wait(self):
@@ -51,6 +53,7 @@ class RPMController(BaseModel):
with self._lock:
self._current_rpm = 0
if self._timer:
self._shutdown_flag = True
self._timer.cancel()
self._timer = threading.Timer(60.0, self._reset_request_count)
self._timer.start()

View File

@@ -356,6 +356,25 @@ def test_api_calls_throttling(capsys):
moveon.assert_called()
def test_agents_rpm_is_never_set_if_crew_max_RPM_is_not_set():
agent = Agent(
role="test role",
goal="test goal",
backstory="test backstory",
allow_delegation=False,
verbose=True,
)
task = Task(
description="just say hi!",
agent=agent,
)
Crew(agents=[agent], tasks=[task], verbose=2)
assert agent._rpm_controller is None
def test_async_task_execution():
import threading
from unittest.mock import patch
@@ -392,7 +411,7 @@ def test_async_task_execution():
with patch.object(threading.Thread, "start") as start:
thread = threading.Thread(target=lambda: None, args=()).start()
start.return_value = thread
with patch.object(threading.Thread, "join", wraps=thread.join()) as join:
with patch.object(threading.Thread, "join", wraps=thread.join()) as join: # type: ignore
list_ideas.output = TaskOutput(
description="A 4 paragraph article about AI.", result="ok"
)