Compare commits

..

71 Commits

Author SHA1 Message Date
Brandon Hancock
48e1505a0a Merge branch 'main' into undo-agentops-api-check 2024-10-16 11:18:19 -04:00
Vini Brasil
a6b7295092 Adapt Tools CLI to uv (#1455)
* Adapt Tools CLI to UV

* Fix failing test
2024-10-16 10:55:04 -03:00
Braelyn Boynton
161e2e20a5 remove extra code 2024-09-05 14:50:01 +09:00
Braelyn Boynton
a68f2cec41 remove extra code 2024-09-05 14:48:54 +09:00
Braelyn Boynton
9db3a4ab23 remove extra code 2024-09-05 14:48:28 +09:00
Braelyn Boynton
7d4cf9a7bc undo agentops api key check 2024-09-05 14:45:01 +09:00
Braelyn Boynton
7af89abe53 Merge remote-tracking branch 'refs/remotes/upstream/main' into undo-agentops-api-check 2024-09-05 14:41:51 +09:00
Braelyn Boynton
b3ae127d2c Merge remote-tracking branch 'refs/remotes/upstream/main' 2024-08-08 16:56:49 -07:00
Braelyn Boynton
0543059dbe Merge remote-tracking branch 'upstream/main'
# Conflicts:
#	pyproject.toml
#	src/crewai/agent.py
#	src/crewai/crew.py
#	src/crewai/task.py
#	src/crewai/tools/tool_usage.py
#	src/crewai/utilities/evaluators/task_evaluator.py
2024-07-23 17:55:15 -04:00
Braelyn Boynton
c3b8ea21d3 deprecation messages 2024-07-08 13:56:17 -07:00
Braelyn Boynton
fa9a42cd89 fix crew logger bug 2024-06-06 18:28:11 -07:00
Braelyn Boynton
9b965d9e33 fix crew logger bug 2024-06-06 18:26:09 -07:00
Braelyn Boynton
45655a956a conditional protect agentops use 2024-06-06 17:58:34 -07:00
Braelyn Boynton
f2d2804854 Merge remote-tracking branch 'origin/main' 2024-06-06 17:09:05 -07:00
Braelyn Boynton
ae65622bd0 Merge remote-tracking branch 'upstream/main'
# Conflicts:
#	src/crewai/task.py
2024-06-06 17:08:39 -07:00
Braelyn Boynton
f516fba9b6 Merge branch 'main' into main 2024-06-06 17:07:28 -07:00
Braelyn Boynton
a4622bfce8 support skip auto end session 2024-05-29 14:28:24 -07:00
theCyberTech - Rip&Tear
0dd4f444ea Added timestamp to logger (#646)
* Added timestamp to logger

Updated the logger.py file to include timestamps when logging output. For example:

 [2024-05-20 15:32:48][DEBUG]: == Working Agent: Researcher
 [2024-05-20 15:32:48][INFO]: == Starting Task: Research the topic
 [2024-05-20 15:33:22][DEBUG]: == [Researcher] Task output:

* Update tool_usage.py

* Revert "Update tool_usage.py"

This reverts commit 95d18d5b6f.

incorrect bramch for this commit
2024-05-28 16:45:50 -07:00
Saif Mahmud
e2dfba63cd fixes #665 (#666) 2024-05-28 16:45:50 -07:00
theCyberTech - Rip&Tear
3bba04ac71 Update crew.py (#644)
Fixed Type on line 53
2024-05-28 16:45:50 -07:00
Mish Ushakov
b153bc1a80 Update BrowserbaseLoadTool.md (#647) 2024-05-28 16:45:50 -07:00
Mike Heavers
8e5bface29 Update README.md (#652)
Rework example so that if you use a custom LLM it doesn't throw code errors by uncommenting.
2024-05-28 16:45:50 -07:00
Anudeep Kolluri
9ac6752cbf Update agent.py (#655)
Changed default model value from gpt-4 to gpt-4o.
Reasoning.
gpt-4 costs 30$ per million tokens while gpt-4o costs 5$.
This is more cost friendly for default option.
2024-05-28 16:45:50 -07:00
Paul Sanders
a08d0dfe12 Clarify text in docstring (#662) 2024-05-28 16:45:50 -07:00
Paul Sanders
96e0dacfc1 Enable search in docs (#663) 2024-05-28 16:45:50 -07:00
Olivier Roberdet
f4ce482eb7 Fix typo in instruction en.json (#676) 2024-05-28 16:45:50 -07:00
Braelyn Boynton
c6471814b3 merge upstream 2024-05-28 16:45:20 -07:00
Howard Gil
2d88109cc3 Merge branch 'main' of https://github.com/joaomdmoura/crewAI 2024-05-21 12:18:03 -07:00
Braelyn Boynton
54237c9974 track task evaluator 2024-05-09 13:15:12 -07:00
Braelyn Boynton
b4241a892e agentops version bump 2024-05-06 21:28:47 -07:00
Braelyn Boynton
a6de5253d5 Merge remote-tracking branch 'upstream/main' 2024-05-06 11:50:31 -07:00
Braelyn Boynton
b9d6ec5721 use langchain callback handler to support all LLMs 2024-05-03 15:07:17 -07:00
Braelyn Boynton
498bf77f08 black formatting 2024-05-02 13:06:34 -07:00
Braelyn Boynton
be91c32488 Merge remote-tracking branch 'upstream/main'
# Conflicts:
#	pyproject.toml
#	src/crewai/agent.py
#	src/crewai/crew.py
#	src/crewai/tools/tool_usage.py
2024-05-02 12:52:31 -07:00
Braelyn Boynton
f2c2a625b0 add crew tag 2024-05-02 12:28:06 -07:00
Braelyn Boynton
b160a52139 Merge remote-tracking branch 'upstream/main'
# Conflicts:
#	pyproject.toml
#	src/crewai/agent.py
#	src/crewai/crew.py
#	src/crewai/tools/tool_usage.py
2024-04-30 01:09:16 -07:00
Braelyn Boynton
a19a37bd9a noop 2024-04-29 23:31:48 -07:00
Braelyn Boynton
2f789800b7 Revert "Revert "Revert "true dependency"""
This reverts commit e9335e89
2024-04-29 23:30:02 -07:00
Braelyn Boynton
8be18c8e11 agentops update 2024-04-19 20:05:47 -07:00
João Moura
e366f006ac Update pyproject.toml 2024-04-19 23:38:20 -03:00
João Moura
d678190850 Forcing version 0.1.5 2024-04-19 23:18:43 -03:00
Braelyn Boynton
9005dc7c59 cleanup 2024-04-19 19:10:26 -07:00
Braelyn Boynton
e9335e89a6 Revert "Revert "true dependency""
This reverts commit 4d1b460b
2024-04-19 19:09:20 -07:00
Braelyn Boynton
fd7de7f2eb Revert "Revert "cleanup""
This reverts commit cea33d9a5d.
2024-04-19 19:08:22 -07:00
Braelyn Boynton
c52b5e9690 agentops 0.1.5 2024-04-19 19:07:53 -07:00
Braelyn Boynton
7725e7c52e optional parent key 2024-04-19 19:04:21 -07:00
Braelyn Boynton
7f8573e6cb Merge remote-tracking branch 'origin/main' 2024-04-19 19:02:39 -07:00
Braelyn Boynton
cea33d9a5d Revert "cleanup"
This reverts commit 7f5635fb9e.
2024-04-19 19:02:20 -07:00
Braelyn Boynton
4d1b460b80 Revert "true dependency"
This reverts commit e52e8e9568.
2024-04-19 19:01:52 -07:00
João Moura
906a5bd8ec Update pyproject.toml 2024-04-19 22:54:57 -03:00
Braelyn Boynton
216cc832dc Merge remote-tracking branch 'upstream/main'
# Conflicts:
#	poetry.lock
2024-04-18 16:21:19 -07:00
Braelyn Boynton
7f5635fb9e cleanup 2024-04-17 17:19:38 -07:00
Braelyn Boynton
0ce8d14742 add crew org key to agentops 2024-04-17 14:48:58 -07:00
Braelyn Boynton
e52e8e9568 true dependency 2024-04-17 14:39:23 -07:00
Braelyn Boynton
4f7a9a5b4b Merge remote-tracking branch 'upstream/main'
# Conflicts:
#	src/crewai/crew.py
2024-04-17 14:27:31 -07:00
Braelyn Boynton
2af85c35b4 remove org key 2024-04-15 15:39:24 -04:00
Braelyn Boynton
e82149aaf9 Merge remote-tracking branch 'upstream/main' 2024-04-11 12:32:17 -07:00
Braelyn Boynton
de0ee8ce41 Merge remote-tracking branch 'upstream/main'
# Conflicts:
#	src/crewai/crew.py
2024-04-05 15:48:35 -07:00
Braelyn Boynton
b20ae847c4 agentops version bump 2024-04-05 15:47:01 -07:00
Braelyn Boynton
59f56324ea Merge remote-tracking branch 'upstream/main'
# Conflicts:
#	poetry.lock
#	src/crewai/tools/tool_usage.py
2024-04-05 15:18:40 -07:00
Braelyn Boynton
79a0d8b94d optional agentops 2024-04-04 14:34:20 -07:00
Braelyn Boynton
750085498f remove telemetry code 2024-04-04 13:23:20 -07:00
Braelyn Boynton
215e39833a optional dependency usage 2024-04-03 23:14:37 -07:00
Braelyn Boynton
67bc1de4d6 make agentops optional 2024-04-03 15:36:47 -07:00
Braelyn Boynton
45e307b98a code cleanup 2024-04-02 12:25:52 -07:00
Braelyn Boynton
4402c9be74 merge upstream 2024-04-02 12:22:49 -07:00
Braelyn Boynton
5e46514398 better tool and llm tracking 2024-03-29 17:45:58 -07:00
Braelyn Boynton
c44c2b6808 track tool usage time 2024-03-29 14:28:33 -07:00
Braelyn Boynton
a9339fcef6 end session after completion 2024-03-26 14:09:58 -07:00
Braelyn Boynton
f67d0a26f1 track tool usage 2024-03-20 18:25:41 -07:00
Braelyn Boynton
f6ee12dbc5 implements agentops with a langchain handler, agent tracking and tool call recording 2024-03-19 18:47:22 -07:00
22 changed files with 248 additions and 576 deletions

View File

@@ -29,7 +29,6 @@ dependencies = [
"pyvis>=0.3.2",
"uv>=0.4.18",
"tomli-w>=1.1.0",
"blinker>=1.8.2",
]
[project.urls]

View File

@@ -1,10 +1,8 @@
import warnings
from crewai.agent import Agent
from crewai.crew import Crew
from crewai.flow.flow import Flow
from crewai.llm import LLM
from crewai.logging.event_logger import EventLogger
from crewai.pipeline import Pipeline
from crewai.process import Process
from crewai.routers import Router

View File

@@ -15,27 +15,18 @@ from crewai.utilities.constants import TRAINED_AGENTS_DATA_FILE, TRAINING_DATA_F
from crewai.utilities.token_counter_callback import TokenCalcHandler
from crewai.utilities.training_handler import CrewTrainingHandler
agentops = None
try:
import agentops # type: ignore # Name "agentops" already defined on line 21
from agentops import track_agent # type: ignore
except ImportError:
def mock_agent_ops_provider():
def track_agent(*args, **kwargs):
def track_agent():
def noop(f):
return f
return noop
return track_agent
agentops = None
if os.environ.get("AGENTOPS_API_KEY"):
try:
from agentops import track_agent
except ImportError:
track_agent = mock_agent_ops_provider()
else:
track_agent = mock_agent_ops_provider()
@track_agent()
class Agent(BaseAgent):

View File

@@ -212,38 +212,6 @@ class BaseAgent(ABC, BaseModel):
"""Get the converter class for the agent to create json/pydantic outputs."""
pass
def serialize(self) -> Dict[str, Any]:
"""Serialize the BaseAgent into a dictionary excluding complex objects."""
exclude = {
"_logger",
"_rpm_controller",
"_request_within_rpm_limit",
"_token_process",
"agent_executor",
"cache_handler",
"tools_handler",
"llm",
"crew",
# Add any other complex attributes that should be excluded
}
serialized_data = self.model_dump(exclude=exclude)
serialized_data = {k: v for k, v in serialized_data.items() if v is not None}
# Add any additional serialization logic if needed
serialized_data["role"] = self.role
serialized_data["goal"] = self.goal
serialized_data["backstory"] = self.backstory
serialized_data["tools"] = (
[str(tool) for tool in self.tools] if self.tools else None
)
# Include the UUID as a string
serialized_data["id"] = str(self.id)
return serialized_data
def copy(self: T) -> T: # type: ignore # Signature of "copy" incompatible with supertype "BaseModel"
"""Create a deep copy of the Agent."""
exclude = {

View File

@@ -25,7 +25,9 @@ class PlusAPI:
def _make_request(self, method: str, endpoint: str, **kwargs) -> requests.Response:
url = urljoin(self.base_url, endpoint)
return requests.request(method, url, headers=self.headers, **kwargs)
session = requests.Session()
session.trust_env = False
return session.request(method, url, headers=self.headers, **kwargs)
def login_to_tool_repository(self):
return self._make_request("POST", f"{self.TOOLS_RESOURCE}/login")

View File

@@ -0,0 +1,10 @@
# Python-generated files
__pycache__/
*.py[oc]
build/
dist/
wheels/
*.egg-info
# Virtual environments
.venv

View File

@@ -1,14 +1,10 @@
[tool.poetry]
[project]
name = "{{folder_name}}"
version = "0.1.0"
description = "Power up your crews with {{folder_name}}"
authors = ["Your Name <you@example.com>"]
readme = "README.md"
requires-python = ">=3.10,<=3.13"
dependencies = [
"crewai[tools]>=0.70.1"
]
[tool.poetry.dependencies]
python = ">=3.10,<=3.13"
crewai = { extras = ["tools"], version = ">=0.70.1,<1.0.0" }
[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"

View File

@@ -4,6 +4,8 @@ import platform
import subprocess
import tempfile
from pathlib import Path
from netrc import netrc
import stat
import click
from rich.console import Console
@@ -147,7 +149,7 @@ class ToolCommand(BaseCommand, PlusAPIMixin):
if login_response.status_code != 200:
console.print(
"Failed to authenticate to the tool repository. Make sure you have the access to tools.",
"Authentication failed. Verify access to the tool repository, or try `crewai login`. ",
style="bold red",
)
raise SystemExit
@@ -159,23 +161,19 @@ class ToolCommand(BaseCommand, PlusAPIMixin):
"Successfully authenticated to the tool repository.", style="bold green"
)
def _set_netrc_credentials(self, credentials):
# Create .netrc or _netrc file
netrc_filename = "_netrc" if platform.system() == "Windows" else ".netrc"
netrc_path = Path.home() / netrc_filename
def _set_netrc_credentials(self, credentials, netrc_path=None):
if not netrc_path:
netrc_filename = "_netrc" if platform.system() == "Windows" else ".netrc"
netrc_path = Path.home() / netrc_filename
netrc_path.touch(mode=stat.S_IRUSR | stat.S_IWUSR, exist_ok=True)
netrc_content = f"""machine app.crewai.com
login {credentials['username']}
password {credentials['password']}
"""
netrc_instance = netrc(file=netrc_path)
netrc_instance.hosts["app.crewai.com"] = (credentials["username"], "", credentials["password"])
with open(netrc_path, "a") as netrc_file:
netrc_file.write(netrc_content)
with open(netrc_path, 'w') as file:
file.write(str(netrc_instance))
# Set appropriate permissions for Unix-like systems
if platform.system() != "Windows":
os.chmod(netrc_path, 0o600)
console.print(f"Added credentials to {netrc_filename}", style="bold green")
console.print(f"Added credentials to {netrc_path}", style="bold green")
def _add_package(self, tool_details):
tool_handle = tool_details["handle"]

View File

@@ -1,6 +1,5 @@
import asyncio
import json
import os
import uuid
import warnings
from concurrent.futures import Future
@@ -40,12 +39,6 @@ from crewai.utilities.constants import (
)
from crewai.utilities.evaluators.crew_evaluator_handler import CrewEvaluator
from crewai.utilities.evaluators.task_evaluator import TaskEvaluator
from crewai.utilities.event_helpers import (
emit_crew_finish,
emit_crew_start,
emit_task_finish,
emit_task_start,
)
from crewai.utilities.formatter import (
aggregate_raw_outputs_from_task_outputs,
aggregate_raw_outputs_from_tasks,
@@ -54,12 +47,10 @@ from crewai.utilities.planning_handler import CrewPlanner
from crewai.utilities.task_output_storage_handler import TaskOutputStorageHandler
from crewai.utilities.training_handler import CrewTrainingHandler
agentops = None
if os.environ.get("AGENTOPS_API_KEY"):
try:
import agentops # type: ignore
except ImportError:
pass
try:
import agentops
except ImportError:
agentops = None
if TYPE_CHECKING:
from crewai.pipeline.pipeline import Pipeline
@@ -92,9 +83,9 @@ class Crew(BaseModel):
"""
__hash__ = object.__hash__ # type: ignore
_execution_span: Any = PrivateAttr()
_rpm_controller: RPMController = PrivateAttr()
_logger: Logger = PrivateAttr()
# TODO: MAKE THIS ALSO USE EVENT EMITTER
_file_handler: FileHandler = PrivateAttr()
_cache_handler: InstanceOf[CacheHandler] = PrivateAttr(default=CacheHandler())
_short_term_memory: Optional[InstanceOf[ShortTermMemory]] = PrivateAttr()
@@ -106,7 +97,6 @@ class Crew(BaseModel):
_logging_color: str = PrivateAttr(
default="bold_purple",
)
# TODO: Figure out how to make this reference event emitter.
_task_output_handler: TaskOutputStorageHandler = PrivateAttr(
default_factory=TaskOutputStorageHandler
)
@@ -464,7 +454,7 @@ class Crew(BaseModel):
inputs: Optional[Dict[str, Any]] = None,
) -> CrewOutput:
"""Starts the crew to work on its assigned tasks."""
emit_crew_start(self, inputs)
self._execution_span = self._telemetry.crew_execution_span(self, inputs)
self._task_output_handler.reset()
self._logging_color = "bold_purple"
@@ -511,8 +501,6 @@ class Crew(BaseModel):
for metric in metrics:
self.usage_metrics.add_usage_metrics(metric)
emit_crew_finish(self, result)
return result
def kickoff_for_each(self, inputs: List[Dict[str, Any]]) -> List[CrewOutput]:
@@ -676,9 +664,7 @@ class Crew(BaseModel):
)
self._prepare_agent_tools(task)
emit_task_start(task, agent_to_use.role)
# TODO: ADD ELSEWHERE
# self._log_task_start(task, agent_to_use.role)
self._log_task_start(task, agent_to_use.role)
if isinstance(task, ConditionalTask):
skipped_task_output = self._handle_conditional_task(
@@ -709,17 +695,8 @@ class Crew(BaseModel):
tools=agent_to_use.tools,
)
task_outputs = [task_output]
emit_task_finish(
task,
self._inputs if self._inputs else {},
task_output,
task_index,
was_replayed,
)
# TODO: ADD ELSEWHERE
# self._process_task_result(task, task_output)
# self._store_execution_log(task, task_output, task_index, was_replayed)
self._process_task_result(task, task_output)
self._store_execution_log(task, task_output, task_index, was_replayed)
if futures:
task_outputs = self._process_async_tasks(futures, was_replayed)
@@ -794,9 +771,7 @@ class Crew(BaseModel):
def _log_task_start(self, task: Task, role: str = "None"):
if self.output_log_file:
self._file_handler.log(
task_name=task.name, task=task.description, agent=role, status="started"
)
self._file_handler.log(task_name=task.name, task=task.description, agent=role, status="started")
def _update_manager_tools(self, task: Task):
if self.manager_agent:
@@ -818,13 +793,7 @@ class Crew(BaseModel):
def _process_task_result(self, task: Task, output: TaskOutput) -> None:
role = task.agent.role if task.agent is not None else "None"
if self.output_log_file:
self._file_handler.log(
task_name=task.name,
task=task.description,
agent=role,
status="completed",
output=output.raw,
)
self._file_handler.log(task_name=task.name, task=task.description, agent=role, status="completed", output=output.raw)
def _create_crew_output(self, task_outputs: List[TaskOutput]) -> CrewOutput:
if len(task_outputs) != 1:
@@ -853,19 +822,10 @@ class Crew(BaseModel):
for future_task, future, task_index in futures:
task_output = future.result()
task_outputs.append(task_output)
emit_task_finish(
future_task,
self._inputs if self._inputs else {},
task_output,
task_index,
was_replayed,
self._process_task_result(future_task, task_output)
self._store_execution_log(
future_task, task_output, task_index, was_replayed
)
# TODO: ADD ELSEWHERE
# self._process_task_result(future_task, task_output)
# self._store_execution_log(
# future_task, task_output, task_index, was_replayed
# )
return task_outputs
def _find_task_index(
@@ -921,40 +881,6 @@ class Crew(BaseModel):
result = self._execute_tasks(self.tasks, start_index, True)
return result
def serialize(self) -> Dict[str, Any]:
"""Serialize the Crew into a dictionary excluding complex objects."""
exclude = {
"_rpm_controller",
"_logger",
"_execution_span",
"_file_handler",
"_cache_handler",
"_short_term_memory",
"_long_term_memory",
"_entity_memory",
"_telemetry",
"agents",
"tasks",
}
# Serialize agents and tasks to a simpler form if needed
serialized_agents = [agent.serialize() for agent in self.agents]
serialized_tasks = [task.serialize() for task in self.tasks]
serialized_data = self.model_dump(exclude=exclude)
serialized_data = {k: v for k, v in serialized_data.items() if v is not None}
# Add serialized agents and tasks
serialized_data["agents"] = serialized_agents
serialized_data["tasks"] = serialized_tasks
# Include the UUID as a string
serialized_data["id"] = str(self.id)
return serialized_data
# TODO: Come back and use the new _serialize method
def copy(self):
"""Create a deep copy of the Crew."""
@@ -1050,7 +976,6 @@ class Crew(BaseModel):
inputs: Optional[Dict[str, Any]] = None,
) -> None:
"""Test and evaluate the Crew with the given inputs for n iterations concurrently using concurrent.futures."""
# TODO: Event Emit test execution start
self._test_execution_span = self._telemetry.test_execution_span(
self,
n_iterations,
@@ -1064,7 +989,6 @@ class Crew(BaseModel):
self.kickoff(inputs=inputs)
evaluator.print_crew_evaluation_result()
# TODO:
def __rshift__(self, other: "Crew") -> "Pipeline":
"""

View File

@@ -41,19 +41,6 @@ class CrewOutput(BaseModel):
output_dict.update(self.pydantic.model_dump())
return output_dict
def serialize(self) -> Dict[str, Any]:
"""Serialize the CrewOutput into a dictionary excluding complex objects."""
serialized_data = {
"raw": self.raw,
"pydantic": self.pydantic.model_dump() if self.pydantic else None,
"json_dict": self.json_dict,
"tasks_output": [
task_output.serialize() for task_output in self.tasks_output
],
"token_usage": self.token_usage.model_dump(),
}
return {k: v for k, v in serialized_data.items() if v is not None}
def __getitem__(self, key):
if self.pydantic and hasattr(self.pydantic, key):
return getattr(self.pydantic, key)

View File

@@ -1 +0,0 @@
from .event_logger import Event, EventLogger

View File

@@ -1,94 +0,0 @@
import json
import os
import uuid
from datetime import datetime, timezone
from typing import Any, Optional
from dotenv import load_dotenv
from sqlalchemy import JSON, Column, DateTime, Integer, String, create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import Session as SQLAlchemySession
from sqlalchemy.orm import sessionmaker
from crewai.utilities.event_emitter import crew_events
load_dotenv()
Base = declarative_base()
class Session:
_session_id: Optional[str] = None
@classmethod
def get_session_id(cls) -> str:
if cls._session_id is None:
cls._session_id = str(uuid.uuid4()) # Generate a new UUID
print(f"Generated new session ID: {cls._session_id}")
return cls._session_id
class Event(Base):
__tablename__ = "events"
id = Column(Integer, primary_key=True)
event_name = Column(String)
timestamp = Column(DateTime, default=datetime.now(timezone.utc))
session_id = Column(String)
data = Column(JSON)
error_type = Column(String)
error_message = Column(String)
traceback = Column(String)
DATABASE_URL = os.getenv("CREWAI_DATABASE_URL", "sqlite:///crew_events.db")
engine = create_engine(DATABASE_URL)
Base.metadata.create_all(engine)
SessionLocal = sessionmaker(bind=engine) # Use a different name to avoid confusion
# Create a session instance
session: SQLAlchemySession = SessionLocal()
class EventLogger:
def __init__(self, session: SQLAlchemySession):
self.session = session
def log_event(self, *args: Any, **kwargs: Any) -> None:
# Extract event name from kwargs
event_name = kwargs.pop("event", "unknown_event")
print("Logging event:", event_name)
print("Args:", args)
print("Kwargs:", kwargs)
# Check if args is a single dictionary and unpack it
if len(args) == 1 and isinstance(args[0], dict):
args_dict = args[0]
else:
# Convert args to a dictionary with keys like 'arg0', 'arg1', etc.
args_dict = {f"arg{i}": arg for i, arg in enumerate(args)}
# Merge args_dict and kwargs into a single dictionary
data = {**args_dict, **kwargs}
print("Data:", data)
event = Event(
event_name=event_name,
session_id=Session.get_session_id(),
data=json.dumps(data),
error_type=kwargs.get("error_type", ""),
error_message=kwargs.get("error_message", ""),
traceback=kwargs.get("traceback", ""),
)
self.session.add(event)
self.session.commit()
print("Successfully logged event:", event_name)
event_logger = EventLogger(session)
print("Connecting event_logger to all signals")
crew_events.on("*", event_logger.log_event)
print("Connected event_logger to all signals")

View File

@@ -317,35 +317,6 @@ class Task(BaseModel):
self.processed_by_agents.add(agent_name)
self.delegations += 1
def serialize(self) -> Dict[str, Any]:
"""Serialize the Task into a dictionary excluding complex objects."""
# Define attributes to exclude from serialization
exclude = {
"_telemetry",
"_execution_span",
"_thread",
"_execution_time",
"agent",
"context",
"tools",
"output",
}
# Use model_dump or similar to get a dictionary representation
serialized_data = self.model_dump(exclude=exclude)
# Add any additional serialization logic if needed
serialized_data["id"] = str(self.id)
serialized_data["name"] = self.name
serialized_data["description"] = self.description
serialized_data["expected_output"] = self.expected_output
serialized_data["output_file"] = self.output_file
serialized_data["human_input"] = self.human_input
serialized_data["processed_by_agents"] = list(self.processed_by_agents)
return serialized_data
def copy(
self, agents: List["BaseAgent"], task_mapping: Dict[str, "Task"]
) -> "Task":

View File

@@ -56,21 +56,6 @@ class TaskOutput(BaseModel):
output_dict.update(self.pydantic.model_dump())
return output_dict
def serialize(self) -> Dict[str, Any]:
"""Serialize the TaskOutput into a dictionary excluding complex objects."""
serialized_data = {
"description": self.description,
"name": self.name,
"expected_output": self.expected_output,
"summary": self.summary,
"raw": self.raw,
"pydantic": self.pydantic.model_dump() if self.pydantic else None,
"json_dict": self.json_dict,
"agent": self.agent,
"output_format": self.output_format.value,
}
return {k: v for k, v in serialized_data.items() if v is not None}
def __str__(self) -> str:
if self.pydantic:
return str(self.pydantic)

View File

@@ -6,7 +6,7 @@ import os
import platform
import warnings
from contextlib import contextmanager
from typing import TYPE_CHECKING, Any, Dict, Optional
from typing import TYPE_CHECKING, Any, Optional
@contextmanager
@@ -21,16 +21,12 @@ with suppress_warnings():
from opentelemetry import trace # noqa: E402
from opentelemetry.exporter.otlp.proto.http.trace_exporter import (
OTLPSpanExporter, # noqa: E402
)
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter # noqa: E402
from opentelemetry.sdk.resources import SERVICE_NAME, Resource # noqa: E402
from opentelemetry.sdk.trace import TracerProvider # noqa: E402
from opentelemetry.sdk.trace.export import BatchSpanProcessor # noqa: E402
from opentelemetry.trace import Span, Status, StatusCode # noqa: E402
from crewai.utilities.event_emitter import CrewEvents, crew_events
if TYPE_CHECKING:
from crewai.crew import Crew
from crewai.task import Task
@@ -87,37 +83,87 @@ class Telemetry:
self.ready = False
self.trace_set = False
def crew_creation(
self, crew_data: Dict[str, Any], inputs: Optional[Dict[str, Any]] = None
):
def crew_creation(self, crew: Crew, inputs: dict[str, Any] | None):
"""Records the creation of a crew."""
if self.ready:
try:
tracer = trace.get_tracer("crewai.telemetry")
span = tracer.start_span("Crew Created")
# Accessing data from the serialized crew dictionary
self._add_attribute(
span, "crewai_version", crew_data.get("crewai_version")
span,
"crewai_version",
pkg_resources.get_distribution("crewai").version,
)
self._add_attribute(span, "python_version", platform.python_version())
self._add_attribute(span, "crew_key", crew_data.get("key"))
self._add_attribute(span, "crew_id", crew_data.get("id"))
self._add_attribute(span, "crew_process", crew_data.get("process"))
self._add_attribute(span, "crew_memory", crew_data.get("memory"))
self._add_attribute(
span, "crew_number_of_tasks", len(crew_data.get("tasks", []))
)
self._add_attribute(
span, "crew_number_of_agents", len(crew_data.get("agents", []))
)
if crew_data.get("share_crew"):
self._add_attribute(span, "crew_key", crew.key)
self._add_attribute(span, "crew_id", str(crew.id))
self._add_attribute(span, "crew_process", crew.process)
self._add_attribute(span, "crew_memory", crew.memory)
self._add_attribute(span, "crew_number_of_tasks", len(crew.tasks))
self._add_attribute(span, "crew_number_of_agents", len(crew.agents))
if crew.share_crew:
self._add_attribute(
span, "crew_agents", json.dumps(crew_data.get("agents", []))
span,
"crew_agents",
json.dumps(
[
{
"key": agent.key,
"id": str(agent.id),
"role": agent.role,
"goal": agent.goal,
"backstory": agent.backstory,
"verbose?": agent.verbose,
"max_iter": agent.max_iter,
"max_rpm": agent.max_rpm,
"i18n": agent.i18n.prompt_file,
"function_calling_llm": (
agent.function_calling_llm.model
if agent.function_calling_llm
else ""
),
"llm": agent.llm.model,
"delegation_enabled?": agent.allow_delegation,
"allow_code_execution?": agent.allow_code_execution,
"max_retry_limit": agent.max_retry_limit,
"tools_names": [
tool.name.casefold()
for tool in agent.tools or []
],
}
for agent in crew.agents
]
),
)
self._add_attribute(
span, "crew_tasks", json.dumps(crew_data.get("tasks", []))
span,
"crew_tasks",
json.dumps(
[
{
"key": task.key,
"id": str(task.id),
"description": task.description,
"expected_output": task.expected_output,
"async_execution?": task.async_execution,
"human_input?": task.human_input,
"agent_role": (
task.agent.role if task.agent else "None"
),
"agent_key": task.agent.key if task.agent else None,
"context": (
[task.description for task in task.context]
if task.context
else None
),
"tools_names": [
tool.name.casefold()
for tool in task.tools or []
],
}
for task in crew.tasks
]
),
)
self._add_attribute(span, "platform", platform.platform())
self._add_attribute(span, "platform_release", platform.release())
@@ -128,10 +174,59 @@ class Telemetry:
span, "crew_inputs", json.dumps(inputs) if inputs else None
)
else:
# Handle the case where share_crew is False
# You might want to add limited data here
pass
self._add_attribute(
span,
"crew_agents",
json.dumps(
[
{
"key": agent.key,
"id": str(agent.id),
"role": agent.role,
"verbose?": agent.verbose,
"max_iter": agent.max_iter,
"max_rpm": agent.max_rpm,
"function_calling_llm": (
agent.function_calling_llm.model
if agent.function_calling_llm
else ""
),
"llm": agent.llm.model,
"delegation_enabled?": agent.allow_delegation,
"allow_code_execution?": agent.allow_code_execution,
"max_retry_limit": agent.max_retry_limit,
"tools_names": [
tool.name.casefold()
for tool in agent.tools or []
],
}
for agent in crew.agents
]
),
)
self._add_attribute(
span,
"crew_tasks",
json.dumps(
[
{
"key": task.key,
"id": str(task.id),
"async_execution?": task.async_execution,
"human_input?": task.human_input,
"agent_role": (
task.agent.role if task.agent else "None"
),
"agent_key": task.agent.key if task.agent else None,
"tools_names": [
tool.name.casefold()
for tool in task.tools or []
],
}
for task in crew.tasks
]
),
)
span.set_status(Status(StatusCode.OK))
span.end()
except Exception:
@@ -366,39 +461,77 @@ class Telemetry:
except Exception:
pass
def crew_execution_span(
self, crew_data: Dict[str, Any], inputs: Optional[Dict[str, Any]] = None
):
def crew_execution_span(self, crew: Crew, inputs: dict[str, Any] | None):
"""Records the complete execution of a crew.
This is only collected if the user has opted-in to share the crew.
"""
if self.ready and crew_data.get("share_crew"):
self.crew_creation(crew, inputs)
if (self.ready) and (crew.share_crew):
try:
tracer = trace.get_tracer("crewai.telemetry")
span = tracer.start_span("Crew Execution")
self._add_attribute(
span,
"crewai_version",
pkg_resources.get_distribution("crewai").version,
)
self._add_attribute(span, "crew_key", crew_data.get("key"))
self._add_attribute(span, "crew_id", crew_data.get("id"))
self._add_attribute(span, "crew_key", crew.key)
self._add_attribute(span, "crew_id", str(crew.id))
self._add_attribute(
span, "crew_inputs", json.dumps(inputs) if inputs else None
)
self._add_attribute(
span,
"crew_agents",
json.dumps(crew_data.get("agents", [])),
json.dumps(
[
{
"key": agent.key,
"id": str(agent.id),
"role": agent.role,
"goal": agent.goal,
"backstory": agent.backstory,
"verbose?": agent.verbose,
"max_iter": agent.max_iter,
"max_rpm": agent.max_rpm,
"i18n": agent.i18n.prompt_file,
"llm": agent.llm.model,
"delegation_enabled?": agent.allow_delegation,
"tools_names": [
tool.name.casefold() for tool in agent.tools or []
],
}
for agent in crew.agents
]
),
)
self._add_attribute(
span,
"crew_tasks",
json.dumps(crew_data.get("tasks", [])),
json.dumps(
[
{
"id": str(task.id),
"description": task.description,
"expected_output": task.expected_output,
"async_execution?": task.async_execution,
"human_input?": task.human_input,
"agent_role": task.agent.role if task.agent else "None",
"agent_key": task.agent.key if task.agent else None,
"context": (
[task.description for task in task.context]
if task.context
else None
),
"tools_names": [
tool.name.casefold() for tool in task.tools or []
],
}
for task in crew.tasks
]
),
)
span.set_status(Status(StatusCode.OK))
span.end()
return span
except Exception:
pass
@@ -474,9 +607,3 @@ class Telemetry:
span.end()
except Exception:
pass
telemetry = Telemetry()
crew_events.on(CrewEvents.CREW_START, telemetry.crew_execution_span)

View File

@@ -1,26 +1,22 @@
import ast
import datetime
import os
import time
from difflib import SequenceMatcher
from textwrap import dedent
from typing import Any, List, Union
import crewai.utilities.events as events
from crewai.agents.tools_handler import ToolsHandler
from crewai.task import Task
from crewai.telemetry import Telemetry
from crewai.tools.tool_calling import InstructorToolCalling, ToolCalling
from crewai.tools.tool_usage_events import ToolUsageError, ToolUsageFinished
from crewai.utilities import I18N, Converter, ConverterError, Printer
import crewai.utilities.events as events
agentops = None
if os.environ.get("AGENTOPS_API_KEY"):
try:
import agentops # type: ignore
except ImportError:
pass
try:
import agentops
except ImportError:
agentops = None
OPENAI_BIGGER_MODELS = ["gpt-4", "gpt-4o", "o1-preview", "o1-mini"]

View File

@@ -1,4 +1,3 @@
import os
from typing import List
from pydantic import BaseModel, Field
@@ -7,26 +6,16 @@ from crewai.utilities import Converter
from crewai.utilities.pydantic_schema_parser import PydanticSchemaParser
def mock_agent_ops_provider():
def track_agent(*args, **kwargs):
agentops = None
try:
from agentops import track_agent
except ImportError:
def track_agent(name):
def noop(f):
return f
return noop
return track_agent
agentops = None
if os.environ.get("AGENTOPS_API_KEY"):
try:
from agentops import track_agent
except ImportError:
track_agent = mock_agent_ops_provider()
else:
track_agent = mock_agent_ops_provider()
class Entity(BaseModel):
name: str = Field(description="The name of the entity.")

View File

@@ -1,46 +0,0 @@
from enum import Enum
from typing import Any, Callable
from blinker import signal
class CrewEvents(Enum):
CREW_START = "crew_start"
CREW_FINISH = "crew_finish"
CREW_FAILURE = "crew_failure"
TASK_START = "task_start"
TASK_FINISH = "task_finish"
TASK_FAILURE = "task_failure"
AGENT_ACTION = "agent_action"
TOOL_USE = "tool_use"
TOKEN_USAGE = "token_usage"
class CrewEventEmitter:
def __init__(self):
self._all_signal = signal("all")
def on(self, event_name: CrewEvents | str, callback: Callable) -> None:
if event_name == "*" or event_name == "all":
self._all_signal.connect(callback, weak=False)
else:
signal(
event_name.value if isinstance(event_name, CrewEvents) else event_name
).connect(callback, weak=False)
def emit(self, event_name: CrewEvents, *args: Any, **kwargs: Any) -> None:
signal(event_name.value).send(*args, **kwargs)
self._all_signal.send(*args, event=event_name.value, **kwargs)
crew_events = CrewEventEmitter()
def emit(event_name: CrewEvents, *args: Any, **kwargs: Any) -> None:
try:
crew_events.emit(event_name, *args, **kwargs)
except Exception as e:
if kwargs.get("raise_on_error", False):
raise e
else:
print(f"Error emitting event: {e}")

View File

@@ -1,121 +0,0 @@
# event_helpers.py
from datetime import datetime
from typing import TYPE_CHECKING, Any, Dict, Optional
from crewai.utilities.event_emitter import CrewEvents, emit
if TYPE_CHECKING:
from crewai.crew import Crew
from crewai.crews.crew_output import CrewOutput
from crewai.task import Task
from crewai.tasks.task_output import TaskOutput
def emit_crew_start(
crew: "Crew", # Use a forward reference
inputs: Optional[Dict[str, Any]] = None,
) -> None:
serialized_crew = crew.serialize()
emit(
CrewEvents.CREW_START,
{
**serialized_crew,
},
inputs=inputs,
)
def emit_crew_finish(crew: "Crew", result: "CrewOutput") -> None:
serialized_crew = crew.serialize()
serialized_result = result.serialize()
print("emit crew finish")
emit(
CrewEvents.CREW_FINISH,
{
**serialized_crew,
"result": serialized_result,
},
)
def emit_crew_failure(
crew_id: str, name: str, error: Exception, traceback: str, duration: float
) -> None:
emit(
CrewEvents.CREW_FAILURE,
{
"crew_id": crew_id,
"name": name,
"failure_time": datetime.now().isoformat(),
"error_type": type(error).__name__,
"error_message": str(error),
"traceback": traceback,
"duration": duration,
},
)
def emit_task_start(
task: "Task",
agent_role: str = "None",
) -> None:
serialized_task = task.serialize()
emit(
CrewEvents.TASK_START,
{
**serialized_task,
},
agent_role=agent_role,
)
def emit_task_finish(
task: "Task",
inputs: Dict[str, Any],
output: "TaskOutput",
task_index: int,
was_replayed: bool = False,
) -> None:
emit(
CrewEvents.TASK_FINISH,
{
"task": task.serialize(),
"output": {
"description": output.description,
"summary": output.summary,
"raw": output.raw,
"pydantic": output.pydantic,
"json_dict": output.json_dict,
"output_format": output.output_format,
"agent": output.agent,
},
"task_index": task_index,
"inputs": inputs,
"was_replayed": was_replayed,
},
)
def emit_task_failure(
crew_id: str,
task_id: str,
task_name: str,
error: Exception,
traceback: str,
duration: float,
) -> None:
emit(
CrewEvents.TASK_FAILURE,
{
"crew_id": crew_id,
"task_id": task_id,
"task_name": task_name,
"failure_time": datetime.now().isoformat(),
"error_type": type(error).__name__,
"error_message": str(error),
"traceback": traceback,
"duration": duration,
},
)

View File

@@ -1,8 +1,8 @@
from typing import Any, Callable, Generic, List, Dict, Type, TypeVar
from functools import wraps
from typing import Any, Callable, Dict, Generic, List, Type, TypeVar
from pydantic import BaseModel
T = TypeVar("T")
EVT = TypeVar("EVT", bound=BaseModel)

View File

@@ -92,16 +92,20 @@ class TestPlusAPI(unittest.TestCase):
)
self.assertEqual(response, mock_response)
@patch("crewai.cli.plus_api.requests.request")
def test_make_request(self, mock_request):
@patch("crewai.cli.plus_api.requests.Session")
def test_make_request(self, mock_session):
mock_response = MagicMock()
mock_request.return_value = mock_response
mock_session_instance = mock_session.return_value
mock_session_instance.request.return_value = mock_response
response = self.api._make_request("GET", "test_endpoint")
mock_request.assert_called_once_with(
mock_session.assert_called_once()
mock_session_instance.request.assert_called_once_with(
"GET", f"{self.api.base_url}/test_endpoint", headers=self.api.headers
)
mock_session_instance.trust_env = False
self.assertEqual(response, mock_response)
@patch("crewai.cli.plus_api.PlusAPI._make_request")

17
uv.lock generated
View File

@@ -290,15 +290,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/b1/fe/e8c672695b37eecc5cbf43e1d0638d88d66ba3a44c4d321c796f4e59167f/beautifulsoup4-4.12.3-py3-none-any.whl", hash = "sha256:b80878c9f40111313e55da8ba20bdba06d8fa3969fc68304167741bbf9e082ed", size = 147925 },
]
[[package]]
name = "blinker"
version = "1.8.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/1e/57/a6a1721eff09598fb01f3c7cda070c1b6a0f12d63c83236edf79a440abcc/blinker-1.8.2.tar.gz", hash = "sha256:8f77b09d3bf7c795e969e9486f39c2c5e9c39d4ee07424be2bc594ece9642d83", size = 23161 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/bb/2a/10164ed1f31196a2f7f3799368a821765c62851ead0e630ab52b8e14b4d0/blinker-1.8.2-py3-none-any.whl", hash = "sha256:1779309f71bf239144b9399d06ae925637cf6634cf6bd131104184531bf67c01", size = 9456 },
]
[[package]]
name = "boto3"
version = "1.35.32"
@@ -636,13 +627,12 @@ wheels = [
[[package]]
name = "crewai"
version = "0.70.1"
version = "0.67.1"
source = { editable = "." }
dependencies = [
{ name = "agentops" },
{ name = "appdirs" },
{ name = "auth0-python" },
{ name = "blinker" },
{ name = "click" },
{ name = "crewai-tools" },
{ name = "embedchain" },
@@ -697,7 +687,6 @@ requires-dist = [
{ name = "agentops", marker = "extra == 'agentops'", specifier = ">=0.3.0" },
{ name = "appdirs", specifier = ">=1.4.4" },
{ name = "auth0-python", specifier = ">=4.7.1" },
{ name = "blinker" },
{ name = "click", specifier = ">=8.1.7" },
{ name = "crewai-tools", specifier = ">=0.12.1" },
{ name = "crewai-tools", marker = "extra == 'tools'", specifier = ">=0.12.1" },
@@ -2507,8 +2496,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/b2/07/8cbb75d6cfbe8712d8f7f6a5615f083c6e710ab916b748fbb20373ddb142/multiprocess-0.70.17-py311-none-any.whl", hash = "sha256:2884701445d0177aec5bd5f6ee0df296773e4fb65b11903b94c613fb46cfb7d1", size = 144346 },
{ url = "https://files.pythonhosted.org/packages/a4/69/d3f343a61a2f86ef10ed7865a26beda7c71554136ce187b0384b1c2c9ca3/multiprocess-0.70.17-py312-none-any.whl", hash = "sha256:2818af14c52446b9617d1b0755fa70ca2f77c28b25ed97bdaa2c69a22c47b46c", size = 147990 },
{ url = "https://files.pythonhosted.org/packages/c8/b7/2e9a4fcd871b81e1f2a812cd5c6fb52ad1e8da7bf0d7646c55eaae220484/multiprocess-0.70.17-py313-none-any.whl", hash = "sha256:20c28ca19079a6c879258103a6d60b94d4ffe2d9da07dda93fb1c8bc6243f522", size = 149843 },
{ url = "https://files.pythonhosted.org/packages/ae/d7/fd7a092fc0ab1845a1a97ca88e61b9b7cc2e9d6fcf0ed24e9480590c2336/multiprocess-0.70.17-py38-none-any.whl", hash = "sha256:1d52f068357acd1e5bbc670b273ef8f81d57863235d9fbf9314751886e141968", size = 132635 },
{ url = "https://files.pythonhosted.org/packages/f9/41/0618ac724b8a56254962c143759e04fa01c73b37aa69dd433f16643bd38b/multiprocess-0.70.17-py39-none-any.whl", hash = "sha256:c3feb874ba574fbccfb335980020c1ac631fbf2a3f7bee4e2042ede62558a021", size = 133359 },
]
[[package]]
@@ -3192,6 +3179,8 @@ version = "5.9.8"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/90/c7/6dc0a455d111f68ee43f27793971cf03fe29b6ef972042549db29eec39a2/psutil-5.9.8.tar.gz", hash = "sha256:6be126e3225486dff286a8fb9a06246a5253f4c7c53b475ea5f5ac934e64194c", size = 503247 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/fe/5f/c26deb822fd3daf8fde4bdb658bf87d9ab1ffd3fca483816e89a9a9a9084/psutil-5.9.8-cp27-none-win32.whl", hash = "sha256:36f435891adb138ed3c9e58c6af3e2e6ca9ac2f365efe1f9cfef2794e6c93b4e", size = 248660 },
{ url = "https://files.pythonhosted.org/packages/32/1d/cf66073d74d6146187e2d0081a7616df4437214afa294ee4f16f80a2f96a/psutil-5.9.8-cp27-none-win_amd64.whl", hash = "sha256:bd1184ceb3f87651a67b2708d4c3338e9b10c5df903f2e3776b62303b26cb631", size = 251966 },
{ url = "https://files.pythonhosted.org/packages/e7/e3/07ae864a636d70a8a6f58da27cb1179192f1140d5d1da10886ade9405797/psutil-5.9.8-cp36-abi3-macosx_10_9_x86_64.whl", hash = "sha256:aee678c8720623dc456fa20659af736241f575d79429a0e5e9cf88ae0605cc81", size = 248702 },
{ url = "https://files.pythonhosted.org/packages/b3/bd/28c5f553667116b2598b9cc55908ec435cb7f77a34f2bff3e3ca765b0f78/psutil-5.9.8-cp36-abi3-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8cb6403ce6d8e047495a701dc7c5bd788add903f8986d523e3e20b98b733e421", size = 285242 },
{ url = "https://files.pythonhosted.org/packages/c5/4f/0e22aaa246f96d6ac87fe5ebb9c5a693fbe8877f537a1022527c47ca43c5/psutil-5.9.8-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d06016f7f8625a1825ba3732081d77c94589dca78b7a3fc072194851e88461a4", size = 288191 },