mirror of
https://github.com/crewAIInc/crewAI.git
synced 2026-01-20 13:28:13 +00:00
feat: add docs about LLM tracking by Agents and Tasks
This commit is contained in:
@@ -752,6 +752,55 @@ CrewAI supports streaming responses from LLMs, allowing your application to rece
|
|||||||
[Click here](https://docs.crewai.com/concepts/event-listener#event-listeners) for more details
|
[Click here](https://docs.crewai.com/concepts/event-listener#event-listeners) for more details
|
||||||
</Tip>
|
</Tip>
|
||||||
</Tab>
|
</Tab>
|
||||||
|
|
||||||
|
<Tab title="Agent & Task Tracking">
|
||||||
|
All LLM events in CrewAI include agent and task information, allowing you to track and filter LLM interactions by specific agents or tasks:
|
||||||
|
|
||||||
|
```python
|
||||||
|
from crewai import LLM, Agent, Task, Crew
|
||||||
|
from crewai.utilities.events import LLMStreamChunkEvent
|
||||||
|
from crewai.utilities.events.base_event_listener import BaseEventListener
|
||||||
|
|
||||||
|
class MyCustomListener(BaseEventListener):
|
||||||
|
def setup_listeners(self, crewai_event_bus):
|
||||||
|
@crewai_event_bus.on(LLMStreamChunkEvent)
|
||||||
|
def on_llm_stream_chunk(source, event):
|
||||||
|
if researcher.id == event.agent_id:
|
||||||
|
print("\n==============\n Got event:", event, "\n==============\n")
|
||||||
|
|
||||||
|
|
||||||
|
my_listener = MyCustomListener()
|
||||||
|
|
||||||
|
llm = LLM(model="gpt-4o-mini", temperature=0, stream=True)
|
||||||
|
|
||||||
|
researcher = Agent(
|
||||||
|
role="About User",
|
||||||
|
goal="You know everything about the user.",
|
||||||
|
backstory="""You are a master at understanding people and their preferences.""",
|
||||||
|
llm=llm,
|
||||||
|
)
|
||||||
|
|
||||||
|
search = Task(
|
||||||
|
description="Answer the following questions about the user: {question}",
|
||||||
|
expected_output="An answer to the question.",
|
||||||
|
agent=researcher,
|
||||||
|
)
|
||||||
|
|
||||||
|
crew = Crew(agents=[researcher], tasks=[search])
|
||||||
|
|
||||||
|
result = crew.kickoff(
|
||||||
|
inputs={"question": "..."}
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
<Info>
|
||||||
|
This feature is particularly useful for:
|
||||||
|
- Debugging specific agent behaviors
|
||||||
|
- Logging LLM usage by task type
|
||||||
|
- Auditing which agents are making what types of LLM calls
|
||||||
|
- Performance monitoring of specific tasks
|
||||||
|
</Info>
|
||||||
|
</Tab>
|
||||||
</Tabs>
|
</Tabs>
|
||||||
|
|
||||||
## Structured LLM Calls
|
## Structured LLM Calls
|
||||||
|
|||||||
Reference in New Issue
Block a user