mirror of
https://github.com/crewAIInc/crewAI.git
synced 2025-12-16 12:28:30 +00:00
Compare commits
15 Commits
bugfix/min
...
conditiona
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
6ad218f9a0 | ||
|
|
36efa172ee | ||
|
|
cb720143c7 | ||
|
|
bde0a3e99c | ||
|
|
99ada42d97 | ||
|
|
ef928ee3cb | ||
|
|
bc7f601f84 | ||
|
|
b0c2b15a3e | ||
|
|
c0f04bbb37 | ||
|
|
c320fc655e | ||
|
|
ac2815c781 | ||
|
|
dd8a199e99 | ||
|
|
161c4a6856 | ||
|
|
67b04b30bf | ||
|
|
7696b45fc3 |
3
.gitignore
vendored
3
.gitignore
vendored
@@ -14,4 +14,5 @@ test.py
|
||||
rc-tests/*
|
||||
*.pkl
|
||||
temp/*
|
||||
.vscode/*
|
||||
.vscode/*
|
||||
crew_tasks_output.json
|
||||
@@ -4,36 +4,37 @@ description: Understanding and utilizing crews in the crewAI framework with comp
|
||||
---
|
||||
|
||||
## What is a Crew?
|
||||
|
||||
A crew in crewAI represents a collaborative group of agents working together to achieve a set of tasks. Each crew defines the strategy for task execution, agent collaboration, and the overall workflow.
|
||||
|
||||
## Crew Attributes
|
||||
|
||||
| Attribute | Parameters | Description |
|
||||
| :-------------------------- | :------------------ | :------------------------------------------------------------------------------------------------------- |
|
||||
| **Tasks** | `tasks` | A list of tasks assigned to the crew. |
|
||||
| **Agents** | `agents` | A list of agents that are part of the crew. |
|
||||
| **Process** *(optional)* | `process` | The process flow (e.g., sequential, hierarchical) the crew follows. |
|
||||
| **Verbose** *(optional)* | `verbose` | The verbosity level for logging during execution. |
|
||||
| **Manager LLM** *(optional)*| `manager_llm` | The language model used by the manager agent in a hierarchical process. **Required when using a hierarchical process.** |
|
||||
| **Function Calling LLM** *(optional)* | `function_calling_llm` | If passed, the crew will use this LLM to do function calling for tools for all agents in the crew. Each agent can have its own LLM, which overrides the crew's LLM for function calling. |
|
||||
| **Config** *(optional)* | `config` | Optional configuration settings for the crew, in `Json` or `Dict[str, Any]` format. |
|
||||
| **Max RPM** *(optional)* | `max_rpm` | Maximum requests per minute the crew adheres to during execution. |
|
||||
| **Language** *(optional)* | `language` | Language used for the crew, defaults to English. |
|
||||
| **Language File** *(optional)* | `language_file` | Path to the language file to be used for the crew. |
|
||||
| **Memory** *(optional)* | `memory` | Utilized for storing execution memories (short-term, long-term, entity memory). |
|
||||
| **Cache** *(optional)* | `cache` | Specifies whether to use a cache for storing the results of tools' execution. |
|
||||
| **Embedder** *(optional)* | `embedder` | Configuration for the embedder to be used by the crew. Mostly used by memory for now. |
|
||||
| **Full Output** *(optional)*| `full_output` | Whether the crew should return the full output with all tasks outputs or just the final output. |
|
||||
| **Step Callback** *(optional)* | `step_callback` | A function that is called after each step of every agent. This can be used to log the agent's actions or to perform other operations; it won't override the agent-specific `step_callback`. |
|
||||
| **Task Callback** *(optional)* | `task_callback` | A function that is called after the completion of each task. Useful for monitoring or additional operations post-task execution. |
|
||||
| **Share Crew** *(optional)* | `share_crew` | Whether you want to share the complete crew information and execution with the crewAI team to make the library better, and allow us to train models. |
|
||||
| **Output Log File** *(optional)* | `output_log_file` | Whether you want to have a file with the complete crew output and execution. You can set it using True and it will default to the folder you are currently in and it will be called logs.txt or passing a string with the full path and name of the file. |
|
||||
| **Manager Agent** *(optional)* | `manager_agent` | `manager` sets a custom agent that will be used as a manager. |
|
||||
| **Manager Callbacks** *(optional)* | `manager_callbacks` | `manager_callbacks` takes a list of callback handlers to be executed by the manager agent when a hierarchical process is used. |
|
||||
| **Prompt File** *(optional)* | `prompt_file` | Path to the prompt JSON file to be used for the crew. |
|
||||
| Attribute | Parameters | Description |
|
||||
| :------------------------------------ | :--------------------- | :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| **Tasks** | `tasks` | A list of tasks assigned to the crew. |
|
||||
| **Agents** | `agents` | A list of agents that are part of the crew. |
|
||||
| **Process** _(optional)_ | `process` | The process flow (e.g., sequential, hierarchical) the crew follows. |
|
||||
| **Verbose** _(optional)_ | `verbose` | The verbosity level for logging during execution. |
|
||||
| **Manager LLM** _(optional)_ | `manager_llm` | The language model used by the manager agent in a hierarchical process. **Required when using a hierarchical process.** |
|
||||
| **Function Calling LLM** _(optional)_ | `function_calling_llm` | If passed, the crew will use this LLM to do function calling for tools for all agents in the crew. Each agent can have its own LLM, which overrides the crew's LLM for function calling. |
|
||||
| **Config** _(optional)_ | `config` | Optional configuration settings for the crew, in `Json` or `Dict[str, Any]` format. |
|
||||
| **Max RPM** _(optional)_ | `max_rpm` | Maximum requests per minute the crew adheres to during execution. |
|
||||
| **Language** _(optional)_ | `language` | Language used for the crew, defaults to English. |
|
||||
| **Language File** _(optional)_ | `language_file` | Path to the language file to be used for the crew. |
|
||||
| **Memory** _(optional)_ | `memory` | Utilized for storing execution memories (short-term, long-term, entity memory). |
|
||||
| **Cache** _(optional)_ | `cache` | Specifies whether to use a cache for storing the results of tools' execution. |
|
||||
| **Embedder** _(optional)_ | `embedder` | Configuration for the embedder to be used by the crew. Mostly used by memory for now. |
|
||||
| **Full Output** _(optional)_ | `full_output` | Whether the crew should return the full output with all tasks outputs or just the final output. |
|
||||
| **Step Callback** _(optional)_ | `step_callback` | A function that is called after each step of every agent. This can be used to log the agent's actions or to perform other operations; it won't override the agent-specific `step_callback`. |
|
||||
| **Task Callback** _(optional)_ | `task_callback` | A function that is called after the completion of each task. Useful for monitoring or additional operations post-task execution. |
|
||||
| **Share Crew** _(optional)_ | `share_crew` | Whether you want to share the complete crew information and execution with the crewAI team to make the library better, and allow us to train models. |
|
||||
| **Output Log File** _(optional)_ | `output_log_file` | Whether you want to have a file with the complete crew output and execution. You can set it using True and it will default to the folder you are currently in and it will be called logs.txt or passing a string with the full path and name of the file. |
|
||||
| **Manager Agent** _(optional)_ | `manager_agent` | `manager` sets a custom agent that will be used as a manager. |
|
||||
| **Manager Callbacks** _(optional)_ | `manager_callbacks` | `manager_callbacks` takes a list of callback handlers to be executed by the manager agent when a hierarchical process is used. |
|
||||
| **Prompt File** _(optional)_ | `prompt_file` | Path to the prompt JSON file to be used for the crew. |
|
||||
|
||||
!!! note "Crew Max RPM"
|
||||
The `max_rpm` attribute sets the maximum number of requests per minute the crew can perform to avoid rate limits and will override individual agents' `max_rpm` settings if you set it.
|
||||
The `max_rpm` attribute sets the maximum number of requests per minute the crew can perform to avoid rate limits and will override individual agents' `max_rpm` settings if you set it.
|
||||
|
||||
## Creating a Crew
|
||||
|
||||
@@ -89,6 +90,57 @@ my_crew = Crew(
|
||||
)
|
||||
```
|
||||
|
||||
## Crew Output
|
||||
|
||||
!!! note "Understanding Crew Outputs"
|
||||
The output of a crew in the crewAI framework is encapsulated within the `CrewOutput` class.
|
||||
This class provides a structured way to access results of the crew's execution, including various formats such as raw strings, JSON, and Pydantic models.
|
||||
The `CrewOutput` includes the results from the final task output, token usage, and individual task outputs.
|
||||
|
||||
### Crew Output Attributes
|
||||
|
||||
| Attribute | Parameters | Type | Description |
|
||||
| :--------------- | :------------- | :------------------------- | :--------------------------------------------------------------------------------------------------- |
|
||||
| **Raw** | `raw` | `str` | The raw output of the crew. This is the default format for the output. |
|
||||
| **Pydantic** | `pydantic` | `Optional[BaseModel]` | A Pydantic model object representing the structured output of the crew. |
|
||||
| **JSON Dict** | `json_dict` | `Optional[Dict[str, Any]]` | A dictionary representing the JSON output of the crew. |
|
||||
| **Tasks Output** | `tasks_output` | `List[TaskOutput]` | A list of `TaskOutput` objects, each representing the output of a task in the crew. |
|
||||
| **Token Usage** | `token_usage` | `Dict[str, Any]` | A summary of token usage, providing insights into the language model's performance during execution. |
|
||||
|
||||
### Crew Output Methods and Properties
|
||||
|
||||
| Method/Property | Description |
|
||||
| :-------------- | :------------------------------------------------------------------------------------------------ |
|
||||
| **json** | Returns the JSON string representation of the crew output if the output format is JSON. |
|
||||
| **to_dict** | Converts the JSON and Pydantic outputs to a dictionary. |
|
||||
| \***\*str\*\*** | Returns the string representation of the crew output, prioritizing Pydantic, then JSON, then raw. |
|
||||
|
||||
### Accessing Crew Outputs
|
||||
|
||||
Once a crew has been executed, its output can be accessed through the `output` attribute of the `Crew` object. The `CrewOutput` class provides various ways to interact with and present this output.
|
||||
|
||||
#### Example
|
||||
|
||||
```python
|
||||
# Example crew execution
|
||||
crew = Crew(
|
||||
agents=[research_agent, writer_agent],
|
||||
tasks=[research_task, write_article_task],
|
||||
verbose=2
|
||||
)
|
||||
|
||||
result = crew.kickoff()
|
||||
|
||||
# Accessing the crew output
|
||||
print(f"Raw Output: {crew_output.raw}")
|
||||
if crew_output.json_dict:
|
||||
print(f"JSON Output: {json.dumps(crew_output.json_dict, indent=2)}")
|
||||
if crew_output.pydantic:
|
||||
print(f"Pydantic Output: {crew_output.pydantic}")
|
||||
print(f"Tasks Output: {crew_output.tasks_output}")
|
||||
print(f"Token Usage: {crew_output.token_usage}")
|
||||
```
|
||||
|
||||
## Memory Utilization
|
||||
|
||||
Crews can utilize memory (short-term, long-term, and entity memory) to enhance their execution and learning over time. This feature allows crews to store and recall execution memories, aiding in decision-making and task execution strategies.
|
||||
@@ -156,3 +208,32 @@ for async_result in async_results:
|
||||
```
|
||||
|
||||
These methods provide flexibility in how you manage and execute tasks within your crew, allowing for both synchronous and asynchronous workflows tailored to your needs
|
||||
|
||||
|
||||
### Replaying from specific task:
|
||||
You can now replay from a specific task using our cli command replay.
|
||||
|
||||
The replay_from_tasks feature in CrewAI allows you to replay from a specific task using the command-line interface (CLI). By running the command `crewai replay -t <task_id>`, you can specify the `task_id` for the replay process.
|
||||
|
||||
Kickoffs will now save the latest kickoffs returned task outputs locally for you to be able to replay from.
|
||||
|
||||
|
||||
### Replaying from specific task Using the CLI
|
||||
To use the replay feature, follow these steps:
|
||||
|
||||
1. Open your terminal or command prompt.
|
||||
2. Navigate to the directory where your CrewAI project is located.
|
||||
3. Run the following command:
|
||||
|
||||
To view latest kickoff task_ids use:
|
||||
|
||||
```shell
|
||||
crewai log-tasks-outputs
|
||||
```
|
||||
|
||||
|
||||
```shell
|
||||
crewai replay -t <task_id>
|
||||
```
|
||||
|
||||
These commands let you replay from your latest kickoff tasks, still retaining context from previously executed tasks.
|
||||
@@ -4,27 +4,29 @@ description: Detailed guide on managing and creating tasks within the crewAI fra
|
||||
---
|
||||
|
||||
## Overview of a Task
|
||||
|
||||
!!! note "What is a Task?"
|
||||
In the crewAI framework, tasks are specific assignments completed by agents. They provide all necessary details for execution, such as a description, the agent responsible, required tools, and more, facilitating a wide range of action complexities.
|
||||
In the crewAI framework, tasks are specific assignments completed by agents. They provide all necessary details for execution, such as a description, the agent responsible, required tools, and more, facilitating a wide range of action complexities.
|
||||
|
||||
Tasks within crewAI can be collaborative, requiring multiple agents to work together. This is managed through the task properties and orchestrated by the Crew's process, enhancing teamwork and efficiency.
|
||||
|
||||
## Task Attributes
|
||||
|
||||
| Attribute | Parameters | Description |
|
||||
| :----------------------| :------------------- | :-------------------------------------------------------------------------------------------- |
|
||||
| **Description** | `description` | A clear, concise statement of what the task entails. |
|
||||
| **Agent** | `agent` | The agent responsible for the task, assigned either directly or by the crew's process. |
|
||||
| **Expected Output** | `expected_output` | A detailed description of what the task's completion looks like. |
|
||||
| **Tools** *(optional)* | `tools` | The functions or capabilities the agent can utilize to perform the task. |
|
||||
| **Async Execution** *(optional)* | `async_execution` | If set, the task executes asynchronously, allowing progression without waiting for completion.|
|
||||
| **Context** *(optional)* | `context` | Specifies tasks whose outputs are used as context for this task. |
|
||||
| **Config** *(optional)* | `config` | Additional configuration details for the agent executing the task, allowing further customization. |
|
||||
| **Output JSON** *(optional)* | `output_json` | Outputs a JSON object, requiring an OpenAI client. Only one output format can be set. |
|
||||
| **Output Pydantic** *(optional)* | `output_pydantic` | Outputs a Pydantic model object, requiring an OpenAI client. Only one output format can be set. |
|
||||
| **Output File** *(optional)* | `output_file` | Saves the task output to a file. If used with `Output JSON` or `Output Pydantic`, specifies how the output is saved. |
|
||||
| **Callback** *(optional)* | `callback` | A Python callable that is executed with the task's output upon completion. |
|
||||
| **Human Input** *(optional)* | `human_input` | Indicates if the task requires human feedback at the end, useful for tasks needing human oversight. |
|
||||
| Attribute | Parameters | Description |
|
||||
| :------------------------------- | :---------------- | :------------------------------------------------------------------------------------------------------------------- |
|
||||
| **Description** | `description` | A clear, concise statement of what the task entails. |
|
||||
| **Agent** | `agent` | The agent responsible for the task, assigned either directly or by the crew's process. |
|
||||
| **Expected Output** | `expected_output` | A detailed description of what the task's completion looks like. |
|
||||
| **Tools** _(optional)_ | `tools` | The functions or capabilities the agent can utilize to perform the task. |
|
||||
| **Async Execution** _(optional)_ | `async_execution` | If set, the task executes asynchronously, allowing progression without waiting for completion. |
|
||||
| **Context** _(optional)_ | `context` | Specifies tasks whose outputs are used as context for this task. |
|
||||
| **Config** _(optional)_ | `config` | Additional configuration details for the agent executing the task, allowing further customization. |
|
||||
| **Output JSON** _(optional)_ | `output_json` | Outputs a JSON object, requiring an OpenAI client. Only one output format can be set. |
|
||||
| **Output Pydantic** _(optional)_ | `output_pydantic` | Outputs a Pydantic model object, requiring an OpenAI client. Only one output format can be set. |
|
||||
| **Output File** _(optional)_ | `output_file` | Saves the task output to a file. If used with `Output JSON` or `Output Pydantic`, specifies how the output is saved. |
|
||||
| **Output** _(optional)_ | `output` | The output of the task, containing the raw, JSON, and Pydantic output plus additional details. |
|
||||
| **Callback** _(optional)_ | `callback` | A Python callable that is executed with the task's output upon completion. |
|
||||
| **Human Input** _(optional)_ | `human_input` | Indicates if the task requires human feedback at the end, useful for tasks needing human oversight. |
|
||||
|
||||
## Creating a Task
|
||||
|
||||
@@ -35,12 +37,75 @@ from crewai import Task
|
||||
|
||||
task = Task(
|
||||
description='Find and summarize the latest and most relevant news on AI',
|
||||
agent=sales_agent
|
||||
agent=sales_agent,
|
||||
expected_output='A bullet list summary of the top 5 most important AI news',
|
||||
)
|
||||
```
|
||||
|
||||
!!! note "Task Assignment"
|
||||
Directly specify an `agent` for assignment or let the `hierarchical` CrewAI's process decide based on roles, availability, etc.
|
||||
Directly specify an `agent` for assignment or let the `hierarchical` CrewAI's process decide based on roles, availability, etc.
|
||||
|
||||
## Task Output
|
||||
|
||||
!!! note "Understanding Task Outputs"
|
||||
The output of a task in the crewAI framework is encapsulated within the `TaskOutput` class. This class provides a structured way to access results of a task, including various formats such as raw strings, JSON, and Pydantic models.
|
||||
By default, the `TaskOutput` will only include the `raw` output. A `TaskOutput` will only include the `pydantic` or `json_dict` output if the original `Task` object was configured with `output_pydantic` or `output_json`, respectively.
|
||||
|
||||
### Task Output Attributes
|
||||
|
||||
| Attribute | Parameters | Type | Description |
|
||||
| :---------------- | :-------------- | :------------------------- | :------------------------------------------------------------------------------------------------- |
|
||||
| **Description** | `description` | `str` | A brief description of the task. |
|
||||
| **Summary** | `summary` | `Optional[str]` | A short summary of the task, auto-generated from the description. |
|
||||
| **Raw** | `raw` | `str` | The raw output of the task. This is the default format for the output. |
|
||||
| **Pydantic** | `pydantic` | `Optional[BaseModel]` | A Pydantic model object representing the structured output of the task. |
|
||||
| **JSON Dict** | `json_dict` | `Optional[Dict[str, Any]]` | A dictionary representing the JSON output of the task. |
|
||||
| **Agent** | `agent` | `str` | The agent that executed the task. |
|
||||
| **Output Format** | `output_format` | `OutputFormat` | The format of the task output, with options including RAW, JSON, and Pydantic. The default is RAW. |
|
||||
|
||||
### Task Output Methods and Properties
|
||||
|
||||
| Method/Property | Description |
|
||||
| :-------------- | :------------------------------------------------------------------------------------------------ |
|
||||
| **json** | Returns the JSON string representation of the task output if the output format is JSON. |
|
||||
| **to_dict** | Converts the JSON and Pydantic outputs to a dictionary. |
|
||||
| \***\*str\*\*** | Returns the string representation of the task output, prioritizing Pydantic, then JSON, then raw. |
|
||||
|
||||
### Accessing Task Outputs
|
||||
|
||||
Once a task has been executed, its output can be accessed through the `output` attribute of the `Task` object. The `TaskOutput` class provides various ways to interact with and present this output.
|
||||
|
||||
#### Example
|
||||
|
||||
```python
|
||||
# Example task
|
||||
task = Task(
|
||||
description='Find and summarize the latest AI news',
|
||||
expected_output='A bullet list summary of the top 5 most important AI news',
|
||||
agent=research_agent,
|
||||
tools=[search_tool]
|
||||
)
|
||||
|
||||
# Execute the crew
|
||||
crew = Crew(
|
||||
agents=[research_agent],
|
||||
tasks=[task],
|
||||
verbose=2
|
||||
)
|
||||
|
||||
result = crew.kickoff()
|
||||
|
||||
# Accessing the task output
|
||||
task_output = task.output
|
||||
|
||||
print(f"Task Description: {task_output.description}")
|
||||
print(f"Task Summary: {task_output.summary}")
|
||||
print(f"Raw Output: {task_output.raw}")
|
||||
if task_output.json_dict:
|
||||
print(f"JSON Output: {json.dumps(task_output.json_dict, indent=2)}")
|
||||
if task_output.pydantic:
|
||||
print(f"Pydantic Output: {task_output.pydantic}")
|
||||
```
|
||||
|
||||
## Integrating Tools with Tasks
|
||||
|
||||
|
||||
87
docs/how-to/Conditional-Tasks.md
Normal file
87
docs/how-to/Conditional-Tasks.md
Normal file
@@ -0,0 +1,87 @@
|
||||
---
|
||||
title: Conditional Tasks
|
||||
description: Learn how to use conditional tasks in a crewAI kickoff
|
||||
---
|
||||
|
||||
## Introduction
|
||||
|
||||
Conditional Tasks in crewAI allow for dynamic workflow adaptation based on the outcomes of previous tasks. This powerful feature enables crews to make decisions and execute tasks selectively, enhancing the flexibility and efficiency of your AI-driven processes.
|
||||
|
||||
```python
|
||||
from typing import List
|
||||
|
||||
from pydantic import BaseModel
|
||||
from crewai import Agent, Crew
|
||||
from crewai.tasks.conditional_task import ConditionalTask
|
||||
from crewai.tasks.task_output import TaskOutput
|
||||
from crewai.task import Task
|
||||
from crewai_tools import SerperDevTool
|
||||
|
||||
|
||||
# Define a condition function for the conditional task
|
||||
# if false task will be skipped, true, then execute task
|
||||
def is_data_missing(output: TaskOutput) -> bool:
|
||||
return len(output.pydantic.events) < 10: # this will skip this task
|
||||
|
||||
# Define the agents
|
||||
data_fetcher_agent = Agent(
|
||||
role="Data Fetcher",
|
||||
goal="Fetch data online using Serper tool",
|
||||
backstory="Backstory 1",
|
||||
verbose=True,
|
||||
tools=[SerperDevTool()],
|
||||
)
|
||||
|
||||
data_processor_agent = Agent(
|
||||
role="Data Processor",
|
||||
goal="Process fetched data",
|
||||
backstory="Backstory 2",
|
||||
verbose=True,
|
||||
)
|
||||
|
||||
summary_generator_agent = Agent(
|
||||
role="Summary Generator",
|
||||
goal="Generate summary from fetched data",
|
||||
backstory="Backstory 3",
|
||||
verbose=True,
|
||||
)
|
||||
|
||||
|
||||
class EventOutput(BaseModel):
|
||||
events: List[str]
|
||||
|
||||
|
||||
task1 = Task(
|
||||
description="Fetch data about events in San Francisco using Serper tool",
|
||||
expected_output="List of 10 things to do in SF this week",
|
||||
agent=data_fetcher_agent,
|
||||
output_pydantic=EventOutput,
|
||||
)
|
||||
|
||||
conditional_task = ConditionalTask(
|
||||
description="""
|
||||
Check if data is missing. If we have less than 10 events,
|
||||
fetch more events using Serper tool so that
|
||||
we have a total of 10 events in SF this week..
|
||||
""",
|
||||
expected_output="List of 10 Things to do in SF this week ",
|
||||
condition=is_data_missing,
|
||||
agent=data_processor_agent,
|
||||
)
|
||||
|
||||
task3 = Task(
|
||||
description="Generate summary of events in San Francisco from fetched data",
|
||||
expected_output="summary_generated",
|
||||
agent=summary_generator_agent,
|
||||
)
|
||||
|
||||
# Create a crew with the tasks
|
||||
crew = Crew(
|
||||
agents=[data_fetcher_agent, data_processor_agent, summary_generator_agent],
|
||||
tasks=[task1, conditional_task, task3],
|
||||
verbose=2,
|
||||
)
|
||||
|
||||
result = crew.kickoff()
|
||||
print("results", result)
|
||||
```
|
||||
49
docs/how-to/Replay-tasks-from-latest-Crew-Kickoff.md
Normal file
49
docs/how-to/Replay-tasks-from-latest-Crew-Kickoff.md
Normal file
@@ -0,0 +1,49 @@
|
||||
---
|
||||
title: Replay Tasks from Latest Crew Kickoff
|
||||
description: Replay tasks from the latest crew.kickoff(...)
|
||||
---
|
||||
|
||||
## Introduction
|
||||
CrewAI provides the ability to replay from a task specified from the latest crew kickoff. This feature is particularly useful when you've finished a kickoff and may want to retry certain tasks or don't need to refetch data over and your agents already have the context saved from the kickoff execution so you just need to replay the tasks you want to.
|
||||
|
||||
## Note:
|
||||
You must run `crew.kickoff()` before you can replay a task. Currently, only the latest kickoff is supported, so if you use `kickoff_for_each`, it will only allow you to replay from the most recent crew run.
|
||||
|
||||
Here's an example of how to replay from a task:
|
||||
|
||||
### Replaying from specific task Using the CLI
|
||||
To use the replay feature, follow these steps:
|
||||
|
||||
1. Open your terminal or command prompt.
|
||||
2. Navigate to the directory where your CrewAI project is located.
|
||||
3. Run the following command:
|
||||
|
||||
To view latest kickoff task_ids use:
|
||||
```shell
|
||||
crewai log-tasks-outputs
|
||||
```
|
||||
|
||||
Once you have your task_id to replay from use:
|
||||
```shell
|
||||
crewai replay -t <task_id>
|
||||
```
|
||||
|
||||
|
||||
### Replaying from a task Programmatically
|
||||
To replay from a task programmatically, use the following steps:
|
||||
|
||||
1. Specify the task_id and input parameters for the replay process.
|
||||
2. Execute the replay command within a try-except block to handle potential errors.
|
||||
|
||||
```python
|
||||
def replay_from_task():
|
||||
"""
|
||||
Replay the crew execution from a specific task.
|
||||
"""
|
||||
task_id = '<task_id>'
|
||||
inputs = {"topic": "CrewAI Training"} # this is optional, you can pass in the inputs you want to replay otherwise uses the previous kickoffs inputs
|
||||
try:
|
||||
YourCrewName_Crew().crew().replay_from_task(task_id=task_id, inputs=inputs)
|
||||
|
||||
except Exception as e:
|
||||
raise Exception(f"An error occurred while replaying the crew: {e}")
|
||||
@@ -113,6 +113,16 @@ Cutting-edge framework for orchestrating role-playing, autonomous AI agents. By
|
||||
Kickoff a Crew for a List
|
||||
</a>
|
||||
</li>
|
||||
<li>
|
||||
<a href="./how-to/Replay-tasks-from-latest-Crew-Kickoff">
|
||||
Replay from a Task
|
||||
</a>
|
||||
</li>
|
||||
<li>
|
||||
<a href="./how-to/Conditional-Tasks">
|
||||
Conditional Tasks
|
||||
</a>
|
||||
</li>
|
||||
<li>
|
||||
<a href="./how-to/AgentOps-Observability">
|
||||
Agent Monitoring with AgentOps
|
||||
|
||||
@@ -145,6 +145,8 @@ nav:
|
||||
- Human Input on Execution: 'how-to/Human-Input-on-Execution.md'
|
||||
- Kickoff a Crew Asynchronously: 'how-to/Kickoff-async.md'
|
||||
- Kickoff a Crew for a List: 'how-to/Kickoff-for-each.md'
|
||||
- Replay from a specific task from a kickoff: 'how-to/Replay-tasks-from-latest-Crew-Kickoff.md'
|
||||
- Conditional Tasks: 'how-to/Conditional-Tasks.md'
|
||||
- Agent Monitoring with AgentOps: 'how-to/AgentOps-Observability.md'
|
||||
- Agent Monitoring with LangTrace: 'how-to/Langtrace-Observability.md'
|
||||
- Tools Docs:
|
||||
@@ -180,6 +182,7 @@ nav:
|
||||
- Landing Page Generator: https://github.com/joaomdmoura/crewAI-examples/tree/main/landing_page_generator"
|
||||
- Prepare for meetings: https://github.com/joaomdmoura/crewAI-examples/tree/main/prep-for-a-meeting"
|
||||
- Telemetry: 'telemetry/Telemetry.md'
|
||||
- Change Log: 'https://github.com/crewAIInc/crewAI/releases'
|
||||
|
||||
extra_css:
|
||||
- stylesheets/output.css
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import uuid
|
||||
from abc import ABC, abstractmethod
|
||||
from copy import copy as shallow_copy
|
||||
from hashlib import md5
|
||||
from typing import Any, Dict, List, Optional, TypeVar
|
||||
|
||||
from pydantic import (
|
||||
@@ -162,6 +163,11 @@ class BaseAgent(ABC, BaseModel):
|
||||
self._token_process = TokenProcess()
|
||||
return self
|
||||
|
||||
@property
|
||||
def key(self):
|
||||
source = [self.role, self.goal, self.backstory]
|
||||
return md5("|".join(source).encode()).hexdigest()
|
||||
|
||||
@abstractmethod
|
||||
def execute_task(
|
||||
self,
|
||||
@@ -180,7 +186,7 @@ class BaseAgent(ABC, BaseModel):
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def get_delegation_tools(self, agents: List["BaseAgent"]):
|
||||
def get_delegation_tools(self, agents: List["BaseAgent"]) -> List[Any]:
|
||||
"""Set the task tools that init BaseAgenTools class."""
|
||||
pass
|
||||
|
||||
|
||||
@@ -1,8 +1,14 @@
|
||||
import click
|
||||
import pkg_resources
|
||||
|
||||
from crewai.memory.storage.kickoff_task_outputs_storage import (
|
||||
KickoffTaskOutputsSQLiteStorage,
|
||||
)
|
||||
|
||||
|
||||
from .create_crew import create_crew
|
||||
from .train_crew import train_crew
|
||||
from .replay_from_task import replay_task_command
|
||||
|
||||
|
||||
@click.group()
|
||||
@@ -48,5 +54,50 @@ def train(n_iterations: int):
|
||||
train_crew(n_iterations)
|
||||
|
||||
|
||||
@crewai.command()
|
||||
@click.option(
|
||||
"-t",
|
||||
"--task_id",
|
||||
type=str,
|
||||
help="Replay the crew from this task ID, including all subsequent tasks.",
|
||||
)
|
||||
def replay(task_id: str) -> None:
|
||||
"""
|
||||
Replay the crew execution from a specific task.
|
||||
|
||||
Args:
|
||||
task_id (str): The ID of the task to replay from.
|
||||
"""
|
||||
try:
|
||||
click.echo(f"Replaying the crew from task {task_id}")
|
||||
replay_task_command(task_id)
|
||||
except Exception as e:
|
||||
click.echo(f"An error occurred while replaying: {e}", err=True)
|
||||
|
||||
|
||||
@crewai.command()
|
||||
def log_tasks_outputs() -> None:
|
||||
"""
|
||||
Retrieve your latest crew.kickoff() task outputs.
|
||||
"""
|
||||
try:
|
||||
storage = KickoffTaskOutputsSQLiteStorage()
|
||||
tasks = storage.load()
|
||||
|
||||
if not tasks:
|
||||
click.echo(
|
||||
"No task outputs found. Only crew kickoff task outputs are logged."
|
||||
)
|
||||
return
|
||||
|
||||
for index, task in enumerate(tasks, 1):
|
||||
click.echo(f"Task {index}: {task['task_id']}")
|
||||
click.echo(f"Description: {task['expected_output']}")
|
||||
click.echo("------")
|
||||
|
||||
except Exception as e:
|
||||
click.echo(f"An error occurred while logging task outputs: {e}", err=True)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
crewai()
|
||||
|
||||
24
src/crewai/cli/replay_from_task.py
Normal file
24
src/crewai/cli/replay_from_task.py
Normal file
@@ -0,0 +1,24 @@
|
||||
import subprocess
|
||||
import click
|
||||
|
||||
|
||||
def replay_task_command(task_id: str) -> None:
|
||||
"""
|
||||
Replay the crew execution from a specific task.
|
||||
|
||||
Args:
|
||||
task_id (str): The ID of the task to replay from.
|
||||
"""
|
||||
command = ["poetry", "run", "replay", task_id]
|
||||
|
||||
try:
|
||||
result = subprocess.run(command, capture_output=False, text=True, check=True)
|
||||
if result.stderr:
|
||||
click.echo(result.stderr, err=True)
|
||||
|
||||
except subprocess.CalledProcessError as e:
|
||||
click.echo(f"An error occurred while replaying the task: {e}", err=True)
|
||||
click.echo(e.output, err=True)
|
||||
|
||||
except Exception as e:
|
||||
click.echo(f"An unexpected error occurred: {e}", err=True)
|
||||
@@ -21,3 +21,13 @@ def train():
|
||||
|
||||
except Exception as e:
|
||||
raise Exception(f"An error occurred while training the crew: {e}")
|
||||
|
||||
def replay_from_task():
|
||||
"""
|
||||
Replay the crew execution from a specific task.
|
||||
"""
|
||||
try:
|
||||
{{crew_name}}Crew().crew().replay_from_task(task_id=sys.argv[1])
|
||||
|
||||
except Exception as e:
|
||||
raise Exception(f"An error occurred while replaying the crew: {e}")
|
||||
|
||||
@@ -11,6 +11,7 @@ crewai = { extras = ["tools"], version = "^0.35.8" }
|
||||
[tool.poetry.scripts]
|
||||
{{folder_name}} = "{{folder_name}}.main:run"
|
||||
train = "{{folder_name}}.main:train"
|
||||
replay = "{{folder_name}}.main:replay_from_task"
|
||||
|
||||
[build-system]
|
||||
requires = ["poetry-core"]
|
||||
|
||||
@@ -2,19 +2,20 @@ import asyncio
|
||||
import json
|
||||
import uuid
|
||||
from concurrent.futures import Future
|
||||
from hashlib import md5
|
||||
from typing import Any, Dict, List, Optional, Tuple, Union
|
||||
|
||||
from langchain_core.callbacks import BaseCallbackHandler
|
||||
from pydantic import (
|
||||
UUID4,
|
||||
BaseModel,
|
||||
ConfigDict,
|
||||
Field,
|
||||
InstanceOf,
|
||||
Json,
|
||||
PrivateAttr,
|
||||
field_validator,
|
||||
model_validator,
|
||||
UUID4,
|
||||
BaseModel,
|
||||
ConfigDict,
|
||||
Field,
|
||||
InstanceOf,
|
||||
Json,
|
||||
PrivateAttr,
|
||||
field_validator,
|
||||
model_validator,
|
||||
)
|
||||
from pydantic_core import PydanticCustomError
|
||||
|
||||
@@ -27,16 +28,21 @@ from crewai.memory.long_term.long_term_memory import LongTermMemory
|
||||
from crewai.memory.short_term.short_term_memory import ShortTermMemory
|
||||
from crewai.process import Process
|
||||
from crewai.task import Task
|
||||
from crewai.tasks.conditional_task import ConditionalTask
|
||||
from crewai.tasks.task_output import TaskOutput
|
||||
from crewai.telemetry import Telemetry
|
||||
from crewai.tools.agent_tools import AgentTools
|
||||
from crewai.utilities import I18N, FileHandler, Logger, RPMController
|
||||
from crewai.utilities.constants import TRAINED_AGENTS_DATA_FILE, TRAINING_DATA_FILE
|
||||
from crewai.utilities.constants import (
|
||||
TRAINED_AGENTS_DATA_FILE,
|
||||
TRAINING_DATA_FILE,
|
||||
)
|
||||
from crewai.utilities.evaluators.task_evaluator import TaskEvaluator
|
||||
from crewai.utilities.formatter import (
|
||||
aggregate_raw_outputs_from_task_outputs,
|
||||
aggregate_raw_outputs_from_tasks,
|
||||
)
|
||||
from crewai.utilities.task_output_storage_handler import TaskOutputStorageHandler
|
||||
from crewai.utilities.training_handler import CrewTrainingHandler
|
||||
|
||||
try:
|
||||
@@ -80,6 +86,13 @@ class Crew(BaseModel):
|
||||
_entity_memory: Optional[InstanceOf[EntityMemory]] = PrivateAttr()
|
||||
_train: Optional[bool] = PrivateAttr(default=False)
|
||||
_train_iteration: Optional[int] = PrivateAttr()
|
||||
_inputs: Optional[Dict[str, Any]] = PrivateAttr(default=None)
|
||||
_logging_color: str = PrivateAttr(
|
||||
default="bold_purple",
|
||||
)
|
||||
_task_output_handler: TaskOutputStorageHandler = PrivateAttr(
|
||||
default_factory=TaskOutputStorageHandler
|
||||
)
|
||||
|
||||
cache: bool = Field(default=True)
|
||||
model_config = ConfigDict(arbitrary_types_allowed=True)
|
||||
@@ -135,6 +148,14 @@ class Crew(BaseModel):
|
||||
default=False,
|
||||
description="output_log_file",
|
||||
)
|
||||
task_execution_output_json_files: Optional[List[str]] = Field(
|
||||
default=None,
|
||||
description="List of file paths for task execution JSON files.",
|
||||
)
|
||||
execution_logs: List[Dict[str, Any]] = Field(
|
||||
default=[],
|
||||
description="List of execution logs for tasks",
|
||||
)
|
||||
|
||||
@field_validator("id", mode="before")
|
||||
@classmethod
|
||||
@@ -274,6 +295,29 @@ class Crew(BaseModel):
|
||||
|
||||
return self
|
||||
|
||||
@model_validator(mode="after")
|
||||
def validate_first_task(self) -> "Crew":
|
||||
"""Ensure the first task is not a ConditionalTask."""
|
||||
if self.tasks and isinstance(self.tasks[0], ConditionalTask):
|
||||
raise PydanticCustomError(
|
||||
"invalid_first_task",
|
||||
"The first task cannot be a ConditionalTask.",
|
||||
{},
|
||||
)
|
||||
return self
|
||||
|
||||
@model_validator(mode="after")
|
||||
def validate_async_tasks_not_async(self) -> "Crew":
|
||||
"""Ensure that ConditionalTask is not async."""
|
||||
for task in self.tasks:
|
||||
if task.async_execution and isinstance(task, ConditionalTask):
|
||||
raise PydanticCustomError(
|
||||
"invalid_async_conditional_task",
|
||||
f"Conditional Task: {task.description} , cannot be executed asynchronously.", # type: ignore # Argument of type "str" cannot be assigned to parameter "message_template" of type "LiteralString"
|
||||
{},
|
||||
)
|
||||
return self
|
||||
|
||||
@model_validator(mode="after")
|
||||
def validate_async_task_cannot_include_sequential_async_tasks_in_context(self):
|
||||
"""
|
||||
@@ -310,6 +354,13 @@ class Crew(BaseModel):
|
||||
)
|
||||
return self
|
||||
|
||||
@property
|
||||
def key(self) -> str:
|
||||
source = [agent.key for agent in self.agents] + [
|
||||
task.key for task in self.tasks
|
||||
]
|
||||
return md5("|".join(source).encode()).hexdigest()
|
||||
|
||||
def _setup_from_config(self):
|
||||
assert self.config is not None, "Config should not be None."
|
||||
|
||||
@@ -376,7 +427,11 @@ class Crew(BaseModel):
|
||||
) -> CrewOutput:
|
||||
"""Starts the crew to work on its assigned tasks."""
|
||||
self._execution_span = self._telemetry.crew_execution_span(self, inputs)
|
||||
self._task_output_handler.reset()
|
||||
self._logging_color = "bold_purple"
|
||||
|
||||
if inputs is not None:
|
||||
self._inputs = inputs
|
||||
self._interpolate_inputs(inputs)
|
||||
self._set_tasks_callbacks()
|
||||
|
||||
@@ -403,7 +458,7 @@ class Crew(BaseModel):
|
||||
if self.process == Process.sequential:
|
||||
result = self._run_sequential_process()
|
||||
elif self.process == Process.hierarchical:
|
||||
result = self._run_hierarchical_process() # type: ignore # Incompatible types in assignment (expression has type "str | dict[str, Any]", variable has type "str")
|
||||
result = self._run_hierarchical_process()
|
||||
else:
|
||||
raise NotImplementedError(
|
||||
f"The process '{self.process}' is not implemented yet."
|
||||
@@ -440,6 +495,7 @@ class Crew(BaseModel):
|
||||
results.append(output)
|
||||
|
||||
self.usage_metrics = total_usage_metrics
|
||||
self._task_output_handler.reset()
|
||||
return results
|
||||
|
||||
async def kickoff_async(self, inputs: Optional[Dict[str, Any]] = {}) -> CrewOutput:
|
||||
@@ -488,129 +544,48 @@ class Crew(BaseModel):
|
||||
total_usage_metrics[key] += crew.usage_metrics.get(key, 0)
|
||||
|
||||
self.usage_metrics = total_usage_metrics
|
||||
|
||||
self._task_output_handler.reset()
|
||||
return results
|
||||
|
||||
def _store_execution_log(
|
||||
self,
|
||||
task: Task,
|
||||
output: TaskOutput,
|
||||
task_index: int,
|
||||
was_replayed: bool = False,
|
||||
):
|
||||
if self._inputs:
|
||||
inputs = self._inputs
|
||||
else:
|
||||
inputs = {}
|
||||
|
||||
log = {
|
||||
"task": task,
|
||||
"output": {
|
||||
"description": output.description,
|
||||
"summary": output.summary,
|
||||
"raw": output.raw,
|
||||
"pydantic": output.pydantic,
|
||||
"json_dict": output.json_dict,
|
||||
"output_format": output.output_format,
|
||||
"agent": output.agent,
|
||||
},
|
||||
"task_index": task_index,
|
||||
"inputs": inputs,
|
||||
"was_replayed": was_replayed,
|
||||
}
|
||||
self._task_output_handler.update(task_index, log)
|
||||
|
||||
def _run_sequential_process(self) -> CrewOutput:
|
||||
"""Executes tasks sequentially and returns the final output."""
|
||||
task_outputs: List[TaskOutput] = []
|
||||
futures: List[Tuple[Task, Future[TaskOutput]]] = []
|
||||
return self._execute_tasks(self.tasks)
|
||||
|
||||
for task in self.tasks:
|
||||
if task.agent and task.agent.allow_delegation:
|
||||
agents_for_delegation = [
|
||||
agent for agent in self.agents if agent != task.agent
|
||||
]
|
||||
if len(self.agents) > 1 and len(agents_for_delegation) > 0:
|
||||
delegation_tools = task.agent.get_delegation_tools(
|
||||
agents_for_delegation
|
||||
)
|
||||
|
||||
# Add tools if they are not already in task.tools
|
||||
for new_tool in delegation_tools:
|
||||
# Find the index of the tool with the same name
|
||||
existing_tool_index = next(
|
||||
(
|
||||
index
|
||||
for index, tool in enumerate(task.tools or [])
|
||||
if tool.name == new_tool.name
|
||||
),
|
||||
None,
|
||||
)
|
||||
if not task.tools:
|
||||
task.tools = []
|
||||
|
||||
if existing_tool_index is not None:
|
||||
# Replace the existing tool
|
||||
task.tools[existing_tool_index] = new_tool
|
||||
else:
|
||||
# Add the new tool
|
||||
task.tools.append(new_tool)
|
||||
|
||||
role = task.agent.role if task.agent is not None else "None"
|
||||
self._logger.log("debug", f"== Working Agent: {role}", color="bold_purple")
|
||||
self._logger.log(
|
||||
"info", f"== Starting Task: {task.description}", color="bold_purple"
|
||||
)
|
||||
|
||||
if self.output_log_file:
|
||||
self._file_handler.log(
|
||||
agent=role, task=task.description, status="started"
|
||||
)
|
||||
|
||||
if task.async_execution:
|
||||
context = (
|
||||
aggregate_raw_outputs_from_tasks(task.context)
|
||||
if task.context
|
||||
else aggregate_raw_outputs_from_task_outputs(task_outputs)
|
||||
)
|
||||
future = task.execute_async(
|
||||
agent=task.agent, context=context, tools=task.tools
|
||||
)
|
||||
futures.append((task, future))
|
||||
else:
|
||||
# Before executing a synchronous task, wait for all async tasks to complete
|
||||
if futures:
|
||||
# Clear task_outputs before processing async tasks
|
||||
task_outputs = []
|
||||
for future_task, future in futures:
|
||||
task_output = future.result()
|
||||
task_outputs.append(task_output)
|
||||
self._process_task_result(future_task, task_output)
|
||||
|
||||
# Clear the futures list after processing all async results
|
||||
futures.clear()
|
||||
|
||||
context = (
|
||||
aggregate_raw_outputs_from_tasks(task.context)
|
||||
if task.context
|
||||
else aggregate_raw_outputs_from_task_outputs(task_outputs)
|
||||
)
|
||||
task_output = task.execute_sync(
|
||||
agent=task.agent, context=context, tools=task.tools
|
||||
)
|
||||
task_outputs = [task_output]
|
||||
self._process_task_result(task, task_output)
|
||||
|
||||
if futures:
|
||||
# Clear task_outputs before processing async tasks
|
||||
task_outputs = []
|
||||
for future_task, future in futures:
|
||||
task_output = future.result()
|
||||
task_outputs.append(task_output)
|
||||
self._process_task_result(future_task, task_output)
|
||||
|
||||
# Important: There should only be one task output in the list
|
||||
# If there are more or 0, something went wrong.
|
||||
if len(task_outputs) != 1:
|
||||
raise ValueError(
|
||||
"Something went wrong. Kickoff should return only one task output."
|
||||
)
|
||||
|
||||
final_task_output = task_outputs[0]
|
||||
|
||||
final_string_output = final_task_output.raw
|
||||
self._finish_execution(final_string_output)
|
||||
|
||||
token_usage = self.calculate_usage_metrics()
|
||||
|
||||
return CrewOutput(
|
||||
raw=final_task_output.raw,
|
||||
pydantic=final_task_output.pydantic,
|
||||
json_dict=final_task_output.json_dict,
|
||||
tasks_output=[task.output for task in self.tasks if task.output],
|
||||
token_usage=token_usage,
|
||||
)
|
||||
|
||||
def _process_task_result(self, task: Task, output: TaskOutput) -> None:
|
||||
role = task.agent.role if task.agent is not None else "None"
|
||||
self._logger.log("debug", f"== [{role}] Task output: {output}\n\n")
|
||||
if self.output_log_file:
|
||||
self._file_handler.log(agent=role, task=output, status="completed")
|
||||
|
||||
# TODO: @joao, Breaking change. Changed return type. Usage metrics is included in crewoutput
|
||||
def _run_hierarchical_process(self) -> CrewOutput:
|
||||
"""Creates and assigns a manager agent to make sure the crew completes the tasks."""
|
||||
self._create_manager_agent()
|
||||
return self._execute_tasks(self.tasks, self.manager_agent)
|
||||
|
||||
def _create_manager_agent(self):
|
||||
i18n = I18N(prompt_file=self.prompt_file)
|
||||
if self.manager_agent is not None:
|
||||
self.manager_agent.allow_delegation = True
|
||||
@@ -629,74 +604,179 @@ class Crew(BaseModel):
|
||||
)
|
||||
self.manager_agent = manager
|
||||
|
||||
def _execute_tasks(
|
||||
self,
|
||||
tasks: List[Task],
|
||||
manager: Optional[BaseAgent] = None,
|
||||
start_index: Optional[int] = 0,
|
||||
was_replayed: bool = False,
|
||||
) -> CrewOutput:
|
||||
"""Executes tasks sequentially and returns the final output.
|
||||
|
||||
Args:
|
||||
tasks (List[Task]): List of tasks to execute
|
||||
manager (Optional[BaseAgent], optional): Manager agent to use for delegation. Defaults to None.
|
||||
|
||||
Returns:
|
||||
CrewOutput: Final output of the crew
|
||||
"""
|
||||
|
||||
task_outputs: List[TaskOutput] = []
|
||||
futures: List[Tuple[Task, Future[TaskOutput]]] = []
|
||||
futures: List[Tuple[Task, Future[TaskOutput], int]] = []
|
||||
last_sync_output: Optional[TaskOutput] = None
|
||||
|
||||
# TODO: IF USER OVERRIDE THE CONTEXT, PASS THAT
|
||||
for task in self.tasks:
|
||||
self._logger.log("debug", f"Working Agent: {manager.role}")
|
||||
self._logger.log("info", f"Starting Task: {task.description}")
|
||||
for task_index, task in enumerate(tasks):
|
||||
if start_index is not None and task_index < start_index:
|
||||
if task.output:
|
||||
if task.async_execution:
|
||||
task_outputs.append(task.output)
|
||||
else:
|
||||
task_outputs = [task.output]
|
||||
last_sync_output = task.output
|
||||
continue
|
||||
|
||||
if self.output_log_file:
|
||||
self._file_handler.log(
|
||||
agent=manager.role, task=task.description, status="started"
|
||||
self._prepare_task(task, manager)
|
||||
if self.process == Process.hierarchical:
|
||||
agent_to_use = manager
|
||||
else:
|
||||
agent_to_use = task.agent
|
||||
if agent_to_use is None:
|
||||
raise ValueError(
|
||||
f"No agent available for task: {task.description}. Ensure that either the task has an assigned agent or a manager agent is provided."
|
||||
)
|
||||
self._log_task_start(task, agent_to_use)
|
||||
|
||||
if isinstance(task, ConditionalTask):
|
||||
skipped_task_output = self._handle_conditional_task(
|
||||
task, task_outputs, futures, task_index, was_replayed
|
||||
)
|
||||
if skipped_task_output:
|
||||
continue
|
||||
|
||||
if task.async_execution:
|
||||
context = (
|
||||
aggregate_raw_outputs_from_tasks(task.context)
|
||||
if task.context
|
||||
else aggregate_raw_outputs_from_task_outputs(task_outputs)
|
||||
context = self._get_context(
|
||||
task, [last_sync_output] if last_sync_output else []
|
||||
)
|
||||
future = task.execute_async(
|
||||
agent=manager, context=context, tools=manager.tools
|
||||
agent=agent_to_use,
|
||||
context=context,
|
||||
tools=agent_to_use.tools,
|
||||
)
|
||||
futures.append((task, future))
|
||||
futures.append((task, future, task_index))
|
||||
else:
|
||||
# Before executing a synchronous task, wait for all async tasks to complete
|
||||
if futures:
|
||||
# Clear task_outputs before processing async tasks
|
||||
task_outputs = []
|
||||
for future_task, future in futures:
|
||||
task_output = future.result()
|
||||
task_outputs.append(task_output)
|
||||
self._process_task_result(future_task, task_output)
|
||||
|
||||
# Clear the futures list after processing all async results
|
||||
task_outputs = self._process_async_tasks(futures, was_replayed)
|
||||
futures.clear()
|
||||
|
||||
context = (
|
||||
aggregate_raw_outputs_from_tasks(task.context)
|
||||
if task.context
|
||||
else aggregate_raw_outputs_from_task_outputs(task_outputs)
|
||||
)
|
||||
context = self._get_context(task, task_outputs)
|
||||
task_output = task.execute_sync(
|
||||
agent=manager, context=context, tools=manager.tools
|
||||
agent=agent_to_use,
|
||||
context=context,
|
||||
tools=agent_to_use.tools,
|
||||
)
|
||||
task_outputs = [task_output]
|
||||
self._process_task_result(task, task_output)
|
||||
self._store_execution_log(task, task_output, task_index, was_replayed)
|
||||
|
||||
# Process any remaining async results
|
||||
if futures:
|
||||
# Clear task_outputs before processing async tasks
|
||||
task_outputs = []
|
||||
for future_task, future in futures:
|
||||
task_output = future.result()
|
||||
task_outputs.append(task_output)
|
||||
self._process_task_result(future_task, task_output)
|
||||
task_outputs = self._process_async_tasks(futures, was_replayed)
|
||||
|
||||
# Important: There should only be one task output in the list
|
||||
# If there are more or 0, something went wrong.
|
||||
return self._create_crew_output(task_outputs)
|
||||
|
||||
def _handle_conditional_task(
|
||||
self,
|
||||
task: ConditionalTask,
|
||||
task_outputs: List[TaskOutput],
|
||||
futures: List[Tuple[Task, Future[TaskOutput], int]],
|
||||
task_index: int,
|
||||
was_replayed: bool,
|
||||
) -> Optional[TaskOutput]:
|
||||
if futures:
|
||||
task_outputs = self._process_async_tasks(futures, was_replayed)
|
||||
futures.clear()
|
||||
|
||||
previous_output = task_outputs[task_index - 1] if task_outputs else None
|
||||
if previous_output is not None and not task.should_execute(previous_output):
|
||||
self._logger.log(
|
||||
"debug",
|
||||
f"Skipping conditional task: {task.description}",
|
||||
color="yellow",
|
||||
)
|
||||
skipped_task_output = task.get_skipped_task_output()
|
||||
|
||||
if not was_replayed:
|
||||
self._store_execution_log(task, skipped_task_output, task_index)
|
||||
return skipped_task_output
|
||||
return None
|
||||
|
||||
def _prepare_task(self, task: Task, manager: Optional[BaseAgent]):
|
||||
if self.process == Process.hierarchical:
|
||||
self._update_manager_tools(task, manager)
|
||||
elif task.agent and task.agent.allow_delegation:
|
||||
self._add_delegation_tools(task)
|
||||
|
||||
def _add_delegation_tools(self, task: Task):
|
||||
agents_for_delegation = [agent for agent in self.agents if agent != task.agent]
|
||||
if len(self.agents) > 1 and len(agents_for_delegation) > 0 and task.agent:
|
||||
delegation_tools = task.agent.get_delegation_tools(agents_for_delegation)
|
||||
|
||||
# Add tools if they are not already in task.tools
|
||||
for new_tool in delegation_tools:
|
||||
# Find the index of the tool with the same name
|
||||
existing_tool_index = next(
|
||||
(
|
||||
index
|
||||
for index, tool in enumerate(task.tools or [])
|
||||
if tool.name == new_tool.name
|
||||
),
|
||||
None,
|
||||
)
|
||||
if not task.tools:
|
||||
task.tools = []
|
||||
|
||||
if existing_tool_index is not None:
|
||||
# Replace the existing tool
|
||||
task.tools[existing_tool_index] = new_tool
|
||||
else:
|
||||
# Add the new tool
|
||||
task.tools.append(new_tool)
|
||||
|
||||
def _log_task_start(self, task: Task, agent: Optional[BaseAgent]):
|
||||
color = self._logging_color
|
||||
role = agent.role if agent else "None"
|
||||
self._logger.log("debug", f"== Working Agent: {role}", color=color)
|
||||
self._logger.log("info", f"== Starting Task: {task.description}", color=color)
|
||||
if self.output_log_file:
|
||||
self._file_handler.log(agent=role, task=task.description, status="started")
|
||||
|
||||
def _update_manager_tools(self, task: Task, manager: Optional[BaseAgent]):
|
||||
if task.agent and manager:
|
||||
manager.tools = task.agent.get_delegation_tools([task.agent])
|
||||
if manager:
|
||||
manager.tools = manager.get_delegation_tools(self.agents)
|
||||
|
||||
def _get_context(self, task: Task, task_outputs: List[TaskOutput]):
|
||||
context = (
|
||||
aggregate_raw_outputs_from_tasks(task.context)
|
||||
if task.context
|
||||
else aggregate_raw_outputs_from_task_outputs(task_outputs)
|
||||
)
|
||||
return context
|
||||
|
||||
def _process_task_result(self, task: Task, output: TaskOutput) -> None:
|
||||
role = task.agent.role if task.agent is not None else "None"
|
||||
self._logger.log("debug", f"== [{role}] Task output: {output}\n\n")
|
||||
if self.output_log_file:
|
||||
self._file_handler.log(agent=role, task=output, status="completed")
|
||||
|
||||
def _create_crew_output(self, task_outputs: List[TaskOutput]) -> CrewOutput:
|
||||
if len(task_outputs) != 1:
|
||||
raise ValueError(
|
||||
"Something went wrong. Kickoff should return only one task output."
|
||||
)
|
||||
|
||||
final_task_output = task_outputs[0]
|
||||
|
||||
final_string_output = final_task_output.raw
|
||||
self._finish_execution(final_string_output)
|
||||
|
||||
token_usage = self.calculate_usage_metrics()
|
||||
|
||||
return CrewOutput(
|
||||
@@ -707,6 +787,74 @@ class Crew(BaseModel):
|
||||
token_usage=token_usage,
|
||||
)
|
||||
|
||||
def _process_async_tasks(
|
||||
self,
|
||||
futures: List[Tuple[Task, Future[TaskOutput], int]],
|
||||
was_replayed: bool = False,
|
||||
) -> List[TaskOutput]:
|
||||
task_outputs = []
|
||||
for future_task, future, task_index in futures:
|
||||
task_output = future.result()
|
||||
task_outputs.append(task_output)
|
||||
self._process_task_result(future_task, task_output)
|
||||
self._store_execution_log(
|
||||
future_task, task_output, task_index, was_replayed
|
||||
)
|
||||
return task_outputs
|
||||
|
||||
def _find_task_index(
|
||||
self, task_id: str, stored_outputs: List[Any]
|
||||
) -> Optional[int]:
|
||||
return next(
|
||||
(
|
||||
index
|
||||
for (index, d) in enumerate(stored_outputs)
|
||||
if d["task_id"] == str(task_id)
|
||||
),
|
||||
None,
|
||||
)
|
||||
|
||||
def replay_from_task(
|
||||
self, task_id: str, inputs: Optional[Dict[str, Any]] = None
|
||||
) -> CrewOutput:
|
||||
stored_outputs = self._task_output_handler.load()
|
||||
if not stored_outputs:
|
||||
raise ValueError(f"Task with id {task_id} not found in the crew's tasks.")
|
||||
|
||||
start_index = self._find_task_index(task_id, stored_outputs)
|
||||
|
||||
if start_index is None:
|
||||
raise ValueError(f"Task with id {task_id} not found in the crew's tasks.")
|
||||
|
||||
replay_inputs = (
|
||||
inputs if inputs is not None else stored_outputs[start_index]["inputs"]
|
||||
)
|
||||
self._inputs = replay_inputs
|
||||
|
||||
if replay_inputs:
|
||||
self._interpolate_inputs(replay_inputs)
|
||||
|
||||
if self.process == Process.hierarchical:
|
||||
self._create_manager_agent()
|
||||
|
||||
for i in range(start_index):
|
||||
stored_output = stored_outputs[i][
|
||||
"output"
|
||||
] # for adding context to the task
|
||||
task_output = TaskOutput(
|
||||
description=stored_output["description"],
|
||||
agent=stored_output["agent"],
|
||||
raw=stored_output["raw"],
|
||||
pydantic=stored_output["pydantic"],
|
||||
json_dict=stored_output["json_dict"],
|
||||
output_format=stored_output["output_format"],
|
||||
)
|
||||
self.tasks[i].output = task_output
|
||||
|
||||
self._logging_color = "bold_blue"
|
||||
result = self._execute_tasks(self.tasks, self.manager_agent, start_index, True)
|
||||
return result
|
||||
|
||||
def copy(self):
|
||||
"""Create a deep copy of the Crew."""
|
||||
|
||||
|
||||
@@ -24,18 +24,6 @@ class CrewOutput(BaseModel):
|
||||
description="Processed token summary", default={}
|
||||
)
|
||||
|
||||
# TODO: Joao - Adding this safety check breakes when people want to see
|
||||
# The full output of a CrewOutput.
|
||||
# @property
|
||||
# def pydantic(self) -> Optional[BaseModel]:
|
||||
# # Check if the final task output included a pydantic model
|
||||
# if self.tasks_output[-1].output_format != OutputFormat.PYDANTIC:
|
||||
# raise ValueError(
|
||||
# "No pydantic model found in the final task. Please make sure to set the output_pydantic property in the final task in your crew."
|
||||
# )
|
||||
|
||||
# return self._pydantic
|
||||
|
||||
@property
|
||||
def json(self) -> Optional[str]:
|
||||
if self.tasks_output[-1].output_format != OutputFormat.JSON:
|
||||
@@ -46,11 +34,13 @@ class CrewOutput(BaseModel):
|
||||
return json.dumps(self.json_dict)
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert json_output and pydantic_output to a dictionary."""
|
||||
output_dict = {}
|
||||
if self.json_dict:
|
||||
return self.json_dict
|
||||
if self.pydantic:
|
||||
return self.pydantic.model_dump()
|
||||
raise ValueError("No output to convert to dictionary")
|
||||
output_dict.update(self.json_dict)
|
||||
elif self.pydantic:
|
||||
output_dict.update(self.pydantic.model_dump())
|
||||
return output_dict
|
||||
|
||||
def __str__(self):
|
||||
if self.pydantic:
|
||||
|
||||
166
src/crewai/memory/storage/kickoff_task_outputs_storage.py
Normal file
166
src/crewai/memory/storage/kickoff_task_outputs_storage.py
Normal file
@@ -0,0 +1,166 @@
|
||||
import json
|
||||
import sqlite3
|
||||
from typing import Any, Dict, List, Optional
|
||||
|
||||
from crewai.task import Task
|
||||
from crewai.utilities import Printer
|
||||
from crewai.utilities.crew_json_encoder import CrewJSONEncoder
|
||||
from crewai.utilities.paths import db_storage_path
|
||||
|
||||
|
||||
class KickoffTaskOutputsSQLiteStorage:
|
||||
"""
|
||||
An updated SQLite storage class for kickoff task outputs storage.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self, db_path: str = f"{db_storage_path()}/latest_kickoff_task_outputs.db"
|
||||
) -> None:
|
||||
self.db_path = db_path
|
||||
self._printer: Printer = Printer()
|
||||
self._initialize_db()
|
||||
|
||||
def _initialize_db(self):
|
||||
"""
|
||||
Initializes the SQLite database and creates LTM table
|
||||
"""
|
||||
try:
|
||||
with sqlite3.connect(self.db_path) as conn:
|
||||
cursor = conn.cursor()
|
||||
cursor.execute(
|
||||
"""
|
||||
CREATE TABLE IF NOT EXISTS latest_kickoff_task_outputs (
|
||||
task_id TEXT PRIMARY KEY,
|
||||
expected_output TEXT,
|
||||
output JSON,
|
||||
task_index INTEGER,
|
||||
inputs JSON,
|
||||
was_replayed BOOLEAN,
|
||||
timestamp DATETIME DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
"""
|
||||
)
|
||||
|
||||
conn.commit()
|
||||
except sqlite3.Error as e:
|
||||
self._printer.print(
|
||||
content=f"SAVING KICKOFF TASK OUTPUTS ERROR: An error occurred during database initialization: {e}",
|
||||
color="red",
|
||||
)
|
||||
|
||||
def add(
|
||||
self,
|
||||
task: Task,
|
||||
output: Dict[str, Any],
|
||||
task_index: int,
|
||||
was_replayed: bool = False,
|
||||
inputs: Dict[str, Any] = {},
|
||||
):
|
||||
try:
|
||||
with sqlite3.connect(self.db_path) as conn:
|
||||
cursor = conn.cursor()
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT OR REPLACE INTO latest_kickoff_task_outputs
|
||||
(task_id, expected_output, output, task_index, inputs, was_replayed)
|
||||
VALUES (?, ?, ?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
str(task.id),
|
||||
task.expected_output,
|
||||
json.dumps(output, cls=CrewJSONEncoder),
|
||||
task_index,
|
||||
json.dumps(inputs),
|
||||
was_replayed,
|
||||
),
|
||||
)
|
||||
conn.commit()
|
||||
except sqlite3.Error as e:
|
||||
self._printer.print(
|
||||
content=f"SAVING KICKOFF TASK OUTPUTS ERROR: An error occurred during database initialization: {e}",
|
||||
color="red",
|
||||
)
|
||||
|
||||
def update(
|
||||
self,
|
||||
task_index: int,
|
||||
**kwargs,
|
||||
):
|
||||
"""
|
||||
Updates an existing row in the latest_kickoff_task_outputs table based on task_index.
|
||||
"""
|
||||
try:
|
||||
with sqlite3.connect(self.db_path) as conn:
|
||||
cursor = conn.cursor()
|
||||
|
||||
fields = []
|
||||
values = []
|
||||
for key, value in kwargs.items():
|
||||
fields.append(f"{key} = ?")
|
||||
values.append(
|
||||
json.dumps(value, cls=CrewJSONEncoder)
|
||||
if isinstance(value, dict)
|
||||
else value
|
||||
)
|
||||
|
||||
query = f"UPDATE latest_kickoff_task_outputs SET {', '.join(fields)} WHERE task_index = ?"
|
||||
values.append(task_index)
|
||||
|
||||
cursor.execute(query, tuple(values))
|
||||
conn.commit()
|
||||
|
||||
if cursor.rowcount == 0:
|
||||
self._printer.print(
|
||||
f"No row found with task_index {task_index}. No update performed.",
|
||||
color="red",
|
||||
)
|
||||
except sqlite3.Error as e:
|
||||
self._printer.print(f"UPDATE KICKOFF TASK OUTPUTS ERROR: {e}", color="red")
|
||||
|
||||
def load(self) -> Optional[List[Dict[str, Any]]]:
|
||||
try:
|
||||
with sqlite3.connect(self.db_path) as conn:
|
||||
cursor = conn.cursor()
|
||||
cursor.execute("""
|
||||
SELECT *
|
||||
FROM latest_kickoff_task_outputs
|
||||
ORDER BY task_index
|
||||
""")
|
||||
|
||||
rows = cursor.fetchall()
|
||||
results = []
|
||||
for row in rows:
|
||||
result = {
|
||||
"task_id": row[0],
|
||||
"expected_output": row[1],
|
||||
"output": json.loads(row[2]),
|
||||
"task_index": row[3],
|
||||
"inputs": json.loads(row[4]),
|
||||
"was_replayed": row[5],
|
||||
"timestamp": row[6],
|
||||
}
|
||||
results.append(result)
|
||||
|
||||
return results
|
||||
|
||||
except sqlite3.Error as e:
|
||||
self._printer.print(
|
||||
content=f"LOADING KICKOFF TASK OUTPUTS ERROR: An error occurred while querying kickoff task outputs: {e}",
|
||||
color="red",
|
||||
)
|
||||
return None
|
||||
|
||||
def delete_all(self):
|
||||
"""
|
||||
Deletes all rows from the latest_kickoff_task_outputs table.
|
||||
"""
|
||||
try:
|
||||
with sqlite3.connect(self.db_path) as conn:
|
||||
cursor = conn.cursor()
|
||||
cursor.execute("DELETE FROM latest_kickoff_task_outputs")
|
||||
conn.commit()
|
||||
except sqlite3.Error as e:
|
||||
self._printer.print(
|
||||
content=f"ERROR: Failed to delete all kickoff task outputs: {e}",
|
||||
color="red",
|
||||
)
|
||||
@@ -5,8 +5,10 @@ import threading
|
||||
import uuid
|
||||
from concurrent.futures import Future
|
||||
from copy import copy
|
||||
from hashlib import md5
|
||||
from typing import Any, Dict, List, Optional, Tuple, Type, Union
|
||||
|
||||
|
||||
from langchain_openai import ChatOpenAI
|
||||
from opentelemetry.trace import Span
|
||||
from pydantic import UUID4, BaseModel, Field, field_validator, model_validator
|
||||
@@ -173,6 +175,14 @@ class Task(BaseModel):
|
||||
"""Execute the task synchronously."""
|
||||
return self._execute_core(agent, context, tools)
|
||||
|
||||
@property
|
||||
def key(self) -> str:
|
||||
description = self._original_description or self.description
|
||||
expected_output = self._original_expected_output or self.expected_output
|
||||
source = [description, expected_output]
|
||||
|
||||
return md5("|".join(source).encode()).hexdigest()
|
||||
|
||||
def execute_async(
|
||||
self,
|
||||
agent: BaseAgent | None = None,
|
||||
@@ -204,6 +214,7 @@ class Task(BaseModel):
|
||||
tools: Optional[List[Any]],
|
||||
) -> TaskOutput:
|
||||
"""Run the core execution logic of the task."""
|
||||
self.agent = agent
|
||||
agent = agent or self.agent
|
||||
if not agent:
|
||||
raise Exception(
|
||||
@@ -237,14 +248,16 @@ class Task(BaseModel):
|
||||
self.callback(self.output)
|
||||
|
||||
if self._execution_span:
|
||||
self._telemetry.task_ended(self._execution_span, self)
|
||||
self._telemetry.task_ended(self._execution_span, self, agent.crew)
|
||||
self._execution_span = None
|
||||
|
||||
if self.output_file:
|
||||
content = (
|
||||
json_output
|
||||
if json_output
|
||||
else pydantic_output.model_dump_json() if pydantic_output else result
|
||||
else pydantic_output.model_dump_json()
|
||||
if pydantic_output
|
||||
else result
|
||||
)
|
||||
self._save_file(content)
|
||||
|
||||
@@ -378,7 +391,7 @@ class Task(BaseModel):
|
||||
def _convert_with_instructions(
|
||||
self, result: str, model: Type[BaseModel]
|
||||
) -> Union[dict, BaseModel, str]:
|
||||
llm = self.agent.function_calling_llm or self.agent.llm
|
||||
llm = self.agent.function_calling_llm or self.agent.llm # type: ignore # Item "None" of "BaseAgent | None" has no attribute "function_calling_llm"
|
||||
instructions = self._get_conversion_instructions(model, llm)
|
||||
|
||||
converter = self._create_converter(
|
||||
|
||||
47
src/crewai/tasks/conditional_task.py
Normal file
47
src/crewai/tasks/conditional_task.py
Normal file
@@ -0,0 +1,47 @@
|
||||
from typing import Any, Callable
|
||||
|
||||
from pydantic import Field
|
||||
|
||||
from crewai.task import Task
|
||||
from crewai.tasks.output_format import OutputFormat
|
||||
from crewai.tasks.task_output import TaskOutput
|
||||
|
||||
|
||||
class ConditionalTask(Task):
|
||||
"""
|
||||
A task that can be conditionally executed based on the output of another task.
|
||||
Note: This cannot be the only task you have in your crew and cannot be the first since its needs context from the previous task.
|
||||
"""
|
||||
|
||||
condition: Callable[[TaskOutput], bool] = Field(
|
||||
default=None,
|
||||
description="Maximum number of retries for an agent to execute a task when an error occurs.",
|
||||
)
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
condition: Callable[[Any], bool],
|
||||
**kwargs,
|
||||
):
|
||||
super().__init__(**kwargs)
|
||||
self.condition = condition
|
||||
|
||||
def should_execute(self, context: TaskOutput) -> bool:
|
||||
"""
|
||||
Determines whether the conditional task should be executed based on the provided context.
|
||||
|
||||
Args:
|
||||
context (Any): The context or output from the previous task that will be evaluated by the condition.
|
||||
|
||||
Returns:
|
||||
bool: True if the task should be executed, False otherwise.
|
||||
"""
|
||||
return self.condition(context)
|
||||
|
||||
def get_skipped_task_output(self):
|
||||
return TaskOutput(
|
||||
description=self.description,
|
||||
raw="",
|
||||
agent=self.agent.role if self.agent else "",
|
||||
output_format=OutputFormat.RAW,
|
||||
)
|
||||
@@ -11,9 +11,7 @@ class TaskOutput(BaseModel):
|
||||
|
||||
description: str = Field(description="Description of the task")
|
||||
summary: Optional[str] = Field(description="Summary of the task", default=None)
|
||||
raw: str = Field(
|
||||
description="Raw output of the task", default=""
|
||||
) # TODO: @joao: breaking change, by renaming raw_output to raw, but now consistent with CrewOutput
|
||||
raw: str = Field(description="Raw output of the task", default="")
|
||||
pydantic: Optional[BaseModel] = Field(
|
||||
description="Pydantic output of task", default=None
|
||||
)
|
||||
@@ -32,22 +30,6 @@ class TaskOutput(BaseModel):
|
||||
self.summary = f"{excerpt}..."
|
||||
return self
|
||||
|
||||
# TODO: Joao - Adding this safety check breakes when people want to see
|
||||
# The full output of a TaskOutput or CrewOutput.
|
||||
# @property
|
||||
# def pydantic(self) -> Optional[BaseModel]:
|
||||
# # Check if the final task output included a pydantic model
|
||||
# if self.output_format != OutputFormat.PYDANTIC:
|
||||
# raise ValueError(
|
||||
# """
|
||||
# Invalid output format requested.
|
||||
# If you would like to access the pydantic model,
|
||||
# please make sure to set the output_pydantic property for the task.
|
||||
# """
|
||||
# )
|
||||
|
||||
# return self._pydantic
|
||||
|
||||
@property
|
||||
def json(self) -> Optional[str]:
|
||||
if self.output_format != OutputFormat.JSON:
|
||||
@@ -66,7 +48,7 @@ class TaskOutput(BaseModel):
|
||||
output_dict = {}
|
||||
if self.json_dict:
|
||||
output_dict.update(self.json_dict)
|
||||
if self.pydantic:
|
||||
elif self.pydantic:
|
||||
output_dict.update(self.pydantic.model_dump())
|
||||
return output_dict
|
||||
|
||||
|
||||
@@ -92,13 +92,8 @@ class Telemetry:
|
||||
pkg_resources.get_distribution("crewai").version,
|
||||
)
|
||||
self._add_attribute(span, "python_version", platform.python_version())
|
||||
self._add_attribute(span, "crew_key", crew.key)
|
||||
self._add_attribute(span, "crew_id", str(crew.id))
|
||||
|
||||
if crew.share_crew:
|
||||
self._add_attribute(
|
||||
span, "crew_inputs", json.dumps(inputs) if inputs else None
|
||||
)
|
||||
|
||||
self._add_attribute(span, "crew_process", crew.process)
|
||||
self._add_attribute(span, "crew_memory", crew.memory)
|
||||
self._add_attribute(span, "crew_number_of_tasks", len(crew.tasks))
|
||||
@@ -109,6 +104,7 @@ class Telemetry:
|
||||
json.dumps(
|
||||
[
|
||||
{
|
||||
"key": agent.key,
|
||||
"id": str(agent.id),
|
||||
"role": agent.role,
|
||||
"goal": agent.goal,
|
||||
@@ -133,12 +129,14 @@ class Telemetry:
|
||||
json.dumps(
|
||||
[
|
||||
{
|
||||
"key": task.key,
|
||||
"id": str(task.id),
|
||||
"description": task.description,
|
||||
"expected_output": task.expected_output,
|
||||
"async_execution?": task.async_execution,
|
||||
"human_input?": task.human_input,
|
||||
"agent_role": task.agent.role if task.agent else "None",
|
||||
"agent_key": task.agent.key if task.agent else None,
|
||||
"context": (
|
||||
[task.description for task in task.context]
|
||||
if task.context
|
||||
@@ -157,6 +155,12 @@ class Telemetry:
|
||||
self._add_attribute(span, "platform_system", platform.system())
|
||||
self._add_attribute(span, "platform_version", platform.version())
|
||||
self._add_attribute(span, "cpus", os.cpu_count())
|
||||
|
||||
if crew.share_crew:
|
||||
self._add_attribute(
|
||||
span, "crew_inputs", json.dumps(inputs) if inputs else None
|
||||
)
|
||||
|
||||
span.set_status(Status(StatusCode.OK))
|
||||
span.end()
|
||||
except Exception:
|
||||
@@ -170,8 +174,9 @@ class Telemetry:
|
||||
|
||||
created_span = tracer.start_span("Task Created")
|
||||
|
||||
self._add_attribute(created_span, "crew_key", crew.key)
|
||||
self._add_attribute(created_span, "crew_id", str(crew.id))
|
||||
self._add_attribute(created_span, "task_index", crew.tasks.index(task))
|
||||
self._add_attribute(created_span, "task_key", task.key)
|
||||
self._add_attribute(created_span, "task_id", str(task.id))
|
||||
|
||||
if crew.share_crew:
|
||||
@@ -187,8 +192,9 @@ class Telemetry:
|
||||
|
||||
span = tracer.start_span("Task Execution")
|
||||
|
||||
self._add_attribute(span, "crew_key", crew.key)
|
||||
self._add_attribute(span, "crew_id", str(crew.id))
|
||||
self._add_attribute(span, "task_index", crew.tasks.index(task))
|
||||
self._add_attribute(span, "task_key", task.key)
|
||||
self._add_attribute(span, "task_id", str(task.id))
|
||||
|
||||
if crew.share_crew:
|
||||
@@ -203,13 +209,16 @@ class Telemetry:
|
||||
|
||||
return None
|
||||
|
||||
def task_ended(self, span: Span, task: Task):
|
||||
def task_ended(self, span: Span, task: Task, crew: Crew):
|
||||
"""Records task execution in a crew."""
|
||||
if self.ready:
|
||||
try:
|
||||
self._add_attribute(
|
||||
span, "output", task.output.raw_output if task.output else ""
|
||||
)
|
||||
if crew.share_crew:
|
||||
self._add_attribute(
|
||||
span,
|
||||
"task_output",
|
||||
task.output.raw if task.output else "",
|
||||
)
|
||||
|
||||
span.set_status(Status(StatusCode.OK))
|
||||
span.end()
|
||||
@@ -284,10 +293,10 @@ class Telemetry:
|
||||
"""Records the complete execution of a crew.
|
||||
This is only collected if the user has opted-in to share the crew.
|
||||
"""
|
||||
self.crew_creation(crew, inputs)
|
||||
|
||||
if (self.ready) and (crew.share_crew):
|
||||
try:
|
||||
self.crew_creation(crew, inputs)
|
||||
|
||||
tracer = trace.get_tracer("crewai.telemetry")
|
||||
span = tracer.start_span("Crew Execution")
|
||||
self._add_attribute(
|
||||
@@ -295,6 +304,7 @@ class Telemetry:
|
||||
"crewai_version",
|
||||
pkg_resources.get_distribution("crewai").version,
|
||||
)
|
||||
self._add_attribute(span, "crew_key", crew.key)
|
||||
self._add_attribute(span, "crew_id", str(crew.id))
|
||||
self._add_attribute(
|
||||
span, "crew_inputs", json.dumps(inputs) if inputs else None
|
||||
@@ -305,6 +315,7 @@ class Telemetry:
|
||||
json.dumps(
|
||||
[
|
||||
{
|
||||
"key": agent.key,
|
||||
"id": str(agent.id),
|
||||
"role": agent.role,
|
||||
"goal": agent.goal,
|
||||
@@ -335,6 +346,7 @@ class Telemetry:
|
||||
"async_execution?": task.async_execution,
|
||||
"human_input?": task.human_input,
|
||||
"agent_role": task.agent.role if task.agent else "None",
|
||||
"agent_key": task.agent.key if task.agent else None,
|
||||
"context": (
|
||||
[task.description for task in task.context]
|
||||
if task.context
|
||||
|
||||
@@ -151,16 +151,12 @@ class ToolUsage:
|
||||
for k, v in calling.arguments.items()
|
||||
if k in acceptable_args
|
||||
}
|
||||
result = tool._run(**arguments)
|
||||
result = tool.invoke(input=arguments)
|
||||
except Exception:
|
||||
if tool.args_schema:
|
||||
arguments = calling.arguments
|
||||
result = tool._run(**arguments)
|
||||
else:
|
||||
arguments = calling.arguments.values() # type: ignore # Incompatible types in assignment (expression has type "dict_values[str, Any]", variable has type "dict[str, Any]")
|
||||
result = tool._run(*arguments)
|
||||
arguments = calling.arguments
|
||||
result = tool.invoke(input=arguments)
|
||||
else:
|
||||
result = tool._run()
|
||||
result = tool.invoke(input={})
|
||||
except Exception as e:
|
||||
self._run_attempts += 1
|
||||
if self._run_attempts > self._max_parsing_attempts:
|
||||
|
||||
31
src/crewai/utilities/crew_json_encoder.py
Normal file
31
src/crewai/utilities/crew_json_encoder.py
Normal file
@@ -0,0 +1,31 @@
|
||||
from datetime import datetime
|
||||
import json
|
||||
from uuid import UUID
|
||||
from pydantic import BaseModel
|
||||
|
||||
|
||||
class CrewJSONEncoder(json.JSONEncoder):
|
||||
def default(self, obj):
|
||||
if isinstance(obj, BaseModel):
|
||||
return self._handle_pydantic_model(obj)
|
||||
elif isinstance(obj, UUID):
|
||||
return str(obj)
|
||||
|
||||
elif isinstance(obj, datetime):
|
||||
return obj.isoformat()
|
||||
return super().default(obj)
|
||||
|
||||
def _handle_pydantic_model(self, obj):
|
||||
try:
|
||||
data = obj.model_dump()
|
||||
# Remove circular references
|
||||
for key, value in data.items():
|
||||
if isinstance(value, BaseModel):
|
||||
data[key] = str(
|
||||
value
|
||||
) # Convert nested models to string representation
|
||||
return data
|
||||
except RecursionError:
|
||||
return str(
|
||||
obj
|
||||
) # Fall back to string representation if circular reference is detected
|
||||
@@ -1,5 +1,7 @@
|
||||
import os
|
||||
import pickle
|
||||
|
||||
|
||||
from datetime import datetime
|
||||
|
||||
|
||||
|
||||
@@ -8,6 +8,10 @@ class Printer:
|
||||
self._print_bold_green(content)
|
||||
elif color == "bold_purple":
|
||||
self._print_bold_purple(content)
|
||||
elif color == "bold_blue":
|
||||
self._print_bold_blue(content)
|
||||
elif color == "yellow":
|
||||
self._print_yellow(content)
|
||||
else:
|
||||
print(content)
|
||||
|
||||
@@ -22,3 +26,9 @@ class Printer:
|
||||
|
||||
def _print_red(self, content):
|
||||
print("\033[91m {}\033[00m".format(content))
|
||||
|
||||
def _print_bold_blue(self, content):
|
||||
print("\033[1m\033[94m {}\033[00m".format(content))
|
||||
|
||||
def _print_yellow(self, content):
|
||||
print("\033[93m {}\033[00m".format(content))
|
||||
|
||||
61
src/crewai/utilities/task_output_storage_handler.py
Normal file
61
src/crewai/utilities/task_output_storage_handler.py
Normal file
@@ -0,0 +1,61 @@
|
||||
from pydantic import BaseModel, Field
|
||||
from datetime import datetime
|
||||
from typing import Dict, Any, Optional, List
|
||||
from crewai.memory.storage.kickoff_task_outputs_storage import (
|
||||
KickoffTaskOutputsSQLiteStorage,
|
||||
)
|
||||
from crewai.task import Task
|
||||
|
||||
|
||||
class ExecutionLog(BaseModel):
|
||||
task_id: str
|
||||
expected_output: Optional[str] = None
|
||||
output: Dict[str, Any]
|
||||
timestamp: datetime = Field(default_factory=datetime.now)
|
||||
task_index: int
|
||||
inputs: Dict[str, Any] = Field(default_factory=dict)
|
||||
was_replayed: bool = False
|
||||
|
||||
def __getitem__(self, key: str) -> Any:
|
||||
return getattr(self, key)
|
||||
|
||||
|
||||
class TaskOutputStorageHandler:
|
||||
def __init__(self) -> None:
|
||||
self.storage = KickoffTaskOutputsSQLiteStorage()
|
||||
|
||||
def update(self, task_index: int, log: Dict[str, Any]):
|
||||
saved_outputs = self.load()
|
||||
if saved_outputs is None:
|
||||
raise ValueError("Logs cannot be None")
|
||||
|
||||
if log.get("was_replayed", False):
|
||||
replayed = {
|
||||
"task_id": str(log["task"].id),
|
||||
"expected_output": log["task"].expected_output,
|
||||
"output": log["output"],
|
||||
"was_replayed": log["was_replayed"],
|
||||
"inputs": log["inputs"],
|
||||
}
|
||||
self.storage.update(
|
||||
task_index,
|
||||
**replayed,
|
||||
)
|
||||
else:
|
||||
self.storage.add(**log)
|
||||
|
||||
def add(
|
||||
self,
|
||||
task: Task,
|
||||
output: Dict[str, Any],
|
||||
task_index: int,
|
||||
inputs: Dict[str, Any] = {},
|
||||
was_replayed: bool = False,
|
||||
):
|
||||
self.storage.add(task, output, task_index, was_replayed, inputs)
|
||||
|
||||
def reset(self):
|
||||
self.storage.delete_all()
|
||||
|
||||
def load(self) -> Optional[List[Dict[str, Any]]]:
|
||||
return self.storage.load()
|
||||
36
tests/agents/agent_builder/base_agent_test.py
Normal file
36
tests/agents/agent_builder/base_agent_test.py
Normal file
@@ -0,0 +1,36 @@
|
||||
import hashlib
|
||||
from typing import Any, List, Optional
|
||||
|
||||
from crewai.agents.agent_builder.base_agent import BaseAgent
|
||||
from pydantic import BaseModel
|
||||
|
||||
|
||||
class TestAgent(BaseAgent):
|
||||
def execute_task(
|
||||
self,
|
||||
task: Any,
|
||||
context: Optional[str] = None,
|
||||
tools: Optional[List[Any]] = None,
|
||||
) -> str:
|
||||
return ""
|
||||
|
||||
def create_agent_executor(self, tools=None) -> None: ...
|
||||
|
||||
def _parse_tools(self, tools: List[Any]) -> List[Any]:
|
||||
return []
|
||||
|
||||
def get_delegation_tools(self, agents: List["BaseAgent"]): ...
|
||||
|
||||
def get_output_converter(
|
||||
self, llm: Any, text: str, model: type[BaseModel] | None, instructions: str
|
||||
): ...
|
||||
|
||||
|
||||
def test_key():
|
||||
agent = TestAgent(
|
||||
role="test role",
|
||||
goal="test goal",
|
||||
backstory="test backstory",
|
||||
)
|
||||
hash = hashlib.md5("test role|test goal|test backstory".encode()).hexdigest()
|
||||
assert agent.key == hash
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
3098
tests/cassettes/test_conditional_task_last_task.yaml
Normal file
3098
tests/cassettes/test_conditional_task_last_task.yaml
Normal file
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,151 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are Researcher. You''re an expert researcher,
|
||||
specialized in technology, software engineering, AI and startups. You work as
|
||||
a freelancer and is now working on doing research and analysis for a new customer.\nYour
|
||||
personal goal is: Make the best research and analysis on content about AI and
|
||||
AI agentsTo give my best complete final answer to the task use the exact following
|
||||
format:\n\nThought: I now can give a great answer\nFinal Answer: my best complete
|
||||
final answer to the task.\nYour final answer must be the great and the most
|
||||
complete as possible, it must be outcome described.\n\nI MUST use these formats,
|
||||
my job depends on it!\nCurrent Task: Say Hi\n\nThis is the expect criteria for
|
||||
your final answer: Hi \n you MUST return the actual complete content as the
|
||||
final answer, not a summary.\n\nBegin! This is VERY important to you, use the
|
||||
tools available and give your best Final Answer, your job depends on it!\n\nThought:\n",
|
||||
"role": "user"}], "model": "gpt-4o", "n": 1, "stop": ["\nObservation"], "stream":
|
||||
true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1072'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- _cfuvid=Nxp5RE8CN2EyHxMIvB_SaTizIH5w0eWt9SPilMuIjMk-1721227661802-0.0.1.1-604800000;
|
||||
__cf_bm=jadAYV2gh7qPDzgKO9A4JzJTaI9c2fnnjxloIQZeOIw-1721227661-1.0.1.1-apaA8kQyGiEV3kOuXHe8z1zeyvxd_jBHCQpdqWirUlylrUo.uRZjRDueI.sSXS4hXoWkyIW6kIMt7lamQM2mdw
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9m0FB4Oy6X0apYX2wcQ30rTBkmtxT","object":"chat.completion.chunk","created":1721227709,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_c4e5b6fa31","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9m0FB4Oy6X0apYX2wcQ30rTBkmtxT","object":"chat.completion.chunk","created":1721227709,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_c4e5b6fa31","choices":[{"index":0,"delta":{"content":"I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9m0FB4Oy6X0apYX2wcQ30rTBkmtxT","object":"chat.completion.chunk","created":1721227709,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_c4e5b6fa31","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9m0FB4Oy6X0apYX2wcQ30rTBkmtxT","object":"chat.completion.chunk","created":1721227709,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_c4e5b6fa31","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9m0FB4Oy6X0apYX2wcQ30rTBkmtxT","object":"chat.completion.chunk","created":1721227709,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_c4e5b6fa31","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9m0FB4Oy6X0apYX2wcQ30rTBkmtxT","object":"chat.completion.chunk","created":1721227709,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_c4e5b6fa31","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9m0FB4Oy6X0apYX2wcQ30rTBkmtxT","object":"chat.completion.chunk","created":1721227709,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_c4e5b6fa31","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9m0FB4Oy6X0apYX2wcQ30rTBkmtxT","object":"chat.completion.chunk","created":1721227709,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_c4e5b6fa31","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9m0FB4Oy6X0apYX2wcQ30rTBkmtxT","object":"chat.completion.chunk","created":1721227709,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_c4e5b6fa31","choices":[{"index":0,"delta":{"content":".\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9m0FB4Oy6X0apYX2wcQ30rTBkmtxT","object":"chat.completion.chunk","created":1721227709,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_c4e5b6fa31","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9m0FB4Oy6X0apYX2wcQ30rTBkmtxT","object":"chat.completion.chunk","created":1721227709,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_c4e5b6fa31","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9m0FB4Oy6X0apYX2wcQ30rTBkmtxT","object":"chat.completion.chunk","created":1721227709,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_c4e5b6fa31","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9m0FB4Oy6X0apYX2wcQ30rTBkmtxT","object":"chat.completion.chunk","created":1721227709,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_c4e5b6fa31","choices":[{"index":0,"delta":{"content":"
|
||||
Hi"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9m0FB4Oy6X0apYX2wcQ30rTBkmtxT","object":"chat.completion.chunk","created":1721227709,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_c4e5b6fa31","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8a4b087e7d024593-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Wed, 17 Jul 2024 14:48:29 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '142'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=15552000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '30000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '29999753'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_75100793afc289eaf8b56127e1cc0532
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -1,296 +1,266 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"messages": [{"role": "user", "content": "You are test role. test backstory\nYour
|
||||
personal goal is: test goal\n\nYou ONLY have access to the following tools,
|
||||
and should NEVER make up tools that are not listed here:\n\nmultiplier: multiplier(first_number:
|
||||
int, second_number: int) -> float - Useful for when you need to multiply two
|
||||
numbers together.\n\nUse the following format:\n\nThought: you should always
|
||||
think about what to do\nAction: the action to take, only one name of [multiplier],
|
||||
just the name, exactly as it''s written.\nAction Input: the input to the action,
|
||||
just a simple a python dictionary using \" to wrap keys and values.\nObservation:
|
||||
the result of the action\n\nOnce all necessary information is gathered:\n\nThought:
|
||||
I now know the final answer\nFinal Answer: the final answer to the original
|
||||
input question\n\n\nCurrent Task: What is 3 times 4?\n\nThis is the expect criteria
|
||||
for your final answer: The result of the multiplication. \n you MUST return
|
||||
the actual complete content as the final answer, not a summary.\n\nBegin! This
|
||||
is VERY important to you, use the tools available and give your best Final Answer,
|
||||
your job depends on it!\n\nThought: \n"}], "model": "gpt-4", "n": 1, "stop":
|
||||
body: '{"messages": [{"content": "You are test role. test backstory\nYour personal
|
||||
goal is: test goal\nYou ONLY have access to the following tools, and should
|
||||
NEVER make up tools that are not listed here:\n\nmultiplier(first_number: int,
|
||||
second_number: int) -> float - Useful for when you need to multiply two numbers
|
||||
together.\n\nUse the following format:\n\nThought: you should always think about
|
||||
what to do\nAction: the action to take, only one name of [multiplier], just
|
||||
the name, exactly as it''s written.\nAction Input: the input to the action,
|
||||
just a simple python dictionary, enclosed in curly braces, using \" to wrap
|
||||
keys and values.\nObservation: the result of the action\n\nOnce all necessary
|
||||
information is gathered:\n\nThought: I now know the final answer\nFinal Answer:
|
||||
the final answer to the original input question\n\nCurrent Task: What is 3 times
|
||||
4?\n\nThis is the expect criteria for your final answer: The result of the multiplication.
|
||||
\n you MUST return the actual complete content as the final answer, not a summary.\n\nBegin!
|
||||
This is VERY important to you, use the tools available and give your best Final
|
||||
Answer, your job depends on it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o",
|
||||
"n": 1, "stop": ["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1277'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
need"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
to"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
determine"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
product"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"3"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"4"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":".\n\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"Action"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
multiplier"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"Action"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
Input"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
{\""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"first"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"_number"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"\":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"3"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":","},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
\""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"second"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"_number"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"\":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"4"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"}"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFmdFjejvg8ErQVjcpCsY8q7QbBI","object":"chat.completion.chunk","created":1720810787,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8a2345bd1ffd742e-MIA
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Fri, 12 Jul 2024 18:59:47 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=EvHnbspWLzEWYmRV1sLsvFlp1S5ePQd_KIGldEvula4-1720810787-1.0.1.1-fcfgfphTZearpEpqAdn5vCov8FO3hERf4Zij0dZmjoTuHkfcpXthynLGlq2sBt7SpE72ogziXHDlNZsSvmBQzA;
|
||||
path=/; expires=Fri, 12-Jul-24 19:29:47 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=QoRToNVlfxPsZucAm6jmW5xUqoEucDbQTYK4SkSwmUc-1720810787746-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '90'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '22000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '21999703'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_f9575c9cc6494a463ddd5681e599b56d
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are test role. test backstory\nYour personal
|
||||
goal is: test goal\nYou ONLY have access to the following tools, and should
|
||||
NEVER make up tools that are not listed here:\n\nmultiplier(first_number: int,
|
||||
second_number: int) -> float - Useful for when you need to multiply two numbers
|
||||
together.\n\nUse the following format:\n\nThought: you should always think about
|
||||
what to do\nAction: the action to take, only one name of [multiplier], just
|
||||
the name, exactly as it''s written.\nAction Input: the input to the action,
|
||||
just a simple python dictionary, enclosed in curly braces, using \" to wrap
|
||||
keys and values.\nObservation: the result of the action\n\nOnce all necessary
|
||||
information is gathered:\n\nThought: I now know the final answer\nFinal Answer:
|
||||
the final answer to the original input question\n\nCurrent Task: What is 3 times
|
||||
4?\n\nThis is the expect criteria for your final answer: The result of the multiplication.
|
||||
\n you MUST return the actual complete content as the final answer, not a summary.\n\nBegin!
|
||||
This is VERY important to you, use the tools available and give your best Final
|
||||
Answer, your job depends on it!\n\nThought:\nI need to determine the product
|
||||
of 3 and 4.\n\nAction: multiplier\nAction Input: {\"first_number\": 3, \"second_number\":
|
||||
4}\nObservation: 12\n", "role": "user"}], "model": "gpt-4o", "n": 1, "stop":
|
||||
["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1268'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.12.0
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.12.0
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
need"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
to"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
calculate"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
product"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"3"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"4"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":".\n\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"Action"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"multi"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"plier"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"\n\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"Action"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
Input"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"{\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
\""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"first"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"_number"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"\":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"3"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":",\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
\""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"second"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"_number"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"\":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"4"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"}\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQXsDdREkfWKHVzRsYPnQXOyQNX","object":"chat.completion.chunk","created":1709396581,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 85e2ba970e5851e6-GRU
|
||||
Cache-Control:
|
||||
- no-cache, must-revalidate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream
|
||||
Date:
|
||||
- Sat, 02 Mar 2024 16:23:01 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=C0VdYjpV_Hv7CqW8v18fK47fudXVna82_U1z_FcZ1Ng-1709396581-1.0.1.1-3mcMHbUjGImAUy9c5jtpPwFU1NQDoziKGjF8PNiFaGvST9S6PAOQVvyo4vHhKkznZM38Rs39YASCuQyyHRlkUg;
|
||||
path=/; expires=Sat, 02-Mar-24 16:53:01 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=XwSpHIa8bLFKjduTc6qVKOscgm9TIoxw5Nm7uklFgXw-1709396581628-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
access-control-allow-origin:
|
||||
- '*'
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-model:
|
||||
- gpt-4-0613
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '360'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=15724800; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '300000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '299706'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 58ms
|
||||
x-request-id:
|
||||
- req_e3515cf2ba7a535d68b03019b57dfbf1
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "user", "content": "You are test role. test backstory\nYour
|
||||
personal goal is: test goal\n\nYou ONLY have access to the following tools,
|
||||
and should NEVER make up tools that are not listed here:\n\nmultiplier: multiplier(first_number:
|
||||
int, second_number: int) -> float - Useful for when you need to multiply two
|
||||
numbers together.\n\nUse the following format:\n\nThought: you should always
|
||||
think about what to do\nAction: the action to take, only one name of [multiplier],
|
||||
just the name, exactly as it''s written.\nAction Input: the input to the action,
|
||||
just a simple a python dictionary using \" to wrap keys and values.\nObservation:
|
||||
the result of the action\n\nOnce all necessary information is gathered:\n\nThought:
|
||||
I now know the final answer\nFinal Answer: the final answer to the original
|
||||
input question\n\n\nCurrent Task: What is 3 times 4?\n\nThis is the expect criteria
|
||||
for your final answer: The result of the multiplication. \n you MUST return
|
||||
the actual complete content as the final answer, not a summary.\n\nBegin! This
|
||||
is VERY important to you, use the tools available and give your best Final Answer,
|
||||
your job depends on it!\n\nThought: \nI need to calculate the product of 3 and
|
||||
4.\n\nAction: \nmultiplier\n\nAction Input: \n{\n \"first_number\": 3,\n \"second_number\":
|
||||
4\n}\n\nObservation: 12\n"}], "model": "gpt-4", "n": 1, "stop": ["\nObservation"],
|
||||
"stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1428'
|
||||
- '1420'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=C0VdYjpV_Hv7CqW8v18fK47fudXVna82_U1z_FcZ1Ng-1709396581-1.0.1.1-3mcMHbUjGImAUy9c5jtpPwFU1NQDoziKGjF8PNiFaGvST9S6PAOQVvyo4vHhKkznZM38Rs39YASCuQyyHRlkUg;
|
||||
_cfuvid=XwSpHIa8bLFKjduTc6qVKOscgm9TIoxw5Nm7uklFgXw-1709396581628-0.0.1.1-604800000
|
||||
- __cf_bm=EvHnbspWLzEWYmRV1sLsvFlp1S5ePQd_KIGldEvula4-1720810787-1.0.1.1-fcfgfphTZearpEpqAdn5vCov8FO3hERf4Zij0dZmjoTuHkfcpXthynLGlq2sBt7SpE72ogziXHDlNZsSvmBQzA;
|
||||
_cfuvid=QoRToNVlfxPsZucAm6jmW5xUqoEucDbQTYK4SkSwmUc-1720810787746-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.12.0
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
@@ -300,7 +270,7 @@ interactions:
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.12.0
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
@@ -309,60 +279,60 @@ interactions:
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-8yMQZOVWWNho7BfFEeCd0GJq4Qstq","object":"chat.completion.chunk","created":1709396583,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
string: 'data: {"id":"chatcmpl-9kFme2PvSyqdXzFbvYY0WXafqzr5K","object":"chat.completion.chunk","created":1720810788,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQZOVWWNho7BfFEeCd0GJq4Qstq","object":"chat.completion.chunk","created":1709396583,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFme2PvSyqdXzFbvYY0WXafqzr5K","object":"chat.completion.chunk","created":1720810788,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQZOVWWNho7BfFEeCd0GJq4Qstq","object":"chat.completion.chunk","created":1709396583,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFme2PvSyqdXzFbvYY0WXafqzr5K","object":"chat.completion.chunk","created":1720810788,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQZOVWWNho7BfFEeCd0GJq4Qstq","object":"chat.completion.chunk","created":1709396583,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFme2PvSyqdXzFbvYY0WXafqzr5K","object":"chat.completion.chunk","created":1720810788,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQZOVWWNho7BfFEeCd0GJq4Qstq","object":"chat.completion.chunk","created":1709396583,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFme2PvSyqdXzFbvYY0WXafqzr5K","object":"chat.completion.chunk","created":1720810788,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQZOVWWNho7BfFEeCd0GJq4Qstq","object":"chat.completion.chunk","created":1709396583,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFme2PvSyqdXzFbvYY0WXafqzr5K","object":"chat.completion.chunk","created":1720810788,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
know"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQZOVWWNho7BfFEeCd0GJq4Qstq","object":"chat.completion.chunk","created":1709396583,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFme2PvSyqdXzFbvYY0WXafqzr5K","object":"chat.completion.chunk","created":1720810788,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQZOVWWNho7BfFEeCd0GJq4Qstq","object":"chat.completion.chunk","created":1709396583,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFme2PvSyqdXzFbvYY0WXafqzr5K","object":"chat.completion.chunk","created":1720810788,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQZOVWWNho7BfFEeCd0GJq4Qstq","object":"chat.completion.chunk","created":1709396583,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFme2PvSyqdXzFbvYY0WXafqzr5K","object":"chat.completion.chunk","created":1720810788,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQZOVWWNho7BfFEeCd0GJq4Qstq","object":"chat.completion.chunk","created":1709396583,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFme2PvSyqdXzFbvYY0WXafqzr5K","object":"chat.completion.chunk","created":1720810788,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":".\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQZOVWWNho7BfFEeCd0GJq4Qstq","object":"chat.completion.chunk","created":1709396583,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFme2PvSyqdXzFbvYY0WXafqzr5K","object":"chat.completion.chunk","created":1720810788,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQZOVWWNho7BfFEeCd0GJq4Qstq","object":"chat.completion.chunk","created":1709396583,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFme2PvSyqdXzFbvYY0WXafqzr5K","object":"chat.completion.chunk","created":1720810788,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQZOVWWNho7BfFEeCd0GJq4Qstq","object":"chat.completion.chunk","created":1709396583,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFme2PvSyqdXzFbvYY0WXafqzr5K","object":"chat.completion.chunk","created":1720810788,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQZOVWWNho7BfFEeCd0GJq4Qstq","object":"chat.completion.chunk","created":1709396583,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFme2PvSyqdXzFbvYY0WXafqzr5K","object":"chat.completion.chunk","created":1720810788,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQZOVWWNho7BfFEeCd0GJq4Qstq","object":"chat.completion.chunk","created":1709396583,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"12"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFme2PvSyqdXzFbvYY0WXafqzr5K","object":"chat.completion.chunk","created":1720810788,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"12"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMQZOVWWNho7BfFEeCd0GJq4Qstq","object":"chat.completion.chunk","created":1709396583,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
data: {"id":"chatcmpl-9kFme2PvSyqdXzFbvYY0WXafqzr5K","object":"chat.completion.chunk","created":1720810788,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
@@ -373,47 +343,41 @@ interactions:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 85e2baa3593951e6-GRU
|
||||
Cache-Control:
|
||||
- no-cache, must-revalidate
|
||||
- 8a2345c21e0b742e-MIA
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Sat, 02 Mar 2024 16:23:03 GMT
|
||||
- Fri, 12 Jul 2024 18:59:48 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
access-control-allow-origin:
|
||||
- '*'
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-model:
|
||||
- gpt-4-0613
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '321'
|
||||
- '133'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=15724800; includeSubDomains
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '300000'
|
||||
- '22000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '299670'
|
||||
- '21999671'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 65ms
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_5670da4792b11cfa6dd8ed3b6dd85cbc
|
||||
- req_be7165ee4924469e40e6a6b89a758b39
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
|
||||
697
tests/cassettes/test_replay_feature.yaml
Normal file
697
tests/cassettes/test_replay_feature.yaml
Normal file
@@ -0,0 +1,697 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are Researcher. You''re an expert researcher,
|
||||
specialized in technology, software engineering, AI and startups. You work as
|
||||
a freelancer and is now working on doing research and analysis for a new customer.\nYour
|
||||
personal goal is: Make the best research and analysis on content about AI and
|
||||
AI agentsTo give my best complete final answer to the task use the exact following
|
||||
format:\n\nThought: I now can give a great answer\nFinal Answer: my best complete
|
||||
final answer to the task.\nYour final answer must be the great and the most
|
||||
complete as possible, it must be outcome described.\n\nI MUST use these formats,
|
||||
my job depends on it!\nCurrent Task: Generate a list of 5 interesting ideas
|
||||
to explore for an article, where each bulletpoint is under 15 words.\n\nThis
|
||||
is the expect criteria for your final answer: Bullet point list of 5 important
|
||||
events. No additional commentary. \n you MUST return the actual complete content
|
||||
as the final answer, not a summary.\n\nBegin! This is VERY important to you,
|
||||
use the tools available and give your best Final Answer, your job depends on
|
||||
it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o", "n": 1, "stop": ["\nObservation"],
|
||||
"stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1237'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.3
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.3
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.5
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"-"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Ethical"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
implications"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
decision"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"-making"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
in"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
healthcare"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":".\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"-"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
agents"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
revolution"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"izing"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
customer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
service"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
in"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
e"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"-commerce"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":".\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"-"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Advances"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
in"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"-driven"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
predictive"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
maintenance"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
for"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
industries"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":".\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"-"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
The"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
role"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
in"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
autonomous"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
vehicle"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
safety"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":".\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"-"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
in"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
personalized"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
education"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Adaptive"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
learning"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
technologies"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7QxB9Y779gwWcC1EK2JtcMZ7vCS","object":"chat.completion.chunk","created":1720540363,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8a097b99ad909e50-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Tue, 09 Jul 2024 15:52:44 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=3B5vxI0ieroGmK5h7cD7a8bCSrrPh4hLjrbw87J9XRE-1720540364-1.0.1.1-BXhaEwefXZ7Ez0Fg7.8O4AAnOoPc5b7O.4CdzhLnbo9WIF30RlsTzH58YBRxoQipeSCQMxhePm2HaNR6nNfWEQ;
|
||||
path=/; expires=Tue, 09-Jul-24 16:22:44 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=D7VkuRYil_ytD3F4vcJzvO0gmVHyb3ZlnhCIjCrlyWE-1720540364005-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- user-soijsnuwuk3xvbf91w0jc33c
|
||||
openai-processing-ms:
|
||||
- '114'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '5000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '450000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '4999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '449712'
|
||||
x-ratelimit-reset-requests:
|
||||
- 12ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 38ms
|
||||
x-request-id:
|
||||
- req_94f0907cc8b2065f1e223070d2be2a85
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are Researcher. You''re an expert researcher,
|
||||
specialized in technology, software engineering, AI and startups. You work as
|
||||
a freelancer and is now working on doing research and analysis for a new customer.\nYour
|
||||
personal goal is: Make the best research and analysis on content about AI and
|
||||
AI agentsTo give my best complete final answer to the task use the exact following
|
||||
format:\n\nThought: I now can give a great answer\nFinal Answer: my best complete
|
||||
final answer to the task.\nYour final answer must be the great and the most
|
||||
complete as possible, it must be outcome described.\n\nI MUST use these formats,
|
||||
my job depends on it!\nCurrent Task: Generate a list of 5 interesting ideas
|
||||
to explore for an article, where each bulletpoint is under 15 words.\n\nThis
|
||||
is the expect criteria for your final answer: Bullet point list of 5 important
|
||||
events. No additional commentary. \n you MUST return the actual complete content
|
||||
as the final answer, not a summary.\n\nBegin! This is VERY important to you,
|
||||
use the tools available and give your best Final Answer, your job depends on
|
||||
it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o", "n": 1, "stop": ["\nObservation"],
|
||||
"stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1237'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=3B5vxI0ieroGmK5h7cD7a8bCSrrPh4hLjrbw87J9XRE-1720540364-1.0.1.1-BXhaEwefXZ7Ez0Fg7.8O4AAnOoPc5b7O.4CdzhLnbo9WIF30RlsTzH58YBRxoQipeSCQMxhePm2HaNR6nNfWEQ;
|
||||
_cfuvid=D7VkuRYil_ytD3F4vcJzvO0gmVHyb3ZlnhCIjCrlyWE-1720540364005-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.3
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.3
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.5
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"-"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Evolution"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Agents"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
in"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Customer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Service"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"-"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
in"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Healthcare"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Transform"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"ing"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Diagnostics"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Treatment"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"-"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Ethical"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Imp"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"lications"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Autonomous"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Systems"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"-"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"-"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"Driven"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Start"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"ups"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Dis"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"ruption"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Innovation"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"-"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Cyber"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"security"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Def"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"ending"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Against"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Modern"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Threat"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"s"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j7Qz3vXpsGKDVKeZa6oBvQ1PkdmE","object":"chat.completion.chunk","created":1720540365,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8a097ba2bc449e50-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Tue, 09 Jul 2024 15:52:45 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- user-soijsnuwuk3xvbf91w0jc33c
|
||||
openai-processing-ms:
|
||||
- '117'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '5000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '450000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '4999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '449712'
|
||||
x-ratelimit-reset-requests:
|
||||
- 12ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 38ms
|
||||
x-request-id:
|
||||
- req_a28d912698f7b75be87900d3a64bc91f
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
161
tests/cassettes/test_replay_from_task_setup_context.yaml
Normal file
161
tests/cassettes/test_replay_from_task_setup_context.yaml
Normal file
@@ -0,0 +1,161 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are test_agent. Test Description\nYour personal
|
||||
goal is: Test GoalTo give my best complete final answer to the task use the
|
||||
exact following format:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||
my best complete final answer to the task.\nYour final answer must be the great
|
||||
and the most complete as possible, it must be outcome described.\n\nI MUST use
|
||||
these formats, my job depends on it!\nCurrent Task: Test Task\n\nThis is the
|
||||
expect criteria for your final answer: Say Hi to John \n you MUST return the
|
||||
actual complete content as the final answer, not a summary.\n\nThis is the context
|
||||
you''re working with:\ncontext raw output\n\nBegin! This is VERY important to
|
||||
you, use the tools available and give your best Final Answer, your job depends
|
||||
on it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o", "n": 1, "stop":
|
||||
["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '918'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.3
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.3
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.5
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9kFSxQSL63v0sL4iqWFRWkul8QbhP","object":"chat.completion.chunk","created":1720809567,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFSxQSL63v0sL4iqWFRWkul8QbhP","object":"chat.completion.chunk","created":1720809567,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFSxQSL63v0sL4iqWFRWkul8QbhP","object":"chat.completion.chunk","created":1720809567,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFSxQSL63v0sL4iqWFRWkul8QbhP","object":"chat.completion.chunk","created":1720809567,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFSxQSL63v0sL4iqWFRWkul8QbhP","object":"chat.completion.chunk","created":1720809567,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFSxQSL63v0sL4iqWFRWkul8QbhP","object":"chat.completion.chunk","created":1720809567,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFSxQSL63v0sL4iqWFRWkul8QbhP","object":"chat.completion.chunk","created":1720809567,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFSxQSL63v0sL4iqWFRWkul8QbhP","object":"chat.completion.chunk","created":1720809567,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFSxQSL63v0sL4iqWFRWkul8QbhP","object":"chat.completion.chunk","created":1720809567,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFSxQSL63v0sL4iqWFRWkul8QbhP","object":"chat.completion.chunk","created":1720809567,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFSxQSL63v0sL4iqWFRWkul8QbhP","object":"chat.completion.chunk","created":1720809567,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFSxQSL63v0sL4iqWFRWkul8QbhP","object":"chat.completion.chunk","created":1720809567,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFSxQSL63v0sL4iqWFRWkul8QbhP","object":"chat.completion.chunk","created":1720809567,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFSxQSL63v0sL4iqWFRWkul8QbhP","object":"chat.completion.chunk","created":1720809567,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFSxQSL63v0sL4iqWFRWkul8QbhP","object":"chat.completion.chunk","created":1720809567,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
Hi"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFSxQSL63v0sL4iqWFRWkul8QbhP","object":"chat.completion.chunk","created":1720809567,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
John"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFSxQSL63v0sL4iqWFRWkul8QbhP","object":"chat.completion.chunk","created":1720809567,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8a2327f1190467c1-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Fri, 12 Jul 2024 18:39:27 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=df.hIcEr2QTS045wWa7HSF0ATx6AeLAoPPW0FoIx7W4-1720809567-1.0.1.1-1Y2nQ4DHdc5HUHFO08LdQOoWZykmQ0xe67vzmv2dS4OnnKEHYd9GMzcq.vWODTXoI.BoSxQiRrylKYuuO2t8Tw;
|
||||
path=/; expires=Fri, 12-Jul-24 19:09:27 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=Zmb0XRHa49q2R664FqlS3F.aojtATJKKGnkUiQoH92I-1720809567257-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '83'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '22000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '21999792'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_4fd8c7c8d47e20be017fb8de1ccb07c9
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
472
tests/cassettes/test_replay_interpolates_inputs_properly.yaml
Normal file
472
tests/cassettes/test_replay_interpolates_inputs_properly.yaml
Normal file
@@ -0,0 +1,472 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are test_agent. Test Description\nYour personal
|
||||
goal is: Test GoalTo give my best complete final answer to the task use the
|
||||
exact following format:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||
my best complete final answer to the task.\nYour final answer must be the great
|
||||
and the most complete as possible, it must be outcome described.\n\nI MUST use
|
||||
these formats, my job depends on it!\nCurrent Task: Context Task\n\nThis is
|
||||
the expect criteria for your final answer: Say John \n you MUST return the actual
|
||||
complete content as the final answer, not a summary.\n\nBegin! This is VERY
|
||||
important to you, use the tools available and give your best Final Answer, your
|
||||
job depends on it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o", "n":
|
||||
1, "stop": ["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '851'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.3
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.3
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.5
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9kF4hrEQNmcSJd7kCqA2oxUWQ4qOr","object":"chat.completion.chunk","created":1720808063,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4hrEQNmcSJd7kCqA2oxUWQ4qOr","object":"chat.completion.chunk","created":1720808063,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4hrEQNmcSJd7kCqA2oxUWQ4qOr","object":"chat.completion.chunk","created":1720808063,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4hrEQNmcSJd7kCqA2oxUWQ4qOr","object":"chat.completion.chunk","created":1720808063,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4hrEQNmcSJd7kCqA2oxUWQ4qOr","object":"chat.completion.chunk","created":1720808063,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4hrEQNmcSJd7kCqA2oxUWQ4qOr","object":"chat.completion.chunk","created":1720808063,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4hrEQNmcSJd7kCqA2oxUWQ4qOr","object":"chat.completion.chunk","created":1720808063,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4hrEQNmcSJd7kCqA2oxUWQ4qOr","object":"chat.completion.chunk","created":1720808063,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4hrEQNmcSJd7kCqA2oxUWQ4qOr","object":"chat.completion.chunk","created":1720808063,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4hrEQNmcSJd7kCqA2oxUWQ4qOr","object":"chat.completion.chunk","created":1720808063,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4hrEQNmcSJd7kCqA2oxUWQ4qOr","object":"chat.completion.chunk","created":1720808063,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4hrEQNmcSJd7kCqA2oxUWQ4qOr","object":"chat.completion.chunk","created":1720808063,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4hrEQNmcSJd7kCqA2oxUWQ4qOr","object":"chat.completion.chunk","created":1720808063,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4hrEQNmcSJd7kCqA2oxUWQ4qOr","object":"chat.completion.chunk","created":1720808063,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4hrEQNmcSJd7kCqA2oxUWQ4qOr","object":"chat.completion.chunk","created":1720808063,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
Say"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4hrEQNmcSJd7kCqA2oxUWQ4qOr","object":"chat.completion.chunk","created":1720808063,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
John"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4hrEQNmcSJd7kCqA2oxUWQ4qOr","object":"chat.completion.chunk","created":1720808063,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8a23033dc9abce48-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Fri, 12 Jul 2024 18:14:23 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=iykqFZ5ecR102MDyK48cHc9Ge3aXJBNKkesB4w9tCz4-1720808063-1.0.1.1-Eg_rjCINHV9hw7HzDFtJgxfwBfr9SahyJnbyo.JfBNFYax9M.ZcSVwmQwySE6AzVg.5AaLC05iljPfXmN26FrA;
|
||||
path=/; expires=Fri, 12-Jul-24 18:44:23 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=KWM5AhkXkvM2JvJ6J7QHiC9iposfEkI9eZRl8w6aVTY-1720808063923-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '84'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '22000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '21999807'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_e6b4b9610f2f254a228ad44dda349115
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are test_agent. Test Description\nYour personal
|
||||
goal is: Test GoalTo give my best complete final answer to the task use the
|
||||
exact following format:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||
my best complete final answer to the task.\nYour final answer must be the great
|
||||
and the most complete as possible, it must be outcome described.\n\nI MUST use
|
||||
these formats, my job depends on it!\nCurrent Task: Test Task\n\nThis is the
|
||||
expect criteria for your final answer: Say Hi to John \n you MUST return the
|
||||
actual complete content as the final answer, not a summary.\n\nThis is the context
|
||||
you''re working with:\nSay John\n\nBegin! This is VERY important to you, use
|
||||
the tools available and give your best Final Answer, your job depends on it!\n\nThought:\n",
|
||||
"role": "user"}], "model": "gpt-4o", "n": 1, "stop": ["\nObservation"], "stream":
|
||||
true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '908'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=iykqFZ5ecR102MDyK48cHc9Ge3aXJBNKkesB4w9tCz4-1720808063-1.0.1.1-Eg_rjCINHV9hw7HzDFtJgxfwBfr9SahyJnbyo.JfBNFYax9M.ZcSVwmQwySE6AzVg.5AaLC05iljPfXmN26FrA;
|
||||
_cfuvid=KWM5AhkXkvM2JvJ6J7QHiC9iposfEkI9eZRl8w6aVTY-1720808063923-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.3
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.3
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.5
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9kF4iocDiUNkL6NucVvumIQYikY61","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iocDiUNkL6NucVvumIQYikY61","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iocDiUNkL6NucVvumIQYikY61","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iocDiUNkL6NucVvumIQYikY61","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iocDiUNkL6NucVvumIQYikY61","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iocDiUNkL6NucVvumIQYikY61","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iocDiUNkL6NucVvumIQYikY61","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iocDiUNkL6NucVvumIQYikY61","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iocDiUNkL6NucVvumIQYikY61","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iocDiUNkL6NucVvumIQYikY61","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iocDiUNkL6NucVvumIQYikY61","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iocDiUNkL6NucVvumIQYikY61","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iocDiUNkL6NucVvumIQYikY61","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iocDiUNkL6NucVvumIQYikY61","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iocDiUNkL6NucVvumIQYikY61","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
Hi"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iocDiUNkL6NucVvumIQYikY61","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
John"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iocDiUNkL6NucVvumIQYikY61","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8a230340ec4cce48-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Fri, 12 Jul 2024 18:14:24 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '69'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '22000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '21999795'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_f3233314439a85e7a197be3d067c6d1c
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are test_agent. Test Description\nYour personal
|
||||
goal is: Test GoalTo give my best complete final answer to the task use the
|
||||
exact following format:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||
my best complete final answer to the task.\nYour final answer must be the great
|
||||
and the most complete as possible, it must be outcome described.\n\nI MUST use
|
||||
these formats, my job depends on it!\nCurrent Task: Test Task\n\nThis is the
|
||||
expect criteria for your final answer: Say Hi to John \n you MUST return the
|
||||
actual complete content as the final answer, not a summary.\n\nThis is the context
|
||||
you''re working with:\ncontext raw output\n\nBegin! This is VERY important to
|
||||
you, use the tools available and give your best Final Answer, your job depends
|
||||
on it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o", "n": 1, "stop":
|
||||
["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '918'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=iykqFZ5ecR102MDyK48cHc9Ge3aXJBNKkesB4w9tCz4-1720808063-1.0.1.1-Eg_rjCINHV9hw7HzDFtJgxfwBfr9SahyJnbyo.JfBNFYax9M.ZcSVwmQwySE6AzVg.5AaLC05iljPfXmN26FrA;
|
||||
_cfuvid=KWM5AhkXkvM2JvJ6J7QHiC9iposfEkI9eZRl8w6aVTY-1720808063923-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.3
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.3
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.5
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9kF4iAwQ2K0M2fy86qpF660mrclmY","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iAwQ2K0M2fy86qpF660mrclmY","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iAwQ2K0M2fy86qpF660mrclmY","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iAwQ2K0M2fy86qpF660mrclmY","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iAwQ2K0M2fy86qpF660mrclmY","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iAwQ2K0M2fy86qpF660mrclmY","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iAwQ2K0M2fy86qpF660mrclmY","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iAwQ2K0M2fy86qpF660mrclmY","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iAwQ2K0M2fy86qpF660mrclmY","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iAwQ2K0M2fy86qpF660mrclmY","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iAwQ2K0M2fy86qpF660mrclmY","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iAwQ2K0M2fy86qpF660mrclmY","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iAwQ2K0M2fy86qpF660mrclmY","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iAwQ2K0M2fy86qpF660mrclmY","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iAwQ2K0M2fy86qpF660mrclmY","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
Hi"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iAwQ2K0M2fy86qpF660mrclmY","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{"content":"
|
||||
John"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kF4iAwQ2K0M2fy86qpF660mrclmY","object":"chat.completion.chunk","created":1720808064,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d33f7b429e","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8a2303443f0cce48-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Fri, 12 Jul 2024 18:14:24 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '86'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '22000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '21999791'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_649687e24793d5b5782ecf58bc76386a
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
449
tests/cassettes/test_replay_task_with_context.yaml
Normal file
449
tests/cassettes/test_replay_task_with_context.yaml
Normal file
@@ -0,0 +1,449 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: !!binary |
|
||||
CrG5AQokCiIKDHNlcnZpY2UubmFtZRISChBjcmV3QUktdGVsZW1ldHJ5Eoe5AQoSChBjcmV3YWku
|
||||
dGVsZW1ldHJ5EpACChB1aPb7uNmPUxB3quhKypN4EgjnW4at80oLKCoOVGFzayBFeGVjdXRpb24w
|
||||
ATnIwfLEssXiF0FI2USctcXiF0ouCghjcmV3X2tleRIiCiA5OGE3ZDIxNDI1MjEwNzY5MzhjYzg3
|
||||
Yzc2OWRlZGNkM0oxCgdjcmV3X2lkEiYKJDQzOTNlYjRmLTRiMDMtNDgzOS1hMmMwLWViY2IwNTMy
|
||||
NDJjNEouCgh0YXNrX2tleRIiCiBhZmE2OThiMjYyZDM1NDNmOWE2MTFlNGQ1MTQ1ZWQ2YUoxCgd0
|
||||
YXNrX2lkEiYKJDM1ZTA0MTVkLTlhNzYtNDc5YS1hOGFkLTI2NjJiZDQ0NjRiMXoCGAGFAQABAAAS
|
||||
ig0KEBXTWyRbS66PohO2H7pX1jQSCBQxHUkWdA7dKgxDcmV3IENyZWF0ZWQwATmAAQ+ftcXiF0E4
|
||||
fhGftcXiF0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjM2LjBKGgoOcHl0aG9uX3ZlcnNpb24SCAoG
|
||||
My4xMS41Si4KCGNyZXdfa2V5EiIKIGNhN2MwMTM2ZWM3YmY1ZGU3NWRlNWQyNjY5OWRhM2I0SjEK
|
||||
B2NyZXdfaWQSJgokYzk5MzVhMDItOGJjYi00YTJlLThhZjktYTU5NWNmMTgyMzYzShwKDGNyZXdf
|
||||
cHJvY2VzcxIMCgpzZXF1ZW50aWFsShEKC2NyZXdfbWVtb3J5EgIQAEoaChRjcmV3X251bWJlcl9v
|
||||
Zl90YXNrcxICGAFKGwoVY3Jld19udW1iZXJfb2ZfYWdlbnRzEgIYAUr+BAoLY3Jld19hZ2VudHMS
|
||||
7gQK6wRbeyJrZXkiOiAiOGJkMjEzOWI1OTc1MTgxNTA2ZTQxZmQ5YzQ1NjNkNzUiLCAiaWQiOiAi
|
||||
Yjg4NzA5MjYtYzNiYi00MDUyLWIxYTctNDY3ZDdjNDJmMjVkIiwgInJvbGUiOiAiUmVzZWFyY2hl
|
||||
ciIsICJnb2FsIjogIk1ha2UgdGhlIGJlc3QgcmVzZWFyY2ggYW5kIGFuYWx5c2lzIG9uIGNvbnRl
|
||||
bnQgYWJvdXQgQUkgYW5kIEFJIGFnZW50cyIsICJiYWNrc3RvcnkiOiAiWW91J3JlIGFuIGV4cGVy
|
||||
dCByZXNlYXJjaGVyLCBzcGVjaWFsaXplZCBpbiB0ZWNobm9sb2d5LCBzb2Z0d2FyZSBlbmdpbmVl
|
||||
cmluZywgQUkgYW5kIHN0YXJ0dXBzLiBZb3Ugd29yayBhcyBhIGZyZWVsYW5jZXIgYW5kIGlzIG5v
|
||||
dyB3b3JraW5nIG9uIGRvaW5nIHJlc2VhcmNoIGFuZCBhbmFseXNpcyBmb3IgYSBuZXcgY3VzdG9t
|
||||
ZXIuIiwgInZlcmJvc2U/IjogZmFsc2UsICJtYXhfaXRlciI6IDI1LCAibWF4X3JwbSI6IG51bGws
|
||||
ICJpMThuIjogbnVsbCwgImxsbSI6ICJ7XCJuYW1lXCI6IG51bGwsIFwibW9kZWxfbmFtZVwiOiBc
|
||||
ImdwdC00b1wiLCBcInRlbXBlcmF0dXJlXCI6IDAuNywgXCJjbGFzc1wiOiBcIkNoYXRPcGVuQUlc
|
||||
In0iLCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6IGZhbHNlLCAidG9vbHNfbmFtZXMiOiBbXX1dSskD
|
||||
CgpjcmV3X3Rhc2tzEroDCrcDW3sia2V5IjogIjk0NGFlZjBiYWM4NDBmMWMyN2JkODNhOTM3YmMz
|
||||
NjFiIiwgImlkIjogIjVkZDRjMTJhLThmOWItNDRmNi1iYjRhLWVkODdiNTgzZGU1MCIsICJkZXNj
|
||||
cmlwdGlvbiI6ICJHaXZlIG1lIGEgbGlzdCBvZiA1IGludGVyZXN0aW5nIGlkZWFzIHRvIGV4cGxv
|
||||
cmUgZm9yIG5hIGFydGljbGUsIHdoYXQgbWFrZXMgdGhlbSB1bmlxdWUgYW5kIGludGVyZXN0aW5n
|
||||
LiIsICJleHBlY3RlZF9vdXRwdXQiOiAiQnVsbGV0IHBvaW50IGxpc3Qgb2YgNSBpbXBvcnRhbnQg
|
||||
ZXZlbnRzLiIsICJhc3luY19leGVjdXRpb24/IjogdHJ1ZSwgImh1bWFuX2lucHV0PyI6IGZhbHNl
|
||||
LCAiYWdlbnRfcm9sZSI6ICJSZXNlYXJjaGVyIiwgImFnZW50X2tleSI6ICI4YmQyMTM5YjU5NzUx
|
||||
ODE1MDZlNDFmZDljNDU2M2Q3NSIsICJjb250ZXh0IjogbnVsbCwgInRvb2xzX25hbWVzIjogW119
|
||||
XUoqCghwbGF0Zm9ybRIeChxtYWNPUy0xNC4xLjEtYXJtNjQtYXJtLTY0Yml0ShwKEHBsYXRmb3Jt
|
||||
X3JlbGVhc2USCAoGMjMuMS4wShsKD3BsYXRmb3JtX3N5c3RlbRIICgZEYXJ3aW5KewoQcGxhdGZv
|
||||
cm1fdmVyc2lvbhJnCmVEYXJ3aW4gS2VybmVsIFZlcnNpb24gMjMuMS4wOiBNb24gT2N0ICA5IDIx
|
||||
OjI3OjI0IFBEVCAyMDIzOyByb290OnhudS0xMDAwMi40MS45fjYvUkVMRUFTRV9BUk02NF9UNjAw
|
||||
MEoKCgRjcHVzEgIYCnoCGAGFAQABAAASjgIKEIq7Mjq32M5Kt5dKJnjskZ0SCMx1TtvU1+T2KgxU
|
||||
YXNrIENyZWF0ZWQwATkI9iiftcXiF0HAfimftcXiF0ouCghjcmV3X2tleRIiCiBjYTdjMDEzNmVj
|
||||
N2JmNWRlNzVkZTVkMjY2OTlkYTNiNEoxCgdjcmV3X2lkEiYKJGM5OTM1YTAyLThiY2ItNGEyZS04
|
||||
YWY5LWE1OTVjZjE4MjM2M0ouCgh0YXNrX2tleRIiCiA5NDRhZWYwYmFjODQwZjFjMjdiZDgzYTkz
|
||||
N2JjMzYxYkoxCgd0YXNrX2lkEiYKJDVkZDRjMTJhLThmOWItNDRmNi1iYjRhLWVkODdiNTgzZGU1
|
||||
MHoCGAGFAQABAAASkAIKEE4G7Oi0pafO3nARt5VKez4SCM6Ua5kmBYFDKg5UYXNrIEV4ZWN1dGlv
|
||||
bjABOXC1KZ+1xeIXQWDhMJ+1xeIXSi4KCGNyZXdfa2V5EiIKIGNhN2MwMTM2ZWM3YmY1ZGU3NWRl
|
||||
NWQyNjY5OWRhM2I0SjEKB2NyZXdfaWQSJgokYzk5MzVhMDItOGJjYi00YTJlLThhZjktYTU5NWNm
|
||||
MTgyMzYzSi4KCHRhc2tfa2V5EiIKIDk0NGFlZjBiYWM4NDBmMWMyN2JkODNhOTM3YmMzNjFiSjEK
|
||||
B3Rhc2tfaWQSJgokNWRkNGMxMmEtOGY5Yi00NGY2LWJiNGEtZWQ4N2I1ODNkZTUwegIYAYUBAAEA
|
||||
ABLUGQoQX+5PRYVQ3O7KzTa1tWKiRxIIuu8x8CYW2KEqDENyZXcgQ3JlYXRlZDABOUjW1qO1xeIX
|
||||
QUjE2aO1xeIXShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuMzYuMEoaCg5weXRob25fdmVyc2lvbhII
|
||||
CgYzLjExLjVKLgoIY3Jld19rZXkSIgogYTBhYzk1MzU0ZDMyOWJhN2Y2OTQ4M2FkZmUwYzdkMjhK
|
||||
MQoHY3Jld19pZBImCiQ5MzgzYzU2Yy1lNzQ3LTQwYmItOTlhYi01NGE2NGRkY2Q3YTlKHAoMY3Jl
|
||||
d19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdfbnVtYmVy
|
||||
X29mX3Rhc2tzEgIYBEobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgCSvcHCgtjcmV3X2FnZW50
|
||||
cxLnBwrkB1t7ImtleSI6ICJkYWI1MjFlYmQzMWFlMTJmZWUxODlhMzhkMzA5MGQ5MyIsICJpZCI6
|
||||
ICJhMzAzNTliZi01Y2QwLTQ3MGYtOTk2NS0yYjZhZTU0YmVmY2MiLCAicm9sZSI6ICJXcml0ZXIi
|
||||
LCAiZ29hbCI6ICJZb3Ugd3JpdGUgbGVzc3NvbnMgb2YgbWF0aCBmb3Iga2lkcy4iLCAiYmFja3N0
|
||||
b3J5IjogIllvdSdyZSBhbiBleHBlcnQgaW4gd3JpdHRpbmcgYW5kIHlvdSBsb3ZlIHRvIHRlYWNo
|
||||
IGtpZHMgYnV0IHlvdSBrbm93IG5vdGhpbmcgb2YgbWF0aC4iLCAidmVyYm9zZT8iOiBmYWxzZSwg
|
||||
Im1heF9pdGVyIjogMjUsICJtYXhfcnBtIjogbnVsbCwgImkxOG4iOiBudWxsLCAibGxtIjogIntc
|
||||
Im5hbWVcIjogbnVsbCwgXCJtb2RlbF9uYW1lXCI6IFwiZ3B0LTRvXCIsIFwidGVtcGVyYXR1cmVc
|
||||
IjogMC43LCBcImNsYXNzXCI6IFwiQ2hhdE9wZW5BSVwifSIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/
|
||||
IjogZmFsc2UsICJ0b29sc19uYW1lcyI6IFsibXVsdGlwbGNhdGlvbl90b29sIl19LCB7ImtleSI6
|
||||
ICJkYWI1MjFlYmQzMWFlMTJmZWUxODlhMzhkMzA5MGQ5MyIsICJpZCI6ICJhN2JiNTI4ZS0yM2M4
|
||||
LTQ1MzItYjNiYS0yOWU1NDNiMGFhN2YiLCAicm9sZSI6ICJXcml0ZXIiLCAiZ29hbCI6ICJZb3Ug
|
||||
d3JpdGUgbGVzc3NvbnMgb2YgbWF0aCBmb3Iga2lkcy4iLCAiYmFja3N0b3J5IjogIllvdSdyZSBh
|
||||
biBleHBlcnQgaW4gd3JpdHRpbmcgYW5kIHlvdSBsb3ZlIHRvIHRlYWNoIGtpZHMgYnV0IHlvdSBr
|
||||
bm93IG5vdGhpbmcgb2YgbWF0aC4iLCAidmVyYm9zZT8iOiBmYWxzZSwgIm1heF9pdGVyIjogMjUs
|
||||
ICJtYXhfcnBtIjogbnVsbCwgImkxOG4iOiBudWxsLCAibGxtIjogIntcIm5hbWVcIjogbnVsbCwg
|
||||
XCJtb2RlbF9uYW1lXCI6IFwiZ3B0LTRvXCIsIFwidGVtcGVyYXR1cmVcIjogMC43LCBcImNsYXNz
|
||||
XCI6IFwiQ2hhdE9wZW5BSVwifSIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJ0b29s
|
||||
c19uYW1lcyI6IFsibXVsdGlwbGNhdGlvbl90b29sIl19XUqaDQoKY3Jld190YXNrcxKLDQqIDVt7
|
||||
ImtleSI6ICIzMGYzMjg2M2EyZWI3OThkMTA5NmM5MDcwMjgwOTgzMCIsICJpZCI6ICIxNjhlOWNm
|
||||
Ny1hYWJjLTRkMGUtYTg1OS01ODEyZGM4MDg4NTYiLCAiZGVzY3JpcHRpb24iOiAiV2hhdCBpcyAy
|
||||
IHRpbWVzIDY/IFJldHVybiBvbmx5IHRoZSBudW1iZXIgYWZ0ZXIgdXNpbmcgdGhlIG11bHRpcGxp
|
||||
Y2F0aW9uIHRvb2wuIiwgImV4cGVjdGVkX291dHB1dCI6ICJ0aGUgcmVzdWx0IG9mIG11bHRpcGxp
|
||||
Y2F0aW9uIiwgImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lucHV0PyI6IGZhbHNl
|
||||
LCAiYWdlbnRfcm9sZSI6ICJXcml0ZXIiLCAiYWdlbnRfa2V5IjogImRhYjUyMWViZDMxYWUxMmZl
|
||||
ZTE4OWEzOGQzMDkwZDkzIiwgImNvbnRleHQiOiBudWxsLCAidG9vbHNfbmFtZXMiOiBbIm11bHRp
|
||||
cGxjYXRpb25fdG9vbCJdfSwgeyJrZXkiOiAiM2QwYmRlZTMxMjdhZjk5MGIzNjZjMTJkZGJkNGE4
|
||||
YTYiLCAiaWQiOiAiMDdkYmU0MzAtZDMxNC00MjgwLWEwODUtOGQ0MTQyZjNjZTFlIiwgImRlc2Ny
|
||||
aXB0aW9uIjogIldoYXQgaXMgMyB0aW1lcyAxPyBSZXR1cm4gb25seSB0aGUgbnVtYmVyIGFmdGVy
|
||||
IHVzaW5nIHRoZSBtdWx0aXBsaWNhdGlvbiB0b29sLiIsICJleHBlY3RlZF9vdXRwdXQiOiAidGhl
|
||||
IHJlc3VsdCBvZiBtdWx0aXBsaWNhdGlvbiIsICJhc3luY19leGVjdXRpb24/IjogZmFsc2UsICJo
|
||||
dW1hbl9pbnB1dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAiV3JpdGVyIiwgImFnZW50X2tleSI6
|
||||
ICJkYWI1MjFlYmQzMWFlMTJmZWUxODlhMzhkMzA5MGQ5MyIsICJjb250ZXh0IjogbnVsbCwgInRv
|
||||
b2xzX25hbWVzIjogWyJtdWx0aXBsY2F0aW9uX3Rvb2wiXX0sIHsia2V5IjogIjMwZjMyODYzYTJl
|
||||
Yjc5OGQxMDk2YzkwNzAyODA5ODMwIiwgImlkIjogImFlYjJmMjk1LTI3NDUtNGYwMi05MjQxLWU3
|
||||
NTk2MjVjOWQyMCIsICJkZXNjcmlwdGlvbiI6ICJXaGF0IGlzIDIgdGltZXMgNj8gUmV0dXJuIG9u
|
||||
bHkgdGhlIG51bWJlciBhZnRlciB1c2luZyB0aGUgbXVsdGlwbGljYXRpb24gdG9vbC4iLCAiZXhw
|
||||
ZWN0ZWRfb3V0cHV0IjogInRoZSByZXN1bHQgb2YgbXVsdGlwbGljYXRpb24iLCAiYXN5bmNfZXhl
|
||||
Y3V0aW9uPyI6IGZhbHNlLCAiaHVtYW5faW5wdXQ/IjogZmFsc2UsICJhZ2VudF9yb2xlIjogIldy
|
||||
aXRlciIsICJhZ2VudF9rZXkiOiAiZGFiNTIxZWJkMzFhZTEyZmVlMTg5YTM4ZDMwOTBkOTMiLCAi
|
||||
Y29udGV4dCI6IG51bGwsICJ0b29sc19uYW1lcyI6IFsibXVsdGlwbGNhdGlvbl90b29sIl19LCB7
|
||||
ImtleSI6ICIzZDBiZGVlMzEyN2FmOTkwYjM2NmMxMmRkYmQ0YThhNiIsICJpZCI6ICI5MGE0N2Fm
|
||||
Ny1jZjc5LTQ3NjYtOWJjNS02Nzg3MWMwOTFiOTYiLCAiZGVzY3JpcHRpb24iOiAiV2hhdCBpcyAz
|
||||
IHRpbWVzIDE/IFJldHVybiBvbmx5IHRoZSBudW1iZXIgYWZ0ZXIgdXNpbmcgdGhlIG11bHRpcGxp
|
||||
Y2F0aW9uIHRvb2wuIiwgImV4cGVjdGVkX291dHB1dCI6ICJ0aGUgcmVzdWx0IG9mIG11bHRpcGxp
|
||||
Y2F0aW9uIiwgImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lucHV0PyI6IGZhbHNl
|
||||
LCAiYWdlbnRfcm9sZSI6ICJXcml0ZXIiLCAiYWdlbnRfa2V5IjogImRhYjUyMWViZDMxYWUxMmZl
|
||||
ZTE4OWEzOGQzMDkwZDkzIiwgImNvbnRleHQiOiBudWxsLCAidG9vbHNfbmFtZXMiOiBbIm11bHRp
|
||||
cGxjYXRpb25fdG9vbCJdfV1KKgoIcGxhdGZvcm0SHgocbWFjT1MtMTQuMS4xLWFybTY0LWFybS02
|
||||
NGJpdEocChBwbGF0Zm9ybV9yZWxlYXNlEggKBjIzLjEuMEobCg9wbGF0Zm9ybV9zeXN0ZW0SCAoG
|
||||
RGFyd2luSnsKEHBsYXRmb3JtX3ZlcnNpb24SZwplRGFyd2luIEtlcm5lbCBWZXJzaW9uIDIzLjEu
|
||||
MDogTW9uIE9jdCAgOSAyMToyNzoyNCBQRFQgMjAyMzsgcm9vdDp4bnUtMTAwMDIuNDEuOX42L1JF
|
||||
TEVBU0VfQVJNNjRfVDYwMDBKCgoEY3B1cxICGAp6AhgBhQEAAQAAEo4CChDMsDVWCtNnTZd78P6M
|
||||
F0qHEghu/cLCLfIljioMVGFzayBDcmVhdGVkMAE5ALHzo7XF4hdBMCb0o7XF4hdKLgoIY3Jld19r
|
||||
ZXkSIgogYTBhYzk1MzU0ZDMyOWJhN2Y2OTQ4M2FkZmUwYzdkMjhKMQoHY3Jld19pZBImCiQ5Mzgz
|
||||
YzU2Yy1lNzQ3LTQwYmItOTlhYi01NGE2NGRkY2Q3YTlKLgoIdGFza19rZXkSIgogMzBmMzI4NjNh
|
||||
MmViNzk4ZDEwOTZjOTA3MDI4MDk4MzBKMQoHdGFza19pZBImCiQxNjhlOWNmNy1hYWJjLTRkMGUt
|
||||
YTg1OS01ODEyZGM4MDg4NTZ6AhgBhQEAAQAAEpUBChB1ZPMBfoWizT2O65qegWcGEgjmR3EI+Dhz
|
||||
KSoKVG9vbCBVc2FnZTABOUCLN6W1xeIXQWBWOKW1xeIXShoKDmNyZXdhaV92ZXJzaW9uEggKBjAu
|
||||
MzYuMEohCgl0b29sX25hbWUSFAoSbXVsdGlwbGNhdGlvbl90b29sSg4KCGF0dGVtcHRzEgIYAXoC
|
||||
GAGFAQABAAASkAIKEEukGnCGulTfJ1/e50UIF5ASCBeMmMmX2ss5Kg5UYXNrIEV4ZWN1dGlvbjAB
|
||||
OShR9KO1xeIXQQBL+KW1xeIXSi4KCGNyZXdfa2V5EiIKIGEwYWM5NTM1NGQzMjliYTdmNjk0ODNh
|
||||
ZGZlMGM3ZDI4SjEKB2NyZXdfaWQSJgokOTM4M2M1NmMtZTc0Ny00MGJiLTk5YWItNTRhNjRkZGNk
|
||||
N2E5Si4KCHRhc2tfa2V5EiIKIDMwZjMyODYzYTJlYjc5OGQxMDk2YzkwNzAyODA5ODMwSjEKB3Rh
|
||||
c2tfaWQSJgokMTY4ZTljZjctYWFiYy00ZDBlLWE4NTktNTgxMmRjODA4ODU2egIYAYUBAAEAABKO
|
||||
AgoQeq27u45vVnfO3QW8y5TkKhII8pwpZBclUSUqDFRhc2sgQ3JlYXRlZDABOXCRBaa1xeIXQeAl
|
||||
Bqa1xeIXSi4KCGNyZXdfa2V5EiIKIGEwYWM5NTM1NGQzMjliYTdmNjk0ODNhZGZlMGM3ZDI4SjEK
|
||||
B2NyZXdfaWQSJgokOTM4M2M1NmMtZTc0Ny00MGJiLTk5YWItNTRhNjRkZGNkN2E5Si4KCHRhc2tf
|
||||
a2V5EiIKIDNkMGJkZWUzMTI3YWY5OTBiMzY2YzEyZGRiZDRhOGE2SjEKB3Rhc2tfaWQSJgokMDdk
|
||||
YmU0MzAtZDMxNC00MjgwLWEwODUtOGQ0MTQyZjNjZTFlegIYAYUBAAEAABKVAQoQDpoau444Nc2P
|
||||
TpqGSwuJUBII3odk2PVsECQqClRvb2wgVXNhZ2UwATlQXjmntcXiF0GwSDqntcXiF0oaCg5jcmV3
|
||||
YWlfdmVyc2lvbhIICgYwLjM2LjBKIQoJdG9vbF9uYW1lEhQKEm11bHRpcGxjYXRpb25fdG9vbEoO
|
||||
CghhdHRlbXB0cxICGAF6AhgBhQEAAQAAEpACChBW89qgKwxZoE48iGgYORjmEggn0gOg6T0j3SoO
|
||||
VGFzayBFeGVjdXRpb24wATnYUAamtcXiF0GwUfSntcXiF0ouCghjcmV3X2tleRIiCiBhMGFjOTUz
|
||||
NTRkMzI5YmE3ZjY5NDgzYWRmZTBjN2QyOEoxCgdjcmV3X2lkEiYKJDkzODNjNTZjLWU3NDctNDBi
|
||||
Yi05OWFiLTU0YTY0ZGRjZDdhOUouCgh0YXNrX2tleRIiCiAzZDBiZGVlMzEyN2FmOTkwYjM2NmMx
|
||||
MmRkYmQ0YThhNkoxCgd0YXNrX2lkEiYKJDA3ZGJlNDMwLWQzMTQtNDI4MC1hMDg1LThkNDE0MmYz
|
||||
Y2UxZXoCGAGFAQABAAASjgIKECX4kiD6RrdHOsL/u1WPaKISCEn8hmiuaOxzKgxUYXNrIENyZWF0
|
||||
ZWQwATngAQCotcXiF0GwhgCotcXiF0ouCghjcmV3X2tleRIiCiBhMGFjOTUzNTRkMzI5YmE3ZjY5
|
||||
NDgzYWRmZTBjN2QyOEoxCgdjcmV3X2lkEiYKJDkzODNjNTZjLWU3NDctNDBiYi05OWFiLTU0YTY0
|
||||
ZGRjZDdhOUouCgh0YXNrX2tleRIiCiAzMGYzMjg2M2EyZWI3OThkMTA5NmM5MDcwMjgwOTgzMEox
|
||||
Cgd0YXNrX2lkEiYKJGFlYjJmMjk1LTI3NDUtNGYwMi05MjQxLWU3NTk2MjVjOWQyMHoCGAGFAQAB
|
||||
AAASlQEKEA/spc2u3GsQSPtfjOb6NJISCG3tEmcOqAoDKgpUb29sIFVzYWdlMAE58GZDqbXF4hdB
|
||||
IFlEqbXF4hdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zNi4wSiEKCXRvb2xfbmFtZRIUChJtdWx0
|
||||
aXBsY2F0aW9uX3Rvb2xKDgoIYXR0ZW1wdHMSAhgBegIYAYUBAAEAABKQAgoQsO+zsd47DClQTmS7
|
||||
ZfZYLxII2UxCduX9osEqDlRhc2sgRXhlY3V0aW9uMAE5qLEAqLXF4hdByLEBqrXF4hdKLgoIY3Jl
|
||||
d19rZXkSIgogYTBhYzk1MzU0ZDMyOWJhN2Y2OTQ4M2FkZmUwYzdkMjhKMQoHY3Jld19pZBImCiQ5
|
||||
MzgzYzU2Yy1lNzQ3LTQwYmItOTlhYi01NGE2NGRkY2Q3YTlKLgoIdGFza19rZXkSIgogMzBmMzI4
|
||||
NjNhMmViNzk4ZDEwOTZjOTA3MDI4MDk4MzBKMQoHdGFza19pZBImCiRhZWIyZjI5NS0yNzQ1LTRm
|
||||
MDItOTI0MS1lNzU5NjI1YzlkMjB6AhgBhQEAAQAAEo4CChD55L/w1V+ezIjuajyG1hwUEggBdDur
|
||||
+CTqUCoMVGFzayBDcmVhdGVkMAE52AcPqrXF4hdBSJwPqrXF4hdKLgoIY3Jld19rZXkSIgogYTBh
|
||||
Yzk1MzU0ZDMyOWJhN2Y2OTQ4M2FkZmUwYzdkMjhKMQoHY3Jld19pZBImCiQ5MzgzYzU2Yy1lNzQ3
|
||||
LTQwYmItOTlhYi01NGE2NGRkY2Q3YTlKLgoIdGFza19rZXkSIgogM2QwYmRlZTMxMjdhZjk5MGIz
|
||||
NjZjMTJkZGJkNGE4YTZKMQoHdGFza19pZBImCiQ5MGE0N2FmNy1jZjc5LTQ3NjYtOWJjNS02Nzg3
|
||||
MWMwOTFiOTZ6AhgBhQEAAQAAEpUBChA+cdY4kcsaUDolyJw3c952EggtUbX8zXBaySoKVG9vbCBV
|
||||
c2FnZTABOaAHUKu1xeIXQcDSUKu1xeIXShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuMzYuMEohCgl0
|
||||
b29sX25hbWUSFAoSbXVsdGlwbGNhdGlvbl90b29sSg4KCGF0dGVtcHRzEgIYAXoCGAGFAQABAAAS
|
||||
kAIKEAvnVK0WhsX8u4xShcr/sooSCC+aS9IRpThCKg5UYXNrIEV4ZWN1dGlvbjABOUDHD6q1xeIX
|
||||
QVC9IKy1xeIXSi4KCGNyZXdfa2V5EiIKIGEwYWM5NTM1NGQzMjliYTdmNjk0ODNhZGZlMGM3ZDI4
|
||||
SjEKB2NyZXdfaWQSJgokOTM4M2M1NmMtZTc0Ny00MGJiLTk5YWItNTRhNjRkZGNkN2E5Si4KCHRh
|
||||
c2tfa2V5EiIKIDNkMGJkZWUzMTI3YWY5OTBiMzY2YzEyZGRiZDRhOGE2SjEKB3Rhc2tfaWQSJgok
|
||||
OTBhNDdhZjctY2Y3OS00NzY2LTliYzUtNjc4NzFjMDkxYjk2egIYAYUBAAEAABKrCwoQfae3w+r8
|
||||
n0qGJnp7DP6RtxIILhPziA5VzRwqDENyZXcgQ3JlYXRlZDABOZipkrW1xeIXQZDIlLW1xeIXShoK
|
||||
DmNyZXdhaV92ZXJzaW9uEggKBjAuMzYuMEoaCg5weXRob25fdmVyc2lvbhIICgYzLjExLjVKLgoI
|
||||
Y3Jld19rZXkSIgogYzk3YjVmZWI1ZDFiNjZiYjU5MDA2YWFhMDFhMjljZDZKMQoHY3Jld19pZBIm
|
||||
CiQ2M2Y2MGU1Yi1kMjZkLTQwZjMtYTQyMC0wNzQ3Y2MwOWM5YWRKHAoMY3Jld19wcm9jZXNzEgwK
|
||||
CnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhABShoKFGNyZXdfbnVtYmVyX29mX3Rhc2tzEgIY
|
||||
AUobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgBStIDCgtjcmV3X2FnZW50cxLCAwq/A1t7Imtl
|
||||
eSI6ICIwN2Q5OWI2MzA0MTFkMzVmZDkwNDdhNTMyZDUzZGRhNyIsICJpZCI6ICJhMGJlYjc4ZS04
|
||||
ZGUzLTQ5YjEtYWRkZS1mODg3NWMzNjM3ZGYiLCAicm9sZSI6ICJSZXNlYXJjaGVyIiwgImdvYWwi
|
||||
OiAiWW91IHJlc2VhcmNoIGFib3V0IG1hdGguIiwgImJhY2tzdG9yeSI6ICJZb3UncmUgYW4gZXhw
|
||||
ZXJ0IGluIHJlc2VhcmNoIGFuZCB5b3UgbG92ZSB0byBsZWFybiBuZXcgdGhpbmdzLiIsICJ2ZXJi
|
||||
b3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAyNSwgIm1heF9ycG0iOiBudWxsLCAiaTE4biI6IG51
|
||||
bGwsICJsbG0iOiAie1wibmFtZVwiOiBudWxsLCBcIm1vZGVsX25hbWVcIjogXCJncHQtNG9cIiwg
|
||||
XCJ0ZW1wZXJhdHVyZVwiOiAwLjcsIFwiY2xhc3NcIjogXCJDaGF0T3BlbkFJXCJ9IiwgImRlbGVn
|
||||
YXRpb25fZW5hYmxlZD8iOiBmYWxzZSwgInRvb2xzX25hbWVzIjogW119XUqWAwoKY3Jld190YXNr
|
||||
cxKHAwqEA1t7ImtleSI6ICI2Mzk5NjUxN2YzZjNmMWM5NGQ2YmI2MTdhYTBiMWM0ZiIsICJpZCI6
|
||||
ICIwMmNiOWIyYS0yODIyLTQwNTYtYjQ5MC04YjA0YzA2MTg2M2IiLCAiZGVzY3JpcHRpb24iOiAi
|
||||
UmVzZWFyY2ggYSB0b3BpYyB0byB0ZWFjaCBhIGtpZCBhZ2VkIDYgYWJvdXQgbWF0aC4iLCAiZXhw
|
||||
ZWN0ZWRfb3V0cHV0IjogIkEgdG9waWMsIGV4cGxhbmF0aW9uLCBhbmdsZSwgYW5kIGV4YW1wbGVz
|
||||
LiIsICJhc3luY19leGVjdXRpb24/IjogZmFsc2UsICJodW1hbl9pbnB1dD8iOiBmYWxzZSwgImFn
|
||||
ZW50X3JvbGUiOiAiUmVzZWFyY2hlciIsICJhZ2VudF9rZXkiOiAiMDdkOTliNjMwNDExZDM1ZmQ5
|
||||
MDQ3YTUzMmQ1M2RkYTciLCAiY29udGV4dCI6IG51bGwsICJ0b29sc19uYW1lcyI6IFtdfV1KKgoI
|
||||
cGxhdGZvcm0SHgocbWFjT1MtMTQuMS4xLWFybTY0LWFybS02NGJpdEocChBwbGF0Zm9ybV9yZWxl
|
||||
YXNlEggKBjIzLjEuMEobCg9wbGF0Zm9ybV9zeXN0ZW0SCAoGRGFyd2luSnsKEHBsYXRmb3JtX3Zl
|
||||
cnNpb24SZwplRGFyd2luIEtlcm5lbCBWZXJzaW9uIDIzLjEuMDogTW9uIE9jdCAgOSAyMToyNzoy
|
||||
NCBQRFQgMjAyMzsgcm9vdDp4bnUtMTAwMDIuNDEuOX42L1JFTEVBU0VfQVJNNjRfVDYwMDBKCgoE
|
||||
Y3B1cxICGAp6AhgBhQEAAQAAEo4CChDwvE8C6FMSvgaW9DsUngzEEgi69DWPEIv7/ioMVGFzayBD
|
||||
cmVhdGVkMAE5QJOmtbXF4hdBSOWmtbXF4hdKLgoIY3Jld19rZXkSIgogYzk3YjVmZWI1ZDFiNjZi
|
||||
YjU5MDA2YWFhMDFhMjljZDZKMQoHY3Jld19pZBImCiQ2M2Y2MGU1Yi1kMjZkLTQwZjMtYTQyMC0w
|
||||
NzQ3Y2MwOWM5YWRKLgoIdGFza19rZXkSIgogNjM5OTY1MTdmM2YzZjFjOTRkNmJiNjE3YWEwYjFj
|
||||
NGZKMQoHdGFza19pZBImCiQwMmNiOWIyYS0yODIyLTQwNTYtYjQ5MC04YjA0YzA2MTg2M2J6AhgB
|
||||
hQEAAQAAEpACChA9ji0VC42jOLjDFsSGAZehEgi88WVi+J8WZyoOVGFzayBFeGVjdXRpb24wATlA
|
||||
EKe1tcXiF0GwOXS9tcXiF0ouCghjcmV3X2tleRIiCiBjOTdiNWZlYjVkMWI2NmJiNTkwMDZhYWEw
|
||||
MWEyOWNkNkoxCgdjcmV3X2lkEiYKJDYzZjYwZTViLWQyNmQtNDBmMy1hNDIwLTA3NDdjYzA5Yzlh
|
||||
ZEouCgh0YXNrX2tleRIiCiA2Mzk5NjUxN2YzZjNmMWM5NGQ2YmI2MTdhYTBiMWM0ZkoxCgd0YXNr
|
||||
X2lkEiYKJDAyY2I5YjJhLTI4MjItNDA1Ni1iNDkwLThiMDRjMDYxODYzYnoCGAGFAQABAAASqwsK
|
||||
EOSKFi2MNd2ArhJhwSRFLU8SCLUBjfAc1xVOKgxDcmV3IENyZWF0ZWQwATlAcNi/tcXiF0G4zdq/
|
||||
tcXiF0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjM2LjBKGgoOcHl0aG9uX3ZlcnNpb24SCAoGMy4x
|
||||
MS41Si4KCGNyZXdfa2V5EiIKIGM5N2I1ZmViNWQxYjY2YmI1OTAwNmFhYTAxYTI5Y2Q2SjEKB2Ny
|
||||
ZXdfaWQSJgokNmFlNzhhZjEtNDE5Ny00ODU2LWE1OTItYWEzZGNmMTI5OGJhShwKDGNyZXdfcHJv
|
||||
Y2VzcxIMCgpzZXF1ZW50aWFsShEKC2NyZXdfbWVtb3J5EgIQAEoaChRjcmV3X251bWJlcl9vZl90
|
||||
YXNrcxICGAFKGwoVY3Jld19udW1iZXJfb2ZfYWdlbnRzEgIYAUrSAwoLY3Jld19hZ2VudHMSwgMK
|
||||
vwNbeyJrZXkiOiAiMDdkOTliNjMwNDExZDM1ZmQ5MDQ3YTUzMmQ1M2RkYTciLCAiaWQiOiAiYWUz
|
||||
ZjEwYmYtNThlMy00NjNjLWI4YTUtMDZjNTAxYTVlNWY0IiwgInJvbGUiOiAiUmVzZWFyY2hlciIs
|
||||
ICJnb2FsIjogIllvdSByZXNlYXJjaCBhYm91dCBtYXRoLiIsICJiYWNrc3RvcnkiOiAiWW91J3Jl
|
||||
IGFuIGV4cGVydCBpbiByZXNlYXJjaCBhbmQgeW91IGxvdmUgdG8gbGVhcm4gbmV3IHRoaW5ncy4i
|
||||
LCAidmVyYm9zZT8iOiBmYWxzZSwgIm1heF9pdGVyIjogMjUsICJtYXhfcnBtIjogbnVsbCwgImkx
|
||||
OG4iOiBudWxsLCAibGxtIjogIntcIm5hbWVcIjogbnVsbCwgXCJtb2RlbF9uYW1lXCI6IFwiZ3B0
|
||||
LTRvXCIsIFwidGVtcGVyYXR1cmVcIjogMC43LCBcImNsYXNzXCI6IFwiQ2hhdE9wZW5BSVwifSIs
|
||||
ICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJ0b29sc19uYW1lcyI6IFtdfV1KlgMKCmNy
|
||||
ZXdfdGFza3MShwMKhANbeyJrZXkiOiAiNjM5OTY1MTdmM2YzZjFjOTRkNmJiNjE3YWEwYjFjNGYi
|
||||
LCAiaWQiOiAiNTQ4NzY5YWUtMmFjZS00NzIwLTlhYWEtZWY3YmVjZGRmMjAyIiwgImRlc2NyaXB0
|
||||
aW9uIjogIlJlc2VhcmNoIGEgdG9waWMgdG8gdGVhY2ggYSBraWQgYWdlZCA2IGFib3V0IG1hdGgu
|
||||
IiwgImV4cGVjdGVkX291dHB1dCI6ICJBIHRvcGljLCBleHBsYW5hdGlvbiwgYW5nbGUsIGFuZCBl
|
||||
eGFtcGxlcy4iLCAiYXN5bmNfZXhlY3V0aW9uPyI6IGZhbHNlLCAiaHVtYW5faW5wdXQ/IjogZmFs
|
||||
c2UsICJhZ2VudF9yb2xlIjogIlJlc2VhcmNoZXIiLCAiYWdlbnRfa2V5IjogIjA3ZDk5YjYzMDQx
|
||||
MWQzNWZkOTA0N2E1MzJkNTNkZGE3IiwgImNvbnRleHQiOiBudWxsLCAidG9vbHNfbmFtZXMiOiBb
|
||||
XX1dSioKCHBsYXRmb3JtEh4KHG1hY09TLTE0LjEuMS1hcm02NC1hcm0tNjRiaXRKHAoQcGxhdGZv
|
||||
cm1fcmVsZWFzZRIICgYyMy4xLjBKGwoPcGxhdGZvcm1fc3lzdGVtEggKBkRhcndpbkp7ChBwbGF0
|
||||
Zm9ybV92ZXJzaW9uEmcKZURhcndpbiBLZXJuZWwgVmVyc2lvbiAyMy4xLjA6IE1vbiBPY3QgIDkg
|
||||
MjE6Mjc6MjQgUERUIDIwMjM7IHJvb3Q6eG51LTEwMDAyLjQxLjl+Ni9SRUxFQVNFX0FSTTY0X1Q2
|
||||
MDAwSgoKBGNwdXMSAhgKegIYAYUBAAEAABKOAgoQJwZSCcuSu3diAuOYiVdE2BIIeF1BeIAKnP4q
|
||||
DFRhc2sgQ3JlYXRlZDABOYhX77+1xeIXQcCh77+1xeIXSi4KCGNyZXdfa2V5EiIKIGM5N2I1ZmVi
|
||||
NWQxYjY2YmI1OTAwNmFhYTAxYTI5Y2Q2SjEKB2NyZXdfaWQSJgokNmFlNzhhZjEtNDE5Ny00ODU2
|
||||
LWE1OTItYWEzZGNmMTI5OGJhSi4KCHRhc2tfa2V5EiIKIDYzOTk2NTE3ZjNmM2YxYzk0ZDZiYjYx
|
||||
N2FhMGIxYzRmSjEKB3Rhc2tfaWQSJgokNTQ4NzY5YWUtMmFjZS00NzIwLTlhYWEtZWY3YmVjZGRm
|
||||
MjAyegIYAYUBAAEAABKQAgoQxBgL6L3Ce74Cu4q6j0ErDRIIEnXenwnk/A4qDlRhc2sgRXhlY3V0
|
||||
aW9uMAE5uMzvv7XF4hdBAEPaxrXF4hdKLgoIY3Jld19rZXkSIgogYzk3YjVmZWI1ZDFiNjZiYjU5
|
||||
MDA2YWFhMDFhMjljZDZKMQoHY3Jld19pZBImCiQ2YWU3OGFmMS00MTk3LTQ4NTYtYTU5Mi1hYTNk
|
||||
Y2YxMjk4YmFKLgoIdGFza19rZXkSIgogNjM5OTY1MTdmM2YzZjFjOTRkNmJiNjE3YWEwYjFjNGZK
|
||||
MQoHdGFza19pZBImCiQ1NDg3NjlhZS0yYWNlLTQ3MjAtOWFhYS1lZjdiZWNkZGYyMDJ6AhgBhQEA
|
||||
AQAAEo0MChB2PHWFIMgXT4sbnL1Fk9tqEgj1iM6d8aLAFyoMQ3JldyBDcmVhdGVkMAE5SKovx7XF
|
||||
4hdBwAcyx7XF4hdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zNi4wShoKDnB5dGhvbl92ZXJzaW9u
|
||||
EggKBjMuMTEuNUouCghjcmV3X2tleRIiCiA4YzI3NTJmNDllNWI5ZDJiNjhjYjM1Y2FjOGZjYzg2
|
||||
ZEoxCgdjcmV3X2lkEiYKJDA2MTdhZjU1LWEwNjQtNDEwMi05YjQ0LThhMThiZWU0NTg3OEocCgxj
|
||||
cmV3X3Byb2Nlc3MSDAoKc2VxdWVudGlhbEoRCgtjcmV3X21lbW9yeRICEABKGgoUY3Jld19udW1i
|
||||
ZXJfb2ZfdGFza3MSAhgBShsKFWNyZXdfbnVtYmVyX29mX2FnZW50cxICGAFK/gQKC2NyZXdfYWdl
|
||||
bnRzEu4ECusEW3sia2V5IjogIjhiZDIxMzliNTk3NTE4MTUwNmU0MWZkOWM0NTYzZDc1IiwgImlk
|
||||
IjogIjc4YmM1N2I4LWIwZTgtNDZlNy04OTIzLTQ2ZDJlMWQ4MjQwNyIsICJyb2xlIjogIlJlc2Vh
|
||||
cmNoZXIiLCAiZ29hbCI6ICJNYWtlIHRoZSBiZXN0IHJlc2VhcmNoIGFuZCBhbmFseXNpcyBvbiBj
|
||||
b250ZW50IGFib3V0IEFJIGFuZCBBSSBhZ2VudHMiLCAiYmFja3N0b3J5IjogIllvdSdyZSBhbiBl
|
||||
eHBlcnQgcmVzZWFyY2hlciwgc3BlY2lhbGl6ZWQgaW4gdGVjaG5vbG9neSwgc29mdHdhcmUgZW5n
|
||||
aW5lZXJpbmcsIEFJIGFuZCBzdGFydHVwcy4gWW91IHdvcmsgYXMgYSBmcmVlbGFuY2VyIGFuZCBp
|
||||
cyBub3cgd29ya2luZyBvbiBkb2luZyByZXNlYXJjaCBhbmQgYW5hbHlzaXMgZm9yIGEgbmV3IGN1
|
||||
c3RvbWVyLiIsICJ2ZXJib3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAyNSwgIm1heF9ycG0iOiBu
|
||||
dWxsLCAiaTE4biI6IG51bGwsICJsbG0iOiAie1wibmFtZVwiOiBudWxsLCBcIm1vZGVsX25hbWVc
|
||||
IjogXCJncHQtNG9cIiwgXCJ0ZW1wZXJhdHVyZVwiOiAwLjcsIFwiY2xhc3NcIjogXCJDaGF0T3Bl
|
||||
bkFJXCJ9IiwgImRlbGVnYXRpb25fZW5hYmxlZD8iOiBmYWxzZSwgInRvb2xzX25hbWVzIjogW119
|
||||
XUrMAgoKY3Jld190YXNrcxK9Agq6Alt7ImtleSI6ICIwZDY4NWEyMTk5NGQ5NDkwOTdiYzVhNTZk
|
||||
NzM3ZTZkMSIsICJpZCI6ICI0M2JjZTUwNy0yZGJmLTRiMDMtYTIyMy0zYzMxMmU1MTI4YWQiLCAi
|
||||
ZGVzY3JpcHRpb24iOiAiU2F5IEhpIiwgImV4cGVjdGVkX291dHB1dCI6ICJUaGUgd29yZDogSGki
|
||||
LCAiYXN5bmNfZXhlY3V0aW9uPyI6IGZhbHNlLCAiaHVtYW5faW5wdXQ/IjogZmFsc2UsICJhZ2Vu
|
||||
dF9yb2xlIjogIlJlc2VhcmNoZXIiLCAiYWdlbnRfa2V5IjogIjhiZDIxMzliNTk3NTE4MTUwNmU0
|
||||
MWZkOWM0NTYzZDc1IiwgImNvbnRleHQiOiBudWxsLCAidG9vbHNfbmFtZXMiOiBbXX1dSioKCHBs
|
||||
YXRmb3JtEh4KHG1hY09TLTE0LjEuMS1hcm02NC1hcm0tNjRiaXRKHAoQcGxhdGZvcm1fcmVsZWFz
|
||||
ZRIICgYyMy4xLjBKGwoPcGxhdGZvcm1fc3lzdGVtEggKBkRhcndpbkp7ChBwbGF0Zm9ybV92ZXJz
|
||||
aW9uEmcKZURhcndpbiBLZXJuZWwgVmVyc2lvbiAyMy4xLjA6IE1vbiBPY3QgIDkgMjE6Mjc6MjQg
|
||||
UERUIDIwMjM7IHJvb3Q6eG51LTEwMDAyLjQxLjl+Ni9SRUxFQVNFX0FSTTY0X1Q2MDAwSgoKBGNw
|
||||
dXMSAhgKegIYAYUBAAEAABKOAgoQgFFdbbcq9b6XpdCgV4ygHxII5FvLDHBJWfYqDFRhc2sgQ3Jl
|
||||
YXRlZDABOcB3Sce1xeIXQfDsSce1xeIXSi4KCGNyZXdfa2V5EiIKIDhjMjc1MmY0OWU1YjlkMmI2
|
||||
OGNiMzVjYWM4ZmNjODZkSjEKB2NyZXdfaWQSJgokMDYxN2FmNTUtYTA2NC00MTAyLTliNDQtOGEx
|
||||
OGJlZTQ1ODc4Si4KCHRhc2tfa2V5EiIKIDBkNjg1YTIxOTk0ZDk0OTA5N2JjNWE1NmQ3MzdlNmQx
|
||||
SjEKB3Rhc2tfaWQSJgokNDNiY2U1MDctMmRiZi00YjAzLWEyMjMtM2MzMTJlNTEyOGFkegIYAYUB
|
||||
AAEAABKQAgoQrD1woM4vvML78Ycv2kKJPBIIQiPC5tnfBsEqDlRhc2sgRXhlY3V0aW9uMAE5uB9K
|
||||
x7XF4hdB0DoqyLXF4hdKLgoIY3Jld19rZXkSIgogOGMyNzUyZjQ5ZTViOWQyYjY4Y2IzNWNhYzhm
|
||||
Y2M4NmRKMQoHY3Jld19pZBImCiQwNjE3YWY1NS1hMDY0LTQxMDItOWI0NC04YTE4YmVlNDU4NzhK
|
||||
LgoIdGFza19rZXkSIgogMGQ2ODVhMjE5OTRkOTQ5MDk3YmM1YTU2ZDczN2U2ZDFKMQoHdGFza19p
|
||||
ZBImCiQ0M2JjZTUwNy0yZGJmLTRiMDMtYTIyMy0zYzMxMmU1MTI4YWR6AhgBhQEAAQAAEskSChDy
|
||||
v1twhKVTS2/hcmcIXG8LEginut5ahEsb9yoMQ3JldyBDcmVhdGVkMAE5eMNWyrXF4hdBUBFZyrXF
|
||||
4hdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zNi4wShoKDnB5dGhvbl92ZXJzaW9uEggKBjMuMTEu
|
||||
NUouCghjcmV3X2tleRIiCiBlM2ZkYTBmMzExMGZlODBiMTg5NDdjMDE0NzE0MzBhNEoxCgdjcmV3
|
||||
X2lkEiYKJGJiZmUxOTdkLWFmM2QtNDUwYy04NzlhLWNkOWQyODQyYzk5NUoeCgxjcmV3X3Byb2Nl
|
||||
c3MSDgoMaGllcmFyY2hpY2FsShEKC2NyZXdfbWVtb3J5EgIQAEoaChRjcmV3X251bWJlcl9vZl90
|
||||
YXNrcxICGAFKGwoVY3Jld19udW1iZXJfb2ZfYWdlbnRzEgIYAkrFCQoLY3Jld19hZ2VudHMStQkK
|
||||
sglbeyJrZXkiOiAiOGJkMjEzOWI1OTc1MTgxNTA2ZTQxZmQ5YzQ1NjNkNzUiLCAiaWQiOiAiNzhi
|
||||
YzU3YjgtYjBlOC00NmU3LTg5MjMtNDZkMmUxZDgyNDA3IiwgInJvbGUiOiAiUmVzZWFyY2hlciIs
|
||||
ICJnb2FsIjogIk1ha2UgdGhlIGJlc3QgcmVzZWFyY2ggYW5kIGFuYWx5c2lzIG9uIGNvbnRlbnQg
|
||||
YWJvdXQgQUkgYW5kIEFJIGFnZW50cyIsICJiYWNrc3RvcnkiOiAiWW91J3JlIGFuIGV4cGVydCBy
|
||||
ZXNlYXJjaGVyLCBzcGVjaWFsaXplZCBpbiB0ZWNobm9sb2d5LCBzb2Z0d2FyZSBlbmdpbmVlcmlu
|
||||
ZywgQUkgYW5kIHN0YXJ0dXBzLiBZb3Ugd29yayBhcyBhIGZyZWVsYW5jZXIgYW5kIGlzIG5vdyB3
|
||||
b3JraW5nIG9uIGRvaW5nIHJlc2VhcmNoIGFuZCBhbmFseXNpcyBmb3IgYSBuZXcgY3VzdG9tZXIu
|
||||
IiwgInZlcmJvc2U/IjogZmFsc2UsICJtYXhfaXRlciI6IDI1LCAibWF4X3JwbSI6IG51bGwsICJp
|
||||
MThuIjogbnVsbCwgImxsbSI6ICJ7XCJuYW1lXCI6IG51bGwsIFwibW9kZWxfbmFtZVwiOiBcImdw
|
||||
dC00b1wiLCBcInRlbXBlcmF0dXJlXCI6IDAuNywgXCJjbGFzc1wiOiBcIkNoYXRPcGVuQUlcIn0i
|
||||
LCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6IGZhbHNlLCAidG9vbHNfbmFtZXMiOiBbXX0sIHsia2V5
|
||||
IjogIjlhNTAxNWVmNDg5NWRjNjI3OGQ1NDgxOGJhNDQ2YWY3IiwgImlkIjogIjk2YzIxOTk1LWFl
|
||||
ZDItNGJjNC1iYjY4LTdiOTA3ZTYzYjQyNSIsICJyb2xlIjogIlNlbmlvciBXcml0ZXIiLCAiZ29h
|
||||
bCI6ICJXcml0ZSB0aGUgYmVzdCBjb250ZW50IGFib3V0IEFJIGFuZCBBSSBhZ2VudHMuIiwgImJh
|
||||
Y2tzdG9yeSI6ICJZb3UncmUgYSBzZW5pb3Igd3JpdGVyLCBzcGVjaWFsaXplZCBpbiB0ZWNobm9s
|
||||
b2d5LCBzb2Z0d2FyZSBlbmdpbmVlcmluZywgQUkgYW5kIHN0YXJ0dXBzLiBZb3Ugd29yayBhcyBh
|
||||
IGZyZWVsYW5jZXIgYW5kIGFyZSBub3cgd29ya2luZyBvbiB3cml0aW5nIGNvbnRlbnQgZm9yIGEg
|
||||
bmV3IGN1c3RvbWVyLiIsICJ2ZXJib3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAyNSwgIm1heF9y
|
||||
cG0iOiBudWxsLCAiaTE4biI6IG51bGwsICJsbG0iOiAie1wibmFtZVwiOiBudWxsLCBcIm1vZGVs
|
||||
X25hbWVcIjogXCJncHQtNG9cIiwgXCJ0ZW1wZXJhdHVyZVwiOiAwLjcsIFwiY2xhc3NcIjogXCJD
|
||||
aGF0T3BlbkFJXCJ9IiwgImRlbGVnYXRpb25fZW5hYmxlZD8iOiBmYWxzZSwgInRvb2xzX25hbWVz
|
||||
IjogW119XUq/BAoKY3Jld190YXNrcxKwBAqtBFt7ImtleSI6ICI1ZmE2NWMwNmE5ZTMxZjJjNjk1
|
||||
NDMyNjY4YWNkNjJkZCIsICJpZCI6ICJjNTFlMzZjZC1iMzc3LTRiMjMtOTY1ZC0xMjE4ZjAxN2Rl
|
||||
NTkiLCAiZGVzY3JpcHRpb24iOiAiQ29tZSB1cCB3aXRoIGEgbGlzdCBvZiA1IGludGVyZXN0aW5n
|
||||
IGlkZWFzIHRvIGV4cGxvcmUgZm9yIGFuIGFydGljbGUsIHRoZW4gd3JpdGUgb25lIGFtYXppbmcg
|
||||
cGFyYWdyYXBoIGhpZ2hsaWdodCBmb3IgZWFjaCBpZGVhIHRoYXQgc2hvd2Nhc2VzIGhvdyBnb29k
|
||||
IGFuIGFydGljbGUgYWJvdXQgdGhpcyB0b3BpYyBjb3VsZCBiZS4gUmV0dXJuIHRoZSBsaXN0IG9m
|
||||
IGlkZWFzIHdpdGggdGhlaXIgcGFyYWdyYXBoIGFuZCB5b3VyIG5vdGVzLiIsICJleHBlY3RlZF9v
|
||||
dXRwdXQiOiAiNSBidWxsZXQgcG9pbnRzIHdpdGggYSBwYXJhZ3JhcGggZm9yIGVhY2ggaWRlYS4i
|
||||
LCAiYXN5bmNfZXhlY3V0aW9uPyI6IGZhbHNlLCAiaHVtYW5faW5wdXQ/IjogZmFsc2UsICJhZ2Vu
|
||||
dF9yb2xlIjogIk5vbmUiLCAiYWdlbnRfa2V5IjogbnVsbCwgImNvbnRleHQiOiBudWxsLCAidG9v
|
||||
bHNfbmFtZXMiOiBbXX1dSioKCHBsYXRmb3JtEh4KHG1hY09TLTE0LjEuMS1hcm02NC1hcm0tNjRi
|
||||
aXRKHAoQcGxhdGZvcm1fcmVsZWFzZRIICgYyMy4xLjBKGwoPcGxhdGZvcm1fc3lzdGVtEggKBkRh
|
||||
cndpbkp7ChBwbGF0Zm9ybV92ZXJzaW9uEmcKZURhcndpbiBLZXJuZWwgVmVyc2lvbiAyMy4xLjA6
|
||||
IE1vbiBPY3QgIDkgMjE6Mjc6MjQgUERUIDIwMjM7IHJvb3Q6eG51LTEwMDAyLjQxLjl+Ni9SRUxF
|
||||
QVNFX0FSTTY0X1Q2MDAwSgoKBGNwdXMSAhgKegIYAYUBAAEAABLJEgoQC4ERVCPi0URQcsRKT+/y
|
||||
GxIInSinsu8yhJsqDENyZXcgQ3JlYXRlZDABOcAcwM61xeIXQaiRws61xeIXShoKDmNyZXdhaV92
|
||||
ZXJzaW9uEggKBjAuMzYuMEoaCg5weXRob25fdmVyc2lvbhIICgYzLjExLjVKLgoIY3Jld19rZXkS
|
||||
IgogZTNmZGEwZjMxMTBmZTgwYjE4OTQ3YzAxNDcxNDMwYTRKMQoHY3Jld19pZBImCiQxMTUxYTA2
|
||||
Zi02YjdhLTQzZGYtOWZjNi05MzJmYTYzY2RhZjRKHgoMY3Jld19wcm9jZXNzEg4KDGhpZXJhcmNo
|
||||
aWNhbEoRCgtjcmV3X21lbW9yeRICEABKGgoUY3Jld19udW1iZXJfb2ZfdGFza3MSAhgBShsKFWNy
|
||||
ZXdfbnVtYmVyX29mX2FnZW50cxICGAJKxQkKC2NyZXdfYWdlbnRzErUJCrIJW3sia2V5IjogIjhi
|
||||
ZDIxMzliNTk3NTE4MTUwNmU0MWZkOWM0NTYzZDc1IiwgImlkIjogIjc4YmM1N2I4LWIwZTgtNDZl
|
||||
Ny04OTIzLTQ2ZDJlMWQ4MjQwNyIsICJyb2xlIjogIlJlc2VhcmNoZXIiLCAiZ29hbCI6ICJNYWtl
|
||||
IHRoZSBiZXN0IHJlc2VhcmNoIGFuZCBhbmFseXNpcyBvbiBjb250ZW50IGFib3V0IEFJIGFuZCBB
|
||||
SSBhZ2VudHMiLCAiYmFja3N0b3J5IjogIllvdSdyZSBhbiBleHBlcnQgcmVzZWFyY2hlciwgc3Bl
|
||||
Y2lhbGl6ZWQgaW4gdGVjaG5vbG9neSwgc29mdHdhcmUgZW5naW5lZXJpbmcsIEFJIGFuZCBzdGFy
|
||||
dHVwcy4gWW91IHdvcmsgYXMgYSBmcmVlbGFuY2VyIGFuZCBpcyBub3cgd29ya2luZyBvbiBkb2lu
|
||||
ZyByZXNlYXJjaCBhbmQgYW5hbHlzaXMgZm9yIGEgbmV3IGN1c3RvbWVyLiIsICJ2ZXJib3NlPyI6
|
||||
IGZhbHNlLCAibWF4X2l0ZXIiOiAyNSwgIm1heF9ycG0iOiBudWxsLCAiaTE4biI6IG51bGwsICJs
|
||||
bG0iOiAie1wibmFtZVwiOiBudWxsLCBcIm1vZGVsX25hbWVcIjogXCJncHQtNG9cIiwgXCJ0ZW1w
|
||||
ZXJhdHVyZVwiOiAwLjcsIFwiY2xhc3NcIjogXCJDaGF0T3BlbkFJXCJ9IiwgImRlbGVnYXRpb25f
|
||||
ZW5hYmxlZD8iOiBmYWxzZSwgInRvb2xzX25hbWVzIjogW119LCB7ImtleSI6ICI5YTUwMTVlZjQ4
|
||||
OTVkYzYyNzhkNTQ4MThiYTQ0NmFmNyIsICJpZCI6ICI5NmMyMTk5NS1hZWQyLTRiYzQtYmI2OC03
|
||||
YjkwN2U2M2I0MjUiLCAicm9sZSI6ICJTZW5pb3IgV3JpdGVyIiwgImdvYWwiOiAiV3JpdGUgdGhl
|
||||
IGJlc3QgY29udGVudCBhYm91dCBBSSBhbmQgQUkgYWdlbnRzLiIsICJiYWNrc3RvcnkiOiAiWW91
|
||||
J3JlIGEgc2VuaW9yIHdyaXRlciwgc3BlY2lhbGl6ZWQgaW4gdGVjaG5vbG9neSwgc29mdHdhcmUg
|
||||
ZW5naW5lZXJpbmcsIEFJIGFuZCBzdGFydHVwcy4gWW91IHdvcmsgYXMgYSBmcmVlbGFuY2VyIGFu
|
||||
ZCBhcmUgbm93IHdvcmtpbmcgb24gd3JpdGluZyBjb250ZW50IGZvciBhIG5ldyBjdXN0b21lci4i
|
||||
LCAidmVyYm9zZT8iOiBmYWxzZSwgIm1heF9pdGVyIjogMjUsICJtYXhfcnBtIjogbnVsbCwgImkx
|
||||
OG4iOiBudWxsLCAibGxtIjogIntcIm5hbWVcIjogbnVsbCwgXCJtb2RlbF9uYW1lXCI6IFwiZ3B0
|
||||
LTRvXCIsIFwidGVtcGVyYXR1cmVcIjogMC43LCBcImNsYXNzXCI6IFwiQ2hhdE9wZW5BSVwifSIs
|
||||
ICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJ0b29sc19uYW1lcyI6IFtdfV1KvwQKCmNy
|
||||
ZXdfdGFza3MSsAQKrQRbeyJrZXkiOiAiNWZhNjVjMDZhOWUzMWYyYzY5NTQzMjY2OGFjZDYyZGQi
|
||||
LCAiaWQiOiAiMDIzOGY1NWUtZTMwYy00MDIwLWE0ZmQtNWY2ZmUzNjE1NjA4IiwgImRlc2NyaXB0
|
||||
aW9uIjogIkNvbWUgdXAgd2l0aCBhIGxpc3Qgb2YgNSBpbnRlcmVzdGluZyBpZGVhcyB0byBleHBs
|
||||
b3JlIGZvciBhbiBhcnRpY2xlLCB0aGVuIHdyaXRlIG9uZSBhbWF6aW5nIHBhcmFncmFwaCBoaWdo
|
||||
bGlnaHQgZm9yIGVhY2ggaWRlYSB0aGF0IHNob3djYXNlcyBob3cgZ29vZCBhbiBhcnRpY2xlIGFi
|
||||
b3V0IHRoaXMgdG9waWMgY291bGQgYmUuIFJldHVybiB0aGUgbGlzdCBvZiBpZGVhcyB3aXRoIHRo
|
||||
ZWlyIHBhcmFncmFwaCBhbmQgeW91ciBub3Rlcy4iLCAiZXhwZWN0ZWRfb3V0cHV0IjogIjUgYnVs
|
||||
bGV0IHBvaW50cyB3aXRoIGEgcGFyYWdyYXBoIGZvciBlYWNoIGlkZWEuIiwgImFzeW5jX2V4ZWN1
|
||||
dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdlbnRfcm9sZSI6ICJOb25l
|
||||
IiwgImFnZW50X2tleSI6IG51bGwsICJjb250ZXh0IjogbnVsbCwgInRvb2xzX25hbWVzIjogW119
|
||||
XUoqCghwbGF0Zm9ybRIeChxtYWNPUy0xNC4xLjEtYXJtNjQtYXJtLTY0Yml0ShwKEHBsYXRmb3Jt
|
||||
X3JlbGVhc2USCAoGMjMuMS4wShsKD3BsYXRmb3JtX3N5c3RlbRIICgZEYXJ3aW5KewoQcGxhdGZv
|
||||
cm1fdmVyc2lvbhJnCmVEYXJ3aW4gS2VybmVsIFZlcnNpb24gMjMuMS4wOiBNb24gT2N0ICA5IDIx
|
||||
OjI3OjI0IFBEVCAyMDIzOyByb290OnhudS0xMDAwMi40MS45fjYvUkVMRUFTRV9BUk02NF9UNjAw
|
||||
MEoKCgRjcHVzEgIYCnoCGAGFAQABAAASwRUKEK+2FWN+LUsj3/10yDBx5MMSCPaor0csAVI6KgxD
|
||||
cmV3IENyZWF0ZWQwATlwEcXPtcXiF0FQscfPtcXiF0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjM2
|
||||
LjBKGgoOcHl0aG9uX3ZlcnNpb24SCAoGMy4xMS41Si4KCGNyZXdfa2V5EiIKIGQzODQ2YzlkMjc2
|
||||
ZThlNmU0M2UzMWY2MTc2MzU3YjRmSjEKB2NyZXdfaWQSJgokYzE0MGJlNTgtOTJjZS00YzQyLWJk
|
||||
YjQtOThjNTlmMGUzYzAxShwKDGNyZXdfcHJvY2VzcxIMCgpzZXF1ZW50aWFsShEKC2NyZXdfbWVt
|
||||
b3J5EgIQAEoaChRjcmV3X251bWJlcl9vZl90YXNrcxICGAJKGwoVY3Jld19udW1iZXJfb2ZfYWdl
|
||||
bnRzEgIYAkrFCQoLY3Jld19hZ2VudHMStQkKsglbeyJrZXkiOiAiOGJkMjEzOWI1OTc1MTgxNTA2
|
||||
ZTQxZmQ5YzQ1NjNkNzUiLCAiaWQiOiAiNzhiYzU3YjgtYjBlOC00NmU3LTg5MjMtNDZkMmUxZDgy
|
||||
NDA3IiwgInJvbGUiOiAiUmVzZWFyY2hlciIsICJnb2FsIjogIk1ha2UgdGhlIGJlc3QgcmVzZWFy
|
||||
Y2ggYW5kIGFuYWx5c2lzIG9uIGNvbnRlbnQgYWJvdXQgQUkgYW5kIEFJIGFnZW50cyIsICJiYWNr
|
||||
c3RvcnkiOiAiWW91J3JlIGFuIGV4cGVydCByZXNlYXJjaGVyLCBzcGVjaWFsaXplZCBpbiB0ZWNo
|
||||
bm9sb2d5LCBzb2Z0d2FyZSBlbmdpbmVlcmluZywgQUkgYW5kIHN0YXJ0dXBzLiBZb3Ugd29yayBh
|
||||
cyBhIGZyZWVsYW5jZXIgYW5kIGlzIG5vdyB3b3JraW5nIG9uIGRvaW5nIHJlc2VhcmNoIGFuZCBh
|
||||
bmFseXNpcyBmb3IgYSBuZXcgY3VzdG9tZXIuIiwgInZlcmJvc2U/IjogZmFsc2UsICJtYXhfaXRl
|
||||
ciI6IDI1LCAibWF4X3JwbSI6IG51bGwsICJpMThuIjogbnVsbCwgImxsbSI6ICJ7XCJuYW1lXCI6
|
||||
IG51bGwsIFwibW9kZWxfbmFtZVwiOiBcImdwdC00b1wiLCBcInRlbXBlcmF0dXJlXCI6IDAuNywg
|
||||
XCJjbGFzc1wiOiBcIkNoYXRPcGVuQUlcIn0iLCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6IGZhbHNl
|
||||
LCAidG9vbHNfbmFtZXMiOiBbXX0sIHsia2V5IjogIjlhNTAxNWVmNDg5NWRjNjI3OGQ1NDgxOGJh
|
||||
NDQ2YWY3IiwgImlkIjogIjk2YzIxOTk1LWFlZDItNGJjNC1iYjY4LTdiOTA3ZTYzYjQyNSIsICJy
|
||||
b2xlIjogIlNlbmlvciBXcml0ZXIiLCAiZ29hbCI6ICJXcml0ZSB0aGUgYmVzdCBjb250ZW50IGFi
|
||||
b3V0IEFJIGFuZCBBSSBhZ2VudHMuIiwgImJhY2tzdG9yeSI6ICJZb3UncmUgYSBzZW5pb3Igd3Jp
|
||||
dGVyLCBzcGVjaWFsaXplZCBpbiB0ZWNobm9sb2d5LCBzb2Z0d2FyZSBlbmdpbmVlcmluZywgQUkg
|
||||
YW5kIHN0YXJ0dXBzLiBZb3Ugd29yayBhcyBhIGZyZWVsYW5jZXIgYW5kIGFyZSBub3cgd29ya2lu
|
||||
ZyBvbiB3cml0aW5nIGNvbnRlbnQgZm9yIGEgbmV3IGN1c3RvbWVyLiIsICJ2ZXJib3NlPyI6IGZh
|
||||
bHNlLCAibWF4X2l0ZXIiOiAyNSwgIm1heF9ycG0iOiBudWxsLCAiaTE4biI6IG51bGwsICJsbG0i
|
||||
OiAie1wibmFtZVwiOiBudWxsLCBcIm1vZGVsX25hbWVcIjogXCJncHQtNG9cIiwgXCJ0ZW1wZXJh
|
||||
dHVyZVwiOiAwLjcsIFwiY2xhc3NcIjogXCJDaGF0T3BlbkFJXCJ9IiwgImRlbGVnYXRpb25fZW5h
|
||||
YmxlZD8iOiBmYWxzZSwgInRvb2xzX25hbWVzIjogW119XUq5BwoKY3Jld190YXNrcxKqBwqnB1t7
|
||||
ImtleSI6ICJlOWU2YjcyYWFjMzI2NDU5ZGQ3MDY4ZjBiMTcxN2MxYyIsICJpZCI6ICI5NWE1NzQz
|
||||
Yy1lMjRmLTQ4ODItYmQ3Mi02ZmZlZjViYTMzNTYiLCAiZGVzY3JpcHRpb24iOiAiR2VuZXJhdGUg
|
||||
YSBsaXN0IG9mIDUgaW50ZXJlc3RpbmcgaWRlYXMgdG8gZXhwbG9yZSBmb3IgYW4gYXJ0aWNsZSwg
|
||||
d2hlcmUgZWFjaCBidWxsZXRwb2ludCBpcyB1bmRlciAxNSB3b3Jkcy4iLCAiZXhwZWN0ZWRfb3V0
|
||||
cHV0IjogIkJ1bGxldCBwb2ludCBsaXN0IG9mIDUgaW1wb3J0YW50IGV2ZW50cy4gTm8gYWRkaXRp
|
||||
b25hbCBjb21tZW50YXJ5LiIsICJhc3luY19leGVjdXRpb24/IjogZmFsc2UsICJodW1hbl9pbnB1
|
||||
dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAiUmVzZWFyY2hlciIsICJhZ2VudF9rZXkiOiAiOGJk
|
||||
MjEzOWI1OTc1MTgxNTA2ZTQxZmQ5YzQ1NjNkNzUiLCAiY29udGV4dCI6IG51bGwsICJ0b29sc19u
|
||||
YW1lcyI6IFtdfSwgeyJrZXkiOiAiZWVlZTdlNzNkNWRmNjZkNDhkMmQ4MDdiYWZmODc0ZjMiLCAi
|
||||
aWQiOiAiNzA4MGY4M2YtNmQ1Zi00MzM0LTgwZTctODI3NzEzNmQ4ZDdkIiwgImRlc2NyaXB0aW9u
|
||||
IjogIldyaXRlIGEgc2VudGVuY2UgYWJvdXQgdGhlIGV2ZW50cyIsICJleHBlY3RlZF9vdXRwdXQi
|
||||
OiAiQSBzZW50ZW5jZSBhYm91dCB0aGUgZXZlbnRzIiwgImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxz
|
||||
ZSwgImh1bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdlbnRfcm9sZSI6ICJTZW5pb3IgV3JpdGVyIiwg
|
||||
ImFnZW50X2tleSI6ICI5YTUwMTVlZjQ4OTVkYzYyNzhkNTQ4MThiYTQ0NmFmNyIsICJjb250ZXh0
|
||||
IjogWyJHZW5lcmF0ZSBhIGxpc3Qgb2YgNSBpbnRlcmVzdGluZyBpZGVhcyB0byBleHBsb3JlIGZv
|
||||
ciBhbiBhcnRpY2xlLCB3aGVyZSBlYWNoIGJ1bGxldHBvaW50IGlzIHVuZGVyIDE1IHdvcmRzLiJd
|
||||
LCAidG9vbHNfbmFtZXMiOiBbXX1dSioKCHBsYXRmb3JtEh4KHG1hY09TLTE0LjEuMS1hcm02NC1h
|
||||
cm0tNjRiaXRKHAoQcGxhdGZvcm1fcmVsZWFzZRIICgYyMy4xLjBKGwoPcGxhdGZvcm1fc3lzdGVt
|
||||
EggKBkRhcndpbkp7ChBwbGF0Zm9ybV92ZXJzaW9uEmcKZURhcndpbiBLZXJuZWwgVmVyc2lvbiAy
|
||||
My4xLjA6IE1vbiBPY3QgIDkgMjE6Mjc6MjQgUERUIDIwMjM7IHJvb3Q6eG51LTEwMDAyLjQxLjl+
|
||||
Ni9SRUxFQVNFX0FSTTY0X1Q2MDAwSgoKBGNwdXMSAhgKegIYAYUBAAEAABKxCwoQrufF6FYNBLX1
|
||||
X9l1YHzEsBIIg+LU+V/XPr0qDENyZXcgQ3JlYXRlZDABOYhBV9K1xeIXQcB/WdK1xeIXShoKDmNy
|
||||
ZXdhaV92ZXJzaW9uEggKBjAuMzYuMEoaCg5weXRob25fdmVyc2lvbhIICgYzLjExLjVKLgoIY3Jl
|
||||
d19rZXkSIgogNjczOGFkNWI4Y2IzZTZmMWMxYzkzNTBiOTZjMmU2NzhKMQoHY3Jld19pZBImCiQw
|
||||
ZDFjYjk0Yy1kMjgyLTQwYWUtYWVlYy0yN2JjNzc1NGZmOThKHAoMY3Jld19wcm9jZXNzEgwKCnNl
|
||||
cXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdfbnVtYmVyX29mX3Rhc2tzEgIYAUob
|
||||
ChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgBSuEDCgtjcmV3X2FnZW50cxLRAwrOA1t7ImtleSI6
|
||||
ICI1MTJhNmRjMzc5ZjY2YjIxZWVhYjI0ZTYzNDgzNmY3MiIsICJpZCI6ICJjOTMwNzU2OS05OTRl
|
||||
LTRkMTItYWI5OS01NzlhYjZhNDMyMDkiLCAicm9sZSI6ICJDb250ZW50IFdyaXRlciIsICJnb2Fs
|
||||
IjogIldyaXRlIGVuZ2FnaW5nIGNvbnRlbnQgb24gdmFyaW91cyB0b3BpY3MuIiwgImJhY2tzdG9y
|
||||
eSI6ICJZb3UgaGF2ZSBhIGJhY2tncm91bmQgaW4gam91cm5hbGlzbSBhbmQgY3JlYXRpdmUgd3Jp
|
||||
dGluZy4iLCAidmVyYm9zZT8iOiBmYWxzZSwgIm1heF9pdGVyIjogMjUsICJtYXhfcnBtIjogbnVs
|
||||
bCwgImkxOG4iOiBudWxsLCAibGxtIjogIntcIm5hbWVcIjogbnVsbCwgXCJtb2RlbF9uYW1lXCI6
|
||||
IFwiZ3B0LTRvXCIsIFwidGVtcGVyYXR1cmVcIjogMC43LCBcImNsYXNzXCI6IFwiQ2hhdE9wZW5B
|
||||
SVwifSIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogdHJ1ZSwgInRvb2xzX25hbWVzIjogW119XUqN
|
||||
AwoKY3Jld190YXNrcxL+Agr7Alt7ImtleSI6ICIzNDc3MDc2YmUzYWY3MTMwNDYyZWRhYTJlYjhh
|
||||
MDQ4ZSIsICJpZCI6ICI4NDMxNmNhOS05ZDNmLTRkZjctODQ4OC0zMDBmNDMxYWE1MDgiLCAiZGVz
|
||||
Y3JpcHRpb24iOiAiV3JpdGUgYSBkZXRhaWxlZCBhcnRpY2xlIGFib3V0IEFJIGluIGhlYWx0aGNh
|
||||
cmUuIiwgImV4cGVjdGVkX291dHB1dCI6ICJBIDEgcGFyYWdyYXBoIGFydGljbGUgYWJvdXQgQUku
|
||||
IiwgImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdl
|
||||
bnRfcm9sZSI6ICJDb250ZW50IFdyaXRlciIsICJhZ2VudF9rZXkiOiAiNTEyYTZkYzM3OWY2NmIy
|
||||
MWVlYWIyNGU2MzQ4MzZmNzIiLCAiY29udGV4dCI6IG51bGwsICJ0b29sc19uYW1lcyI6IFtdfV1K
|
||||
KgoIcGxhdGZvcm0SHgocbWFjT1MtMTQuMS4xLWFybTY0LWFybS02NGJpdEocChBwbGF0Zm9ybV9y
|
||||
ZWxlYXNlEggKBjIzLjEuMEobCg9wbGF0Zm9ybV9zeXN0ZW0SCAoGRGFyd2luSnsKEHBsYXRmb3Jt
|
||||
X3ZlcnNpb24SZwplRGFyd2luIEtlcm5lbCBWZXJzaW9uIDIzLjEuMDogTW9uIE9jdCAgOSAyMToy
|
||||
NzoyNCBQRFQgMjAyMzsgcm9vdDp4bnUtMTAwMDIuNDEuOX42L1JFTEVBU0VfQVJNNjRfVDYwMDBK
|
||||
CgoEY3B1cxICGAp6AhgBhQEAAQAA
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate, br
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '23733'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.25.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Tue, 16 Jul 2024 18:43:12 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
@@ -20,7 +20,7 @@ interactions:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
@@ -44,80 +44,103 @@ interactions:
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.9
|
||||
- 3.11.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
string: 'data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"I"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
need"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
to"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
decide"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
choose"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
on"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
an"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
appropriate"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
greeting"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":".\n\n"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
to"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"Action"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
make"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
everyone"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
feel"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
welcome"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":".\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"Action"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
Decide"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
Greetings"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"Action"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"Action"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
Input"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
{}\n"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{"content":"
|
||||
{}"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
data: {"id":"chatcmpl-9kFn5wBmr3LxGG62NXc5GVlHvdhYu","object":"chat.completion.chunk","created":1720810815,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_298125635f","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
@@ -128,20 +151,20 @@ interactions:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 89e305e3c8e382f5-GIG
|
||||
- 8a23466b68c6da17-MIA
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Thu, 04 Jul 2024 23:51:24 GMT
|
||||
- Fri, 12 Jul 2024 19:00:15 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=y7BtDW9RWNaYoBExulKsMw50ppqr1itieWbcStDWqVc-1720137084-1.0.1.1-EYCEQ9jOimP45.FgXjdzWftUrV1HHm49W4wbcxFhbrj2DVC1LnMbz9.l.c._AqBRgFAE3xVolosvjmoFDAMPYQ;
|
||||
path=/; expires=Fri, 05-Jul-24 00:21:24 GMT; domain=.api.openai.com; HttpOnly;
|
||||
- __cf_bm=5Q4fR7Axcb9pOI3.xD7MckzP6nvjl7aUGw1NJkxLXiY-1720810815-1.0.1.1-nBCqeg7gu9__SL5Jstz.kPKlKOisvW1iKhTWpyXfY.wyoYUCxPPcaOkFtXjn4wZuibClBrGVEsFDAAuB1Jwj1g;
|
||||
path=/; expires=Fri, 12-Jul-24 19:30:15 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=pZBoWQ1_gTeUh2oe6ta.S2mxWtdaHvAtn6m2HszLdwk-1720137084219-0.0.1.1-604800000;
|
||||
- _cfuvid=VtZy6_mT_xZheA53qWxlfPk4NhcrOjGYsGlKoh66LSw-1720810815947-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
@@ -150,7 +173,7 @@ interactions:
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '335'
|
||||
- '537'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
@@ -158,17 +181,17 @@ interactions:
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '16000000'
|
||||
- '22000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '15999700'
|
||||
- '21999701'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 1ms
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_b3f7e3c47df2641d6bef704ef3ae8a0f
|
||||
- req_1b582ffd4f0b3972542c2dff29ded287
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
@@ -187,24 +210,24 @@ interactions:
|
||||
greeting.\n\nThis is the expect criteria for your final answer: The greeting.
|
||||
\n you MUST return the actual complete content as the final answer, not a summary.\n\nBegin!
|
||||
This is VERY important to you, use the tools available and give your best Final
|
||||
Answer, your job depends on it!\n\nThought:\nI need to decide on an appropriate
|
||||
greeting.\n\nAction: Decide Greetings\nAction Input: {}\n\nObservation: Howdy!\n",
|
||||
"role": "user"}], "model": "gpt-4o", "n": 1, "stop": ["\nObservation"], "stream":
|
||||
true, "temperature": 0.7}'
|
||||
Answer, your job depends on it!\n\nThought:\nThought: I need to choose an appropriate
|
||||
greeting to make everyone feel welcome.\nAction: Decide Greetings\nAction Input:
|
||||
{}\nObservation: Howdy!\n", "role": "user"}], "model": "gpt-4o", "n": 1, "stop":
|
||||
["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1404'
|
||||
- '1436'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=y7BtDW9RWNaYoBExulKsMw50ppqr1itieWbcStDWqVc-1720137084-1.0.1.1-EYCEQ9jOimP45.FgXjdzWftUrV1HHm49W4wbcxFhbrj2DVC1LnMbz9.l.c._AqBRgFAE3xVolosvjmoFDAMPYQ;
|
||||
_cfuvid=pZBoWQ1_gTeUh2oe6ta.S2mxWtdaHvAtn6m2HszLdwk-1720137084219-0.0.1.1-604800000
|
||||
- __cf_bm=5Q4fR7Axcb9pOI3.xD7MckzP6nvjl7aUGw1NJkxLXiY-1720810815-1.0.1.1-nBCqeg7gu9__SL5Jstz.kPKlKOisvW1iKhTWpyXfY.wyoYUCxPPcaOkFtXjn4wZuibClBrGVEsFDAAuB1Jwj1g;
|
||||
_cfuvid=VtZy6_mT_xZheA53qWxlfPk4NhcrOjGYsGlKoh66LSw-1720810815947-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
@@ -222,68 +245,68 @@ interactions:
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.9
|
||||
- 3.11.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
string: 'data: {"id":"chatcmpl-9kFn6xVncEzHOGdnzGN9R8fMONMEi","object":"chat.completion.chunk","created":1720810816,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_dd932ca5d1","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFn6xVncEzHOGdnzGN9R8fMONMEi","object":"chat.completion.chunk","created":1720810816,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_dd932ca5d1","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFn6xVncEzHOGdnzGN9R8fMONMEi","object":"chat.completion.chunk","created":1720810816,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_dd932ca5d1","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFn6xVncEzHOGdnzGN9R8fMONMEi","object":"chat.completion.chunk","created":1720810816,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_dd932ca5d1","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFn6xVncEzHOGdnzGN9R8fMONMEi","object":"chat.completion.chunk","created":1720810816,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_dd932ca5d1","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFn6xVncEzHOGdnzGN9R8fMONMEi","object":"chat.completion.chunk","created":1720810816,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_dd932ca5d1","choices":[{"index":0,"delta":{"content":"
|
||||
know"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFn6xVncEzHOGdnzGN9R8fMONMEi","object":"chat.completion.chunk","created":1720810816,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_dd932ca5d1","choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFn6xVncEzHOGdnzGN9R8fMONMEi","object":"chat.completion.chunk","created":1720810816,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_dd932ca5d1","choices":[{"index":0,"delta":{"content":"
|
||||
final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFn6xVncEzHOGdnzGN9R8fMONMEi","object":"chat.completion.chunk","created":1720810816,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_dd932ca5d1","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":".\n\n"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFn6xVncEzHOGdnzGN9R8fMONMEi","object":"chat.completion.chunk","created":1720810816,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_dd932ca5d1","choices":[{"index":0,"delta":{"content":".\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFn6xVncEzHOGdnzGN9R8fMONMEi","object":"chat.completion.chunk","created":1720810816,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_dd932ca5d1","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFn6xVncEzHOGdnzGN9R8fMONMEi","object":"chat.completion.chunk","created":1720810816,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_dd932ca5d1","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFn6xVncEzHOGdnzGN9R8fMONMEi","object":"chat.completion.chunk","created":1720810816,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_dd932ca5d1","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9kFn6xVncEzHOGdnzGN9R8fMONMEi","object":"chat.completion.chunk","created":1720810816,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_dd932ca5d1","choices":[{"index":0,"delta":{"content":"
|
||||
How"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"dy"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFn6xVncEzHOGdnzGN9R8fMONMEi","object":"chat.completion.chunk","created":1720810816,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_dd932ca5d1","choices":[{"index":0,"delta":{"content":"dy"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"!"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9kFn6xVncEzHOGdnzGN9R8fMONMEi","object":"chat.completion.chunk","created":1720810816,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_dd932ca5d1","choices":[{"index":0,"delta":{"content":"!"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
data: {"id":"chatcmpl-9kFn6xVncEzHOGdnzGN9R8fMONMEi","object":"chat.completion.chunk","created":1720810816,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_dd932ca5d1","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
@@ -294,13 +317,13 @@ interactions:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 89e305ea4abc82f5-GIG
|
||||
- 8a234672c9fbda17-MIA
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Thu, 04 Jul 2024 23:51:24 GMT
|
||||
- Fri, 12 Jul 2024 19:00:16 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
@@ -310,7 +333,7 @@ interactions:
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '91'
|
||||
- '122'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
@@ -318,17 +341,17 @@ interactions:
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '16000000'
|
||||
- '22000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '15999673'
|
||||
- '21999666'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 1ms
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_10032db16fa190e8435947a6aaa700ff
|
||||
- req_484dde23b57ef3f715ab183690fe73b9
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
|
||||
@@ -1,13 +1,13 @@
|
||||
"""Test Agent creation and execution basic functionality."""
|
||||
|
||||
import hashlib
|
||||
import json
|
||||
from concurrent.futures import Future
|
||||
from unittest import mock
|
||||
from unittest.mock import patch
|
||||
from unittest.mock import MagicMock, patch
|
||||
|
||||
import pydantic_core
|
||||
import pytest
|
||||
|
||||
from crewai.agent import Agent
|
||||
from crewai.agents.cache import CacheHandler
|
||||
from crewai.crew import Crew
|
||||
@@ -15,8 +15,11 @@ from crewai.crews.crew_output import CrewOutput
|
||||
from crewai.memory.contextual.contextual_memory import ContextualMemory
|
||||
from crewai.process import Process
|
||||
from crewai.task import Task
|
||||
from crewai.tasks.conditional_task import ConditionalTask
|
||||
from crewai.tasks.output_format import OutputFormat
|
||||
from crewai.tasks.task_output import TaskOutput
|
||||
from crewai.utilities import Logger, RPMController
|
||||
from crewai.utilities.task_output_storage_handler import TaskOutputStorageHandler
|
||||
|
||||
ceo = Agent(
|
||||
role="CEO",
|
||||
@@ -136,7 +139,6 @@ def test_async_task_cannot_include_sequential_async_tasks_in_context():
|
||||
|
||||
|
||||
def test_context_no_future_tasks():
|
||||
|
||||
task2 = Task(
|
||||
description="Task 2",
|
||||
expected_output="output",
|
||||
@@ -584,7 +586,6 @@ def test_sequential_async_task_execution_completion():
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_single_task_with_async_execution():
|
||||
|
||||
researcher_agent = Agent(
|
||||
role="Researcher",
|
||||
goal="Make the best research and analysis on content about AI and AI agents",
|
||||
@@ -740,7 +741,6 @@ def test_async_task_execution_call_count():
|
||||
) as mock_execute_sync, patch.object(
|
||||
Task, "execute_async", return_value=mock_future
|
||||
) as mock_execute_async:
|
||||
|
||||
crew.kickoff()
|
||||
|
||||
assert mock_execute_async.call_count == 2
|
||||
@@ -917,9 +917,7 @@ async def test_kickoff_async_basic_functionality_and_output():
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
@pytest.mark.asyncio
|
||||
async def test_async_kickoff_for_each_async_basic_functionality_and_output():
|
||||
"""Tests the basic functionality and output of akickoff_for_each_async."""
|
||||
from unittest.mock import patch
|
||||
|
||||
"""Tests the basic functionality and output of kickoff_for_each_async."""
|
||||
inputs = [
|
||||
{"topic": "dog"},
|
||||
{"topic": "cat"},
|
||||
@@ -945,8 +943,13 @@ async def test_async_kickoff_for_each_async_basic_functionality_and_output():
|
||||
agent=agent,
|
||||
)
|
||||
|
||||
async def mock_kickoff_async(**kwargs):
|
||||
input_data = kwargs.get("inputs")
|
||||
index = [input_["topic"] for input_ in inputs].index(input_data["topic"])
|
||||
return expected_outputs[index]
|
||||
|
||||
with patch.object(
|
||||
Crew, "kickoff_async", side_effect=expected_outputs
|
||||
Crew, "kickoff_async", side_effect=mock_kickoff_async
|
||||
) as mock_kickoff_async:
|
||||
crew = Crew(agents=[agent], tasks=[task])
|
||||
|
||||
@@ -1148,7 +1151,6 @@ def test_code_execution_flag_adds_code_tool_upon_kickoff():
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_delegation_is_not_enabled_if_there_are_only_one_agent():
|
||||
|
||||
researcher = Agent(
|
||||
role="Researcher",
|
||||
goal="Make the best research and analysis on content about AI and AI agents",
|
||||
@@ -1238,10 +1240,10 @@ def test_agent_usage_metrics_are_captured_for_hierarchical_process():
|
||||
print(crew.usage_metrics)
|
||||
|
||||
assert crew.usage_metrics == {
|
||||
"total_tokens": 2217,
|
||||
"prompt_tokens": 1847,
|
||||
"completion_tokens": 370,
|
||||
"successful_requests": 4,
|
||||
"total_tokens": 311,
|
||||
"prompt_tokens": 224,
|
||||
"completion_tokens": 87,
|
||||
"successful_requests": 1,
|
||||
}
|
||||
|
||||
|
||||
@@ -1380,7 +1382,6 @@ def test_crew_does_not_interpolate_without_inputs():
|
||||
interpolate_task_inputs.assert_not_called()
|
||||
|
||||
|
||||
# TODO: Ask @joao if we want to start throwing errors if inputs are not provided
|
||||
# def test_crew_partial_inputs():
|
||||
# agent = Agent(
|
||||
# role="{topic} Researcher",
|
||||
@@ -1404,7 +1405,6 @@ def test_crew_does_not_interpolate_without_inputs():
|
||||
# assert crew.agents[0].backstory == "You have a lot of experience with AI."
|
||||
|
||||
|
||||
# TODO: If we do want ot throw errors if we are missing inputs. Add in this test.
|
||||
# def test_crew_invalid_inputs():
|
||||
# agent = Agent(
|
||||
# role="{topic} Researcher",
|
||||
@@ -1806,3 +1806,617 @@ def test__setup_for_training():
|
||||
|
||||
for agent in agents:
|
||||
assert agent.allow_delegation is False
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_replay_feature():
|
||||
list_ideas = Task(
|
||||
description="Generate a list of 5 interesting ideas to explore for an article, where each bulletpoint is under 15 words.",
|
||||
expected_output="Bullet point list of 5 important events. No additional commentary.",
|
||||
agent=researcher,
|
||||
)
|
||||
write = Task(
|
||||
description="Write a sentence about the events",
|
||||
expected_output="A sentence about the events",
|
||||
agent=writer,
|
||||
context=[list_ideas],
|
||||
)
|
||||
|
||||
crew = Crew(
|
||||
agents=[researcher, writer],
|
||||
tasks=[list_ideas, write],
|
||||
process=Process.sequential,
|
||||
)
|
||||
|
||||
with patch.object(Task, "execute_sync") as mock_execute_task:
|
||||
mock_execute_task.return_value = TaskOutput(
|
||||
description="Mock description",
|
||||
raw="Mocked output for list of ideas",
|
||||
agent="Researcher",
|
||||
json_dict=None,
|
||||
output_format=OutputFormat.RAW,
|
||||
pydantic=None,
|
||||
summary="Mocked output for list of ideas",
|
||||
)
|
||||
|
||||
crew.kickoff()
|
||||
crew.replay_from_task(str(write.id))
|
||||
# Ensure context was passed correctly
|
||||
assert mock_execute_task.call_count == 3
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_crew_replay_from_task_error():
|
||||
task = Task(
|
||||
description="Come up with a list of 5 interesting ideas to explore for an article",
|
||||
expected_output="5 bullet points with a paragraph for each idea.",
|
||||
agent=researcher,
|
||||
)
|
||||
|
||||
crew = Crew(
|
||||
agents=[researcher, writer],
|
||||
tasks=[task],
|
||||
)
|
||||
|
||||
with pytest.raises(TypeError) as e:
|
||||
crew.replay_from_task() # type: ignore purposefully throwing err
|
||||
assert "task_id is required" in str(e)
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_crew_task_db_init():
|
||||
agent = Agent(
|
||||
role="Content Writer",
|
||||
goal="Write engaging content on various topics.",
|
||||
backstory="You have a background in journalism and creative writing.",
|
||||
)
|
||||
|
||||
task = Task(
|
||||
description="Write a detailed article about AI in healthcare.",
|
||||
expected_output="A 1 paragraph article about AI.",
|
||||
agent=agent,
|
||||
)
|
||||
|
||||
crew = Crew(agents=[agent], tasks=[task])
|
||||
|
||||
with patch.object(Task, "execute_sync") as mock_execute_task:
|
||||
mock_execute_task.return_value = TaskOutput(
|
||||
description="Write about AI in healthcare.",
|
||||
raw="Artificial Intelligence (AI) is revolutionizing healthcare by enhancing diagnostic accuracy, personalizing treatment plans, and streamlining administrative tasks.",
|
||||
agent="Content Writer",
|
||||
json_dict=None,
|
||||
output_format=OutputFormat.RAW,
|
||||
pydantic=None,
|
||||
summary="Write about AI in healthcare...",
|
||||
)
|
||||
|
||||
crew.kickoff()
|
||||
|
||||
# Check if this runs without raising an exception
|
||||
try:
|
||||
db_handler = TaskOutputStorageHandler()
|
||||
db_handler.load()
|
||||
assert True # If we reach this point, no exception was raised
|
||||
except Exception as e:
|
||||
pytest.fail(f"An exception was raised: {str(e)}")
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_replay_task_with_context():
|
||||
agent1 = Agent(
|
||||
role="Researcher",
|
||||
goal="Research AI advancements.",
|
||||
backstory="You are an expert in AI research.",
|
||||
)
|
||||
agent2 = Agent(
|
||||
role="Writer",
|
||||
goal="Write detailed articles on AI.",
|
||||
backstory="You have a background in journalism and AI.",
|
||||
)
|
||||
|
||||
task1 = Task(
|
||||
description="Research the latest advancements in AI.",
|
||||
expected_output="A detailed report on AI advancements.",
|
||||
agent=agent1,
|
||||
)
|
||||
task2 = Task(
|
||||
description="Summarize the AI advancements report.",
|
||||
expected_output="A summary of the AI advancements report.",
|
||||
agent=agent2,
|
||||
)
|
||||
task3 = Task(
|
||||
description="Write an article based on the AI advancements summary.",
|
||||
expected_output="An article on AI advancements.",
|
||||
agent=agent2,
|
||||
)
|
||||
task4 = Task(
|
||||
description="Create a presentation based on the AI advancements article.",
|
||||
expected_output="A presentation on AI advancements.",
|
||||
agent=agent2,
|
||||
context=[task1],
|
||||
)
|
||||
|
||||
crew = Crew(
|
||||
agents=[agent1, agent2],
|
||||
tasks=[task1, task2, task3, task4],
|
||||
process=Process.sequential,
|
||||
)
|
||||
|
||||
mock_task_output1 = TaskOutput(
|
||||
description="Research the latest advancements in AI.",
|
||||
raw="Detailed report on AI advancements...",
|
||||
agent="Researcher",
|
||||
json_dict=None,
|
||||
output_format=OutputFormat.RAW,
|
||||
pydantic=None,
|
||||
summary="Detailed report on AI advancements...",
|
||||
)
|
||||
mock_task_output2 = TaskOutput(
|
||||
description="Summarize the AI advancements report.",
|
||||
raw="Summary of the AI advancements report...",
|
||||
agent="Writer",
|
||||
json_dict=None,
|
||||
output_format=OutputFormat.RAW,
|
||||
pydantic=None,
|
||||
summary="Summary of the AI advancements report...",
|
||||
)
|
||||
mock_task_output3 = TaskOutput(
|
||||
description="Write an article based on the AI advancements summary.",
|
||||
raw="Article on AI advancements...",
|
||||
agent="Writer",
|
||||
json_dict=None,
|
||||
output_format=OutputFormat.RAW,
|
||||
pydantic=None,
|
||||
summary="Article on AI advancements...",
|
||||
)
|
||||
mock_task_output4 = TaskOutput(
|
||||
description="Create a presentation based on the AI advancements article.",
|
||||
raw="Presentation on AI advancements...",
|
||||
agent="Writer",
|
||||
json_dict=None,
|
||||
output_format=OutputFormat.RAW,
|
||||
pydantic=None,
|
||||
summary="Presentation on AI advancements...",
|
||||
)
|
||||
|
||||
with patch.object(Task, "execute_sync") as mock_execute_task:
|
||||
mock_execute_task.side_effect = [
|
||||
mock_task_output1,
|
||||
mock_task_output2,
|
||||
mock_task_output3,
|
||||
mock_task_output4,
|
||||
]
|
||||
|
||||
crew.kickoff()
|
||||
db_handler = TaskOutputStorageHandler()
|
||||
assert db_handler.load() != []
|
||||
|
||||
with patch.object(Task, "execute_sync") as mock_replay_task:
|
||||
mock_replay_task.return_value = mock_task_output4
|
||||
|
||||
replayed_output = crew.replay_from_task(str(task4.id))
|
||||
assert replayed_output.raw == "Presentation on AI advancements..."
|
||||
|
||||
db_handler.reset()
|
||||
|
||||
|
||||
def test_replay_from_task_with_context():
|
||||
agent = Agent(role="test_agent", backstory="Test Description", goal="Test Goal")
|
||||
task1 = Task(
|
||||
description="Context Task", expected_output="Say Task Output", agent=agent
|
||||
)
|
||||
task2 = Task(
|
||||
description="Test Task", expected_output="Say Hi", agent=agent, context=[task1]
|
||||
)
|
||||
|
||||
context_output = TaskOutput(
|
||||
description="Context Task Output",
|
||||
agent="test_agent",
|
||||
raw="context raw output",
|
||||
pydantic=None,
|
||||
json_dict={},
|
||||
output_format=OutputFormat.RAW,
|
||||
)
|
||||
task1.output = context_output
|
||||
|
||||
crew = Crew(agents=[agent], tasks=[task1, task2], process=Process.sequential)
|
||||
|
||||
with patch(
|
||||
"crewai.utilities.task_output_storage_handler.TaskOutputStorageHandler.load",
|
||||
return_value=[
|
||||
{
|
||||
"task_id": str(task1.id),
|
||||
"output": {
|
||||
"description": context_output.description,
|
||||
"summary": context_output.summary,
|
||||
"raw": context_output.raw,
|
||||
"pydantic": context_output.pydantic,
|
||||
"json_dict": context_output.json_dict,
|
||||
"output_format": context_output.output_format,
|
||||
"agent": context_output.agent,
|
||||
},
|
||||
"inputs": {},
|
||||
},
|
||||
{
|
||||
"task_id": str(task2.id),
|
||||
"output": {
|
||||
"description": "Test Task Output",
|
||||
"summary": None,
|
||||
"raw": "test raw output",
|
||||
"pydantic": None,
|
||||
"json_dict": {},
|
||||
"output_format": "json",
|
||||
"agent": "test_agent",
|
||||
},
|
||||
"inputs": {},
|
||||
},
|
||||
],
|
||||
):
|
||||
crew.replay_from_task(str(task2.id))
|
||||
|
||||
assert crew.tasks[1].context[0].output.raw == "context raw output"
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_replay_with_invalid_task_id():
|
||||
agent = Agent(role="test_agent", backstory="Test Description", goal="Test Goal")
|
||||
task1 = Task(
|
||||
description="Context Task", expected_output="Say Task Output", agent=agent
|
||||
)
|
||||
task2 = Task(
|
||||
description="Test Task", expected_output="Say Hi", agent=agent, context=[task1]
|
||||
)
|
||||
|
||||
context_output = TaskOutput(
|
||||
description="Context Task Output",
|
||||
agent="test_agent",
|
||||
raw="context raw output",
|
||||
pydantic=None,
|
||||
json_dict={},
|
||||
output_format=OutputFormat.RAW,
|
||||
)
|
||||
task1.output = context_output
|
||||
|
||||
crew = Crew(agents=[agent], tasks=[task1, task2], process=Process.sequential)
|
||||
|
||||
with patch(
|
||||
"crewai.utilities.task_output_storage_handler.TaskOutputStorageHandler.load",
|
||||
return_value=[
|
||||
{
|
||||
"task_id": str(task1.id),
|
||||
"output": {
|
||||
"description": context_output.description,
|
||||
"summary": context_output.summary,
|
||||
"raw": context_output.raw,
|
||||
"pydantic": context_output.pydantic,
|
||||
"json_dict": context_output.json_dict,
|
||||
"output_format": context_output.output_format,
|
||||
"agent": context_output.agent,
|
||||
},
|
||||
"inputs": {},
|
||||
},
|
||||
{
|
||||
"task_id": str(task2.id),
|
||||
"output": {
|
||||
"description": "Test Task Output",
|
||||
"summary": None,
|
||||
"raw": "test raw output",
|
||||
"pydantic": None,
|
||||
"json_dict": {},
|
||||
"output_format": "json",
|
||||
"agent": "test_agent",
|
||||
},
|
||||
"inputs": {},
|
||||
},
|
||||
],
|
||||
):
|
||||
with pytest.raises(
|
||||
ValueError,
|
||||
match="Task with id bf5b09c9-69bd-4eb8-be12-f9e5bae31c2d not found in the crew's tasks.",
|
||||
):
|
||||
crew.replay_from_task("bf5b09c9-69bd-4eb8-be12-f9e5bae31c2d")
|
||||
|
||||
|
||||
@patch.object(Crew, "_interpolate_inputs")
|
||||
def test_replay_interpolates_inputs_properly(mock_interpolate_inputs):
|
||||
agent = Agent(role="test_agent", backstory="Test Description", goal="Test Goal")
|
||||
task1 = Task(description="Context Task", expected_output="Say {name}", agent=agent)
|
||||
task2 = Task(
|
||||
description="Test Task",
|
||||
expected_output="Say Hi to {name}",
|
||||
agent=agent,
|
||||
context=[task1],
|
||||
)
|
||||
|
||||
context_output = TaskOutput(
|
||||
description="Context Task Output",
|
||||
agent="test_agent",
|
||||
raw="context raw output",
|
||||
pydantic=None,
|
||||
json_dict={},
|
||||
output_format=OutputFormat.RAW,
|
||||
)
|
||||
task1.output = context_output
|
||||
|
||||
crew = Crew(agents=[agent], tasks=[task1, task2], process=Process.sequential)
|
||||
crew.kickoff(inputs={"name": "John"})
|
||||
|
||||
with patch(
|
||||
"crewai.utilities.task_output_storage_handler.TaskOutputStorageHandler.load",
|
||||
return_value=[
|
||||
{
|
||||
"task_id": str(task1.id),
|
||||
"output": {
|
||||
"description": context_output.description,
|
||||
"summary": context_output.summary,
|
||||
"raw": context_output.raw,
|
||||
"pydantic": context_output.pydantic,
|
||||
"json_dict": context_output.json_dict,
|
||||
"output_format": context_output.output_format,
|
||||
"agent": context_output.agent,
|
||||
},
|
||||
"inputs": {"name": "John"},
|
||||
},
|
||||
{
|
||||
"task_id": str(task2.id),
|
||||
"output": {
|
||||
"description": "Test Task Output",
|
||||
"summary": None,
|
||||
"raw": "test raw output",
|
||||
"pydantic": None,
|
||||
"json_dict": {},
|
||||
"output_format": "json",
|
||||
"agent": "test_agent",
|
||||
},
|
||||
"inputs": {"name": "John"},
|
||||
},
|
||||
],
|
||||
):
|
||||
crew.replay_from_task(str(task2.id))
|
||||
assert crew._inputs == {"name": "John"}
|
||||
assert mock_interpolate_inputs.call_count == 2
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_replay_from_task_setup_context():
|
||||
agent = Agent(role="test_agent", backstory="Test Description", goal="Test Goal")
|
||||
task1 = Task(description="Context Task", expected_output="Say {name}", agent=agent)
|
||||
task2 = Task(
|
||||
description="Test Task",
|
||||
expected_output="Say Hi to {name}",
|
||||
agent=agent,
|
||||
)
|
||||
context_output = TaskOutput(
|
||||
description="Context Task Output",
|
||||
agent="test_agent",
|
||||
raw="context raw output",
|
||||
pydantic=None,
|
||||
json_dict={},
|
||||
output_format=OutputFormat.RAW,
|
||||
)
|
||||
task1.output = context_output
|
||||
crew = Crew(agents=[agent], tasks=[task1, task2], process=Process.sequential)
|
||||
with patch(
|
||||
"crewai.utilities.task_output_storage_handler.TaskOutputStorageHandler.load",
|
||||
return_value=[
|
||||
{
|
||||
"task_id": str(task1.id),
|
||||
"output": {
|
||||
"description": context_output.description,
|
||||
"summary": context_output.summary,
|
||||
"raw": context_output.raw,
|
||||
"pydantic": context_output.pydantic,
|
||||
"json_dict": context_output.json_dict,
|
||||
"output_format": context_output.output_format,
|
||||
"agent": context_output.agent,
|
||||
},
|
||||
"inputs": {"name": "John"},
|
||||
},
|
||||
{
|
||||
"task_id": str(task2.id),
|
||||
"output": {
|
||||
"description": "Test Task Output",
|
||||
"summary": None,
|
||||
"raw": "test raw output",
|
||||
"pydantic": None,
|
||||
"json_dict": {},
|
||||
"output_format": "json",
|
||||
"agent": "test_agent",
|
||||
},
|
||||
"inputs": {"name": "John"},
|
||||
},
|
||||
],
|
||||
):
|
||||
crew.replay_from_task(str(task2.id))
|
||||
|
||||
# Check if the first task's output was set correctly
|
||||
assert crew.tasks[0].output is not None
|
||||
assert isinstance(crew.tasks[0].output, TaskOutput)
|
||||
assert crew.tasks[0].output.description == "Context Task Output"
|
||||
assert crew.tasks[0].output.agent == "test_agent"
|
||||
assert crew.tasks[0].output.raw == "context raw output"
|
||||
assert crew.tasks[0].output.output_format == OutputFormat.RAW
|
||||
|
||||
assert crew.tasks[1].prompt_context == "context raw output"
|
||||
|
||||
|
||||
def test_key():
|
||||
tasks = [
|
||||
Task(
|
||||
description="Give me a list of 5 interesting ideas to explore for na article, what makes them unique and interesting.",
|
||||
expected_output="Bullet point list of 5 important events.",
|
||||
agent=researcher,
|
||||
),
|
||||
Task(
|
||||
description="Write a 1 amazing paragraph highlight for each idea that showcases how good an article about this topic could be. Return the list of ideas with their paragraph and your notes.",
|
||||
expected_output="A 4 paragraph article about AI.",
|
||||
agent=writer,
|
||||
),
|
||||
]
|
||||
crew = Crew(
|
||||
agents=[researcher, writer],
|
||||
process=Process.sequential,
|
||||
tasks=tasks,
|
||||
)
|
||||
hash = hashlib.md5(
|
||||
f"{researcher.key}|{writer.key}|{tasks[0].key}|{tasks[1].key}".encode()
|
||||
).hexdigest()
|
||||
|
||||
assert crew.key == hash
|
||||
|
||||
|
||||
def test_conditional_task_requirement_breaks_when_singular_conditional_task():
|
||||
def condition_fn(output) -> bool:
|
||||
return output.raw.startswith("Andrew Ng has!!")
|
||||
|
||||
task = ConditionalTask(
|
||||
description="Come up with a list of 5 interesting ideas to explore for an article, then write one amazing paragraph highlight for each idea that showcases how good an article about this topic could be. Return the list of ideas with their paragraph and your notes.",
|
||||
expected_output="5 bullet points with a paragraph for each idea.",
|
||||
condition=condition_fn,
|
||||
)
|
||||
|
||||
with pytest.raises(pydantic_core._pydantic_core.ValidationError):
|
||||
Crew(
|
||||
agents=[researcher, writer],
|
||||
tasks=[task],
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_conditional_task_last_task_when_conditional_is_true():
|
||||
def condition_fn(output) -> bool:
|
||||
return True
|
||||
|
||||
task1 = Task(
|
||||
description="Say Hi",
|
||||
expected_output="Hi",
|
||||
agent=researcher,
|
||||
)
|
||||
task2 = ConditionalTask(
|
||||
description="Come up with a list of 5 interesting ideas to explore for an article, then write one amazing paragraph highlight for each idea that showcases how good an article about this topic could be. Return the list of ideas with their paragraph and your notes.",
|
||||
expected_output="5 bullet points with a paragraph for each idea.",
|
||||
condition=condition_fn,
|
||||
agent=writer,
|
||||
)
|
||||
|
||||
crew = Crew(
|
||||
agents=[researcher, writer],
|
||||
tasks=[task1, task2],
|
||||
)
|
||||
result = crew.kickoff()
|
||||
assert result.raw.startswith(
|
||||
"1. **The Rise of AI Agents in Customer Service: Revolutionizing Customer Interactions**"
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_conditional_task_last_task_when_conditional_is_false():
|
||||
def condition_fn(output) -> bool:
|
||||
return False
|
||||
|
||||
task1 = Task(
|
||||
description="Say Hi",
|
||||
expected_output="Hi",
|
||||
agent=researcher,
|
||||
)
|
||||
task2 = ConditionalTask(
|
||||
description="Come up with a list of 5 interesting ideas to explore for an article, then write one amazing paragraph highlight for each idea that showcases how good an article about this topic could be. Return the list of ideas with their paragraph and your notes.",
|
||||
expected_output="5 bullet points with a paragraph for each idea.",
|
||||
condition=condition_fn,
|
||||
agent=writer,
|
||||
)
|
||||
|
||||
crew = Crew(
|
||||
agents=[researcher, writer],
|
||||
tasks=[task1, task2],
|
||||
)
|
||||
result = crew.kickoff()
|
||||
print(result.raw)
|
||||
assert result.raw == "Hi"
|
||||
|
||||
|
||||
def test_conditional_task_requirement_breaks_when_task_async():
|
||||
def my_condition(context):
|
||||
return context.get("some_value") > 10
|
||||
|
||||
task = ConditionalTask(
|
||||
description="Come up with a list of 5 interesting ideas to explore for an article, then write one amazing paragraph highlight for each idea that showcases how good an article about this topic could be. Return the list of ideas with their paragraph and your notes.",
|
||||
expected_output="5 bullet points with a paragraph for each idea.",
|
||||
execute_async=True,
|
||||
condition=my_condition,
|
||||
agent=researcher,
|
||||
)
|
||||
task2 = Task(
|
||||
description="Say Hi",
|
||||
expected_output="Hi",
|
||||
agent=writer,
|
||||
)
|
||||
|
||||
with pytest.raises(pydantic_core._pydantic_core.ValidationError):
|
||||
Crew(
|
||||
agents=[researcher, writer],
|
||||
tasks=[task, task2],
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_conditional_should_skip():
|
||||
task1 = Task(description="Return hello", expected_output="say hi", agent=researcher)
|
||||
|
||||
condition_mock = MagicMock(return_value=False)
|
||||
task2 = ConditionalTask(
|
||||
description="Come up with a list of 5 interesting ideas to explore for an article, then write one amazing paragraph highlight for each idea that showcases how good an article about this topic could be. Return the list of ideas with their paragraph and your notes.",
|
||||
expected_output="5 bullet points with a paragraph for each idea.",
|
||||
condition=condition_mock,
|
||||
agent=writer,
|
||||
)
|
||||
crew_met = Crew(
|
||||
agents=[researcher, writer],
|
||||
tasks=[task1, task2],
|
||||
)
|
||||
with patch.object(Task, "execute_sync") as mock_execute_sync:
|
||||
mock_execute_sync.return_value = TaskOutput(
|
||||
description="Task 1 description",
|
||||
raw="Task 1 output",
|
||||
agent="Researcher",
|
||||
)
|
||||
|
||||
result = crew_met.kickoff()
|
||||
assert mock_execute_sync.call_count == 1
|
||||
|
||||
assert condition_mock.call_count == 1
|
||||
assert condition_mock() is False
|
||||
|
||||
assert task2.output is None
|
||||
assert result.raw.startswith("Task 1 output")
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_conditional_should_execute():
|
||||
task1 = Task(description="Return hello", expected_output="say hi", agent=researcher)
|
||||
|
||||
condition_mock = MagicMock(
|
||||
return_value=True
|
||||
) # should execute this conditional task
|
||||
task2 = ConditionalTask(
|
||||
description="Come up with a list of 5 interesting ideas to explore for an article, then write one amazing paragraph highlight for each idea that showcases how good an article about this topic could be. Return the list of ideas with their paragraph and your notes.",
|
||||
expected_output="5 bullet points with a paragraph for each idea.",
|
||||
condition=condition_mock,
|
||||
agent=writer,
|
||||
)
|
||||
crew_met = Crew(
|
||||
agents=[researcher, writer],
|
||||
tasks=[task1, task2],
|
||||
)
|
||||
with patch.object(Task, "execute_sync") as mock_execute_sync:
|
||||
mock_execute_sync.return_value = TaskOutput(
|
||||
description="Task 1 description",
|
||||
raw="Task 1 output",
|
||||
agent="Researcher",
|
||||
)
|
||||
|
||||
crew_met.kickoff()
|
||||
|
||||
assert condition_mock.call_count == 1
|
||||
assert condition_mock() is True
|
||||
assert mock_execute_sync.call_count == 2
|
||||
|
||||
@@ -1,15 +1,16 @@
|
||||
"""Test Agent creation and execution basic functionality."""
|
||||
|
||||
import hashlib
|
||||
import json
|
||||
from unittest.mock import MagicMock, patch
|
||||
|
||||
import pytest
|
||||
from pydantic import BaseModel
|
||||
from pydantic_core import ValidationError
|
||||
|
||||
from crewai import Agent, Crew, Process, Task
|
||||
from crewai.tasks.conditional_task import ConditionalTask
|
||||
from crewai.tasks.task_output import TaskOutput
|
||||
from crewai.utilities.converter import Converter
|
||||
from pydantic import BaseModel
|
||||
from pydantic_core import ValidationError
|
||||
|
||||
|
||||
def test_task_tool_reflect_agent_tools():
|
||||
@@ -81,7 +82,7 @@ def test_task_prompt_includes_expected_output():
|
||||
|
||||
with patch.object(Agent, "execute_task") as execute:
|
||||
execute.return_value = "ok"
|
||||
task.execute_sync()
|
||||
task.execute_sync(agent=researcher)
|
||||
execute.assert_called_once_with(task=task, context=None, tools=[])
|
||||
|
||||
|
||||
@@ -104,7 +105,7 @@ def test_task_callback():
|
||||
|
||||
with patch.object(Agent, "execute_task") as execute:
|
||||
execute.return_value = "ok"
|
||||
task.execute_sync()
|
||||
task.execute_sync(agent=researcher)
|
||||
task_completed.assert_called_once_with(task.output)
|
||||
|
||||
|
||||
@@ -129,7 +130,7 @@ def test_task_callback_returns_task_ouput():
|
||||
|
||||
with patch.object(Agent, "execute_task") as execute:
|
||||
execute.return_value = "exported_ok"
|
||||
task.execute_sync()
|
||||
task.execute_sync(agent=researcher)
|
||||
# Ensure the callback is called with a TaskOutput object serialized to JSON
|
||||
task_completed.assert_called_once()
|
||||
callback_data = task_completed.call_args[0][0]
|
||||
@@ -521,9 +522,7 @@ def test_save_task_json_output():
|
||||
with patch.object(Task, "_save_file") as save_file:
|
||||
save_file.return_value = None
|
||||
crew.kickoff()
|
||||
save_file.assert_called_once_with(
|
||||
{"score": 4}
|
||||
) # TODO: @Joao, should this be a dict or a json string?
|
||||
save_file.assert_called_once_with({"score": 4})
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
@@ -697,6 +696,19 @@ def test_task_definition_based_on_dict():
|
||||
assert task.agent is None
|
||||
|
||||
|
||||
def test_conditional_task_definition_based_on_dict():
|
||||
config = {
|
||||
"description": "Give me an integer score between 1-5 for the following title: 'The impact of AI in the future of work', check examples to based your evaluation.",
|
||||
"expected_output": "The score of the title.",
|
||||
}
|
||||
|
||||
task = ConditionalTask(config=config, condition=lambda x: True)
|
||||
|
||||
assert task.description == config["description"]
|
||||
assert task.expected_output == config["expected_output"]
|
||||
assert task.agent is None
|
||||
|
||||
|
||||
def test_interpolate_inputs():
|
||||
task = Task(
|
||||
description="Give me a list of 5 interesting ideas about {topic} to explore for an article, what makes them unique and interesting.",
|
||||
@@ -793,3 +805,22 @@ def test_task_output_str_with_none():
|
||||
)
|
||||
|
||||
assert str(task_output) == ""
|
||||
|
||||
|
||||
def test_key():
|
||||
original_description = "Give me a list of 5 interesting ideas about {topic} to explore for an article, what makes them unique and interesting."
|
||||
original_expected_output = "Bullet point list of 5 interesting ideas about {topic}."
|
||||
task = Task(
|
||||
description=original_description,
|
||||
expected_output=original_expected_output,
|
||||
)
|
||||
hash = hashlib.md5(
|
||||
f"{original_description}|{original_expected_output}".encode()
|
||||
).hexdigest()
|
||||
|
||||
assert task.key == hash, "The key should be the hash of the description."
|
||||
|
||||
task.interpolate_inputs(inputs={"topic": "AI"})
|
||||
assert (
|
||||
task.key == hash
|
||||
), "The key should be the hash of the non-interpolated description."
|
||||
|
||||
Reference in New Issue
Block a user