mirror of
https://github.com/crewAIInc/crewAI.git
synced 2025-12-16 12:28:30 +00:00
Compare commits
48 Commits
v0.35.3
...
bugfix/lan
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
040e5a78d2 | ||
|
|
b93632a53a | ||
|
|
09938641cd | ||
|
|
7acf0b2107 | ||
|
|
4eb4073661 | ||
|
|
7b53457ef3 | ||
|
|
691b094a40 | ||
|
|
68e9e54c88 | ||
|
|
d0d99125c4 | ||
|
|
129000d01f | ||
|
|
47f9d026dd | ||
|
|
b75b0b5552 | ||
|
|
3dd6249f1e | ||
|
|
8451113039 | ||
|
|
a79b216875 | ||
|
|
52217c2f63 | ||
|
|
7edacf6e24 | ||
|
|
58558a1950 | ||
|
|
1607c85ae5 | ||
|
|
a6ff342948 | ||
|
|
d2eb54ebf8 | ||
|
|
a41bd18599 | ||
|
|
bb64c80964 | ||
|
|
2fb56f1f9f | ||
|
|
35676fe2f5 | ||
|
|
81ed6f177e | ||
|
|
4bcd1df6bb | ||
|
|
6fae56dd60 | ||
|
|
430f0e9013 | ||
|
|
d7f080a978 | ||
|
|
5d18f73654 | ||
|
|
57fc079267 | ||
|
|
706f4cd74a | ||
|
|
2e3646cc96 | ||
|
|
844cc515d5 | ||
|
|
f47904134b | ||
|
|
d72b00af3c | ||
|
|
bd053a98c7 | ||
|
|
c18208ca59 | ||
|
|
acbe5af8ce | ||
|
|
c81146505a | ||
|
|
6b9a1d4040 | ||
|
|
508fbd49e9 | ||
|
|
e18a6c6bb8 | ||
|
|
16237ef393 | ||
|
|
5332d02f36 | ||
|
|
7258120a0d | ||
|
|
8b7bc69ba1 |
6
.github/workflows/tests.yml
vendored
6
.github/workflows/tests.yml
vendored
@@ -19,13 +19,13 @@ jobs:
|
||||
- name: Setup Python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: "3.10"
|
||||
python-version: "3.11.9"
|
||||
|
||||
- name: Install Requirements
|
||||
run: |
|
||||
set -e
|
||||
pip install poetry
|
||||
poetry lock &&
|
||||
poetry install
|
||||
|
||||
- name: Run tests
|
||||
run: poetry run pytest tests
|
||||
run: poetry run pytest
|
||||
|
||||
3
.gitignore
vendored
3
.gitignore
vendored
@@ -13,4 +13,5 @@ db/
|
||||
test.py
|
||||
rc-tests/*
|
||||
*.pkl
|
||||
temp/*
|
||||
temp/*
|
||||
.vscode/*
|
||||
40
README.md
40
README.md
@@ -197,46 +197,6 @@ Please refer to the [Connect crewAI to LLMs](https://docs.crewai.com/how-to/LLM-
|
||||
**CrewAI's Advantage**: CrewAI is built with production in mind. It offers the flexibility of Autogen's conversational agents and the structured process approach of ChatDev, but without the rigidity. CrewAI's processes are designed to be dynamic and adaptable, fitting seamlessly into both development and production workflows.
|
||||
|
||||
|
||||
## Training
|
||||
|
||||
The training feature in CrewAI allows you to train your AI agents using the command-line interface (CLI). By running the command `crewai train -n <n_iterations>`, you can specify the number of iterations for the training process.
|
||||
|
||||
During training, CrewAI utilizes techniques to optimize the performance of your agents along with human feedback. This helps the agents improve their understanding, decision-making, and problem-solving abilities.
|
||||
|
||||
To use the training feature, follow these steps:
|
||||
|
||||
1. Open your terminal or command prompt.
|
||||
2. Navigate to the directory where your CrewAI project is located.
|
||||
3. Run the following command:
|
||||
|
||||
```shell
|
||||
crewai train -n <n_iterations>
|
||||
```
|
||||
|
||||
Replace `<n_iterations>` with the desired number of training iterations. This determines how many times the agents will go through the training process.
|
||||
|
||||
Remember to also replace the placeholder inputs with the actual values you want to use on the main.py file in the `train` function.
|
||||
|
||||
```python
|
||||
def train():
|
||||
"""
|
||||
Train the crew for a given number of iterations.
|
||||
"""
|
||||
inputs = {"topic": "AI LLMs"}
|
||||
try:
|
||||
ProjectCreationCrew().crew().train(n_iterations=int(sys.argv[1]), inputs=inputs)
|
||||
...
|
||||
```
|
||||
|
||||
It is important to note that the training process may take some time, depending on the complexity of your agents and will also require your feedback on each iteration.
|
||||
|
||||
Once the training is complete, your agents will be equipped with enhanced capabilities and knowledge, ready to tackle complex tasks and provide more consistent and valuable insights.
|
||||
|
||||
Remember to regularly update and retrain your agents to ensure they stay up-to-date with the latest information and advancements in the field.
|
||||
|
||||
Happy training with CrewAI!
|
||||
|
||||
|
||||
## Contribution
|
||||
|
||||
CrewAI is open-source and we welcome contributions. If you're looking to contribute, please:
|
||||
|
||||
@@ -16,24 +16,24 @@ description: What are crewAI Agents and how to use them.
|
||||
|
||||
## Agent Attributes
|
||||
|
||||
| Attribute | Description |
|
||||
| :------------------------- | :--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| **Role** | Defines the agent's function within the crew. It determines the kind of tasks the agent is best suited for. |
|
||||
| **Goal** | The individual objective that the agent aims to achieve. It guides the agent's decision-making process. |
|
||||
| **Backstory** | Provides context to the agent's role and goal, enriching the interaction and collaboration dynamics. |
|
||||
| **LLM** *(optional)* | Represents the language model that will run the agent. It dynamically fetches the model name from the `OPENAI_MODEL_NAME` environment variable, defaulting to "gpt-4" if not specified. |
|
||||
| **Tools** *(optional)* | Set of capabilities or functions that the agent can use to perform tasks. Expected to be instances of custom classes compatible with the agent's execution environment. Tools are initialized with a default value of an empty list. |
|
||||
| **Function Calling LLM** *(optional)* | Specifies the language model that will handle the tool calling for this agent, overriding the crew function calling LLM if passed. Default is `None`. |
|
||||
| **Max Iter** *(optional)* | `max_iter` is the maximum number of iterations the agent can perform before being forced to give its best answer. Default is `25`. |
|
||||
| **Max RPM** *(optional)* | `max_rpm` is Tte maximum number of requests per minute the agent can perform to avoid rate limits. It's optional and can be left unspecified, with a default value of `None`. |
|
||||
| **Max Execution Time** *(optional)* | `max_execution_time` is the Maximum execution time for an agent to execute a task. It's optional and can be left unspecified, with a default value of `None`, meaning no max execution time. |
|
||||
| **Verbose** *(optional)* | Setting this to `True` configures the internal logger to provide detailed execution logs, aiding in debugging and monitoring. Default is `False`. |
|
||||
| **Allow Delegation** *(optional)* | Agents can delegate tasks or questions to one another, ensuring that each task is handled by the most suitable agent. Default is `True`. |
|
||||
| **Step Callback** *(optional)* | A function that is called after each step of the agent. This can be used to log the agent's actions or to perform other operations. It will overwrite the crew `step_callback`. |
|
||||
| **Cache** *(optional)* | Indicates if the agent should use a cache for tool usage. Default is `True`. |
|
||||
| **System Template** *(optional)* | Specifies the system format for the agent. Default is `None`. |
|
||||
| **Prompt Template** *(optional)* | Specifies the prompt format for the agent. Default is `None`. |
|
||||
| **Response Template** *(optional)* | Specifies the response format for the agent. Default is `None`. |
|
||||
| Attribute | Parameter | Description |
|
||||
| :------------------------- | :---- | :--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| **Role** | `role` | Defines the agent's function within the crew. It determines the kind of tasks the agent is best suited for. |
|
||||
| **Goal** | `goal` | The individual objective that the agent aims to achieve. It guides the agent's decision-making process. |
|
||||
| **Backstory** | `backstory` | Provides context to the agent's role and goal, enriching the interaction and collaboration dynamics. |
|
||||
| **LLM** *(optional)* | `llm` | Represents the language model that will run the agent. It dynamically fetches the model name from the `OPENAI_MODEL_NAME` environment variable, defaulting to "gpt-4" if not specified. |
|
||||
| **Tools** *(optional)* | `tools` | Set of capabilities or functions that the agent can use to perform tasks. Expected to be instances of custom classes compatible with the agent's execution environment. Tools are initialized with a default value of an empty list. |
|
||||
| **Function Calling LLM** *(optional)* | `function_calling_llm` | Specifies the language model that will handle the tool calling for this agent, overriding the crew function calling LLM if passed. Default is `None`. |
|
||||
| **Max Iter** *(optional)* | `max_iter` | Max Iter is the maximum number of iterations the agent can perform before being forced to give its best answer. Default is `25`. |
|
||||
| **Max RPM** *(optional)* | `max_rpm` | Max RPM is the maximum number of requests per minute the agent can perform to avoid rate limits. It's optional and can be left unspecified, with a default value of `None`. |
|
||||
| **Max Execution Time** *(optional)* | `max_execution_time` | Max Execution Time is the Maximum execution time for an agent to execute a task. It's optional and can be left unspecified, with a default value of `None`, meaning no max execution time. |
|
||||
| **Verbose** *(optional)* | `verbose` | Setting this to `True` configures the internal logger to provide detailed execution logs, aiding in debugging and monitoring. Default is `False`. |
|
||||
| **Allow Delegation** *(optional)* | `allow_delegation` | Agents can delegate tasks or questions to one another, ensuring that each task is handled by the most suitable agent. Default is `True`. |
|
||||
| **Step Callback** *(optional)* | `step_callback` | A function that is called after each step of the agent. This can be used to log the agent's actions or to perform other operations. It will overwrite the crew `step_callback`. |
|
||||
| **Cache** *(optional)* | `cache` | Indicates if the agent should use a cache for tool usage. Default is `True`. |
|
||||
| **System Template** *(optional)* | `system_template` | Specifies the system format for the agent. Default is `None`. |
|
||||
| **Prompt Template** *(optional)* | `prompt_template` | Specifies the prompt format for the agent. Default is `None`. |
|
||||
| **Response Template** *(optional)* | `response_template` | Specifies the response format for the agent. Default is `None`. |
|
||||
|
||||
## Creating an Agent
|
||||
|
||||
|
||||
@@ -8,29 +8,29 @@ A crew in crewAI represents a collaborative group of agents working together to
|
||||
|
||||
## Crew Attributes
|
||||
|
||||
| Attribute | Description |
|
||||
| :-------------------------- | :----------------------------------------------------------- |
|
||||
| **Tasks** | A list of tasks assigned to the crew. |
|
||||
| **Agents** | A list of agents that are part of the crew. |
|
||||
| **Process** *(optional)* | The process flow (e.g., sequential, hierarchical) the crew follows. |
|
||||
| **Verbose** *(optional)* | The verbosity level for logging during execution. |
|
||||
| **Manager LLM** *(optional)*| The language model used by the manager agent in a hierarchical process. **Required when using a hierarchical process.** |
|
||||
| **Function Calling LLM** *(optional)* | If passed, the crew will use this LLM to do function calling for tools for all agents in the crew. Each agent can have its own LLM, which overrides the crew's LLM for function calling. |
|
||||
| **Config** *(optional)* | Optional configuration settings for the crew, in `Json` or `Dict[str, Any]` format. |
|
||||
| **Max RPM** *(optional)* | Maximum requests per minute the crew adheres to during execution. |
|
||||
| **Language** *(optional)* | Language used for the crew, defaults to English. |
|
||||
| **Language File** *(optional)* | Path to the language file to be used for the crew. |
|
||||
| **Memory** *(optional)* | Utilized for storing execution memories (short-term, long-term, entity memory). |
|
||||
| **Cache** *(optional)* | Specifies whether to use a cache for storing the results of tools' execution. |
|
||||
| **Embedder** *(optional)* | Configuration for the embedder to be used by the crew. Mostly used by memory for now. |
|
||||
| **Full Output** *(optional)*| Whether the crew should return the full output with all tasks outputs or just the final output. |
|
||||
| **Step Callback** *(optional)* | A function that is called after each step of every agent. This can be used to log the agent's actions or to perform other operations; it won't override the agent-specific `step_callback`. |
|
||||
| **Task Callback** *(optional)* | A function that is called after the completion of each task. Useful for monitoring or additional operations post-task execution. |
|
||||
| **Share Crew** *(optional)* | Whether you want to share the complete crew information and execution with the crewAI team to make the library better, and allow us to train models. |
|
||||
| **Output Log File** *(optional)* | Whether you want to have a file with the complete crew output and execution. You can set it using True and it will default to the folder you are currently in and it will be called logs.txt or passing a string with the full path and name of the file. |
|
||||
| **Manager Agent** *(optional)* | `manager` sets a custom agent that will be used as a manager. |
|
||||
| **Manager Callbacks** *(optional)* | `manager_callbacks` takes a list of callback handlers to be executed by the manager agent when a hierarchical process is used. |
|
||||
| **Prompt File** *(optional)* | Path to the prompt JSON file to be used for the crew. |
|
||||
| Attribute | Parameters | Description |
|
||||
| :-------------------------- | :------------------ | :------------------------------------------------------------------------------------------------------- |
|
||||
| **Tasks** | `tasks` | A list of tasks assigned to the crew. |
|
||||
| **Agents** | `agents` | A list of agents that are part of the crew. |
|
||||
| **Process** *(optional)* | `process` | The process flow (e.g., sequential, hierarchical) the crew follows. |
|
||||
| **Verbose** *(optional)* | `verbose` | The verbosity level for logging during execution. |
|
||||
| **Manager LLM** *(optional)*| `manager_llm` | The language model used by the manager agent in a hierarchical process. **Required when using a hierarchical process.** |
|
||||
| **Function Calling LLM** *(optional)* | `function_calling_llm` | If passed, the crew will use this LLM to do function calling for tools for all agents in the crew. Each agent can have its own LLM, which overrides the crew's LLM for function calling. |
|
||||
| **Config** *(optional)* | `config` | Optional configuration settings for the crew, in `Json` or `Dict[str, Any]` format. |
|
||||
| **Max RPM** *(optional)* | `max_rpm` | Maximum requests per minute the crew adheres to during execution. |
|
||||
| **Language** *(optional)* | `language` | Language used for the crew, defaults to English. |
|
||||
| **Language File** *(optional)* | `language_file` | Path to the language file to be used for the crew. |
|
||||
| **Memory** *(optional)* | `memory` | Utilized for storing execution memories (short-term, long-term, entity memory). |
|
||||
| **Cache** *(optional)* | `cache` | Specifies whether to use a cache for storing the results of tools' execution. |
|
||||
| **Embedder** *(optional)* | `embedder` | Configuration for the embedder to be used by the crew. Mostly used by memory for now. |
|
||||
| **Full Output** *(optional)*| `full_output` | Whether the crew should return the full output with all tasks outputs or just the final output. |
|
||||
| **Step Callback** *(optional)* | `step_callback` | A function that is called after each step of every agent. This can be used to log the agent's actions or to perform other operations; it won't override the agent-specific `step_callback`. |
|
||||
| **Task Callback** *(optional)* | `task_callback` | A function that is called after the completion of each task. Useful for monitoring or additional operations post-task execution. |
|
||||
| **Share Crew** *(optional)* | `share_crew` | Whether you want to share the complete crew information and execution with the crewAI team to make the library better, and allow us to train models. |
|
||||
| **Output Log File** *(optional)* | `output_log_file` | Whether you want to have a file with the complete crew output and execution. You can set it using True and it will default to the folder you are currently in and it will be called logs.txt or passing a string with the full path and name of the file. |
|
||||
| **Manager Agent** *(optional)* | `manager_agent` | `manager` sets a custom agent that will be used as a manager. |
|
||||
| **Manager Callbacks** *(optional)* | `manager_callbacks` | `manager_callbacks` takes a list of callback handlers to be executed by the manager agent when a hierarchical process is used. |
|
||||
| **Prompt File** *(optional)* | `prompt_file` | Path to the prompt JSON file to be used for the crew. |
|
||||
|
||||
!!! note "Crew Max RPM"
|
||||
The `max_rpm` attribute sets the maximum number of requests per minute the crew can perform to avoid rate limits and will override individual agents' `max_rpm` settings if you set it.
|
||||
@@ -123,7 +123,7 @@ result = my_crew.kickoff()
|
||||
print(result)
|
||||
```
|
||||
|
||||
### Different wayt to Kicking Off a Crew
|
||||
### Different ways to Kicking Off a Crew
|
||||
|
||||
Once your crew is assembled, initiate the workflow with the appropriate kickoff method. CrewAI provides several methods for better control over the kickoff process: `kickoff()`, `kickoff_for_each()`, `kickoff_async()`, and `kickoff_for_each_async()`.
|
||||
|
||||
|
||||
@@ -12,7 +12,7 @@ description: Leveraging memory systems in the crewAI framework to enhance agent
|
||||
| Component | Description |
|
||||
| :------------------- | :----------------------------------------------------------- |
|
||||
| **Short-Term Memory**| Temporarily stores recent interactions and outcomes, enabling agents to recall and utilize information relevant to their current context during the current executions. |
|
||||
| **Long-Term Memory** | Preserves valuable insights and learnings from past executions, allowing agents to build and refine their knowledge over time. So Agents can remeber what they did right and wrong across multiple executions |
|
||||
| **Long-Term Memory** | Preserves valuable insights and learnings from past executions, allowing agents to build and refine their knowledge over time. So Agents can remember what they did right and wrong across multiple executions |
|
||||
| **Entity Memory** | Captures and organizes information about entities (people, places, concepts) encountered during tasks, facilitating deeper understanding and relationship mapping. |
|
||||
| **Contextual Memory**| Maintains the context of interactions by combining `ShortTermMemory`, `LongTermMemory`, and `EntityMemory`, aiding in the coherence and relevance of agent responses over a sequence of tasks or a conversation. |
|
||||
|
||||
|
||||
@@ -11,20 +11,20 @@ Tasks within crewAI can be collaborative, requiring multiple agents to work toge
|
||||
|
||||
## Task Attributes
|
||||
|
||||
| Attribute | Description |
|
||||
| :----------------------| :-------------------------------------------------------------------------------------------- |
|
||||
| **Description** | A clear, concise statement of what the task entails. |
|
||||
| **Agent** | The agent responsible for the task, assigned either directly or by the crew's process. |
|
||||
| **Expected Output** | A detailed description of what the task's completion looks like. |
|
||||
| **Tools** *(optional)* | The functions or capabilities the agent can utilize to perform the task. |
|
||||
| **Async Execution** *(optional)* | If set, the task executes asynchronously, allowing progression without waiting for completion.|
|
||||
| **Context** *(optional)* | Specifies tasks whose outputs are used as context for this task. |
|
||||
| **Config** *(optional)* | Additional configuration details for the agent executing the task, allowing further customization. |
|
||||
| **Output JSON** *(optional)* | Outputs a JSON object, requiring an OpenAI client. Only one output format can be set. |
|
||||
| **Output Pydantic** *(optional)* | Outputs a Pydantic model object, requiring an OpenAI client. Only one output format can be set. |
|
||||
| **Output File** *(optional)* | Saves the task output to a file. If used with `Output JSON` or `Output Pydantic`, specifies how the output is saved. |
|
||||
| **Callback** *(optional)* | A Python callable that is executed with the task's output upon completion. |
|
||||
| **Human Input** *(optional)* | Indicates if the task requires human feedback at the end, useful for tasks needing human oversight. |
|
||||
| Attribute | Parameters | Description |
|
||||
| :----------------------| :------------------- | :-------------------------------------------------------------------------------------------- |
|
||||
| **Description** | `description` | A clear, concise statement of what the task entails. |
|
||||
| **Agent** | `agent` | The agent responsible for the task, assigned either directly or by the crew's process. |
|
||||
| **Expected Output** | `expected_output` | A detailed description of what the task's completion looks like. |
|
||||
| **Tools** *(optional)* | `tools` | The functions or capabilities the agent can utilize to perform the task. |
|
||||
| **Async Execution** *(optional)* | `async_execution` | If set, the task executes asynchronously, allowing progression without waiting for completion.|
|
||||
| **Context** *(optional)* | `context` | Specifies tasks whose outputs are used as context for this task. |
|
||||
| **Config** *(optional)* | `config` | Additional configuration details for the agent executing the task, allowing further customization. |
|
||||
| **Output JSON** *(optional)* | `output_json` | Outputs a JSON object, requiring an OpenAI client. Only one output format can be set. |
|
||||
| **Output Pydantic** *(optional)* | `output_pydantic` | Outputs a Pydantic model object, requiring an OpenAI client. Only one output format can be set. |
|
||||
| **Output File** *(optional)* | `output_file` | Saves the task output to a file. If used with `Output JSON` or `Output Pydantic`, specifies how the output is saved. |
|
||||
| **Callback** *(optional)* | `callback` | A Python callable that is executed with the task's output upon completion. |
|
||||
| **Human Input** *(optional)* | `human_input` | Indicates if the task requires human feedback at the end, useful for tasks needing human oversight. |
|
||||
|
||||
## Creating a Task
|
||||
|
||||
|
||||
@@ -8,6 +8,7 @@ The training feature in CrewAI allows you to train your AI agents using the comm
|
||||
|
||||
During training, CrewAI utilizes techniques to optimize the performance of your agents along with human feedback. This helps the agents improve their understanding, decision-making, and problem-solving abilities.
|
||||
|
||||
### Training Your Crew Using the CLI
|
||||
To use the training feature, follow these steps:
|
||||
|
||||
1. Open your terminal or command prompt.
|
||||
@@ -18,7 +19,26 @@ To use the training feature, follow these steps:
|
||||
crewai train -n <n_iterations>
|
||||
```
|
||||
|
||||
Replace `<n_iterations>` with the desired number of training iterations. This determines how many times the agents will go through the training process.
|
||||
### Training Your Crew Programmatically
|
||||
To train your crew programmatically, use the following steps:
|
||||
|
||||
1. Define the number of iterations for training.
|
||||
2. Specify the input parameters for the training process.
|
||||
3. Execute the training command within a try-except block to handle potential errors.
|
||||
|
||||
```python
|
||||
n_iterations = 2
|
||||
inputs = {"topic": "CrewAI Training"}
|
||||
|
||||
try:
|
||||
YourCrewName_Crew().crew().train(n_iterations= n_iterations, inputs=inputs)
|
||||
|
||||
except Exception as e:
|
||||
raise Exception(f"An error occurred while training the crew: {e}")
|
||||
```
|
||||
|
||||
!!! note "Replace `<n_iterations>` with the desired number of training iterations. This determines how many times the agents will go through the training process."
|
||||
|
||||
|
||||
### Key Points to Note:
|
||||
- **Positive Integer Requirement:** Ensure that the number of iterations (`n_iterations`) is a positive integer. The code will raise a `ValueError` if this condition is not met.
|
||||
|
||||
@@ -51,7 +51,7 @@ To optimize tool performance with caching, define custom caching strategies usin
|
||||
@tool("Tool with Caching")
|
||||
def cached_tool(argument: str) -> str:
|
||||
"""Tool functionality description."""
|
||||
return "Cachable result"
|
||||
return "Cacheable result"
|
||||
|
||||
def my_cache_strategy(arguments: dict, result: str) -> bool:
|
||||
# Define custom caching logic
|
||||
|
||||
@@ -79,5 +79,4 @@ manager = Agent(
|
||||
|
||||
1. `allow_code_execution`: Enable or disable code execution capabilities for the agent (default is False).
|
||||
2. `max_execution_time`: Set a maximum execution time (in seconds) for the agent to complete a task.
|
||||
3. `function_calling_llm`: Specify a separate language model for function calling.
|
||||
4
|
||||
3. `function_calling_llm`: Specify a separate language model for function calling.
|
||||
31
docs/how-to/Force-Tool-Ouput-as-Result.md
Normal file
31
docs/how-to/Force-Tool-Ouput-as-Result.md
Normal file
@@ -0,0 +1,31 @@
|
||||
---
|
||||
title: Forcing Tool Output as Result
|
||||
description: Learn how to force tool output as the result in of an Agent's task in crewAI.
|
||||
---
|
||||
|
||||
## Introduction
|
||||
In CrewAI, you can force the output of a tool as the result of an agent's task. This feature is useful when you want to ensure that the tool output is captured and returned as the task result, and avoid the agent modifying the output during the task execution.
|
||||
|
||||
## Forcing Tool Output as Result
|
||||
To force the tool output as the result of an agent's task, you can set the `force_tool_output` parameter to `True` when creating the task. This parameter ensures that the tool output is captured and returned as the task result, without any modifications by the agent.
|
||||
|
||||
Here's an example of how to force the tool output as the result of an agent's task:
|
||||
|
||||
```python
|
||||
# ...
|
||||
# Define a custom tool that returns the result as the answer
|
||||
coding_agent =Agent(
|
||||
role="Data Scientist",
|
||||
goal="Product amazing resports on AI",
|
||||
backstory="You work with data and AI",
|
||||
tools=[MyCustomTool(result_as_answer=True)],
|
||||
)
|
||||
# ...
|
||||
```
|
||||
|
||||
### Workflow in Action
|
||||
|
||||
1. **Task Execution**: The agent executes the task using the tool provided.
|
||||
2. **Tool Output**: The tool generates the output, which is captured as the task result.
|
||||
3. **Agent Interaction**: The agent my reflect and take learnings from the tool but the output is not modified.
|
||||
4. **Result Return**: The tool output is returned as the task result without any modifications.
|
||||
@@ -127,7 +127,7 @@ llm = HuggingFaceHub(
|
||||
```
|
||||
|
||||
## OpenAI Compatible API Endpoints
|
||||
Switch between APIs and models seamlessly using environment variables, supporting platforms like FastChat, LM Studio, and Mistral AI.
|
||||
Switch between APIs and models seamlessly using environment variables, supporting platforms like FastChat, LM Studio, Groq, and Mistral AI.
|
||||
|
||||
### Configuration Examples
|
||||
#### FastChat
|
||||
@@ -144,6 +144,13 @@ OPENAI_API_BASE="http://localhost:1234/v1"
|
||||
OPENAI_API_KEY="lm-studio"
|
||||
```
|
||||
|
||||
#### Groq API
|
||||
```sh
|
||||
OPENAI_API_KEY=your-groq-api-key
|
||||
OPENAI_MODEL_NAME='llama3-8b-8192'
|
||||
OPENAI_API_BASE=https://api.groq.com/openai/v1
|
||||
```
|
||||
|
||||
#### Mistral API
|
||||
```sh
|
||||
OPENAI_API_KEY=your-mistral-api-key
|
||||
@@ -211,4 +218,4 @@ azure_agent = Agent(
|
||||
```
|
||||
|
||||
## Conclusion
|
||||
Integrating CrewAI with different LLMs expands the framework's versatility, allowing for customized, efficient AI solutions across various domains and platforms.
|
||||
Integrating CrewAI with different LLMs expands the framework's versatility, allowing for customized, efficient AI solutions across various domains and platforms.
|
||||
|
||||
@@ -37,10 +37,9 @@ writer = Agent(
|
||||
backstory='A skilled writer with a talent for crafting compelling narratives'
|
||||
)
|
||||
|
||||
# Define the tasks in sequence
|
||||
research_task = Task(description='Gather relevant data...', agent=researcher)
|
||||
analysis_task = Task(description='Analyze the data...', agent=analyst)
|
||||
writing_task = Task(description='Compose the report...', agent=writer)
|
||||
research_task = Task(description='Gather relevant data...', agent=researcher, expected_output='Raw Data')
|
||||
analysis_task = Task(description='Analyze the data...', agent=analyst, expected_output='Data Insights')
|
||||
writing_task = Task(description='Compose the report...', agent=writer, expected_output='Final Report')
|
||||
|
||||
# Form the crew with a sequential process
|
||||
report_crew = Crew(
|
||||
@@ -83,4 +82,4 @@ CrewAI tracks token usage across all tasks and agents. You can access these metr
|
||||
1. **Order Matters**: Arrange tasks in a logical sequence where each task builds upon the previous one.
|
||||
2. **Clear Task Descriptions**: Provide detailed descriptions for each task to guide the agents effectively.
|
||||
3. **Appropriate Agent Selection**: Match agents' skills and roles to the requirements of each task.
|
||||
4. **Use Context**: Leverage the context from previous tasks to inform subsequent ones
|
||||
4. **Use Context**: Leverage the context from previous tasks to inform subsequent ones
|
||||
|
||||
137
docs/how-to/Start-a-New-CrewAI-Project.md
Normal file
137
docs/how-to/Start-a-New-CrewAI-Project.md
Normal file
@@ -0,0 +1,137 @@
|
||||
---
|
||||
title: Starting a New CrewAI Project
|
||||
description: A comprehensive guide to starting a new CrewAI project, including the latest updates and project setup methods.
|
||||
---
|
||||
|
||||
# Starting Your CrewAI Project
|
||||
|
||||
Welcome to the ultimate guide for starting a new CrewAI project. This document will walk you through the steps to create, customize, and run your CrewAI project, ensuring you have everything you need to get started.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
We assume you have already installed CrewAI. If not, please refer to the [installation guide](how-to/Installing-CrewAI.md) to install CrewAI and its dependencies.
|
||||
|
||||
## Creating a New Project
|
||||
|
||||
To create a new project, run the following CLI command:
|
||||
|
||||
```shell
|
||||
$ crewai create my_project
|
||||
```
|
||||
|
||||
This command will create a new project folder with the following structure:
|
||||
|
||||
```shell
|
||||
my_project/
|
||||
├── .gitignore
|
||||
├── pyproject.toml
|
||||
├── README.md
|
||||
└── src/
|
||||
└── my_project/
|
||||
├── __init__.py
|
||||
├── main.py
|
||||
├── crew.py
|
||||
├── tools/
|
||||
│ ├── custom_tool.py
|
||||
│ └── __init__.py
|
||||
└── config/
|
||||
├── agents.yaml
|
||||
└── tasks.yaml
|
||||
```
|
||||
|
||||
You can now start developing your project by editing the files in the `src/my_project` folder. The `main.py` file is the entry point of your project, and the `crew.py` file is where you define your agents and tasks.
|
||||
|
||||
## Customizing Your Project
|
||||
|
||||
To customize your project, you can:
|
||||
- Modify `src/my_project/config/agents.yaml` to define your agents.
|
||||
- Modify `src/my_project/config/tasks.yaml` to define your tasks.
|
||||
- Modify `src/my_project/crew.py` to add your own logic, tools, and specific arguments.
|
||||
- Modify `src/my_project/main.py` to add custom inputs for your agents and tasks.
|
||||
- Add your environment variables into the `.env` file.
|
||||
|
||||
### Example: Defining Agents and Tasks
|
||||
|
||||
#### agents.yaml
|
||||
|
||||
```yaml
|
||||
researcher:
|
||||
role: >
|
||||
Job Candidate Researcher
|
||||
goal: >
|
||||
Find potential candidates for the job
|
||||
backstory: >
|
||||
You are adept at finding the right candidates by exploring various online
|
||||
resources. Your skill in identifying suitable candidates ensures the best
|
||||
match for job positions.
|
||||
```
|
||||
|
||||
#### tasks.yaml
|
||||
|
||||
```yaml
|
||||
research_candidates_task:
|
||||
description: >
|
||||
Conduct thorough research to find potential candidates for the specified job.
|
||||
Utilize various online resources and databases to gather a comprehensive list of potential candidates.
|
||||
Ensure that the candidates meet the job requirements provided.
|
||||
|
||||
Job Requirements:
|
||||
{job_requirements}
|
||||
expected_output: >
|
||||
A list of 10 potential candidates with their contact information and brief profiles highlighting their suitability.
|
||||
```
|
||||
|
||||
## Installing Dependencies
|
||||
|
||||
To install the dependencies for your project, you can use Poetry. First, navigate to your project directory:
|
||||
|
||||
```shell
|
||||
$ cd my_project
|
||||
$ poetry lock
|
||||
$ poetry install
|
||||
```
|
||||
|
||||
This will install the dependencies specified in the `pyproject.toml` file.
|
||||
|
||||
## Interpolating Variables
|
||||
|
||||
Any variable interpolated in your `agents.yaml` and `tasks.yaml` files like `{variable}` will be replaced by the value of the variable in the `main.py` file.
|
||||
|
||||
#### agents.yaml
|
||||
|
||||
```yaml
|
||||
research_task:
|
||||
description: >
|
||||
Conduct a thorough research about the customer and competitors in the context
|
||||
of {customer_domain}.
|
||||
Make sure you find any interesting and relevant information given the
|
||||
current year is 2024.
|
||||
expected_output: >
|
||||
A complete report on the customer and their customers and competitors,
|
||||
including their demographics, preferences, market positioning and audience engagement.
|
||||
```
|
||||
|
||||
#### main.py
|
||||
|
||||
```python
|
||||
# main.py
|
||||
def run():
|
||||
inputs = {
|
||||
"customer_domain": "crewai.com"
|
||||
}
|
||||
MyProjectCrew(inputs).crew().kickoff(inputs=inputs)
|
||||
```
|
||||
|
||||
## Running Your Project
|
||||
|
||||
To run your project, use the following command:
|
||||
|
||||
```shell
|
||||
$ poetry run my_project
|
||||
```
|
||||
|
||||
This will initialize your crew of AI agents and begin task execution as defined in your configuration in the `main.py` file.
|
||||
|
||||
## Deploying Your Project
|
||||
|
||||
The easiest way to deploy your crew is through [CrewAI+](https://www.crewai.com/crewaiplus), where you can deploy your crew in a few clicks.
|
||||
@@ -48,6 +48,11 @@ Cutting-edge framework for orchestrating role-playing, autonomous AI agents. By
|
||||
<div style="width:30%">
|
||||
<h2>How-To Guides</h2>
|
||||
<ul>
|
||||
<li>
|
||||
<a href="./how-to/Start-a-New-CrewAI-Project">
|
||||
Starting Your crewAI Project
|
||||
</a>
|
||||
</li>
|
||||
<li>
|
||||
<a href="./how-to/Installing-CrewAI">
|
||||
Installing crewAI
|
||||
@@ -88,6 +93,11 @@ Cutting-edge framework for orchestrating role-playing, autonomous AI agents. By
|
||||
Coding Agents
|
||||
</a>
|
||||
</li>
|
||||
<li>
|
||||
<a href="./how-to/Force-Tool-Ouput-as-Result">
|
||||
Forcing Tool Output as Result
|
||||
</a>
|
||||
</li>
|
||||
<li>
|
||||
<a href="./how-to/Human-Input-on-Execution">
|
||||
Human Input on Execution
|
||||
|
||||
@@ -20,6 +20,7 @@ pip install 'crewai[tools]'
|
||||
Remember that when using this tool, the code must be generated by the Agent itself. The code must be a Python3 code. And it will take some time for the first time to run because it needs to build the Docker image.
|
||||
|
||||
```python
|
||||
from crewai import Agent
|
||||
from crewai_tools import CodeInterpreterTool
|
||||
|
||||
Agent(
|
||||
@@ -27,3 +28,14 @@ Agent(
|
||||
tools=[CodeInterpreterTool()],
|
||||
)
|
||||
```
|
||||
|
||||
We also provide a simple way to use it directly from the Agent.
|
||||
|
||||
```python
|
||||
from crewai import Agent
|
||||
|
||||
agent = Agent(
|
||||
...
|
||||
allow_code_execution=True,
|
||||
)
|
||||
```
|
||||
|
||||
72
docs/tools/ComposioTool.md
Normal file
72
docs/tools/ComposioTool.md
Normal file
@@ -0,0 +1,72 @@
|
||||
# ComposioTool Documentation
|
||||
|
||||
## Description
|
||||
|
||||
This tools is a wrapper around the composio toolset and gives your agent access to a wide variety of tools from the composio SDK.
|
||||
|
||||
## Installation
|
||||
|
||||
To incorporate this tool into your project, follow the installation instructions below:
|
||||
|
||||
```shell
|
||||
pip install composio-core
|
||||
pip install 'crewai[tools]'
|
||||
```
|
||||
|
||||
after the installation is complete, either run `composio login` or export your composio API key as `COMPOSIO_API_KEY`.
|
||||
|
||||
## Example
|
||||
|
||||
The following example demonstrates how to initialize the tool and execute a github action:
|
||||
|
||||
1. Initialize toolset
|
||||
|
||||
```python
|
||||
from composio import App
|
||||
from crewai_tools import ComposioTool
|
||||
from crewai import Agent, Task
|
||||
|
||||
|
||||
tools = [ComposioTool.from_action(action=Action.GITHUB_ACTIVITY_STAR_REPO_FOR_AUTHENTICATED_USER)]
|
||||
```
|
||||
|
||||
If you don't know what action you want to use, use `from_app` and `tags` filter to get relevant actions
|
||||
|
||||
```python
|
||||
tools = ComposioTool.from_app(App.GITHUB, tags=["important"])
|
||||
```
|
||||
|
||||
or use `use_case` to search relevant actions
|
||||
|
||||
```python
|
||||
tools = ComposioTool.from_app(App.GITHUB, use_case="Star a github repository")
|
||||
```
|
||||
|
||||
2. Define agent
|
||||
|
||||
```python
|
||||
crewai_agent = Agent(
|
||||
role="Github Agent",
|
||||
goal="You take action on Github using Github APIs",
|
||||
backstory=(
|
||||
"You are AI agent that is responsible for taking actions on Github "
|
||||
"on users behalf. You need to take action on Github using Github APIs"
|
||||
),
|
||||
verbose=True,
|
||||
tools=tools,
|
||||
)
|
||||
```
|
||||
|
||||
3. Execute task
|
||||
|
||||
```python
|
||||
task = Task(
|
||||
description="Star a repo ComposioHQ/composio on GitHub",
|
||||
agent=crewai_agent,
|
||||
expected_output="if the star happened",
|
||||
)
|
||||
|
||||
task.execute()
|
||||
```
|
||||
|
||||
* More detailed list of tools can be found [here](https://app.composio.dev)
|
||||
@@ -4,7 +4,7 @@
|
||||
We are still working on improving tools, so there might be unexpected behavior or changes in the future.
|
||||
|
||||
## Description
|
||||
The GithubSearchTool is a Read, Append, and Generate (RAG) tool specifically designed for conducting semantic searches within GitHub repositories. Utilizing advanced semantic search capabilities, it sifts through code, pull requests, issues, and repositories, making it an essential tool for developers, researchers, or anyone in need of precise information from GitHub.
|
||||
The GithubSearchTool is a Retrieval-Augmented Generation (RAG) tool specifically designed for conducting semantic searches within GitHub repositories. Utilizing advanced semantic search capabilities, it sifts through code, pull requests, issues, and repositories, making it an essential tool for developers, researchers, or anyone in need of precise information from GitHub.
|
||||
|
||||
## Installation
|
||||
To use the GithubSearchTool, first ensure the crewai_tools package is installed in your Python environment:
|
||||
|
||||
@@ -4,7 +4,7 @@
|
||||
The MDXSearchTool is in continuous development. Features may be added or removed, and functionality could change unpredictably as we refine the tool.
|
||||
|
||||
## Description
|
||||
The MDX Search Tool is a component of the `crewai_tools` package aimed at facilitating advanced market data extraction. This tool is invaluable for researchers and analysts seeking quick access to market insights, especially within the AI sector. It simplifies the task of acquiring, interpreting, and organizing market data by interfacing with various data sources.
|
||||
The MDX Search Tool is a component of the `crewai_tools` package aimed at facilitating advanced markdown language extraction. It enables users to effectively search and extract relevant information from MD files using query-based searches. This tool is invaluable for data analysis, information management, and research tasks, streamlining the process of finding specific information within large document collections.
|
||||
|
||||
## Installation
|
||||
Before using the MDX Search Tool, ensure the `crewai_tools` package is installed. If it is not, you can install it with the following command:
|
||||
@@ -59,4 +59,4 @@ tool = MDXSearchTool(
|
||||
),
|
||||
)
|
||||
)
|
||||
```
|
||||
```
|
||||
|
||||
@@ -31,7 +31,7 @@ tool = TXTSearchTool(txt='path/to/text/file.txt')
|
||||
```
|
||||
|
||||
## Arguments
|
||||
- `txt` (str): **Optinal**. The path to the text file you want to search. This argument is only required if the tool was not initialized with a specific text file; otherwise, the search will be conducted within the initially provided text file.
|
||||
- `txt` (str): **Optional**. The path to the text file you want to search. This argument is only required if the tool was not initialized with a specific text file; otherwise, the search will be conducted within the initially provided text file.
|
||||
|
||||
## Custom model and embeddings
|
||||
|
||||
|
||||
@@ -131,6 +131,7 @@ nav:
|
||||
- Using LangChain Tools: 'core-concepts/Using-LangChain-Tools.md'
|
||||
- Using LlamaIndex Tools: 'core-concepts/Using-LlamaIndex-Tools.md'
|
||||
- How to Guides:
|
||||
- Starting Your crewAI Project: 'how-to/Start-a-New-CrewAI-Project.md'
|
||||
- Installing CrewAI: 'how-to/Installing-CrewAI.md'
|
||||
- Getting Started: 'how-to/Creating-a-Crew-and-kick-it-off.md'
|
||||
- Create Custom Tools: 'how-to/Create-Custom-Tools.md'
|
||||
@@ -140,6 +141,7 @@ nav:
|
||||
- Connecting to any LLM: 'how-to/LLM-Connections.md'
|
||||
- Customizing Agents: 'how-to/Customizing-Agents.md'
|
||||
- Coding Agents: 'how-to/Coding-Agents.md'
|
||||
- Forcing Tool Output as Result: 'how-to/Force-Tool-Ouput-as-Result.md'
|
||||
- Human Input on Execution: 'how-to/Human-Input-on-Execution.md'
|
||||
- Kickoff a Crew Asynchronously: 'how-to/Kickoff-async.md'
|
||||
- Kickoff a Crew for a List: 'how-to/Kickoff-for-each.md'
|
||||
@@ -148,6 +150,8 @@ nav:
|
||||
- Tools Docs:
|
||||
- Google Serper Search: 'tools/SerperDevTool.md'
|
||||
- Browserbase Web Loader: 'tools/BrowserbaseLoadTool.md'
|
||||
- Composio Tools: 'tools/ComposioTool.md'
|
||||
- Code Interpreter: 'tools/CodeInterpreterTool.md'
|
||||
- Scrape Website: 'tools/ScrapeWebsiteTool.md'
|
||||
- Directory Read: 'tools/DirectoryReadTool.md'
|
||||
- Exa Serch Web Loader: 'tools/EXASearchTool.md'
|
||||
|
||||
1436
poetry.lock
generated
1436
poetry.lock
generated
File diff suppressed because it is too large
Load Diff
@@ -1,6 +1,6 @@
|
||||
[tool.poetry]
|
||||
name = "crewai"
|
||||
version = "0.35.3"
|
||||
version = "0.36.0"
|
||||
description = "Cutting-edge framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks."
|
||||
authors = ["Joao Moura <joao@crewai.com>"]
|
||||
readme = "README.md"
|
||||
@@ -14,22 +14,25 @@ Repository = "https://github.com/joaomdmoura/crewai"
|
||||
[tool.poetry.dependencies]
|
||||
python = ">=3.10,<=3.13"
|
||||
pydantic = "^2.4.2"
|
||||
langchain = ">=0.2,<=0.3"
|
||||
langchain = ">0.2,<=0.3"
|
||||
openai = "^1.13.3"
|
||||
opentelemetry-api = "^1.22.0"
|
||||
opentelemetry-sdk = "^1.22.0"
|
||||
opentelemetry-exporter-otlp-proto-http = "^1.22.0"
|
||||
instructor = "1.3.3"
|
||||
regex = "^2023.12.25"
|
||||
crewai-tools = { version = "^0.4.1", optional = true }
|
||||
crewai-tools = { version = "^0.4.8", optional = true }
|
||||
click = "^8.1.7"
|
||||
python-dotenv = "^1.0.0"
|
||||
embedchain-crewai = {extras = ["github", "youtube"], version = "^0.1.114"}
|
||||
appdirs = "^1.4.4"
|
||||
jsonref = "^1.1.0"
|
||||
agentops = { version = "^0.1.9", optional = true }
|
||||
embedchain = "^0.1.114"
|
||||
json-repair = "^0.25.2"
|
||||
|
||||
[tool.poetry.extras]
|
||||
tools = ["crewai-tools"]
|
||||
agentops = ["agentops"]
|
||||
|
||||
[tool.poetry.group.dev.dependencies]
|
||||
isort = "^5.13.2"
|
||||
@@ -43,7 +46,7 @@ mkdocs-material = { extras = ["imaging"], version = "^9.5.7" }
|
||||
mkdocs-material-extensions = "^1.3.1"
|
||||
pillow = "^10.2.0"
|
||||
cairosvg = "^2.7.1"
|
||||
crewai-tools = "^0.4.1"
|
||||
crewai-tools = "^0.4.8"
|
||||
|
||||
[tool.poetry.group.test.dependencies]
|
||||
pytest = "^8.0.0"
|
||||
@@ -60,4 +63,4 @@ exclude = ["cli/templates/main.py", "cli/templates/crew.py"]
|
||||
|
||||
[build-system]
|
||||
requires = ["poetry-core"]
|
||||
build-backend = "poetry.core.masonry.api"
|
||||
build-backend = "poetry.core.masonry.api"
|
||||
|
||||
@@ -1,25 +1,38 @@
|
||||
import os
|
||||
from inspect import signature
|
||||
from typing import Any, List, Optional, Tuple
|
||||
|
||||
from langchain.agents.agent import RunnableAgent
|
||||
from langchain.agents.tools import BaseTool
|
||||
from langchain.agents.tools import tool as LangChainTool
|
||||
from langchain.tools.render import render_text_description
|
||||
from langchain_core.agents import AgentAction
|
||||
from langchain_core.callbacks import BaseCallbackHandler
|
||||
from langchain_openai import ChatOpenAI
|
||||
|
||||
from pydantic import Field, InstanceOf, model_validator
|
||||
from pydantic import Field, InstanceOf, PrivateAttr, model_validator
|
||||
|
||||
from crewai.agents import CacheHandler, CrewAgentExecutor, CrewAgentParser
|
||||
from crewai.agents.agent_builder.base_agent import BaseAgent
|
||||
from crewai.memory.contextual.contextual_memory import ContextualMemory
|
||||
from crewai.tools.agent_tools import AgentTools
|
||||
from crewai.utilities import Prompts, Converter
|
||||
from crewai.utilities import Converter, Prompts
|
||||
from crewai.utilities.constants import TRAINED_AGENTS_DATA_FILE, TRAINING_DATA_FILE
|
||||
from crewai.utilities.token_counter_callback import TokenCalcHandler
|
||||
from crewai.agents.agent_builder.base_agent import BaseAgent
|
||||
from crewai.utilities.training_handler import CrewTrainingHandler
|
||||
|
||||
agentops = None
|
||||
try:
|
||||
import agentops # type: ignore # Name "agentops" already defined on line 21
|
||||
from agentops import track_agent
|
||||
except ImportError:
|
||||
|
||||
def track_agent():
|
||||
def noop(f):
|
||||
return f
|
||||
|
||||
return noop
|
||||
|
||||
|
||||
@track_agent()
|
||||
class Agent(BaseAgent):
|
||||
"""Represents an agent in a system.
|
||||
|
||||
@@ -42,12 +55,17 @@ class Agent(BaseAgent):
|
||||
tools: Tools at agents disposal
|
||||
step_callback: Callback to be executed after each step of the agent execution.
|
||||
callbacks: A list of callback functions from the langchain library that are triggered during the agent's execution process
|
||||
allow_code_execution: Enable code execution for the agent.
|
||||
max_retry_limit: Maximum number of retries for an agent to execute a task when an error occurs.
|
||||
"""
|
||||
|
||||
_times_executed: int = PrivateAttr(default=0)
|
||||
max_execution_time: Optional[int] = Field(
|
||||
default=None,
|
||||
description="Maximum execution time for an agent to execute a task",
|
||||
)
|
||||
agent_ops_agent_name: str = None # type: ignore # Incompatible types in assignment (expression has type "None", variable has type "str")
|
||||
agent_ops_agent_id: str = None # type: ignore # Incompatible types in assignment (expression has type "None", variable has type "str")
|
||||
cache_handler: InstanceOf[CacheHandler] = Field(
|
||||
default=None, description="An instance of the CacheHandler class."
|
||||
)
|
||||
@@ -76,18 +94,25 @@ class Agent(BaseAgent):
|
||||
response_template: Optional[str] = Field(
|
||||
default=None, description="Response format for the agent."
|
||||
)
|
||||
|
||||
tools_results: Optional[List[Any]] = Field(
|
||||
default=[], description="Results of the tools used by the agent."
|
||||
)
|
||||
allow_code_execution: Optional[bool] = Field(
|
||||
default=False, description="Enable code execution for the agent."
|
||||
)
|
||||
max_retry_limit: int = Field(
|
||||
default=2,
|
||||
description="Maximum number of retries for an agent to execute a task when an error occurs.",
|
||||
)
|
||||
|
||||
def __init__(__pydantic_self__, **data):
|
||||
config = data.pop("config", {})
|
||||
super().__init__(**config, **data)
|
||||
__pydantic_self__.agent_ops_agent_name = __pydantic_self__.role
|
||||
|
||||
@model_validator(mode="after")
|
||||
def set_agent_executor(self) -> "Agent":
|
||||
"""Ensure agent executor and token process is set."""
|
||||
"""Ensure agent executor and token process are set."""
|
||||
if hasattr(self.llm, "model_name"):
|
||||
token_handler = TokenCalcHandler(self.llm.model_name, self._token_process)
|
||||
|
||||
@@ -101,6 +126,13 @@ class Agent(BaseAgent):
|
||||
):
|
||||
self.llm.callbacks.append(token_handler)
|
||||
|
||||
if agentops and not any(
|
||||
isinstance(handler, agentops.LangchainCallbackHandler)
|
||||
for handler in self.llm.callbacks
|
||||
):
|
||||
agentops.stop_instrumenting()
|
||||
self.llm.callbacks.append(agentops.LangchainCallbackHandler())
|
||||
|
||||
if not self.agent_executor:
|
||||
if not self.cache_handler:
|
||||
self.cache_handler = CacheHandler()
|
||||
@@ -124,8 +156,7 @@ class Agent(BaseAgent):
|
||||
Output of the agent
|
||||
"""
|
||||
if self.tools_handler:
|
||||
# type: ignore # Incompatible types in assignment (expression has type "dict[Never, Never]", variable has type "ToolCalling")
|
||||
self.tools_handler.last_used_tool = {}
|
||||
self.tools_handler.last_used_tool = {} # type: ignore # Incompatible types in assignment (expression has type "dict[Never, Never]", variable has type "ToolCalling")
|
||||
|
||||
task_prompt = task.prompt()
|
||||
|
||||
@@ -144,14 +175,15 @@ class Agent(BaseAgent):
|
||||
if memory.strip() != "":
|
||||
task_prompt += self.i18n.slice("memory").format(memory=memory)
|
||||
|
||||
tools = tools or self.tools
|
||||
# type: ignore # Argument 1 to "_parse_tools" of "Agent" has incompatible type "list[Any] | None"; expected "list[Any]"
|
||||
parsed_tools = self._parse_tools(tools or [])
|
||||
tools = tools or self.tools or []
|
||||
parsed_tools = self._parse_tools(tools)
|
||||
self.create_agent_executor(tools=tools)
|
||||
self.agent_executor.tools = parsed_tools
|
||||
self.agent_executor.task = task
|
||||
|
||||
self.agent_executor.tools_description = render_text_description(parsed_tools)
|
||||
self.agent_executor.tools_description = self._render_text_description_and_args(
|
||||
parsed_tools
|
||||
)
|
||||
self.agent_executor.tools_names = self.__tools_names(parsed_tools)
|
||||
|
||||
if self.crew and self.crew._train:
|
||||
@@ -159,15 +191,30 @@ class Agent(BaseAgent):
|
||||
else:
|
||||
task_prompt = self._use_trained_data(task_prompt=task_prompt)
|
||||
|
||||
result = self.agent_executor.invoke(
|
||||
{
|
||||
"input": task_prompt,
|
||||
"tool_names": self.agent_executor.tools_names,
|
||||
"tools": self.agent_executor.tools_description,
|
||||
}
|
||||
)["output"]
|
||||
try:
|
||||
result = self.agent_executor.invoke(
|
||||
{
|
||||
"input": task_prompt,
|
||||
"tool_names": self.agent_executor.tools_names,
|
||||
"tools": self.agent_executor.tools_description,
|
||||
}
|
||||
)["output"]
|
||||
except Exception as e:
|
||||
self._times_executed += 1
|
||||
if self._times_executed > self.max_retry_limit:
|
||||
raise e
|
||||
self.execute_task(task, context, tools)
|
||||
|
||||
if self.max_rpm:
|
||||
self._rpm_controller.stop_rpm_counter()
|
||||
|
||||
# If there was any tool in self.tools_results that had result_as_answer
|
||||
# set to True, return the results of the last tool that had
|
||||
# result_as_answer set to True
|
||||
for tool_result in self.tools_results: # type: ignore # Item "None" of "list[Any] | None" has no attribute "__iter__" (not iterable)
|
||||
if tool_result.get("result_as_answer", False):
|
||||
result = tool_result["result"]
|
||||
|
||||
return result
|
||||
|
||||
def format_log_to_str(
|
||||
@@ -189,7 +236,7 @@ class Agent(BaseAgent):
|
||||
Returns:
|
||||
An instance of the CrewAgentExecutor class.
|
||||
"""
|
||||
tools = tools or self.tools
|
||||
tools = tools or self.tools or []
|
||||
|
||||
agent_args = {
|
||||
"input": lambda x: x["input"],
|
||||
@@ -268,7 +315,7 @@ class Agent(BaseAgent):
|
||||
def get_output_converter(self, llm, text, model, instructions):
|
||||
return Converter(llm=llm, text=text, model=model, instructions=instructions)
|
||||
|
||||
def _parse_tools(self, tools: List[Any]) -> List[LangChainTool]:
|
||||
def _parse_tools(self, tools: List[Any]) -> List[LangChainTool]: # type: ignore # Function "langchain_core.tools.tool" is not valid as a type
|
||||
"""Parse tools to be used for the task."""
|
||||
tools_list = []
|
||||
try:
|
||||
@@ -284,6 +331,7 @@ class Agent(BaseAgent):
|
||||
tools_list = []
|
||||
for tool in tools:
|
||||
tools_list.append(tool)
|
||||
|
||||
return tools_list
|
||||
|
||||
def _training_handler(self, task_prompt: str) -> str:
|
||||
@@ -310,6 +358,52 @@ class Agent(BaseAgent):
|
||||
)
|
||||
return task_prompt
|
||||
|
||||
def _render_text_description(self, tools: List[BaseTool]) -> str:
|
||||
"""Render the tool name and description in plain text.
|
||||
|
||||
Output will be in the format of:
|
||||
|
||||
.. code-block:: markdown
|
||||
|
||||
search: This tool is used for search
|
||||
calculator: This tool is used for math
|
||||
"""
|
||||
description = "\n".join(
|
||||
[
|
||||
f"Tool name: {tool.name}\nTool description:\n{tool.description}"
|
||||
for tool in tools
|
||||
]
|
||||
)
|
||||
|
||||
return description
|
||||
|
||||
def _render_text_description_and_args(self, tools: List[BaseTool]) -> str:
|
||||
"""Render the tool name, description, and args in plain text.
|
||||
|
||||
Output will be in the format of:
|
||||
|
||||
.. code-block:: markdown
|
||||
|
||||
search: This tool is used for search, args: {"query": {"type": "string"}}
|
||||
calculator: This tool is used for math, \
|
||||
args: {"expression": {"type": "string"}}
|
||||
"""
|
||||
tool_strings = []
|
||||
for tool in tools:
|
||||
args_schema = str(tool.args)
|
||||
if hasattr(tool, "func") and tool.func:
|
||||
sig = signature(tool.func)
|
||||
description = (
|
||||
f"Tool Name: {tool.name}{sig}\nTool Description: {tool.description}"
|
||||
)
|
||||
else:
|
||||
description = (
|
||||
f"Tool Name: {tool.name}\nTool Description: {tool.description}"
|
||||
)
|
||||
tool_strings.append(f"{description}\nTool Arguments: {args_schema}")
|
||||
|
||||
return "\n".join(tool_strings)
|
||||
|
||||
@staticmethod
|
||||
def __tools_names(tools) -> str:
|
||||
return ", ".join([t.name for t in tools])
|
||||
|
||||
@@ -1,22 +1,26 @@
|
||||
from copy import deepcopy
|
||||
import uuid
|
||||
from typing import Any, Dict, List, Optional
|
||||
from abc import ABC, abstractmethod
|
||||
from copy import copy as shallow_copy
|
||||
from typing import Any, Dict, List, Optional, TypeVar
|
||||
|
||||
from pydantic import (
|
||||
UUID4,
|
||||
BaseModel,
|
||||
ConfigDict,
|
||||
Field,
|
||||
InstanceOf,
|
||||
PrivateAttr,
|
||||
field_validator,
|
||||
model_validator,
|
||||
ConfigDict,
|
||||
PrivateAttr,
|
||||
)
|
||||
from pydantic_core import PydanticCustomError
|
||||
|
||||
from crewai.utilities import I18N, RPMController, Logger
|
||||
from crewai.agents import CacheHandler, ToolsHandler
|
||||
from crewai.utilities.token_counter_callback import TokenProcess
|
||||
from crewai.agents.agent_builder.utilities.base_token_process import TokenProcess
|
||||
from crewai.agents.cache.cache_handler import CacheHandler
|
||||
from crewai.agents.tools_handler import ToolsHandler
|
||||
from crewai.utilities import I18N, Logger, RPMController
|
||||
|
||||
T = TypeVar("T", bound="BaseAgent")
|
||||
|
||||
|
||||
class BaseAgent(ABC, BaseModel):
|
||||
@@ -187,6 +191,31 @@ class BaseAgent(ABC, BaseModel):
|
||||
"""Get the converter class for the agent to create json/pydantic outputs."""
|
||||
pass
|
||||
|
||||
def copy(self: T) -> T: # type: ignore # Signature of "copy" incompatible with supertype "BaseModel"
|
||||
"""Create a deep copy of the Agent."""
|
||||
exclude = {
|
||||
"id",
|
||||
"_logger",
|
||||
"_rpm_controller",
|
||||
"_request_within_rpm_limit",
|
||||
"_token_process",
|
||||
"agent_executor",
|
||||
"tools",
|
||||
"tools_handler",
|
||||
"cache_handler",
|
||||
"llm",
|
||||
}
|
||||
|
||||
# Copy llm and clear callbacks
|
||||
existing_llm = shallow_copy(self.llm)
|
||||
existing_llm.callbacks = []
|
||||
copied_data = self.model_dump(exclude=exclude)
|
||||
copied_data = {k: v for k, v in copied_data.items() if v is not None}
|
||||
|
||||
copied_agent = type(self)(**copied_data, llm=existing_llm, tools=self.tools)
|
||||
|
||||
return copied_agent
|
||||
|
||||
def interpolate_inputs(self, inputs: Dict[str, Any]) -> None:
|
||||
"""Interpolate inputs into the agent description and backstory."""
|
||||
if self._original_role is None:
|
||||
@@ -216,35 +245,6 @@ class BaseAgent(ABC, BaseModel):
|
||||
def increment_formatting_errors(self) -> None:
|
||||
self.formatting_errors += 1
|
||||
|
||||
def copy(self):
|
||||
exclude = {
|
||||
"id",
|
||||
"_logger",
|
||||
"_rpm_controller",
|
||||
"_request_within_rpm_limit",
|
||||
"token_process",
|
||||
"agent_executor",
|
||||
"tools",
|
||||
"tools_handler",
|
||||
"cache_handler",
|
||||
"crew",
|
||||
"llm",
|
||||
}
|
||||
|
||||
copied_data = self.model_dump(exclude=exclude, exclude_unset=True)
|
||||
copied_agent = self.__class__(**copied_data)
|
||||
|
||||
# Copy mutable attributes separately
|
||||
copied_agent.tools = deepcopy(self.tools)
|
||||
copied_agent.config = deepcopy(self.config)
|
||||
|
||||
# Preserve original values for interpolation
|
||||
copied_agent._original_role = self._original_role
|
||||
copied_agent._original_goal = self._original_goal
|
||||
copied_agent._original_backstory = self._original_backstory
|
||||
|
||||
return copied_agent
|
||||
|
||||
def set_rpm_controller(self, rpm_controller: RPMController) -> None:
|
||||
"""Set the rpm controller for the agent.
|
||||
|
||||
|
||||
@@ -1,65 +1,109 @@
|
||||
import time
|
||||
from typing import TYPE_CHECKING, Optional
|
||||
|
||||
from crewai.memory.entity.entity_memory_item import EntityMemoryItem
|
||||
from crewai.memory.long_term.long_term_memory_item import LongTermMemoryItem
|
||||
from crewai.memory.short_term.short_term_memory_item import ShortTermMemoryItem
|
||||
from crewai.utilities.converter import ConverterError
|
||||
from crewai.utilities.evaluators.task_evaluator import TaskEvaluator
|
||||
from crewai.utilities import I18N
|
||||
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from crewai.crew import Crew
|
||||
from crewai.task import Task
|
||||
from crewai.agents.agent_builder.base_agent import BaseAgent
|
||||
|
||||
|
||||
class CrewAgentExecutorMixin:
|
||||
crew: Optional["Crew"]
|
||||
crew_agent: Optional["BaseAgent"]
|
||||
task: Optional["Task"]
|
||||
iterations: int
|
||||
force_answer_max_iterations: int
|
||||
have_forced_answer: bool
|
||||
_i18n: I18N
|
||||
|
||||
def _should_force_answer(self) -> bool:
|
||||
"""Determine if a forced answer is required based on iteration count."""
|
||||
return (
|
||||
self.iterations == self.force_answer_max_iterations
|
||||
) and not self.have_forced_answer
|
||||
|
||||
def _create_short_term_memory(self, output) -> None:
|
||||
"""Create and save a short-term memory item if conditions are met."""
|
||||
if (
|
||||
self.crew
|
||||
and self.crew_agent
|
||||
and self.task
|
||||
and "Action: Delegate work to coworker" not in output.log
|
||||
):
|
||||
try:
|
||||
memory = ShortTermMemoryItem(
|
||||
data=output.log,
|
||||
agent=self.crew_agent.role,
|
||||
metadata={
|
||||
"observation": self.task.description,
|
||||
},
|
||||
)
|
||||
if (
|
||||
hasattr(self.crew, "_short_term_memory")
|
||||
and self.crew._short_term_memory
|
||||
):
|
||||
self.crew._short_term_memory.save(memory)
|
||||
except Exception as e:
|
||||
print(f"Failed to add to short term memory: {e}")
|
||||
pass
|
||||
|
||||
def _create_long_term_memory(self, output) -> None:
|
||||
"""Create and save long-term and entity memory items based on evaluation."""
|
||||
if (
|
||||
self.crew
|
||||
and self.crew.memory
|
||||
and "Action: Delegate work to coworker" not in output.log
|
||||
and self.crew._long_term_memory
|
||||
and self.crew._entity_memory
|
||||
and self.task
|
||||
and self.crew_agent
|
||||
):
|
||||
memory = ShortTermMemoryItem(
|
||||
data=output.log,
|
||||
agent=self.crew_agent.role,
|
||||
metadata={
|
||||
"observation": self.task.description,
|
||||
},
|
||||
)
|
||||
self.crew._short_term_memory.save(memory)
|
||||
try:
|
||||
ltm_agent = TaskEvaluator(self.crew_agent)
|
||||
evaluation = ltm_agent.evaluate(self.task, output.log)
|
||||
|
||||
def _create_long_term_memory(self, output) -> None:
|
||||
if self.crew and self.crew.memory:
|
||||
ltm_agent = TaskEvaluator(self.crew_agent)
|
||||
evaluation = ltm_agent.evaluate(self.task, output.log)
|
||||
if isinstance(evaluation, ConverterError):
|
||||
return
|
||||
|
||||
if isinstance(evaluation, ConverterError):
|
||||
return
|
||||
|
||||
long_term_memory = LongTermMemoryItem(
|
||||
task=self.task.description,
|
||||
agent=self.crew_agent.role,
|
||||
quality=evaluation.quality,
|
||||
datetime=str(time.time()),
|
||||
expected_output=self.task.expected_output,
|
||||
metadata={
|
||||
"suggestions": evaluation.suggestions,
|
||||
"quality": evaluation.quality,
|
||||
},
|
||||
)
|
||||
self.crew._long_term_memory.save(long_term_memory)
|
||||
|
||||
for entity in evaluation.entities:
|
||||
entity_memory = EntityMemoryItem(
|
||||
name=entity.name,
|
||||
type=entity.type,
|
||||
description=entity.description,
|
||||
relationships="\n".join([f"- {r}" for r in entity.relationships]),
|
||||
long_term_memory = LongTermMemoryItem(
|
||||
task=self.task.description,
|
||||
agent=self.crew_agent.role,
|
||||
quality=evaluation.quality,
|
||||
datetime=str(time.time()),
|
||||
expected_output=self.task.expected_output,
|
||||
metadata={
|
||||
"suggestions": evaluation.suggestions,
|
||||
"quality": evaluation.quality,
|
||||
},
|
||||
)
|
||||
self.crew._entity_memory.save(entity_memory)
|
||||
self.crew._long_term_memory.save(long_term_memory)
|
||||
|
||||
for entity in evaluation.entities:
|
||||
entity_memory = EntityMemoryItem(
|
||||
name=entity.name,
|
||||
type=entity.type,
|
||||
description=entity.description,
|
||||
relationships="\n".join(
|
||||
[f"- {r}" for r in entity.relationships]
|
||||
),
|
||||
)
|
||||
self.crew._entity_memory.save(entity_memory)
|
||||
except AttributeError as e:
|
||||
print(f"Missing attributes for long term memory: {e}")
|
||||
pass
|
||||
except Exception as e:
|
||||
print(f"Failed to add to long term memory: {e}")
|
||||
pass
|
||||
|
||||
def _ask_human_input(self, final_answer: dict) -> str:
|
||||
"""Get human input."""
|
||||
"""Prompt human input for final decision making."""
|
||||
return input(
|
||||
self._i18n.slice("getting_input").format(final_answer=final_answer)
|
||||
)
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import List, Optional, Union
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
from crewai.agents.agent_builder.base_agent import BaseAgent
|
||||
from crewai.task import Task
|
||||
from crewai.utilities import I18N
|
||||
@@ -22,6 +24,7 @@ class BaseAgentTools(BaseModel, ABC):
|
||||
is_list = coworker.startswith("[") and coworker.endswith("]")
|
||||
if is_list:
|
||||
coworker = coworker[1:-1].split(",")[0]
|
||||
|
||||
return coworker
|
||||
|
||||
def delegate_work(
|
||||
@@ -38,11 +41,13 @@ class BaseAgentTools(BaseModel, ABC):
|
||||
coworker = self._get_coworker(coworker, **kwargs)
|
||||
return self._execute(coworker, question, context)
|
||||
|
||||
def _execute(self, agent: Union[str, None], task: str, context: Union[str, None]):
|
||||
def _execute(
|
||||
self, agent_name: Union[str, None], task: str, context: Union[str, None]
|
||||
):
|
||||
"""Execute the command."""
|
||||
try:
|
||||
if agent is None:
|
||||
agent = ""
|
||||
if agent_name is None:
|
||||
agent_name = ""
|
||||
|
||||
# It is important to remove the quotes from the agent name.
|
||||
# The reason we have to do this is because less-powerful LLM's
|
||||
@@ -51,9 +56,9 @@ class BaseAgentTools(BaseModel, ABC):
|
||||
# {"task": "....", "coworker": "....
|
||||
# when it should look like this:
|
||||
# {"task": "....", "coworker": "...."}
|
||||
agent_name = agent.casefold().replace('"', "").replace("\n", "")
|
||||
agent_name = agent_name.casefold().replace('"', "").replace("\n", "")
|
||||
|
||||
agent = [
|
||||
agent = [ # type: ignore # Incompatible types in assignment (expression has type "list[BaseAgent]", variable has type "str | None")
|
||||
available_agent
|
||||
for available_agent in self.agents
|
||||
if available_agent.role.casefold().replace("\n", "") == agent_name
|
||||
@@ -73,9 +78,9 @@ class BaseAgentTools(BaseModel, ABC):
|
||||
)
|
||||
|
||||
agent = agent[0]
|
||||
task = Task(
|
||||
task_with_assigned_agent = Task( # type: ignore # Incompatible types in assignment (expression has type "Task", variable has type "str")
|
||||
description=task,
|
||||
agent=agent,
|
||||
expected_output="Your best answer to your coworker asking you this, accounting for the context shared.",
|
||||
)
|
||||
return agent.execute_task(task, context)
|
||||
return agent.execute_task(task_with_assigned_agent, context)
|
||||
|
||||
@@ -1,8 +1,7 @@
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import Any, Optional
|
||||
|
||||
|
||||
from pydantic import BaseModel, Field, PrivateAttr
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
|
||||
class OutputConverter(BaseModel, ABC):
|
||||
@@ -22,13 +21,12 @@ class OutputConverter(BaseModel, ABC):
|
||||
max_attempts (int): Maximum number of conversion attempts (default: 3).
|
||||
"""
|
||||
|
||||
_is_gpt: bool = PrivateAttr(default=True)
|
||||
text: str = Field(description="Text to be converted.")
|
||||
llm: Any = Field(description="The language model to be used to convert the text.")
|
||||
model: Any = Field(description="The model to be used to convert the text.")
|
||||
instructions: str = Field(description="Conversion instructions to the LLM.")
|
||||
max_attemps: Optional[int] = Field(
|
||||
description="Max number of attemps to try to get the output formated.",
|
||||
max_attempts: Optional[int] = Field(
|
||||
description="Max number of attempts to try to get the output formatted.",
|
||||
default=3,
|
||||
)
|
||||
|
||||
@@ -42,7 +40,8 @@ class OutputConverter(BaseModel, ABC):
|
||||
"""Convert text to json."""
|
||||
pass
|
||||
|
||||
@property
|
||||
@abstractmethod
|
||||
def _is_gpt(self, llm):
|
||||
def is_gpt(self) -> bool:
|
||||
"""Return if llm provided is of gpt from openai."""
|
||||
pass
|
||||
@@ -7,12 +7,11 @@ from langchain.agents.agent import ExceptionTool
|
||||
from langchain.callbacks.manager import CallbackManagerForChainRun
|
||||
from langchain_core.agents import AgentAction, AgentFinish, AgentStep
|
||||
from langchain_core.exceptions import OutputParserException
|
||||
|
||||
from langchain_core.tools import BaseTool
|
||||
from langchain_core.utils.input import get_color_mapping
|
||||
from pydantic import InstanceOf
|
||||
from crewai.agents.agent_builder.base_agent_executor_mixin import CrewAgentExecutorMixin
|
||||
|
||||
from crewai.agents.agent_builder.base_agent_executor_mixin import CrewAgentExecutorMixin
|
||||
from crewai.agents.tools_handler import ToolsHandler
|
||||
from crewai.tools.tool_usage import ToolUsage, ToolUsageErrorException
|
||||
from crewai.utilities import I18N
|
||||
@@ -36,7 +35,7 @@ class CrewAgentExecutor(AgentExecutor, CrewAgentExecutorMixin):
|
||||
tools_handler: Optional[InstanceOf[ToolsHandler]] = None
|
||||
max_iterations: Optional[int] = 15
|
||||
have_forced_answer: bool = False
|
||||
force_answer_max_iterations: Optional[int] = None
|
||||
force_answer_max_iterations: Optional[int] = None # type: ignore # Incompatible types in assignment (expression has type "int | None", base class "CrewAgentExecutorMixin" defined the type as "int")
|
||||
step_callback: Optional[Any] = None
|
||||
system_template: Optional[str] = None
|
||||
prompt_template: Optional[str] = None
|
||||
@@ -233,6 +232,7 @@ class CrewAgentExecutor(AgentExecutor, CrewAgentExecutorMixin):
|
||||
tools_names=self.tools_names,
|
||||
function_calling_llm=self.function_calling_llm,
|
||||
task=self.task,
|
||||
agent=self.crew_agent,
|
||||
action=agent_action,
|
||||
)
|
||||
tool_calling = tool_usage.parse(agent_action.log)
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import re
|
||||
from typing import Any, Union
|
||||
|
||||
from json_repair import repair_json
|
||||
from langchain.agents.output_parsers import ReActSingleInputOutputParser
|
||||
from langchain_core.agents import AgentAction, AgentFinish
|
||||
from langchain_core.exceptions import OutputParserException
|
||||
@@ -48,11 +49,15 @@ class CrewAgentParser(ReActSingleInputOutputParser):
|
||||
raise OutputParserException(
|
||||
f"{FINAL_ANSWER_AND_PARSABLE_ACTION_ERROR_MESSAGE}: {text}"
|
||||
)
|
||||
action = action_match.group(1).strip()
|
||||
action_input = action_match.group(2)
|
||||
tool_input = action_input.strip(" ")
|
||||
tool_input = tool_input.strip('"')
|
||||
return AgentAction(action, tool_input, text)
|
||||
action = action_match.group(1)
|
||||
clean_action = self._clean_action(action)
|
||||
|
||||
action_input = action_match.group(2).strip()
|
||||
|
||||
tool_input = action_input.strip(" ").strip('"')
|
||||
safe_tool_input = self._safe_repair_json(tool_input)
|
||||
|
||||
return AgentAction(clean_action, safe_tool_input, text)
|
||||
|
||||
elif includes_answer:
|
||||
return AgentFinish(
|
||||
@@ -87,3 +92,30 @@ class CrewAgentParser(ReActSingleInputOutputParser):
|
||||
llm_output=text,
|
||||
send_to_llm=True,
|
||||
)
|
||||
|
||||
def _clean_action(self, text: str) -> str:
|
||||
"""Clean action string by removing non-essential formatting characters."""
|
||||
return re.sub(r"^\s*\*+\s*|\s*\*+\s*$", "", text).strip()
|
||||
|
||||
def _safe_repair_json(self, tool_input: str) -> str:
|
||||
UNABLE_TO_REPAIR_JSON_RESULTS = ['""', "{}"]
|
||||
|
||||
# Skip repair if the input starts and ends with square brackets
|
||||
# Explanation: The JSON parser has issues handling inputs that are enclosed in square brackets ('[]').
|
||||
# These are typically valid JSON arrays or strings that do not require repair. Attempting to repair such inputs
|
||||
# might lead to unintended alterations, such as wrapping the entire input in additional layers or modifying
|
||||
# the structure in a way that changes its meaning. By skipping the repair for inputs that start and end with
|
||||
# square brackets, we preserve the integrity of these valid JSON structures and avoid unnecessary modifications.
|
||||
if tool_input.startswith("[") and tool_input.endswith("]"):
|
||||
return tool_input
|
||||
|
||||
# Before repair, handle common LLM issues:
|
||||
# 1. Replace """ with " to avoid JSON parser errors
|
||||
|
||||
tool_input = tool_input.replace('"""', '"')
|
||||
|
||||
result = repair_json(tool_input)
|
||||
if result in UNABLE_TO_REPAIR_JSON_RESULTS:
|
||||
return tool_input
|
||||
|
||||
return str(result)
|
||||
|
||||
@@ -12,4 +12,4 @@ reporting_task:
|
||||
Make sure the report is detailed and contains any and all relevant information.
|
||||
expected_output: >
|
||||
A fully fledge reports with the mains topics, each with a full section of information.
|
||||
Formated as markdown with out '```'
|
||||
Formatted as markdown without '```'
|
||||
|
||||
@@ -6,7 +6,7 @@ authors = ["Your Name <you@example.com>"]
|
||||
|
||||
[tool.poetry.dependencies]
|
||||
python = ">=3.10,<=3.13"
|
||||
crewai = { extras = ["tools"], version = "^0.35.0" }
|
||||
crewai = { extras = ["tools"], version = "^0.35.8" }
|
||||
|
||||
[tool.poetry.scripts]
|
||||
{{folder_name}} = "{{folder_name}}.main:run"
|
||||
|
||||
@@ -1,36 +1,49 @@
|
||||
import asyncio
|
||||
import json
|
||||
import uuid
|
||||
from typing import Any, Dict, List, Optional, Union
|
||||
from concurrent.futures import Future
|
||||
from typing import Any, Dict, List, Optional, Tuple, Union
|
||||
|
||||
from langchain_core.callbacks import BaseCallbackHandler
|
||||
from pydantic import (
|
||||
UUID4,
|
||||
BaseModel,
|
||||
ConfigDict,
|
||||
Field,
|
||||
InstanceOf,
|
||||
Json,
|
||||
PrivateAttr,
|
||||
field_validator,
|
||||
model_validator,
|
||||
UUID4,
|
||||
BaseModel,
|
||||
ConfigDict,
|
||||
Field,
|
||||
InstanceOf,
|
||||
Json,
|
||||
PrivateAttr,
|
||||
field_validator,
|
||||
model_validator,
|
||||
)
|
||||
from pydantic_core import PydanticCustomError
|
||||
|
||||
from crewai.agent import Agent
|
||||
from crewai.agents.agent_builder.base_agent import BaseAgent
|
||||
from crewai.agents.cache import CacheHandler
|
||||
from crewai.crews.crew_output import CrewOutput
|
||||
from crewai.memory.entity.entity_memory import EntityMemory
|
||||
from crewai.memory.long_term.long_term_memory import LongTermMemory
|
||||
from crewai.memory.short_term.short_term_memory import ShortTermMemory
|
||||
from crewai.process import Process
|
||||
from crewai.task import Task
|
||||
from crewai.tasks.task_output import TaskOutput
|
||||
from crewai.telemetry import Telemetry
|
||||
from crewai.tools.agent_tools import AgentTools
|
||||
from crewai.utilities import I18N, FileHandler, Logger, RPMController
|
||||
from crewai.utilities.constants import TRAINED_AGENTS_DATA_FILE, TRAINING_DATA_FILE
|
||||
from crewai.utilities.evaluators.task_evaluator import TaskEvaluator
|
||||
from crewai.utilities.formatter import (
|
||||
aggregate_raw_outputs_from_task_outputs,
|
||||
aggregate_raw_outputs_from_tasks,
|
||||
)
|
||||
from crewai.utilities.training_handler import CrewTrainingHandler
|
||||
|
||||
try:
|
||||
import agentops
|
||||
except ImportError:
|
||||
agentops = None
|
||||
|
||||
|
||||
class Crew(BaseModel):
|
||||
"""
|
||||
@@ -51,7 +64,6 @@ class Crew(BaseModel):
|
||||
max_rpm: Maximum number of requests per minute for the crew execution to be respected.
|
||||
prompt_file: Path to the prompt json file to be used for the crew.
|
||||
id: A unique identifier for the crew instance.
|
||||
full_output: Whether the crew should return the full output with all tasks outputs and token usage metrics or just the final output.
|
||||
task_callback: Callback to be executed after each task for every agents execution.
|
||||
step_callback: Callback to be executed after each step for every agents execution.
|
||||
share_crew: Whether you want to share the complete crew information and execution with crewAI to make the library better, and allow us to train models.
|
||||
@@ -87,10 +99,6 @@ class Crew(BaseModel):
|
||||
default=None,
|
||||
description="Metrics for the LLM usage during all tasks execution.",
|
||||
)
|
||||
full_output: Optional[bool] = Field(
|
||||
default=False,
|
||||
description="Whether the crew should return the full output with all tasks outputs and token usage metrics or just the final output.",
|
||||
)
|
||||
manager_llm: Optional[Any] = Field(
|
||||
description="Language model that will run the agent.", default=None
|
||||
)
|
||||
@@ -162,7 +170,6 @@ class Crew(BaseModel):
|
||||
self._rpm_controller = RPMController(max_rpm=self.max_rpm, logger=self._logger)
|
||||
self._telemetry = Telemetry()
|
||||
self._telemetry.set_tracer()
|
||||
self._telemetry.crew_creation(self)
|
||||
return self
|
||||
|
||||
@model_validator(mode="after")
|
||||
@@ -219,6 +226,90 @@ class Crew(BaseModel):
|
||||
agent.set_rpm_controller(self._rpm_controller)
|
||||
return self
|
||||
|
||||
@model_validator(mode="after")
|
||||
def validate_tasks(self):
|
||||
if self.process == Process.sequential:
|
||||
for task in self.tasks:
|
||||
if task.agent is None:
|
||||
raise PydanticCustomError(
|
||||
"missing_agent_in_task",
|
||||
f"Sequential process error: Agent is missing in the task with the following description: {task.description}", # type: ignore # Argument of type "str" cannot be assigned to parameter "message_template" of type "LiteralString"
|
||||
{},
|
||||
)
|
||||
|
||||
return self
|
||||
|
||||
@model_validator(mode="after")
|
||||
def check_tasks_in_hierarchical_process_not_async(self):
|
||||
"""Validates that the tasks in hierarchical process are not flagged with async_execution."""
|
||||
if self.process == Process.hierarchical:
|
||||
for task in self.tasks:
|
||||
if task.async_execution:
|
||||
raise PydanticCustomError(
|
||||
"async_execution_in_hierarchical_process",
|
||||
"Hierarchical process error: Tasks cannot be flagged with async_execution.",
|
||||
{},
|
||||
)
|
||||
|
||||
return self
|
||||
|
||||
@model_validator(mode="after")
|
||||
def validate_end_with_at_most_one_async_task(self):
|
||||
"""Validates that the crew ends with at most one asynchronous task."""
|
||||
final_async_task_count = 0
|
||||
|
||||
# Traverse tasks backward
|
||||
for task in reversed(self.tasks):
|
||||
if task.async_execution:
|
||||
final_async_task_count += 1
|
||||
else:
|
||||
break # Stop traversing as soon as a non-async task is encountered
|
||||
|
||||
if final_async_task_count > 1:
|
||||
raise PydanticCustomError(
|
||||
"async_task_count",
|
||||
"The crew must end with at most one asynchronous task.",
|
||||
{},
|
||||
)
|
||||
|
||||
return self
|
||||
|
||||
@model_validator(mode="after")
|
||||
def validate_async_task_cannot_include_sequential_async_tasks_in_context(self):
|
||||
"""
|
||||
Validates that if a task is set to be executed asynchronously,
|
||||
it cannot include other asynchronous tasks in its context unless
|
||||
separated by a synchronous task.
|
||||
"""
|
||||
for i, task in enumerate(self.tasks):
|
||||
if task.async_execution and task.context:
|
||||
for context_task in task.context:
|
||||
if context_task.async_execution:
|
||||
for j in range(i - 1, -1, -1):
|
||||
if self.tasks[j] == context_task:
|
||||
raise ValueError(
|
||||
f"Task '{task.description}' is asynchronous and cannot include other sequential asynchronous tasks in its context."
|
||||
)
|
||||
if not self.tasks[j].async_execution:
|
||||
break
|
||||
return self
|
||||
|
||||
@model_validator(mode="after")
|
||||
def validate_context_no_future_tasks(self):
|
||||
"""Validates that a task's context does not include future tasks."""
|
||||
task_indices = {id(task): i for i, task in enumerate(self.tasks)}
|
||||
|
||||
for task in self.tasks:
|
||||
if task.context:
|
||||
for context_task in task.context:
|
||||
if id(context_task) not in task_indices:
|
||||
continue # Skip context tasks not in the main tasks list
|
||||
if task_indices[id(context_task)] > task_indices[id(task)]:
|
||||
raise ValueError(
|
||||
f"Task '{task.description}' has a context dependency on a future task '{context_task.description}', which is not allowed."
|
||||
)
|
||||
return self
|
||||
|
||||
def _setup_from_config(self):
|
||||
assert self.config is not None, "Config should not be None."
|
||||
|
||||
@@ -257,6 +348,9 @@ class Crew(BaseModel):
|
||||
for agent in self.agents:
|
||||
agent.allow_delegation = False
|
||||
|
||||
CrewTrainingHandler(TRAINING_DATA_FILE).initialize_file()
|
||||
CrewTrainingHandler(TRAINED_AGENTS_DATA_FILE).initialize_file()
|
||||
|
||||
def train(self, n_iterations: int, inputs: Optional[Dict[str, Any]] = {}) -> None:
|
||||
"""Trains the crew for a given number of iterations."""
|
||||
self._setup_for_training()
|
||||
@@ -265,46 +359,42 @@ class Crew(BaseModel):
|
||||
self._train_iteration = n_iteration
|
||||
self.kickoff(inputs=inputs)
|
||||
|
||||
training_data = CrewTrainingHandler("training_data.pkl").load()
|
||||
training_data = CrewTrainingHandler(TRAINING_DATA_FILE).load()
|
||||
|
||||
for agent in self.agents:
|
||||
result = TaskEvaluator(agent).evaluate_training_data(
|
||||
training_data=training_data, agent_id=str(agent.id)
|
||||
)
|
||||
|
||||
CrewTrainingHandler("trained_agents_data.pkl").save_trained_data(
|
||||
CrewTrainingHandler(TRAINED_AGENTS_DATA_FILE).save_trained_data(
|
||||
agent_id=str(agent.role), trained_data=result.model_dump()
|
||||
)
|
||||
|
||||
def kickoff(
|
||||
self,
|
||||
inputs: Optional[Dict[str, Any]] = {},
|
||||
) -> Union[str, Dict[str, Any]]:
|
||||
inputs: Optional[Dict[str, Any]] = None,
|
||||
) -> CrewOutput:
|
||||
"""Starts the crew to work on its assigned tasks."""
|
||||
self._execution_span = self._telemetry.crew_execution_span(self, inputs)
|
||||
# type: ignore # Argument 1 to "_interpolate_inputs" of "Crew" has incompatible type "dict[str, Any] | None"; expected "dict[str, Any]"
|
||||
self._interpolate_inputs(inputs)
|
||||
if inputs is not None:
|
||||
self._interpolate_inputs(inputs)
|
||||
self._set_tasks_callbacks()
|
||||
|
||||
i18n = I18N(prompt_file=self.prompt_file)
|
||||
|
||||
for agent in self.agents:
|
||||
# type: ignore # Argument 1 to "_interpolate_inputs" of "Crew" has incompatible type "dict[str, Any] | None"; expected "dict[str, Any]"
|
||||
agent.i18n = i18n
|
||||
# type: ignore[attr-defined] # Argument 1 to "_interpolate_inputs" of "Crew" has incompatible type "dict[str, Any] | None"; expected "dict[str, Any]"
|
||||
agent.crew = self # type: ignore[attr-defined]
|
||||
# TODO: Create an AgentFunctionCalling protocol for future refactoring
|
||||
if (
|
||||
hasattr(agent, "function_calling_llm")
|
||||
and not agent.function_calling_llm
|
||||
):
|
||||
agent.function_calling_llm = self.function_calling_llm
|
||||
if not agent.function_calling_llm: # type: ignore # "BaseAgent" has no attribute "function_calling_llm"
|
||||
agent.function_calling_llm = self.function_calling_llm # type: ignore # "BaseAgent" has no attribute "function_calling_llm"
|
||||
|
||||
if hasattr(agent, "allow_code_execution") and agent.allow_code_execution:
|
||||
agent.tools += agent.get_code_execution_tools()
|
||||
if agent.allow_code_execution: # type: ignore # BaseAgent" has no attribute "allow_code_execution"
|
||||
agent.tools += agent.get_code_execution_tools() # type: ignore # "BaseAgent" has no attribute "get_code_execution_tools"; maybe "get_delegation_tools"?
|
||||
|
||||
if hasattr(agent, "step_callback") and not agent.step_callback:
|
||||
agent.step_callback = self.step_callback
|
||||
if not agent.step_callback: # type: ignore # "BaseAgent" has no attribute "step_callback"
|
||||
agent.step_callback = self.step_callback # type: ignore # "BaseAgent" has no attribute "step_callback"
|
||||
|
||||
agent.create_agent_executor()
|
||||
|
||||
@@ -313,17 +403,12 @@ class Crew(BaseModel):
|
||||
if self.process == Process.sequential:
|
||||
result = self._run_sequential_process()
|
||||
elif self.process == Process.hierarchical:
|
||||
# type: ignore # Unpacking a string is disallowed
|
||||
result, manager_metrics = self._run_hierarchical_process()
|
||||
# type: ignore # Cannot determine type of "manager_metrics"
|
||||
metrics.append(manager_metrics)
|
||||
result = self._run_hierarchical_process() # type: ignore # Incompatible types in assignment (expression has type "str | dict[str, Any]", variable has type "str")
|
||||
else:
|
||||
raise NotImplementedError(
|
||||
f"The process '{self.process}' is not implemented yet."
|
||||
)
|
||||
metrics = metrics + [
|
||||
agent._token_process.get_summary() for agent in self.agents
|
||||
]
|
||||
metrics += [agent._token_process.get_summary() for agent in self.agents]
|
||||
|
||||
self.usage_metrics = {
|
||||
key: sum([m[key] for m in metrics if m is not None]) for key in metrics[0]
|
||||
@@ -331,57 +416,116 @@ class Crew(BaseModel):
|
||||
|
||||
return result
|
||||
|
||||
def kickoff_for_each(self, inputs: List[Dict[str, Any]]) -> List:
|
||||
def kickoff_for_each(self, inputs: List[Dict[str, Any]]) -> List[CrewOutput]:
|
||||
"""Executes the Crew's workflow for each input in the list and aggregates results."""
|
||||
results = []
|
||||
results: List[CrewOutput] = []
|
||||
|
||||
# Initialize the parent crew's usage metrics
|
||||
total_usage_metrics = {
|
||||
"total_tokens": 0,
|
||||
"prompt_tokens": 0,
|
||||
"completion_tokens": 0,
|
||||
"successful_requests": 0,
|
||||
}
|
||||
|
||||
for input_data in inputs:
|
||||
crew = self.copy()
|
||||
|
||||
for task in crew.tasks:
|
||||
task.interpolate_inputs(input_data)
|
||||
for agent in crew.agents:
|
||||
agent.interpolate_inputs(input_data)
|
||||
output = crew.kickoff(inputs=input_data)
|
||||
|
||||
if crew.usage_metrics:
|
||||
for key in total_usage_metrics:
|
||||
total_usage_metrics[key] += crew.usage_metrics.get(key, 0)
|
||||
|
||||
output = crew.kickoff()
|
||||
results.append(output)
|
||||
|
||||
self.usage_metrics = total_usage_metrics
|
||||
return results
|
||||
|
||||
async def kickoff_async(
|
||||
self, inputs: Optional[Dict[str, Any]] = {}
|
||||
) -> Union[str, Dict]:
|
||||
async def kickoff_async(self, inputs: Optional[Dict[str, Any]] = {}) -> CrewOutput:
|
||||
"""Asynchronous kickoff method to start the crew execution."""
|
||||
return await asyncio.to_thread(self.kickoff, inputs)
|
||||
|
||||
async def kickoff_for_each_async(self, inputs: List[Dict]) -> List[Any]:
|
||||
async def run_crew(input_data):
|
||||
crew = self.copy()
|
||||
async def kickoff_for_each_async(self, inputs: List[Dict]) -> List[CrewOutput]:
|
||||
crew_copies = [self.copy() for _ in inputs]
|
||||
|
||||
for task in crew.tasks:
|
||||
task.interpolate_inputs(input_data)
|
||||
for agent in crew.agents:
|
||||
agent.interpolate_inputs(input_data)
|
||||
async def run_crew(crew, input_data):
|
||||
return await crew.kickoff_async(inputs=input_data)
|
||||
|
||||
return await crew.kickoff_async()
|
||||
|
||||
tasks = [asyncio.create_task(run_crew(input_data)) for input_data in inputs]
|
||||
tasks = [
|
||||
asyncio.create_task(run_crew(crew_copies[i], inputs[i]))
|
||||
for i in range(len(inputs))
|
||||
]
|
||||
tasks = [
|
||||
asyncio.create_task(run_crew(crew_copies[i], inputs[i]))
|
||||
for i in range(len(inputs))
|
||||
]
|
||||
|
||||
results = await asyncio.gather(*tasks)
|
||||
|
||||
total_usage_metrics = {
|
||||
"total_tokens": 0,
|
||||
"prompt_tokens": 0,
|
||||
"completion_tokens": 0,
|
||||
"successful_requests": 0,
|
||||
}
|
||||
for crew in crew_copies:
|
||||
if crew.usage_metrics:
|
||||
for key in total_usage_metrics:
|
||||
total_usage_metrics[key] += crew.usage_metrics.get(key, 0)
|
||||
|
||||
self.usage_metrics = total_usage_metrics
|
||||
|
||||
total_usage_metrics = {
|
||||
"total_tokens": 0,
|
||||
"prompt_tokens": 0,
|
||||
"completion_tokens": 0,
|
||||
"successful_requests": 0,
|
||||
}
|
||||
for crew in crew_copies:
|
||||
if crew.usage_metrics:
|
||||
for key in total_usage_metrics:
|
||||
total_usage_metrics[key] += crew.usage_metrics.get(key, 0)
|
||||
|
||||
self.usage_metrics = total_usage_metrics
|
||||
|
||||
return results
|
||||
|
||||
def _run_sequential_process(self) -> str:
|
||||
def _run_sequential_process(self) -> CrewOutput:
|
||||
"""Executes tasks sequentially and returns the final output."""
|
||||
task_output = ""
|
||||
token_usage = []
|
||||
task_outputs: List[TaskOutput] = []
|
||||
futures: List[Tuple[Task, Future[TaskOutput]]] = []
|
||||
|
||||
for task in self.tasks:
|
||||
if task.agent.allow_delegation: # type: ignore # Item "None" of "Agent | None" has no attribute "allow_delegation"
|
||||
if task.agent and task.agent.allow_delegation:
|
||||
agents_for_delegation = [
|
||||
agent for agent in self.agents if agent != task.agent
|
||||
]
|
||||
if len(self.agents) > 1 and len(agents_for_delegation) > 0:
|
||||
task.tools += task.agent.get_delegation_tools(agents_for_delegation)
|
||||
delegation_tools = task.agent.get_delegation_tools(
|
||||
agents_for_delegation
|
||||
)
|
||||
|
||||
# Add tools if they are not already in task.tools
|
||||
for new_tool in delegation_tools:
|
||||
# Find the index of the tool with the same name
|
||||
existing_tool_index = next(
|
||||
(
|
||||
index
|
||||
for index, tool in enumerate(task.tools or [])
|
||||
if tool.name == new_tool.name
|
||||
),
|
||||
None,
|
||||
)
|
||||
if not task.tools:
|
||||
task.tools = []
|
||||
|
||||
if existing_tool_index is not None:
|
||||
# Replace the existing tool
|
||||
task.tools[existing_tool_index] = new_tool
|
||||
else:
|
||||
# Add the new tool
|
||||
task.tools.append(new_tool)
|
||||
|
||||
role = task.agent.role if task.agent is not None else "None"
|
||||
self._logger.log("debug", f"== Working Agent: {role}", color="bold_purple")
|
||||
@@ -393,34 +537,85 @@ class Crew(BaseModel):
|
||||
self._file_handler.log(
|
||||
agent=role, task=task.description, status="started"
|
||||
)
|
||||
output = task.execute(context=task_output)
|
||||
|
||||
if not task.async_execution:
|
||||
task_output = output
|
||||
if task.async_execution:
|
||||
context = (
|
||||
aggregate_raw_outputs_from_tasks(task.context)
|
||||
if task.context
|
||||
else aggregate_raw_outputs_from_task_outputs(task_outputs)
|
||||
)
|
||||
future = task.execute_async(
|
||||
agent=task.agent, context=context, tools=task.tools
|
||||
)
|
||||
futures.append((task, future))
|
||||
else:
|
||||
# Before executing a synchronous task, wait for all async tasks to complete
|
||||
if futures:
|
||||
# Clear task_outputs before processing async tasks
|
||||
task_outputs = []
|
||||
for future_task, future in futures:
|
||||
task_output = future.result()
|
||||
task_outputs.append(task_output)
|
||||
self._process_task_result(future_task, task_output)
|
||||
|
||||
role = task.agent.role if task.agent is not None else "None"
|
||||
self._logger.log("debug", f"== [{role}] Task output: {task_output}\n\n")
|
||||
token_summ = task.agent._token_process.get_summary()
|
||||
# Clear the futures list after processing all async results
|
||||
futures.clear()
|
||||
|
||||
token_usage.append(token_summ)
|
||||
context = (
|
||||
aggregate_raw_outputs_from_tasks(task.context)
|
||||
if task.context
|
||||
else aggregate_raw_outputs_from_task_outputs(task_outputs)
|
||||
)
|
||||
task_output = task.execute_sync(
|
||||
agent=task.agent, context=context, tools=task.tools
|
||||
)
|
||||
task_outputs = [task_output]
|
||||
self._process_task_result(task, task_output)
|
||||
|
||||
if self.output_log_file:
|
||||
self._file_handler.log(agent=role, task=task_output, status="completed")
|
||||
if futures:
|
||||
# Clear task_outputs before processing async tasks
|
||||
task_outputs = []
|
||||
for future_task, future in futures:
|
||||
task_output = future.result()
|
||||
task_outputs.append(task_output)
|
||||
self._process_task_result(future_task, task_output)
|
||||
|
||||
token_usage_formatted = self.aggregate_token_usage(token_usage)
|
||||
self._finish_execution(task_output)
|
||||
# Important: There should only be one task output in the list
|
||||
# If there are more or 0, something went wrong.
|
||||
if len(task_outputs) != 1:
|
||||
raise ValueError(
|
||||
"Something went wrong. Kickoff should return only one task output."
|
||||
)
|
||||
|
||||
# type: ignore # Incompatible return value type (got "tuple[str, Any]", expected "str")
|
||||
return self._format_output(task_output, token_usage_formatted)
|
||||
final_task_output = task_outputs[0]
|
||||
|
||||
def _run_hierarchical_process(self) -> Union[str, Dict[str, Any]]:
|
||||
final_string_output = final_task_output.raw
|
||||
self._finish_execution(final_string_output)
|
||||
|
||||
token_usage = self.calculate_usage_metrics()
|
||||
|
||||
return CrewOutput(
|
||||
raw=final_task_output.raw,
|
||||
pydantic=final_task_output.pydantic,
|
||||
json_dict=final_task_output.json_dict,
|
||||
tasks_output=[task.output for task in self.tasks if task.output],
|
||||
token_usage=token_usage,
|
||||
)
|
||||
|
||||
def _process_task_result(self, task: Task, output: TaskOutput) -> None:
|
||||
role = task.agent.role if task.agent is not None else "None"
|
||||
self._logger.log("debug", f"== [{role}] Task output: {output}\n\n")
|
||||
if self.output_log_file:
|
||||
self._file_handler.log(agent=role, task=output, status="completed")
|
||||
|
||||
# TODO: @joao, Breaking change. Changed return type. Usage metrics is included in crewoutput
|
||||
def _run_hierarchical_process(self) -> CrewOutput:
|
||||
"""Creates and assigns a manager agent to make sure the crew completes the tasks."""
|
||||
|
||||
i18n = I18N(prompt_file=self.prompt_file)
|
||||
if self.manager_agent is not None:
|
||||
self.manager_agent.allow_delegation = True
|
||||
manager = self.manager_agent
|
||||
if len(manager.tools) > 0:
|
||||
if manager.tools is not None and len(manager.tools) > 0:
|
||||
raise Exception("Manager agent should not have tools")
|
||||
manager.tools = self.manager_agent.get_delegation_tools(self.agents)
|
||||
else:
|
||||
@@ -432,9 +627,12 @@ class Crew(BaseModel):
|
||||
llm=self.manager_llm,
|
||||
verbose=self.verbose,
|
||||
)
|
||||
self.manager_agent = manager
|
||||
|
||||
task_output = ""
|
||||
token_usage = []
|
||||
task_outputs: List[TaskOutput] = []
|
||||
futures: List[Tuple[Task, Future[TaskOutput]]] = []
|
||||
|
||||
# TODO: IF USER OVERRIDE THE CONTEXT, PASS THAT
|
||||
for task in self.tasks:
|
||||
self._logger.log("debug", f"Working Agent: {manager.role}")
|
||||
self._logger.log("info", f"Starting Task: {task.description}")
|
||||
@@ -444,29 +642,70 @@ class Crew(BaseModel):
|
||||
agent=manager.role, task=task.description, status="started"
|
||||
)
|
||||
|
||||
task_output = task.execute(
|
||||
agent=manager, context=task_output, tools=manager.tools
|
||||
if task.async_execution:
|
||||
context = (
|
||||
aggregate_raw_outputs_from_tasks(task.context)
|
||||
if task.context
|
||||
else aggregate_raw_outputs_from_task_outputs(task_outputs)
|
||||
)
|
||||
future = task.execute_async(
|
||||
agent=manager, context=context, tools=manager.tools
|
||||
)
|
||||
futures.append((task, future))
|
||||
else:
|
||||
# Before executing a synchronous task, wait for all async tasks to complete
|
||||
if futures:
|
||||
# Clear task_outputs before processing async tasks
|
||||
task_outputs = []
|
||||
for future_task, future in futures:
|
||||
task_output = future.result()
|
||||
task_outputs.append(task_output)
|
||||
self._process_task_result(future_task, task_output)
|
||||
|
||||
# Clear the futures list after processing all async results
|
||||
futures.clear()
|
||||
|
||||
context = (
|
||||
aggregate_raw_outputs_from_tasks(task.context)
|
||||
if task.context
|
||||
else aggregate_raw_outputs_from_task_outputs(task_outputs)
|
||||
)
|
||||
task_output = task.execute_sync(
|
||||
agent=manager, context=context, tools=manager.tools
|
||||
)
|
||||
task_outputs = [task_output]
|
||||
self._process_task_result(task, task_output)
|
||||
|
||||
# Process any remaining async results
|
||||
if futures:
|
||||
# Clear task_outputs before processing async tasks
|
||||
task_outputs = []
|
||||
for future_task, future in futures:
|
||||
task_output = future.result()
|
||||
task_outputs.append(task_output)
|
||||
self._process_task_result(future_task, task_output)
|
||||
|
||||
# Important: There should only be one task output in the list
|
||||
# If there are more or 0, something went wrong.
|
||||
if len(task_outputs) != 1:
|
||||
raise ValueError(
|
||||
"Something went wrong. Kickoff should return only one task output."
|
||||
)
|
||||
|
||||
self._logger.log("debug", f"[{manager.role}] Task output: {task_output}")
|
||||
if hasattr(task, "agent._token_process"):
|
||||
token_summ = task.agent._token_process.get_summary()
|
||||
token_usage.append(token_summ)
|
||||
if self.output_log_file:
|
||||
self._file_handler.log(
|
||||
agent=manager.role, task=task_output, status="completed"
|
||||
)
|
||||
final_task_output = task_outputs[0]
|
||||
|
||||
self._finish_execution(task_output)
|
||||
final_string_output = final_task_output.raw
|
||||
self._finish_execution(final_string_output)
|
||||
|
||||
# type: ignore # Incompatible return value type (got "tuple[str, Any]", expected "str")
|
||||
manager_token_usage = manager._token_process.get_summary()
|
||||
token_usage.append(manager_token_usage)
|
||||
token_usage_formatted = self.aggregate_token_usage(token_usage)
|
||||
token_usage = self.calculate_usage_metrics()
|
||||
|
||||
return self._format_output(
|
||||
task_output, token_usage_formatted
|
||||
), manager_token_usage
|
||||
return CrewOutput(
|
||||
raw=final_task_output.raw,
|
||||
pydantic=final_task_output.pydantic,
|
||||
json_dict=final_task_output.json_dict,
|
||||
tasks_output=[task.output for task in self.tasks if task.output],
|
||||
token_usage=token_usage,
|
||||
)
|
||||
|
||||
def copy(self):
|
||||
"""Create a deep copy of the Crew."""
|
||||
@@ -481,12 +720,13 @@ class Crew(BaseModel):
|
||||
"_short_term_memory",
|
||||
"_long_term_memory",
|
||||
"_entity_memory",
|
||||
"_telemetry",
|
||||
"agents",
|
||||
"tasks",
|
||||
}
|
||||
|
||||
cloned_agents = [agent.copy() for agent in self.agents]
|
||||
cloned_tasks = [task.copy() for task in self.tasks]
|
||||
cloned_tasks = [task.copy(cloned_agents) for task in self.tasks]
|
||||
|
||||
copied_data = self.model_dump(exclude=exclude)
|
||||
copied_data = {k: v for k, v in copied_data.items() if v is not None}
|
||||
@@ -517,32 +757,37 @@ class Crew(BaseModel):
|
||||
for agent in self.agents:
|
||||
agent.interpolate_inputs(inputs)
|
||||
|
||||
def _format_output(
|
||||
self, output: str, token_usage: Optional[Dict[str, Any]] = None
|
||||
) -> Union[str, Dict[str, Any]]:
|
||||
"""
|
||||
Formats the output of the crew execution.
|
||||
If full_output is True, then returned data type will be a dictionary else returned outputs are string
|
||||
"""
|
||||
if self.full_output:
|
||||
return { # type: ignore # Incompatible return value type (got "dict[str, Sequence[str | TaskOutput | None]]", expected "str")
|
||||
"final_output": output,
|
||||
"tasks_outputs": [task.output for task in self.tasks if task],
|
||||
"usage_metrics": token_usage,
|
||||
}
|
||||
else:
|
||||
return output
|
||||
|
||||
def _finish_execution(self, output) -> None:
|
||||
def _finish_execution(self, final_string_output: str) -> None:
|
||||
if self.max_rpm:
|
||||
self._rpm_controller.stop_rpm_counter()
|
||||
self._telemetry.end_crew(self, output)
|
||||
if agentops:
|
||||
agentops.end_session(
|
||||
end_state="Success",
|
||||
end_state_reason="Finished Execution",
|
||||
)
|
||||
self._telemetry.end_crew(self, final_string_output)
|
||||
|
||||
def calculate_usage_metrics(self) -> Dict[str, int]:
|
||||
"""Calculates and returns the usage metrics."""
|
||||
total_usage_metrics = {
|
||||
"total_tokens": 0,
|
||||
"prompt_tokens": 0,
|
||||
"completion_tokens": 0,
|
||||
"successful_requests": 0,
|
||||
}
|
||||
|
||||
for agent in self.agents:
|
||||
if hasattr(agent, "_token_process"):
|
||||
token_sum = agent._token_process.get_summary()
|
||||
for key in total_usage_metrics:
|
||||
total_usage_metrics[key] += token_sum.get(key, 0)
|
||||
|
||||
if self.manager_agent and hasattr(self.manager_agent, "_token_process"):
|
||||
token_sum = self.manager_agent._token_process.get_summary()
|
||||
for key in total_usage_metrics:
|
||||
total_usage_metrics[key] += token_sum.get(key, 0)
|
||||
|
||||
return total_usage_metrics
|
||||
|
||||
def __repr__(self):
|
||||
return f"Crew(id={self.id}, process={self.process}, number_of_agents={len(self.agents)}, number_of_tasks={len(self.tasks)})"
|
||||
|
||||
def aggregate_token_usage(self, token_usage_list: List[Dict[str, Any]]):
|
||||
return {
|
||||
key: sum([m[key] for m in token_usage_list if m is not None])
|
||||
for key in token_usage_list[0]
|
||||
}
|
||||
|
||||
1
src/crewai/crews/__init__.py
Normal file
1
src/crewai/crews/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
from .crew_output import CrewOutput
|
||||
60
src/crewai/crews/crew_output.py
Normal file
60
src/crewai/crews/crew_output.py
Normal file
@@ -0,0 +1,60 @@
|
||||
import json
|
||||
from typing import Any, Dict, Optional
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
from crewai.tasks.output_format import OutputFormat
|
||||
from crewai.tasks.task_output import TaskOutput
|
||||
|
||||
|
||||
class CrewOutput(BaseModel):
|
||||
"""Class that represents the result of a crew."""
|
||||
|
||||
raw: str = Field(description="Raw output of crew", default="")
|
||||
pydantic: Optional[BaseModel] = Field(
|
||||
description="Pydantic output of Crew", default=None
|
||||
)
|
||||
json_dict: Optional[Dict[str, Any]] = Field(
|
||||
description="JSON dict output of Crew", default=None
|
||||
)
|
||||
tasks_output: list[TaskOutput] = Field(
|
||||
description="Output of each task", default=[]
|
||||
)
|
||||
token_usage: Dict[str, Any] = Field(
|
||||
description="Processed token summary", default={}
|
||||
)
|
||||
|
||||
# TODO: Joao - Adding this safety check breakes when people want to see
|
||||
# The full output of a CrewOutput.
|
||||
# @property
|
||||
# def pydantic(self) -> Optional[BaseModel]:
|
||||
# # Check if the final task output included a pydantic model
|
||||
# if self.tasks_output[-1].output_format != OutputFormat.PYDANTIC:
|
||||
# raise ValueError(
|
||||
# "No pydantic model found in the final task. Please make sure to set the output_pydantic property in the final task in your crew."
|
||||
# )
|
||||
|
||||
# return self._pydantic
|
||||
|
||||
@property
|
||||
def json(self) -> Optional[str]:
|
||||
if self.tasks_output[-1].output_format != OutputFormat.JSON:
|
||||
raise ValueError(
|
||||
"No JSON output found in the final task. Please make sure to set the output_json property in the final task in your crew."
|
||||
)
|
||||
|
||||
return json.dumps(self.json_dict)
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
if self.json_dict:
|
||||
return self.json_dict
|
||||
if self.pydantic:
|
||||
return self.pydantic.model_dump()
|
||||
raise ValueError("No output to convert to dictionary")
|
||||
|
||||
def __str__(self):
|
||||
if self.pydantic:
|
||||
return str(self.pydantic)
|
||||
if self.json_dict:
|
||||
return str(self.json_dict)
|
||||
return self.raw
|
||||
@@ -1,9 +1,11 @@
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
import threading
|
||||
import uuid
|
||||
from copy import deepcopy
|
||||
from typing import Any, Dict, List, Optional, Type
|
||||
from concurrent.futures import Future
|
||||
from copy import copy
|
||||
from typing import Any, Dict, List, Optional, Tuple, Type, Union
|
||||
|
||||
from langchain_openai import ChatOpenAI
|
||||
from opentelemetry.trace import Span
|
||||
@@ -11,9 +13,12 @@ from pydantic import UUID4, BaseModel, Field, field_validator, model_validator
|
||||
from pydantic_core import PydanticCustomError
|
||||
|
||||
from crewai.agents.agent_builder.base_agent import BaseAgent
|
||||
from crewai.tasks.output_format import OutputFormat
|
||||
from crewai.tasks.task_output import TaskOutput
|
||||
from crewai.telemetry.telemetry import Telemetry
|
||||
from crewai.utilities import I18N, ConverterError, Printer
|
||||
from crewai.utilities.converter import Converter, ConverterError
|
||||
from crewai.utilities.i18n import I18N
|
||||
from crewai.utilities.printer import Printer
|
||||
from crewai.utilities.pydantic_schema_parser import PydanticSchemaParser
|
||||
|
||||
|
||||
@@ -95,6 +100,10 @@ class Task(BaseModel):
|
||||
description="Whether the task should have a human review the final answer of the agent",
|
||||
default=False,
|
||||
)
|
||||
converter_cls: Optional[Type[Converter]] = Field(
|
||||
description="A converter class used to export structured output",
|
||||
default=None,
|
||||
)
|
||||
|
||||
_telemetry: Telemetry
|
||||
_execution_span: Span | None = None
|
||||
@@ -155,82 +164,74 @@ class Task(BaseModel):
|
||||
)
|
||||
return self
|
||||
|
||||
def wait_for_completion(self) -> str | BaseModel:
|
||||
"""Wait for asynchronous task completion and return the output."""
|
||||
assert self.async_execution, "Task is not set to be executed asynchronously."
|
||||
def execute_sync(
|
||||
self,
|
||||
agent: Optional[BaseAgent] = None,
|
||||
context: Optional[str] = None,
|
||||
tools: Optional[List[Any]] = None,
|
||||
) -> TaskOutput:
|
||||
"""Execute the task synchronously."""
|
||||
return self._execute_core(agent, context, tools)
|
||||
|
||||
if self._thread:
|
||||
self._thread.join()
|
||||
self._thread = None
|
||||
|
||||
assert self.output, "Task output is not set."
|
||||
|
||||
return self.output.exported_output
|
||||
|
||||
def execute( # type: ignore # Missing return statement
|
||||
def execute_async(
|
||||
self,
|
||||
agent: BaseAgent | None = None,
|
||||
context: Optional[str] = None,
|
||||
tools: Optional[List[Any]] = None,
|
||||
) -> str:
|
||||
"""Execute the task.
|
||||
) -> Future[TaskOutput]:
|
||||
"""Execute the task asynchronously."""
|
||||
future: Future[TaskOutput] = Future()
|
||||
threading.Thread(
|
||||
target=self._execute_task_async, args=(agent, context, tools, future)
|
||||
).start()
|
||||
return future
|
||||
|
||||
Returns:
|
||||
Output of the task.
|
||||
"""
|
||||
|
||||
self._execution_span = self._telemetry.task_started(self)
|
||||
def _execute_task_async(
|
||||
self,
|
||||
agent: Optional[BaseAgent],
|
||||
context: Optional[str],
|
||||
tools: Optional[List[Any]],
|
||||
future: Future[TaskOutput],
|
||||
) -> None:
|
||||
"""Execute the task asynchronously with context handling."""
|
||||
result = self._execute_core(agent, context, tools)
|
||||
future.set_result(result)
|
||||
|
||||
def _execute_core(
|
||||
self,
|
||||
agent: Optional[BaseAgent],
|
||||
context: Optional[str],
|
||||
tools: Optional[List[Any]],
|
||||
) -> TaskOutput:
|
||||
"""Run the core execution logic of the task."""
|
||||
agent = agent or self.agent
|
||||
if not agent:
|
||||
raise Exception(
|
||||
f"The task '{self.description}' has no agent assigned, therefore it can't be executed directly and should be executed in a Crew using a specific process that support that, like hierarchical."
|
||||
)
|
||||
|
||||
if self.context:
|
||||
# type: ignore # Incompatible types in assignment (expression has type "list[Never]", variable has type "str | None")
|
||||
context = []
|
||||
for task in self.context:
|
||||
if task.async_execution:
|
||||
task.wait_for_completion()
|
||||
if task.output:
|
||||
# type: ignore # Item "str" of "str | None" has no attribute "append"
|
||||
context.append(task.output.raw_output)
|
||||
# type: ignore # Argument 1 to "join" of "str" has incompatible type "str | None"; expected "Iterable[str]"
|
||||
context = "\n".join(context)
|
||||
self._execution_span = self._telemetry.task_started(crew=agent.crew, task=self)
|
||||
|
||||
self.prompt_context = context
|
||||
tools = tools or self.tools
|
||||
tools = tools or self.tools or []
|
||||
|
||||
if self.async_execution:
|
||||
self._thread = threading.Thread(
|
||||
target=self._execute, args=(agent, self, context, tools)
|
||||
)
|
||||
self._thread.start()
|
||||
else:
|
||||
result = self._execute(
|
||||
task=self,
|
||||
agent=agent,
|
||||
context=context,
|
||||
tools=tools,
|
||||
)
|
||||
return result
|
||||
|
||||
def _execute(self, agent, task, context, tools):
|
||||
result = agent.execute_task(
|
||||
task=task,
|
||||
task=self,
|
||||
context=context,
|
||||
tools=tools,
|
||||
)
|
||||
exported_output = self._export_output(result)
|
||||
|
||||
# type: ignore # the responses are usually str but need to figure out a more elegant solution here
|
||||
self.output = TaskOutput(
|
||||
pydantic_output, json_output = self._export_output(result)
|
||||
|
||||
task_output = TaskOutput(
|
||||
description=self.description,
|
||||
exported_output=exported_output,
|
||||
raw_output=result,
|
||||
raw=result,
|
||||
pydantic=pydantic_output,
|
||||
json_dict=json_output,
|
||||
agent=agent.role,
|
||||
output_format=self._get_output_format(),
|
||||
)
|
||||
self.output = task_output
|
||||
|
||||
if self.callback:
|
||||
self.callback(self.output)
|
||||
@@ -239,7 +240,15 @@ class Task(BaseModel):
|
||||
self._telemetry.task_ended(self._execution_span, self)
|
||||
self._execution_span = None
|
||||
|
||||
return exported_output
|
||||
if self.output_file:
|
||||
content = (
|
||||
json_output
|
||||
if json_output
|
||||
else pydantic_output.model_dump_json() if pydantic_output else result
|
||||
)
|
||||
self._save_file(content)
|
||||
|
||||
return task_output
|
||||
|
||||
def prompt(self) -> str:
|
||||
"""Prompt the task.
|
||||
@@ -274,7 +283,7 @@ class Task(BaseModel):
|
||||
"""Increment the delegations counter."""
|
||||
self.delegations += 1
|
||||
|
||||
def copy(self):
|
||||
def copy(self, agents: List["BaseAgent"]) -> "Task":
|
||||
"""Create a deep copy of the Task."""
|
||||
exclude = {
|
||||
"id",
|
||||
@@ -287,10 +296,14 @@ class Task(BaseModel):
|
||||
copied_data = {k: v for k, v in copied_data.items() if v is not None}
|
||||
|
||||
cloned_context = (
|
||||
[task.copy() for task in self.context] if self.context else None
|
||||
[task.copy(agents) for task in self.context] if self.context else None
|
||||
)
|
||||
cloned_agent = self.agent.copy() if self.agent else None
|
||||
cloned_tools = deepcopy(self.tools) if self.tools else []
|
||||
|
||||
def get_agent_by_role(role: str) -> Union["BaseAgent", None]:
|
||||
return next((agent for agent in agents if agent.role == role), None)
|
||||
|
||||
cloned_agent = get_agent_by_role(self.agent.role) if self.agent else None
|
||||
cloned_tools = copy(self.tools) if self.tools else []
|
||||
|
||||
copied_task = Task(
|
||||
**copied_data,
|
||||
@@ -298,81 +311,126 @@ class Task(BaseModel):
|
||||
agent=cloned_agent,
|
||||
tools=cloned_tools,
|
||||
)
|
||||
|
||||
return copied_task
|
||||
|
||||
def _export_output(self, result: str) -> Any:
|
||||
exported_result = result
|
||||
instructions = "I'm gonna convert this raw text into valid JSON."
|
||||
def _create_converter(self, *args, **kwargs) -> Converter:
|
||||
"""Create a converter instance."""
|
||||
converter = self.agent.get_output_converter(*args, **kwargs)
|
||||
if self.converter_cls:
|
||||
converter = self.converter_cls(*args, **kwargs)
|
||||
return converter
|
||||
|
||||
def _export_output(
|
||||
self, result: str
|
||||
) -> Tuple[Optional[BaseModel], Optional[Dict[str, Any]]]:
|
||||
pydantic_output: Optional[BaseModel] = None
|
||||
json_output: Optional[Dict[str, Any]] = None
|
||||
|
||||
if self.output_pydantic or self.output_json:
|
||||
model = self.output_pydantic or self.output_json
|
||||
model_output = self._convert_to_model(result)
|
||||
pydantic_output = (
|
||||
model_output if isinstance(model_output, BaseModel) else None
|
||||
)
|
||||
if isinstance(model_output, str):
|
||||
try:
|
||||
json_output = json.loads(model_output)
|
||||
except json.JSONDecodeError:
|
||||
json_output = None
|
||||
else:
|
||||
json_output = model_output if isinstance(model_output, dict) else None
|
||||
|
||||
# try to convert task_output directly to pydantic/json
|
||||
return pydantic_output, json_output
|
||||
|
||||
def _convert_to_model(self, result: str) -> Union[dict, BaseModel, str]:
|
||||
model = self.output_pydantic or self.output_json
|
||||
if model is None:
|
||||
return result
|
||||
|
||||
try:
|
||||
return self._validate_model(result, model)
|
||||
except Exception:
|
||||
return self._handle_partial_json(result, model)
|
||||
|
||||
def _validate_model(
|
||||
self, result: str, model: Type[BaseModel]
|
||||
) -> Union[dict, BaseModel]:
|
||||
exported_result = model.model_validate_json(result)
|
||||
if self.output_json:
|
||||
return exported_result.model_dump()
|
||||
return exported_result
|
||||
|
||||
def _handle_partial_json(
|
||||
self, result: str, model: Type[BaseModel]
|
||||
) -> Union[dict, BaseModel, str]:
|
||||
match = re.search(r"({.*})", result, re.DOTALL)
|
||||
if match:
|
||||
try:
|
||||
# type: ignore # Item "None" of "type[BaseModel] | None" has no attribute "model_validate_json"
|
||||
exported_result = model.model_validate_json(result)
|
||||
exported_result = model.model_validate_json(match.group(0))
|
||||
if self.output_json:
|
||||
# type: ignore # "str" has no attribute "model_dump"
|
||||
return exported_result.model_dump()
|
||||
return exported_result
|
||||
except Exception:
|
||||
# sometimes the response contains valid JSON in the middle of text
|
||||
match = re.search(r"({.*})", result, re.DOTALL)
|
||||
if match:
|
||||
try:
|
||||
# type: ignore # Item "None" of "type[BaseModel] | None" has no attribute "model_validate_json"
|
||||
exported_result = model.model_validate_json(match.group(0))
|
||||
if self.output_json:
|
||||
# type: ignore # "str" has no attribute "model_dump"
|
||||
return exported_result.model_dump()
|
||||
return exported_result
|
||||
except Exception:
|
||||
pass
|
||||
pass
|
||||
|
||||
# type: ignore # Item "None" of "Agent | None" has no attribute "function_calling_llm"
|
||||
llm = getattr(self.agent, "function_calling_llm", None) or self.agent.llm
|
||||
if not self._is_gpt(llm):
|
||||
# type: ignore # Argument "model" to "PydanticSchemaParser" has incompatible type "type[BaseModel] | None"; expected "type[BaseModel]"
|
||||
model_schema = PydanticSchemaParser(model=model).get_schema()
|
||||
instructions = f"{instructions}\n\nThe json should have the following structure, with the following keys:\n{model_schema}"
|
||||
return self._convert_with_instructions(result, model)
|
||||
|
||||
converter = self.agent.get_output_converter(
|
||||
llm=llm, text=result, model=model, instructions=instructions
|
||||
def _convert_with_instructions(
|
||||
self, result: str, model: Type[BaseModel]
|
||||
) -> Union[dict, BaseModel, str]:
|
||||
llm = self.agent.function_calling_llm or self.agent.llm
|
||||
instructions = self._get_conversion_instructions(model, llm)
|
||||
|
||||
converter = self._create_converter(
|
||||
llm=llm, text=result, model=model, instructions=instructions
|
||||
)
|
||||
exported_result = (
|
||||
converter.to_pydantic() if self.output_pydantic else converter.to_json()
|
||||
)
|
||||
|
||||
if isinstance(exported_result, ConverterError):
|
||||
Printer().print(
|
||||
content=f"{exported_result.message} Using raw output instead.",
|
||||
color="red",
|
||||
)
|
||||
|
||||
if self.output_pydantic:
|
||||
exported_result = converter.to_pydantic()
|
||||
elif self.output_json:
|
||||
exported_result = converter.to_json()
|
||||
|
||||
if isinstance(exported_result, ConverterError):
|
||||
Printer().print(
|
||||
content=f"{exported_result.message} Using raw output instead.",
|
||||
color="red",
|
||||
)
|
||||
exported_result = result
|
||||
|
||||
if self.output_file:
|
||||
content = (
|
||||
# type: ignore # "str" has no attribute "json"
|
||||
exported_result if not self.output_pydantic else exported_result.json()
|
||||
)
|
||||
self._save_file(content)
|
||||
return result
|
||||
|
||||
return exported_result
|
||||
|
||||
def _get_output_format(self) -> OutputFormat:
|
||||
if self.output_json:
|
||||
return OutputFormat.JSON
|
||||
if self.output_pydantic:
|
||||
return OutputFormat.PYDANTIC
|
||||
return OutputFormat.RAW
|
||||
|
||||
def _get_conversion_instructions(self, model: Type[BaseModel], llm: Any) -> str:
|
||||
instructions = "I'm gonna convert this raw text into valid JSON."
|
||||
if not self._is_gpt(llm):
|
||||
model_schema = PydanticSchemaParser(model=model).get_schema()
|
||||
instructions = f"{instructions}\n\nThe json should have the following structure, with the following keys:\n{model_schema}"
|
||||
return instructions
|
||||
|
||||
def _save_output(self, content: str) -> None:
|
||||
if not self.output_file:
|
||||
raise Exception("Output file path is not set.")
|
||||
|
||||
directory = os.path.dirname(self.output_file)
|
||||
if directory and not os.path.exists(directory):
|
||||
os.makedirs(directory)
|
||||
with open(self.output_file, "w", encoding="utf-8") as file:
|
||||
file.write(content)
|
||||
|
||||
def _is_gpt(self, llm) -> bool:
|
||||
return isinstance(llm, ChatOpenAI) and llm.openai_api_base is None
|
||||
|
||||
def _save_file(self, result: Any) -> None:
|
||||
# type: ignore # Value of type variable "AnyOrLiteralStr" of "dirname" cannot be "str | None"
|
||||
directory = os.path.dirname(self.output_file)
|
||||
directory = os.path.dirname(self.output_file) # type: ignore # Value of type variable "AnyOrLiteralStr" of "dirname" cannot be "str | None"
|
||||
|
||||
if directory and not os.path.exists(directory):
|
||||
os.makedirs(directory)
|
||||
|
||||
# type: ignore # Argument 1 to "open" has incompatible type "str | None"; expected "int | str | bytes | PathLike[str] | PathLike[bytes]"
|
||||
with open(self.output_file, "w", encoding="utf-8") as file:
|
||||
with open(self.output_file, "w", encoding="utf-8") as file: # type: ignore # Argument 1 to "open" has incompatible type "str | None"; expected "int | str | bytes | PathLike[str] | PathLike[bytes]"
|
||||
file.write(result)
|
||||
return None
|
||||
|
||||
|
||||
@@ -0,0 +1,4 @@
|
||||
from crewai.tasks.output_format import OutputFormat
|
||||
from crewai.tasks.task_output import TaskOutput
|
||||
|
||||
__all__ = ["OutputFormat", "TaskOutput"]
|
||||
|
||||
9
src/crewai/tasks/output_format.py
Normal file
9
src/crewai/tasks/output_format.py
Normal file
@@ -0,0 +1,9 @@
|
||||
from enum import Enum
|
||||
|
||||
|
||||
class OutputFormat(str, Enum):
|
||||
"""Enum that represents the output format of a task."""
|
||||
|
||||
JSON = "json"
|
||||
PYDANTIC = "pydantic"
|
||||
RAW = "raw"
|
||||
@@ -1,24 +1,78 @@
|
||||
from typing import Optional, Union
|
||||
import json
|
||||
from typing import Any, Dict, Optional
|
||||
|
||||
from pydantic import BaseModel, Field, model_validator
|
||||
|
||||
from crewai.tasks.output_format import OutputFormat
|
||||
|
||||
|
||||
class TaskOutput(BaseModel):
|
||||
"""Class that represents the result of a task."""
|
||||
|
||||
description: str = Field(description="Description of the task")
|
||||
summary: Optional[str] = Field(description="Summary of the task", default=None)
|
||||
exported_output: Union[str, BaseModel] = Field(
|
||||
description="Output of the task", default=None
|
||||
raw: str = Field(
|
||||
description="Raw output of the task", default=""
|
||||
) # TODO: @joao: breaking change, by renaming raw_output to raw, but now consistent with CrewOutput
|
||||
pydantic: Optional[BaseModel] = Field(
|
||||
description="Pydantic output of task", default=None
|
||||
)
|
||||
json_dict: Optional[Dict[str, Any]] = Field(
|
||||
description="JSON dictionary of task", default=None
|
||||
)
|
||||
agent: str = Field(description="Agent that executed the task")
|
||||
raw_output: str = Field(description="Result of the task")
|
||||
output_format: OutputFormat = Field(
|
||||
description="Output format of the task", default=OutputFormat.RAW
|
||||
)
|
||||
|
||||
@model_validator(mode="after")
|
||||
def set_summary(self):
|
||||
"""Set the summary field based on the description."""
|
||||
excerpt = " ".join(self.description.split(" ")[:10])
|
||||
self.summary = f"{excerpt}..."
|
||||
return self
|
||||
|
||||
def result(self):
|
||||
return self.exported_output
|
||||
# TODO: Joao - Adding this safety check breakes when people want to see
|
||||
# The full output of a TaskOutput or CrewOutput.
|
||||
# @property
|
||||
# def pydantic(self) -> Optional[BaseModel]:
|
||||
# # Check if the final task output included a pydantic model
|
||||
# if self.output_format != OutputFormat.PYDANTIC:
|
||||
# raise ValueError(
|
||||
# """
|
||||
# Invalid output format requested.
|
||||
# If you would like to access the pydantic model,
|
||||
# please make sure to set the output_pydantic property for the task.
|
||||
# """
|
||||
# )
|
||||
|
||||
# return self._pydantic
|
||||
|
||||
@property
|
||||
def json(self) -> Optional[str]:
|
||||
if self.output_format != OutputFormat.JSON:
|
||||
raise ValueError(
|
||||
"""
|
||||
Invalid output format requested.
|
||||
If you would like to access the JSON output,
|
||||
please make sure to set the output_json property for the task
|
||||
"""
|
||||
)
|
||||
|
||||
return json.dumps(self.json_dict)
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert json_output and pydantic_output to a dictionary."""
|
||||
output_dict = {}
|
||||
if self.json_dict:
|
||||
output_dict.update(self.json_dict)
|
||||
if self.pydantic:
|
||||
output_dict.update(self.pydantic.model_dump())
|
||||
return output_dict
|
||||
|
||||
def __str__(self) -> str:
|
||||
if self.pydantic:
|
||||
return str(self.pydantic)
|
||||
if self.json_dict:
|
||||
return str(self.json_dict)
|
||||
return self.raw
|
||||
|
||||
@@ -80,7 +80,7 @@ class Telemetry:
|
||||
self.ready = False
|
||||
self.trace_set = False
|
||||
|
||||
def crew_creation(self, crew):
|
||||
def crew_creation(self, crew: Crew, inputs: dict[str, Any] | None):
|
||||
"""Records the creation of a crew."""
|
||||
if self.ready:
|
||||
try:
|
||||
@@ -93,6 +93,12 @@ class Telemetry:
|
||||
)
|
||||
self._add_attribute(span, "python_version", platform.python_version())
|
||||
self._add_attribute(span, "crew_id", str(crew.id))
|
||||
|
||||
if crew.share_crew:
|
||||
self._add_attribute(
|
||||
span, "crew_inputs", json.dumps(inputs) if inputs else None
|
||||
)
|
||||
|
||||
self._add_attribute(span, "crew_process", crew.process)
|
||||
self._add_attribute(span, "crew_memory", crew.memory)
|
||||
self._add_attribute(span, "crew_number_of_tasks", len(crew.tasks))
|
||||
@@ -114,7 +120,7 @@ class Telemetry:
|
||||
"llm": json.dumps(self._safe_llm_attributes(agent.llm)),
|
||||
"delegation_enabled?": agent.allow_delegation,
|
||||
"tools_names": [
|
||||
tool.name.casefold() for tool in agent.tools
|
||||
tool.name.casefold() for tool in agent.tools or []
|
||||
],
|
||||
}
|
||||
for agent in crew.agents
|
||||
@@ -139,7 +145,7 @@ class Telemetry:
|
||||
else None
|
||||
),
|
||||
"tools_names": [
|
||||
tool.name.casefold() for tool in task.tools
|
||||
tool.name.casefold() for tool in task.tools or []
|
||||
],
|
||||
}
|
||||
for task in crew.tasks
|
||||
@@ -156,18 +162,40 @@ class Telemetry:
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def task_started(self, task: Task) -> Span | None:
|
||||
def task_started(self, crew: Crew, task: Task) -> Span | None:
|
||||
"""Records task started in a crew."""
|
||||
if self.ready:
|
||||
try:
|
||||
tracer = trace.get_tracer("crewai.telemetry")
|
||||
|
||||
created_span = tracer.start_span("Task Created")
|
||||
|
||||
self._add_attribute(created_span, "crew_id", str(crew.id))
|
||||
self._add_attribute(created_span, "task_index", crew.tasks.index(task))
|
||||
self._add_attribute(created_span, "task_id", str(task.id))
|
||||
|
||||
if crew.share_crew:
|
||||
self._add_attribute(
|
||||
created_span, "formatted_description", task.description
|
||||
)
|
||||
self._add_attribute(
|
||||
created_span, "formatted_expected_output", task.expected_output
|
||||
)
|
||||
|
||||
created_span.set_status(Status(StatusCode.OK))
|
||||
created_span.end()
|
||||
|
||||
span = tracer.start_span("Task Execution")
|
||||
|
||||
self._add_attribute(span, "crew_id", str(crew.id))
|
||||
self._add_attribute(span, "task_index", crew.tasks.index(task))
|
||||
self._add_attribute(span, "task_id", str(task.id))
|
||||
self._add_attribute(span, "formatted_description", task.description)
|
||||
self._add_attribute(
|
||||
span, "formatted_expected_output", task.expected_output
|
||||
)
|
||||
|
||||
if crew.share_crew:
|
||||
self._add_attribute(span, "formatted_description", task.description)
|
||||
self._add_attribute(
|
||||
span, "formatted_expected_output", task.expected_output
|
||||
)
|
||||
|
||||
return span
|
||||
except Exception:
|
||||
@@ -258,6 +286,8 @@ class Telemetry:
|
||||
"""
|
||||
if (self.ready) and (crew.share_crew):
|
||||
try:
|
||||
self.crew_creation(crew, inputs)
|
||||
|
||||
tracer = trace.get_tracer("crewai.telemetry")
|
||||
span = tracer.start_span("Crew Execution")
|
||||
self._add_attribute(
|
||||
@@ -266,7 +296,9 @@ class Telemetry:
|
||||
pkg_resources.get_distribution("crewai").version,
|
||||
)
|
||||
self._add_attribute(span, "crew_id", str(crew.id))
|
||||
self._add_attribute(span, "inputs", json.dumps(inputs))
|
||||
self._add_attribute(
|
||||
span, "crew_inputs", json.dumps(inputs) if inputs else None
|
||||
)
|
||||
self._add_attribute(
|
||||
span,
|
||||
"crew_agents",
|
||||
@@ -320,7 +352,7 @@ class Telemetry:
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def end_crew(self, crew, output):
|
||||
def end_crew(self, crew, final_string_output):
|
||||
if (self.ready) and (crew.share_crew):
|
||||
try:
|
||||
self._add_attribute(
|
||||
@@ -328,7 +360,9 @@ class Telemetry:
|
||||
"crewai_version",
|
||||
pkg_resources.get_distribution("crewai").version,
|
||||
)
|
||||
self._add_attribute(crew._execution_span, "crew_output", output)
|
||||
self._add_attribute(
|
||||
crew._execution_span, "crew_output", final_string_output
|
||||
)
|
||||
self._add_attribute(
|
||||
crew._execution_span,
|
||||
"crew_tasks_output",
|
||||
|
||||
@@ -7,7 +7,7 @@ class AgentTools(BaseAgentTools):
|
||||
"""Default tools around agent delegation"""
|
||||
|
||||
def tools(self):
|
||||
coworkers = f"[{', '.join([f'{agent.role}' for agent in self.agents])}]"
|
||||
coworkers = ", ".join([f"{agent.role}" for agent in self.agents])
|
||||
tools = [
|
||||
StructuredTool.from_function(
|
||||
func=self.delegate_work,
|
||||
|
||||
@@ -8,7 +8,7 @@ from pydantic.v1 import BaseModel, Field
|
||||
class ToolCalling(BaseModel):
|
||||
tool_name: str = Field(..., description="The name of the tool to be called.")
|
||||
arguments: Optional[Dict[str, Any]] = Field(
|
||||
..., description="A dictinary of arguments to be passed to the tool."
|
||||
..., description="A dictionary of arguments to be passed to the tool."
|
||||
)
|
||||
|
||||
|
||||
@@ -17,5 +17,5 @@ class InstructorToolCalling(PydanticBaseModel):
|
||||
..., description="The name of the tool to be called."
|
||||
)
|
||||
arguments: Optional[Dict[str, Any]] = PydanticField(
|
||||
..., description="A dictinary of arguments to be passed to the tool."
|
||||
..., description="A dictionary of arguments to be passed to the tool."
|
||||
)
|
||||
|
||||
@@ -11,6 +11,11 @@ from crewai.telemetry import Telemetry
|
||||
from crewai.tools.tool_calling import InstructorToolCalling, ToolCalling
|
||||
from crewai.utilities import I18N, Converter, ConverterError, Printer
|
||||
|
||||
try:
|
||||
import agentops
|
||||
except ImportError:
|
||||
agentops = None
|
||||
|
||||
OPENAI_BIGGER_MODELS = ["gpt-4"]
|
||||
|
||||
|
||||
@@ -45,6 +50,7 @@ class ToolUsage:
|
||||
tools_names: str,
|
||||
task: Any,
|
||||
function_calling_llm: Any,
|
||||
agent: Any,
|
||||
action: Any,
|
||||
) -> None:
|
||||
self._i18n: I18N = I18N()
|
||||
@@ -53,6 +59,7 @@ class ToolUsage:
|
||||
self._run_attempts: int = 1
|
||||
self._max_parsing_attempts: int = 3
|
||||
self._remember_format_after_usages: int = 3
|
||||
self.agent = agent
|
||||
self.tools_description = tools_description
|
||||
self.tools_names = tools_names
|
||||
self.tools_handler = tools_handler
|
||||
@@ -98,7 +105,8 @@ class ToolUsage:
|
||||
tool_string: str,
|
||||
tool: BaseTool,
|
||||
calling: Union[ToolCalling, InstructorToolCalling],
|
||||
) -> str: # TODO: Fix this return type --> finecwg : I updated return type to str
|
||||
) -> str: # TODO: Fix this return type
|
||||
tool_event = agentops.ToolEvent(name=calling.tool_name) if agentops else None
|
||||
if self._check_tool_repeated_usage(calling=calling): # type: ignore # _check_tool_repeated_usage of "ToolUsage" does not return a value (it only ever returns None)
|
||||
try:
|
||||
result = self._i18n.errors("task_repeated_usage").format(
|
||||
@@ -111,7 +119,7 @@ class ToolUsage:
|
||||
attempts=self._run_attempts,
|
||||
)
|
||||
result = self._format_result(result=result) # type: ignore # "_format_result" of "ToolUsage" does not return a value (it only ever returns None)
|
||||
return result # type: ignore # Fix the reutrn type of this function
|
||||
return result # type: ignore # Fix the return type of this function
|
||||
|
||||
except Exception:
|
||||
self.task.increment_tools_errors()
|
||||
@@ -123,7 +131,11 @@ class ToolUsage:
|
||||
tool=calling.tool_name, input=calling.arguments
|
||||
)
|
||||
|
||||
if result is None: #! finecwg: if not result --> if result is None
|
||||
original_tool = next(
|
||||
(ot for ot in self.original_tools if ot.name == tool.name), None
|
||||
)
|
||||
|
||||
if result is None: #! finecwg: if not result --> if result is None
|
||||
try:
|
||||
if calling.tool_name in [
|
||||
"Delegate work to coworker",
|
||||
@@ -139,16 +151,12 @@ class ToolUsage:
|
||||
for k, v in calling.arguments.items()
|
||||
if k in acceptable_args
|
||||
}
|
||||
result = tool._run(**arguments)
|
||||
result = tool.invoke(input=arguments)
|
||||
except Exception:
|
||||
if tool.args_schema:
|
||||
arguments = calling.arguments
|
||||
result = tool._run(**arguments)
|
||||
else:
|
||||
arguments = calling.arguments.values() # type: ignore # Incompatible types in assignment (expression has type "dict_values[str, Any]", variable has type "dict[str, Any]")
|
||||
result = tool._run(*arguments)
|
||||
arguments = calling.arguments
|
||||
result = tool.invoke(input=arguments)
|
||||
else:
|
||||
result = tool._run()
|
||||
result = tool.invoke(input={})
|
||||
except Exception as e:
|
||||
self._run_attempts += 1
|
||||
if self._run_attempts > self._max_parsing_attempts:
|
||||
@@ -164,13 +172,14 @@ class ToolUsage:
|
||||
return error # type: ignore # No return value expected
|
||||
|
||||
self.task.increment_tools_errors()
|
||||
if agentops:
|
||||
agentops.record(
|
||||
agentops.ErrorEvent(exception=e, trigger_event=tool_event)
|
||||
)
|
||||
return self.use(calling=calling, tool_string=tool_string) # type: ignore # No return value expected
|
||||
|
||||
if self.tools_handler:
|
||||
should_cache = True
|
||||
original_tool = next(
|
||||
(ot for ot in self.original_tools if ot.name == tool.name), None
|
||||
)
|
||||
if (
|
||||
hasattr(original_tool, "cache_function")
|
||||
and original_tool.cache_function # type: ignore # Item "None" of "Any | None" has no attribute "cache_function"
|
||||
@@ -184,12 +193,29 @@ class ToolUsage:
|
||||
)
|
||||
|
||||
self._printer.print(content=f"\n\n{result}\n", color="purple")
|
||||
if agentops:
|
||||
agentops.record(tool_event)
|
||||
self._telemetry.tool_usage(
|
||||
llm=self.function_calling_llm,
|
||||
tool_name=tool.name,
|
||||
attempts=self._run_attempts,
|
||||
)
|
||||
result = self._format_result(result=result) # type: ignore # "_format_result" of "ToolUsage" does not return a value (it only ever returns None)
|
||||
data = {
|
||||
"result": result,
|
||||
"tool_name": tool.name,
|
||||
"tool_args": calling.arguments,
|
||||
}
|
||||
|
||||
if (
|
||||
hasattr(original_tool, "result_as_answer")
|
||||
and original_tool.result_as_answer # type: ignore # Item "None" of "Any | None" has no attribute "cache_function"
|
||||
):
|
||||
result_as_answer = original_tool.result_as_answer # type: ignore # Item "None" of "Any | None" has no attribute "result_as_answer"
|
||||
data["result_as_answer"] = result_as_answer
|
||||
|
||||
self.agent.tools_results.append(data)
|
||||
|
||||
return result # type: ignore # No return value expected
|
||||
|
||||
def _format_result(self, result: Any) -> None:
|
||||
@@ -290,7 +316,7 @@ class ToolUsage:
|
||||
Example:
|
||||
{"tool_name": "tool name", "arguments": {"arg_name1": "value", "arg_name2": 2}}""",
|
||||
),
|
||||
max_attemps=1,
|
||||
max_attempts=1,
|
||||
)
|
||||
calling = converter.to_pydantic()
|
||||
|
||||
|
||||
@@ -16,8 +16,8 @@
|
||||
"format_without_tools": "\nSorry, I didn't use the right format. I MUST either use a tool (among the available ones), OR give my best final answer.\nI just remembered the expected format I must follow:\n\nQuestion: the input question you must answer\nThought: you should always think about what to do\nAction: the action to take, should be one of [{tool_names}]\nAction Input: the input to the action\nObservation: the result of the action\n... (this Thought/Action/Action Input/Observation can repeat N times)\nThought: I now can give a great answer\nFinal Answer: my best complete final answer to the task\nYour final answer must be the great and the most complete as possible, it must be outcome described\n\n",
|
||||
"task_with_context": "{task}\n\nThis is the context you're working with:\n{context}",
|
||||
"expected_output": "\nThis is the expect criteria for your final answer: {expected_output} \n you MUST return the actual complete content as the final answer, not a summary.",
|
||||
"human_feedback": "You got human feedback on your work, re-avaluate it and give a new Final Answer when ready.\n {human_feedback}",
|
||||
"getting_input": "This is the agent final answer: {final_answer}\nPlease provide a feedback: "
|
||||
"human_feedback": "You got human feedback on your work, re-evaluate it and give a new Final Answer when ready.\n {human_feedback}",
|
||||
"getting_input": "This is the agent's final answer: {final_answer}\nPlease provide feedback: "
|
||||
},
|
||||
"errors": {
|
||||
"force_final_answer": "Tool won't be use because it's time to give your final answer. Don't use tools and just your absolute BEST Final answer.",
|
||||
|
||||
@@ -2,10 +2,8 @@ import json
|
||||
|
||||
from langchain.schema import HumanMessage, SystemMessage
|
||||
from langchain_openai import ChatOpenAI
|
||||
from pydantic import model_validator
|
||||
from crewai.agents.agent_builder.utilities.base_output_converter_base import (
|
||||
OutputConverter,
|
||||
)
|
||||
|
||||
from crewai.agents.agent_builder.utilities.base_output_converter import OutputConverter
|
||||
|
||||
|
||||
class ConverterError(Exception):
|
||||
@@ -19,20 +17,15 @@ class ConverterError(Exception):
|
||||
class Converter(OutputConverter):
|
||||
"""Class that converts text into either pydantic or json."""
|
||||
|
||||
@model_validator(mode="after")
|
||||
def check_llm_provider(self):
|
||||
if not self._is_gpt(self.llm):
|
||||
self._is_gpt = False
|
||||
|
||||
def to_pydantic(self, current_attempt=1):
|
||||
"""Convert text to pydantic."""
|
||||
try:
|
||||
if self._is_gpt:
|
||||
if self.is_gpt:
|
||||
return self._create_instructor().to_pydantic()
|
||||
else:
|
||||
return self._create_chain().invoke({})
|
||||
except Exception as e:
|
||||
if current_attempt < self.max_attemps:
|
||||
if current_attempt < self.max_attempts:
|
||||
return self.to_pydantic(current_attempt + 1)
|
||||
return ConverterError(
|
||||
f"Failed to convert text into a pydantic model due to the following error: {e}"
|
||||
@@ -41,12 +34,12 @@ class Converter(OutputConverter):
|
||||
def to_json(self, current_attempt=1):
|
||||
"""Convert text to json."""
|
||||
try:
|
||||
if self._is_gpt:
|
||||
if self.is_gpt:
|
||||
return self._create_instructor().to_json()
|
||||
else:
|
||||
return json.dumps(self._create_chain().invoke({}).model_dump())
|
||||
except Exception:
|
||||
if current_attempt < self.max_attemps:
|
||||
if current_attempt < self.max_attempts:
|
||||
return self.to_json(current_attempt + 1)
|
||||
return ConverterError("Failed to convert text into JSON.")
|
||||
|
||||
@@ -56,7 +49,7 @@ class Converter(OutputConverter):
|
||||
|
||||
inst = Instructor(
|
||||
llm=self.llm,
|
||||
max_attemps=self.max_attemps,
|
||||
max_attempts=self.max_attempts,
|
||||
model=self.model,
|
||||
content=self.text,
|
||||
instructions=self.instructions,
|
||||
@@ -75,5 +68,7 @@ class Converter(OutputConverter):
|
||||
)
|
||||
return new_prompt | self.llm | parser
|
||||
|
||||
def _is_gpt(self, llm) -> bool: # type: ignore # BUG? Name "_is_gpt" defined on line 20 hides name from outer scope
|
||||
return isinstance(llm, ChatOpenAI) and llm.openai_api_base is None
|
||||
@property
|
||||
def is_gpt(self) -> bool:
|
||||
"""Return if llm provided is of gpt from openai."""
|
||||
return isinstance(self.llm, ChatOpenAI) and self.llm.openai_api_base is None
|
||||
|
||||
@@ -6,6 +6,17 @@ from pydantic import BaseModel, Field
|
||||
from crewai.utilities import Converter
|
||||
from crewai.utilities.pydantic_schema_parser import PydanticSchemaParser
|
||||
|
||||
agentops = None
|
||||
try:
|
||||
from agentops import track_agent
|
||||
except ImportError:
|
||||
|
||||
def track_agent(name):
|
||||
def noop(f):
|
||||
return f
|
||||
|
||||
return noop
|
||||
|
||||
|
||||
class Entity(BaseModel):
|
||||
name: str = Field(description="The name of the entity.")
|
||||
@@ -38,6 +49,7 @@ class TrainingTaskEvaluation(BaseModel):
|
||||
)
|
||||
|
||||
|
||||
@track_agent(name="Task Evaluator")
|
||||
class TaskEvaluator:
|
||||
def __init__(self, original_agent):
|
||||
self.llm = original_agent.llm
|
||||
|
||||
@@ -31,9 +31,8 @@ class PickleHandler:
|
||||
- file_name (str): The name of the file for saving and loading data.
|
||||
"""
|
||||
self.file_path = os.path.join(os.getcwd(), file_name)
|
||||
self._initialize_file()
|
||||
|
||||
def _initialize_file(self) -> None:
|
||||
def initialize_file(self) -> None:
|
||||
"""
|
||||
Initialize the file with an empty dictionary if it does not exist or is empty.
|
||||
"""
|
||||
|
||||
20
src/crewai/utilities/formatter.py
Normal file
20
src/crewai/utilities/formatter.py
Normal file
@@ -0,0 +1,20 @@
|
||||
from typing import List
|
||||
|
||||
from crewai.task import Task
|
||||
from crewai.tasks.task_output import TaskOutput
|
||||
|
||||
|
||||
def aggregate_raw_outputs_from_task_outputs(task_outputs: List[TaskOutput]) -> str:
|
||||
"""Generate string context from the task outputs."""
|
||||
dividers = "\n\n----------\n\n"
|
||||
|
||||
# Join task outputs with dividers
|
||||
context = dividers.join(output.raw for output in task_outputs)
|
||||
return context
|
||||
|
||||
|
||||
def aggregate_raw_outputs_from_tasks(tasks: List[Task]) -> str:
|
||||
"""Generate string context from the tasks."""
|
||||
task_outputs = [task.output for task in tasks if task.output is not None]
|
||||
|
||||
return aggregate_raw_outputs_from_task_outputs(task_outputs)
|
||||
@@ -8,18 +8,18 @@ from crewai.agents.agent_builder.utilities.base_token_process import TokenProces
|
||||
|
||||
|
||||
class TokenCalcHandler(BaseCallbackHandler):
|
||||
model: str = ""
|
||||
model_name: str = ""
|
||||
token_cost_process: TokenProcess
|
||||
|
||||
def __init__(self, model, token_cost_process):
|
||||
self.model = model
|
||||
def __init__(self, model_name, token_cost_process):
|
||||
self.model_name = model_name
|
||||
self.token_cost_process = token_cost_process
|
||||
|
||||
def on_llm_start(
|
||||
self, serialized: Dict[str, Any], prompts: List[str], **kwargs: Any
|
||||
) -> None:
|
||||
try:
|
||||
encoding = tiktoken.encoding_for_model(self.model)
|
||||
encoding = tiktoken.encoding_for_model(self.model_name)
|
||||
except KeyError:
|
||||
encoding = tiktoken.get_encoding("cl100k_base")
|
||||
|
||||
|
||||
@@ -12,7 +12,6 @@ from crewai import Agent, Crew, Task
|
||||
from crewai.agents.cache import CacheHandler
|
||||
from crewai.agents.executor import CrewAgentExecutor
|
||||
from crewai.agents.parser import CrewAgentParser
|
||||
|
||||
from crewai.tools.tool_calling import InstructorToolCalling
|
||||
from crewai.tools.tool_usage import ToolUsage
|
||||
from crewai.utilities import RPMController
|
||||
@@ -632,8 +631,9 @@ def test_agent_use_specific_tasks_output_as_context(capsys):
|
||||
|
||||
crew = Crew(agents=[agent1, agent2], tasks=tasks)
|
||||
result = crew.kickoff()
|
||||
assert "bye" not in result.lower()
|
||||
assert "hi" in result.lower() or "hello" in result.lower()
|
||||
print("LOWER RESULT", result.raw)
|
||||
assert "bye" not in result.raw.lower()
|
||||
assert "hi" in result.raw.lower() or "hello" in result.raw.lower()
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
@@ -645,7 +645,7 @@ def test_agent_step_callback():
|
||||
with patch.object(StepCallback, "callback") as callback:
|
||||
|
||||
@tool
|
||||
def learn_about_AI(topic) -> float:
|
||||
def learn_about_AI(topic) -> str:
|
||||
"""Useful for when you need to learn about AI to write an paragraph about it."""
|
||||
return "AI is a very broad field."
|
||||
|
||||
@@ -679,7 +679,7 @@ def test_agent_function_calling_llm():
|
||||
with patch.object(llm.client, "create", wraps=llm.client.create) as private_mock:
|
||||
|
||||
@tool
|
||||
def learn_about_AI(topic) -> float:
|
||||
def learn_about_AI(topic) -> str:
|
||||
"""Useful for when you need to learn about AI to write an paragraph about it."""
|
||||
return "AI is a very broad field."
|
||||
|
||||
@@ -724,6 +724,74 @@ def test_agent_count_formatting_error():
|
||||
mock_count_errors.assert_called_once()
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_tool_result_as_answer_is_the_final_answer_for_the_agent():
|
||||
from crewai_tools import BaseTool
|
||||
|
||||
class MyCustomTool(BaseTool):
|
||||
name: str = "Get Greetings"
|
||||
description: str = "Get a random greeting back"
|
||||
|
||||
def _run(self) -> str:
|
||||
return "Howdy!"
|
||||
|
||||
agent1 = Agent(
|
||||
role="Data Scientist",
|
||||
goal="Product amazing resports on AI",
|
||||
backstory="You work with data and AI",
|
||||
tools=[MyCustomTool(result_as_answer=True)],
|
||||
)
|
||||
|
||||
essay = Task(
|
||||
description="Write and then review an small paragraph on AI until it's AMAZING. But first use the `Get Greetings` tool to get a greeting.",
|
||||
expected_output="The final paragraph with the full review on AI and no greeting.",
|
||||
agent=agent1,
|
||||
)
|
||||
tasks = [essay]
|
||||
crew = Crew(agents=[agent1], tasks=tasks)
|
||||
|
||||
result = crew.kickoff()
|
||||
print("RESULT: ", result.raw)
|
||||
assert result.raw == "Howdy!"
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_tool_usage_information_is_appended_to_agent():
|
||||
from crewai_tools import BaseTool
|
||||
|
||||
class MyCustomTool(BaseTool):
|
||||
name: str = "Decide Greetings"
|
||||
description: str = "Decide what is the appropriate greeting to use"
|
||||
|
||||
def _run(self) -> str:
|
||||
return "Howdy!"
|
||||
|
||||
agent1 = Agent(
|
||||
role="Friendly Neighbor",
|
||||
goal="Make everyone feel welcome",
|
||||
backstory="You are the friendly neighbor",
|
||||
tools=[MyCustomTool(result_as_answer=True)],
|
||||
)
|
||||
|
||||
greeting = Task(
|
||||
description="Say an appropriate greeting.",
|
||||
expected_output="The greeting.",
|
||||
agent=agent1,
|
||||
)
|
||||
tasks = [greeting]
|
||||
crew = Crew(agents=[agent1], tasks=tasks)
|
||||
|
||||
crew.kickoff()
|
||||
assert agent1.tools_results == [
|
||||
{
|
||||
"result": "Howdy!",
|
||||
"tool_name": "Decide Greetings",
|
||||
"tool_args": {},
|
||||
"result_as_answer": True,
|
||||
}
|
||||
]
|
||||
|
||||
|
||||
def test_agent_llm_uses_token_calc_handler_with_llm_has_model_name():
|
||||
agent1 = Agent(
|
||||
role="test role",
|
||||
@@ -734,7 +802,7 @@ def test_agent_llm_uses_token_calc_handler_with_llm_has_model_name():
|
||||
|
||||
assert len(agent1.llm.callbacks) == 1
|
||||
assert agent1.llm.callbacks[0].__class__.__name__ == "TokenCalcHandler"
|
||||
assert agent1.llm.callbacks[0].model == "gpt-4o"
|
||||
assert agent1.llm.callbacks[0].model_name == "gpt-4o"
|
||||
assert (
|
||||
agent1.llm.callbacks[0].token_cost_process.__class__.__name__ == "TokenProcess"
|
||||
)
|
||||
@@ -895,3 +963,54 @@ def test_agent_use_trained_data(crew_training_handler):
|
||||
crew_training_handler.assert_has_calls(
|
||||
[mock.call(), mock.call("trained_agents_data.pkl"), mock.call().load()]
|
||||
)
|
||||
|
||||
|
||||
def test_agent_max_retry_limit():
|
||||
agent = Agent(
|
||||
role="test role",
|
||||
goal="test goal",
|
||||
backstory="test backstory",
|
||||
max_retry_limit=1,
|
||||
)
|
||||
|
||||
task = Task(
|
||||
agent=agent,
|
||||
description="Say the word: Hi",
|
||||
expected_output="The word: Hi",
|
||||
human_input=True,
|
||||
)
|
||||
|
||||
error_message = "Error happening while sending prompt to model."
|
||||
with patch.object(
|
||||
CrewAgentExecutor, "invoke", wraps=agent.agent_executor.invoke
|
||||
) as invoke_mock:
|
||||
invoke_mock.side_effect = Exception(error_message)
|
||||
|
||||
assert agent._times_executed == 0
|
||||
assert agent.max_retry_limit == 1
|
||||
|
||||
with pytest.raises(Exception) as e:
|
||||
agent.execute_task(
|
||||
task=task,
|
||||
)
|
||||
assert e.value.args[0] == error_message
|
||||
assert agent._times_executed == 2
|
||||
|
||||
invoke_mock.assert_has_calls(
|
||||
[
|
||||
mock.call(
|
||||
{
|
||||
"input": "Say the word: Hi\n\nThis is the expect criteria for your final answer: The word: Hi \n you MUST return the actual complete content as the final answer, not a summary.",
|
||||
"tool_names": "",
|
||||
"tools": "",
|
||||
}
|
||||
),
|
||||
mock.call(
|
||||
{
|
||||
"input": "Say the word: Hi\n\nThis is the expect criteria for your final answer: The word: Hi \n you MUST return the actual complete content as the final answer, not a summary.",
|
||||
"tool_names": "",
|
||||
"tools": "",
|
||||
}
|
||||
),
|
||||
]
|
||||
)
|
||||
|
||||
0
tests/agents/__init__.py
Normal file
0
tests/agents/__init__.py
Normal file
378
tests/agents/test_crew_agent_parser.py
Normal file
378
tests/agents/test_crew_agent_parser.py
Normal file
@@ -0,0 +1,378 @@
|
||||
import pytest
|
||||
from crewai.agents.parser import CrewAgentParser
|
||||
from langchain_core.agents import AgentAction, AgentFinish
|
||||
from langchain_core.exceptions import OutputParserException
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def parser():
|
||||
p = CrewAgentParser()
|
||||
p.agent = MockAgent()
|
||||
return p
|
||||
|
||||
|
||||
def test_valid_action_parsing_special_characters(parser):
|
||||
text = "Thought: Let's find the temperature\nAction: search\nAction Input: what's the temperature in SF?"
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentAction)
|
||||
assert result.tool == "search"
|
||||
assert result.tool_input == "what's the temperature in SF?"
|
||||
|
||||
|
||||
def test_valid_action_parsing_with_json_tool_input(parser):
|
||||
text = """
|
||||
Thought: Let's find the information
|
||||
Action: query
|
||||
Action Input: ** {"task": "What are some common challenges or barriers that you have observed or experienced when implementing AI-powered solutions in healthcare settings?", "context": "As we've discussed recent advancements in AI applications in healthcare, it's crucial to acknowledge the potential hurdles. Some possible obstacles include...", "coworker": "Senior Researcher"}
|
||||
"""
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentAction)
|
||||
expected_tool_input = '{"task": "What are some common challenges or barriers that you have observed or experienced when implementing AI-powered solutions in healthcare settings?", "context": "As we\'ve discussed recent advancements in AI applications in healthcare, it\'s crucial to acknowledge the potential hurdles. Some possible obstacles include...", "coworker": "Senior Researcher"}'
|
||||
assert result.tool == "query"
|
||||
assert result.tool_input == expected_tool_input
|
||||
|
||||
|
||||
def test_valid_action_parsing_with_quotes(parser):
|
||||
text = 'Thought: Let\'s find the temperature\nAction: search\nAction Input: "temperature in SF"'
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentAction)
|
||||
assert result.tool == "search"
|
||||
assert result.tool_input == "temperature in SF"
|
||||
|
||||
|
||||
def test_valid_action_parsing_with_curly_braces(parser):
|
||||
text = "Thought: Let's find the temperature\nAction: search\nAction Input: {temperature in SF}"
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentAction)
|
||||
assert result.tool == "search"
|
||||
assert result.tool_input == "{temperature in SF}"
|
||||
|
||||
|
||||
def test_valid_action_parsing_with_angle_brackets(parser):
|
||||
text = "Thought: Let's find the temperature\nAction: search\nAction Input: <temperature in SF>"
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentAction)
|
||||
assert result.tool == "search"
|
||||
assert result.tool_input == "<temperature in SF>"
|
||||
|
||||
|
||||
def test_valid_action_parsing_with_parentheses(parser):
|
||||
text = "Thought: Let's find the temperature\nAction: search\nAction Input: (temperature in SF)"
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentAction)
|
||||
assert result.tool == "search"
|
||||
assert result.tool_input == "(temperature in SF)"
|
||||
|
||||
|
||||
def test_valid_action_parsing_with_mixed_brackets(parser):
|
||||
text = "Thought: Let's find the temperature\nAction: search\nAction Input: [temperature in {SF}]"
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentAction)
|
||||
assert result.tool == "search"
|
||||
assert result.tool_input == "[temperature in {SF}]"
|
||||
|
||||
|
||||
def test_valid_action_parsing_with_nested_quotes(parser):
|
||||
text = "Thought: Let's find the temperature\nAction: search\nAction Input: \"what's the temperature in 'SF'?\""
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentAction)
|
||||
assert result.tool == "search"
|
||||
assert result.tool_input == "what's the temperature in 'SF'?"
|
||||
|
||||
|
||||
def test_valid_action_parsing_with_incomplete_json(parser):
|
||||
text = 'Thought: Let\'s find the temperature\nAction: search\nAction Input: {"query": "temperature in SF"'
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentAction)
|
||||
assert result.tool == "search"
|
||||
assert result.tool_input == '{"query": "temperature in SF"}'
|
||||
|
||||
|
||||
def test_valid_action_parsing_with_special_characters(parser):
|
||||
text = "Thought: Let's find the temperature\nAction: search\nAction Input: what is the temperature in SF? @$%^&*"
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentAction)
|
||||
assert result.tool == "search"
|
||||
assert result.tool_input == "what is the temperature in SF? @$%^&*"
|
||||
|
||||
|
||||
def test_valid_action_parsing_with_combination(parser):
|
||||
text = 'Thought: Let\'s find the temperature\nAction: search\nAction Input: "[what is the temperature in SF?]"'
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentAction)
|
||||
assert result.tool == "search"
|
||||
assert result.tool_input == "[what is the temperature in SF?]"
|
||||
|
||||
|
||||
def test_valid_action_parsing_with_mixed_quotes(parser):
|
||||
text = "Thought: Let's find the temperature\nAction: search\nAction Input: \"what's the temperature in SF?\""
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentAction)
|
||||
assert result.tool == "search"
|
||||
assert result.tool_input == "what's the temperature in SF?"
|
||||
|
||||
|
||||
def test_valid_action_parsing_with_newlines(parser):
|
||||
text = "Thought: Let's find the temperature\nAction: search\nAction Input: what is\nthe temperature in SF?"
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentAction)
|
||||
assert result.tool == "search"
|
||||
assert result.tool_input == "what is\nthe temperature in SF?"
|
||||
|
||||
|
||||
def test_valid_action_parsing_with_escaped_characters(parser):
|
||||
text = "Thought: Let's find the temperature\nAction: search\nAction Input: what is the temperature in SF? \\n"
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentAction)
|
||||
assert result.tool == "search"
|
||||
assert result.tool_input == "what is the temperature in SF? \\n"
|
||||
|
||||
|
||||
def test_valid_action_parsing_with_json_string(parser):
|
||||
text = 'Thought: Let\'s find the temperature\nAction: search\nAction Input: {"query": "temperature in SF"}'
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentAction)
|
||||
assert result.tool == "search"
|
||||
assert result.tool_input == '{"query": "temperature in SF"}'
|
||||
|
||||
|
||||
def test_valid_action_parsing_with_unbalanced_quotes(parser):
|
||||
text = "Thought: Let's find the temperature\nAction: search\nAction Input: \"what is the temperature in SF?"
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentAction)
|
||||
assert result.tool == "search"
|
||||
assert result.tool_input == "what is the temperature in SF?"
|
||||
|
||||
|
||||
def test_clean_action_no_formatting(parser):
|
||||
action = "Ask question to senior researcher"
|
||||
cleaned_action = parser._clean_action(action)
|
||||
assert cleaned_action == "Ask question to senior researcher"
|
||||
|
||||
|
||||
def test_clean_action_with_leading_asterisks(parser):
|
||||
action = "** Ask question to senior researcher"
|
||||
cleaned_action = parser._clean_action(action)
|
||||
assert cleaned_action == "Ask question to senior researcher"
|
||||
|
||||
|
||||
def test_clean_action_with_trailing_asterisks(parser):
|
||||
action = "Ask question to senior researcher **"
|
||||
cleaned_action = parser._clean_action(action)
|
||||
assert cleaned_action == "Ask question to senior researcher"
|
||||
|
||||
|
||||
def test_clean_action_with_leading_and_trailing_asterisks(parser):
|
||||
action = "** Ask question to senior researcher **"
|
||||
cleaned_action = parser._clean_action(action)
|
||||
assert cleaned_action == "Ask question to senior researcher"
|
||||
|
||||
|
||||
def test_clean_action_with_multiple_leading_asterisks(parser):
|
||||
action = "**** Ask question to senior researcher"
|
||||
cleaned_action = parser._clean_action(action)
|
||||
assert cleaned_action == "Ask question to senior researcher"
|
||||
|
||||
|
||||
def test_clean_action_with_multiple_trailing_asterisks(parser):
|
||||
action = "Ask question to senior researcher ****"
|
||||
cleaned_action = parser._clean_action(action)
|
||||
assert cleaned_action == "Ask question to senior researcher"
|
||||
|
||||
|
||||
def test_clean_action_with_spaces_and_asterisks(parser):
|
||||
action = " ** Ask question to senior researcher ** "
|
||||
cleaned_action = parser._clean_action(action)
|
||||
print(f"Original action: '{action}'")
|
||||
print(f"Cleaned action: '{cleaned_action}'")
|
||||
assert cleaned_action == "Ask question to senior researcher"
|
||||
|
||||
|
||||
def test_clean_action_with_only_asterisks(parser):
|
||||
action = "****"
|
||||
cleaned_action = parser._clean_action(action)
|
||||
assert cleaned_action == ""
|
||||
|
||||
|
||||
def test_clean_action_with_empty_string(parser):
|
||||
action = ""
|
||||
cleaned_action = parser._clean_action(action)
|
||||
assert cleaned_action == ""
|
||||
|
||||
|
||||
def test_valid_final_answer_parsing(parser):
|
||||
text = (
|
||||
"Thought: I found the information\nFinal Answer: The temperature is 100 degrees"
|
||||
)
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentFinish)
|
||||
assert result.return_values["output"] == "The temperature is 100 degrees"
|
||||
|
||||
|
||||
def test_missing_action_error(parser):
|
||||
text = "Thought: Let's find the temperature\nAction Input: what is the temperature in SF?"
|
||||
with pytest.raises(OutputParserException) as exc_info:
|
||||
parser.parse(text)
|
||||
assert "Could not parse LLM output" in str(exc_info.value)
|
||||
|
||||
|
||||
def test_missing_action_input_error(parser):
|
||||
text = "Thought: Let's find the temperature\nAction: search"
|
||||
with pytest.raises(OutputParserException) as exc_info:
|
||||
parser.parse(text)
|
||||
assert "Could not parse LLM output" in str(exc_info.value)
|
||||
|
||||
|
||||
def test_action_and_final_answer_error(parser):
|
||||
text = "Thought: I found the information\nAction: search\nAction Input: what is the temperature in SF?\nFinal Answer: The temperature is 100 degrees"
|
||||
with pytest.raises(OutputParserException) as exc_info:
|
||||
parser.parse(text)
|
||||
assert "both perform Action and give a Final Answer" in str(exc_info.value)
|
||||
|
||||
|
||||
def test_safe_repair_json(parser):
|
||||
invalid_json = '{"task": "Research XAI", "context": "Explainable AI", "coworker": Senior Researcher'
|
||||
expected_repaired_json = '{"task": "Research XAI", "context": "Explainable AI", "coworker": "Senior Researcher"}'
|
||||
result = parser._safe_repair_json(invalid_json)
|
||||
assert result == expected_repaired_json
|
||||
|
||||
|
||||
def test_safe_repair_json_unrepairable(parser):
|
||||
invalid_json = "{invalid_json"
|
||||
result = parser._safe_repair_json(invalid_json)
|
||||
print("result:", invalid_json)
|
||||
assert result == invalid_json # Should return the original if unrepairable
|
||||
|
||||
|
||||
def test_safe_repair_json_missing_quotes(parser):
|
||||
invalid_json = (
|
||||
'{task: "Research XAI", context: "Explainable AI", coworker: Senior Researcher}'
|
||||
)
|
||||
expected_repaired_json = '{"task": "Research XAI", "context": "Explainable AI", "coworker": "Senior Researcher"}'
|
||||
result = parser._safe_repair_json(invalid_json)
|
||||
assert result == expected_repaired_json
|
||||
|
||||
|
||||
def test_safe_repair_json_unclosed_brackets(parser):
|
||||
invalid_json = '{"task": "Research XAI", "context": "Explainable AI", "coworker": "Senior Researcher"'
|
||||
expected_repaired_json = '{"task": "Research XAI", "context": "Explainable AI", "coworker": "Senior Researcher"}'
|
||||
result = parser._safe_repair_json(invalid_json)
|
||||
assert result == expected_repaired_json
|
||||
|
||||
|
||||
def test_safe_repair_json_extra_commas(parser):
|
||||
invalid_json = '{"task": "Research XAI", "context": "Explainable AI", "coworker": "Senior Researcher",}'
|
||||
expected_repaired_json = '{"task": "Research XAI", "context": "Explainable AI", "coworker": "Senior Researcher"}'
|
||||
result = parser._safe_repair_json(invalid_json)
|
||||
assert result == expected_repaired_json
|
||||
|
||||
|
||||
def test_safe_repair_json_trailing_commas(parser):
|
||||
invalid_json = '{"task": "Research XAI", "context": "Explainable AI", "coworker": "Senior Researcher",}'
|
||||
expected_repaired_json = '{"task": "Research XAI", "context": "Explainable AI", "coworker": "Senior Researcher"}'
|
||||
result = parser._safe_repair_json(invalid_json)
|
||||
assert result == expected_repaired_json
|
||||
|
||||
|
||||
def test_safe_repair_json_single_quotes(parser):
|
||||
invalid_json = "{'task': 'Research XAI', 'context': 'Explainable AI', 'coworker': 'Senior Researcher'}"
|
||||
expected_repaired_json = '{"task": "Research XAI", "context": "Explainable AI", "coworker": "Senior Researcher"}'
|
||||
result = parser._safe_repair_json(invalid_json)
|
||||
assert result == expected_repaired_json
|
||||
|
||||
|
||||
def test_safe_repair_json_mixed_quotes(parser):
|
||||
invalid_json = "{'task': \"Research XAI\", 'context': \"Explainable AI\", 'coworker': 'Senior Researcher'}"
|
||||
expected_repaired_json = '{"task": "Research XAI", "context": "Explainable AI", "coworker": "Senior Researcher"}'
|
||||
result = parser._safe_repair_json(invalid_json)
|
||||
assert result == expected_repaired_json
|
||||
|
||||
|
||||
def test_safe_repair_json_unescaped_characters(parser):
|
||||
invalid_json = '{"task": "Research XAI", "context": "Explainable AI", "coworker": "Senior Researcher\n"}'
|
||||
expected_repaired_json = '{"task": "Research XAI", "context": "Explainable AI", "coworker": "Senior Researcher"}'
|
||||
result = parser._safe_repair_json(invalid_json)
|
||||
print("result:", result)
|
||||
assert result == expected_repaired_json
|
||||
|
||||
|
||||
def test_safe_repair_json_missing_colon(parser):
|
||||
invalid_json = '{"task" "Research XAI", "context": "Explainable AI", "coworker": "Senior Researcher"}'
|
||||
expected_repaired_json = '{"task": "Research XAI", "context": "Explainable AI", "coworker": "Senior Researcher"}'
|
||||
result = parser._safe_repair_json(invalid_json)
|
||||
assert result == expected_repaired_json
|
||||
|
||||
|
||||
def test_safe_repair_json_missing_comma(parser):
|
||||
invalid_json = '{"task": "Research XAI" "context": "Explainable AI", "coworker": "Senior Researcher"}'
|
||||
expected_repaired_json = '{"task": "Research XAI", "context": "Explainable AI", "coworker": "Senior Researcher"}'
|
||||
result = parser._safe_repair_json(invalid_json)
|
||||
assert result == expected_repaired_json
|
||||
|
||||
|
||||
def test_safe_repair_json_unexpected_trailing_characters(parser):
|
||||
invalid_json = '{"task": "Research XAI", "context": "Explainable AI", "coworker": "Senior Researcher"} random text'
|
||||
expected_repaired_json = '{"task": "Research XAI", "context": "Explainable AI", "coworker": "Senior Researcher"}'
|
||||
result = parser._safe_repair_json(invalid_json)
|
||||
assert result == expected_repaired_json
|
||||
|
||||
|
||||
def test_safe_repair_json_special_characters_key(parser):
|
||||
invalid_json = '{"task!@#": "Research XAI", "context$%^": "Explainable AI", "coworker&*()": "Senior Researcher"}'
|
||||
expected_repaired_json = '{"task!@#": "Research XAI", "context$%^": "Explainable AI", "coworker&*()": "Senior Researcher"}'
|
||||
result = parser._safe_repair_json(invalid_json)
|
||||
assert result == expected_repaired_json
|
||||
|
||||
|
||||
def test_parsing_with_whitespace(parser):
|
||||
text = " Thought: Let's find the temperature \n Action: search \n Action Input: what is the temperature in SF? "
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentAction)
|
||||
assert result.tool == "search"
|
||||
assert result.tool_input == "what is the temperature in SF?"
|
||||
|
||||
|
||||
def test_parsing_with_special_characters(parser):
|
||||
text = 'Thought: Let\'s find the temperature\nAction: search\nAction Input: "what is the temperature in SF?"'
|
||||
result = parser.parse(text)
|
||||
assert isinstance(result, AgentAction)
|
||||
assert result.tool == "search"
|
||||
assert result.tool_input == "what is the temperature in SF?"
|
||||
|
||||
|
||||
def test_integration_valid_and_invalid(parser):
|
||||
text = """
|
||||
Thought: Let's find the temperature
|
||||
Action: search
|
||||
Action Input: what is the temperature in SF?
|
||||
|
||||
Thought: I found the information
|
||||
Final Answer: The temperature is 100 degrees
|
||||
|
||||
Thought: Missing action
|
||||
Action Input: invalid
|
||||
|
||||
Thought: Missing action input
|
||||
Action: invalid
|
||||
"""
|
||||
parts = text.strip().split("\n\n")
|
||||
results = []
|
||||
for part in parts:
|
||||
try:
|
||||
result = parser.parse(part.strip())
|
||||
results.append(result)
|
||||
except OutputParserException as e:
|
||||
results.append(e)
|
||||
|
||||
assert isinstance(results[0], AgentAction)
|
||||
assert isinstance(results[1], AgentFinish)
|
||||
assert isinstance(results[2], OutputParserException)
|
||||
assert isinstance(results[3], OutputParserException)
|
||||
|
||||
|
||||
class MockAgent:
|
||||
def increment_formatting_errors(self):
|
||||
pass
|
||||
|
||||
|
||||
# TODO: ADD TEST TO MAKE SURE ** REMOVAL DOESN'T MESS UP ANYTHING
|
||||
File diff suppressed because it is too large
Load Diff
1804
tests/cassettes/test_async_execution_single_task.yaml
Normal file
1804
tests/cassettes/test_async_execution_single_task.yaml
Normal file
File diff suppressed because it is too large
Load Diff
4694
tests/cassettes/test_async_task_execution.yaml
Normal file
4694
tests/cassettes/test_async_task_execution.yaml
Normal file
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
591
tests/cassettes/test_crew_async_kickoff.yaml
Normal file
591
tests/cassettes/test_crew_async_kickoff.yaml
Normal file
@@ -0,0 +1,591 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are dog Researcher. You have a lot of experience
|
||||
with dog.\nYour personal goal is: Express hot takes on dog.To give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: my best complete final answer to the task.\nYour
|
||||
final answer must be the great and the most complete as possible, it must be
|
||||
outcome described.\n\nI MUST use these formats, my job depends on it!\nCurrent
|
||||
Task: Give me an analysis around dog.\n\nThis is the expect criteria for your
|
||||
final answer: 1 bullet point about dog that''s under 15 words. \n you MUST return
|
||||
the actual complete content as the final answer, not a summary.\n\nBegin! This
|
||||
is VERY important to you, use the tools available and give your best Final Answer,
|
||||
your job depends on it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o",
|
||||
"n": 1, "stop": ["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '951'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.34.0
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.34.0
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Dogs"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
are"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
incredibly"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
loyal"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
provide"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
unmatched"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
companionship"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
to"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
humans"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXYXcf53VmxfiC6Q2NBDG2bPci","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 89d0fa4e7abf53db-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Tue, 02 Jul 2024 19:17:45 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=6Xl2nvdsXT4uSfQ3C1ZK.LWKGYekVs5ErrLDZOdI.50-1719947865-1.0.1.1-6RQoTCznxe7H868MoxghRegIZaElbG_bN_jbs94hmnsnuR1P9bptoj8o2DbOSvj48ubewyvy8L16mOZHlMLw_A;
|
||||
path=/; expires=Tue, 02-Jul-24 19:47:45 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=kPTMOkGHQp0ytgVUrm3jFNiB9I.DDI2ONPRTr6IMTeo-1719947865623-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '102'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '16000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9997'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '15999783'
|
||||
x-ratelimit-reset-requests:
|
||||
- 14ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_2c5219e228ce79f0131c497230904013
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are apple Researcher. You have a lot of
|
||||
experience with apple.\nYour personal goal is: Express hot takes on apple.To
|
||||
give my best complete final answer to the task use the exact following format:\n\nThought:
|
||||
I now can give a great answer\nFinal Answer: my best complete final answer to
|
||||
the task.\nYour final answer must be the great and the most complete as possible,
|
||||
it must be outcome described.\n\nI MUST use these formats, my job depends on
|
||||
it!\nCurrent Task: Give me an analysis around apple.\n\nThis is the expect criteria
|
||||
for your final answer: 1 bullet point about apple that''s under 15 words. \n
|
||||
you MUST return the actual complete content as the final answer, not a summary.\n\nBegin!
|
||||
This is VERY important to you, use the tools available and give your best Final
|
||||
Answer, your job depends on it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o",
|
||||
"n": 1, "stop": ["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '961'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.34.0
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.34.0
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":".\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
Apple"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
revolution"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"izes"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
technology"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
with"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
sleek"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
designs"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":","},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
seamless"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
integration"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":","},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
innovative"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
user"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
experiences"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXaXAntrwdA2E5Bhxgz9p7q5Nc","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 89d0fa4e7ca907e6-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Tue, 02 Jul 2024 19:17:45 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=wf2ozMjr46sG0EhuZjpiDNagwTxC05ct3Hn7Y9Rs5AI-1719947865-1.0.1.1-uckxTTr7Yfe6sv4ZznqqrGTEz9E3_Cpp7OAWBIEeNz1Smdjwijw8YV5oYPe_6W4DrEtwVzRDxaqIHlWP55O0QA;
|
||||
path=/; expires=Tue, 02-Jul-24 19:47:45 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=F9pWw4TeoPa8puOm5RN9Gp2oY0lRoN53ChZ1qFYx1S8-1719947865726-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '168'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '16000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9998'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '15999780'
|
||||
x-ratelimit-reset-requests:
|
||||
- 10ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_e6dfeda5935eae030bcc2da526234635
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are cat Researcher. You have a lot of experience
|
||||
with cat.\nYour personal goal is: Express hot takes on cat.To give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: my best complete final answer to the task.\nYour
|
||||
final answer must be the great and the most complete as possible, it must be
|
||||
outcome described.\n\nI MUST use these formats, my job depends on it!\nCurrent
|
||||
Task: Give me an analysis around cat.\n\nThis is the expect criteria for your
|
||||
final answer: 1 bullet point about cat that''s under 15 words. \n you MUST return
|
||||
the actual complete content as the final answer, not a summary.\n\nBegin! This
|
||||
is VERY important to you, use the tools available and give your best Final Answer,
|
||||
your job depends on it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o",
|
||||
"n": 1, "stop": ["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '951'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.34.0
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.34.0
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Cats"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
are"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
master"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"ful"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
hunters"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
brilliant"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
problem"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"-sol"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"vers"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gdIXPfC85ZAgbI0KqvS9z396XBKw","object":"chat.completion.chunk","created":1719947865,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 89d0fa4e7ae912d7-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Tue, 02 Jul 2024 19:17:45 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=y7JNZ8WEp.q5pMXLi79ajfcI.F6MfE0GeYLw34Apkf0-1719947865-1.0.1.1-QKklGeYuOnsQROgqMs42XwqKNvW.mPrmcbtaxMnUg3eSgI7TRnRq4qPuSan0ynDt4Hd9NMuls2FR.Caa1MVr9Q;
|
||||
path=/; expires=Tue, 02-Jul-24 19:47:45 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=FVQoSgcvVyiB_o43X6y5MGYgzGojmsQqS.nPObW3JYU-1719947865679-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '132'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '16000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '15999783'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_a06bde4044d3ee75edf08f333139679c
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
585
tests/cassettes/test_crew_async_kickoff_for_each_full_ouput.yaml
Normal file
585
tests/cassettes/test_crew_async_kickoff_for_each_full_ouput.yaml
Normal file
@@ -0,0 +1,585 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are dog Researcher. You have a lot of experience
|
||||
with dog.\nYour personal goal is: Express hot takes on dog.To give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: my best complete final answer to the task.\nYour
|
||||
final answer must be the great and the most complete as possible, it must be
|
||||
outcome described.\n\nI MUST use these formats, my job depends on it!\nCurrent
|
||||
Task: Give me an analysis around dog.\n\nThis is the expect criteria for your
|
||||
final answer: 1 bullet point about dog that''s under 15 words. \n you MUST return
|
||||
the actual complete content as the final answer, not a summary.\n\nBegin! This
|
||||
is VERY important to you, use the tools available and give your best Final Answer,
|
||||
your job depends on it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o",
|
||||
"n": 1, "stop": ["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '951'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.34.0
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.34.0
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
Dogs"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
are"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
incredibly"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
loyal"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
creatures"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
make"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
excellent"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
companions"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGly5pkMQFPEoB5vefeCguR5lZg5","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 89c8b85b1adac009-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Mon, 01 Jul 2024 19:14:38 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=EKr3m8uVLAQymRGlOvcrrYSniXqH.I6nlooc.HtxR58-1719861278-1.0.1.1-ShT9PH0Sv.qvbXjw_BhtziPUPaOIFBxlzXEIk_MXnfJ5PxggSkkaN25IKMZglSd3N2X.U2pWvFwywNQXiXlRnQ;
|
||||
path=/; expires=Mon, 01-Jul-24 19:44:38 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=NdoJw5c7TqrbsjEH.ABD06WhM3d1BUh2BfsxOwuclSY-1719861278155-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '110'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '16000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9998'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '15999741'
|
||||
x-ratelimit-reset-requests:
|
||||
- 11ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_266fff38f6e0a997187154e25d6615e8
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are apple Researcher. You have a lot of
|
||||
experience with apple.\nYour personal goal is: Express hot takes on apple.To
|
||||
give my best complete final answer to the task use the exact following format:\n\nThought:
|
||||
I now can give a great answer\nFinal Answer: my best complete final answer to
|
||||
the task.\nYour final answer must be the great and the most complete as possible,
|
||||
it must be outcome described.\n\nI MUST use these formats, my job depends on
|
||||
it!\nCurrent Task: Give me an analysis around apple.\n\nThis is the expect criteria
|
||||
for your final answer: 1 bullet point about apple that''s under 15 words. \n
|
||||
you MUST return the actual complete content as the final answer, not a summary.\n\nBegin!
|
||||
This is VERY important to you, use the tools available and give your best Final
|
||||
Answer, your job depends on it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o",
|
||||
"n": 1, "stop": ["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '961'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.34.0
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.34.0
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":".\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
Apple''s"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
ecosystem"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
fosters"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
seamless"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
integration"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
but"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
limit"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
user"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
flexibility"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
choice"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyTsxWJNVf7aQLLtKEYroHrIXk","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 89c8b85b19501351-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Mon, 01 Jul 2024 19:14:38 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=R2LwekshY1B9Z5rF_RyOforiY4rLQreEXx6gOzhnZCI-1719861278-1.0.1.1-y_WShmsjsavKJEOt9Yw8nwjv05e4WdVaZGu8pYcS4z0wF9heVD.0C9W2aQodYxaWIQvkXiPm7y93ma7WCxUdBQ;
|
||||
path=/; expires=Mon, 01-Jul-24 19:44:38 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=TLDVkWpa_eP62.b4QTrhowr4J_DsXwMZ2nGDaWD4ebU-1719861278245-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '99'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '16000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '15999780'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_39f52bd8b06ab4ada2c16853345eb6dc
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are cat Researcher. You have a lot of experience
|
||||
with cat.\nYour personal goal is: Express hot takes on cat.To give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: my best complete final answer to the task.\nYour
|
||||
final answer must be the great and the most complete as possible, it must be
|
||||
outcome described.\n\nI MUST use these formats, my job depends on it!\nCurrent
|
||||
Task: Give me an analysis around cat.\n\nThis is the expect criteria for your
|
||||
final answer: 1 bullet point about cat that''s under 15 words. \n you MUST return
|
||||
the actual complete content as the final answer, not a summary.\n\nBegin! This
|
||||
is VERY important to you, use the tools available and give your best Final Answer,
|
||||
your job depends on it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o",
|
||||
"n": 1, "stop": ["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '951'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.34.0
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.34.0
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Cats"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
are"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
natural"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
hunters"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
with"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
unique"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
personalities"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
strong"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
territorial"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
instincts"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gGlyAPhJWtbWCzqHs7gAID5K4T0X","object":"chat.completion.chunk","created":1719861278,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 89c8b85b1fde4546-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Mon, 01 Jul 2024 19:14:38 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=KdRY51WYOZSedMQIfG_vPmrPdO67._RkSjV0nq.khUk-1719861278-1.0.1.1-r0uJtNVxaGm2OCkQliG.tsX3vekPCDQb2IET3ywu41Igu1Qfz01rhz_WlvKIllsZlXIyd6rvHT7NxLo.UOtD7Q;
|
||||
path=/; expires=Mon, 01-Jul-24 19:44:38 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=Qqu1FBant56uM9Al0JmDl7QMk9cRcojzFUt8volvWPo-1719861278308-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '156'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '16000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '15999783'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_a0c6e5796af340d6720b9cfd55df703e
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
@@ -400126,7 +400126,7 @@ interactions:
|
||||
Content-Type:
|
||||
- application/octet-stream
|
||||
Date:
|
||||
- Wed, 26 Jun 2024 00:47:57 GMT
|
||||
- Sun, 23 Jun 2024 14:23:37 GMT
|
||||
ETag:
|
||||
- '0x8DC736E200CD241'
|
||||
Last-Modified:
|
||||
@@ -400138,128 +400138,24 @@ interactions:
|
||||
x-ms-lease-status:
|
||||
- unlocked
|
||||
x-ms-request-id:
|
||||
- 8134b189-f01e-002a-4362-c7d69f000000
|
||||
- fcf86155-f01e-0015-1278-c51e3c000000
|
||||
x-ms-version:
|
||||
- '2009-09-19'
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: !!binary |
|
||||
CqIhCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkS+SAKEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRKTAQoQeFaYfr5OxyZaJbtlPmhykRIIyBZe8r8vT78qClRvb2wgVXNhZ2UwATmAmXe6
|
||||
bGfcF0FgRXi6bGfcF0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjMyLjJKHwoJdG9vbF9uYW1lEhIK
|
||||
EGdldF9maW5hbF9hbnN3ZXJKDgoIYXR0ZW1wdHMSAhgBegIYAYUBAAEAABKTAQoQDWl8VO5eDhpC
|
||||
8WTRhE7lIxIIjCV/tIXBg+gqClRvb2wgVXNhZ2UwATmQ09a7bGfcF0EYZNe7bGfcF0oaCg5jcmV3
|
||||
YWlfdmVyc2lvbhIICgYwLjMyLjJKHwoJdG9vbF9uYW1lEhIKEGdldF9maW5hbF9hbnN3ZXJKDgoI
|
||||
YXR0ZW1wdHMSAhgBegIYAYUBAAEAABKTAQoQdZyAk3NfWDJ95Rl9fwm9JBIIeOLAjtRHX8YqClRv
|
||||
b2wgVXNhZ2UwATn4LiPAbGfcF0Hw1iPAbGfcF0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjMyLjJK
|
||||
HwoJdG9vbF9uYW1lEhIKEGdldF9maW5hbF9hbnN3ZXJKDgoIYXR0ZW1wdHMSAhgBegIYAYUBAAEA
|
||||
ABLQAQoQ9B/WTcQbD55N9my9PwwJ/RII61GsN4ZK8lsqEFRvb2wgVXNhZ2UgRXJyb3IwATn4XSTC
|
||||
bGfcF0F4GSXCbGfcF0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjMyLjJKZgoDbGxtEl8KXXsibmFt
|
||||
ZSI6IG51bGwsICJtb2RlbF9uYW1lIjogImdwdC00LTAxMjUtcHJldmlldyIsICJ0ZW1wZXJhdHVy
|
||||
ZSI6IDAuNywgImNsYXNzIjogIkNoYXRPcGVuQUkifXoCGAGFAQABAAAS0AEKEN0yjmyVck//dsMA
|
||||
AJt7gVsSCE64+GH0u+SCKhBUb29sIFVzYWdlIEVycm9yMAE5iIgTw2xn3BdBODwUw2xn3BdKGgoO
|
||||
Y3Jld2FpX3ZlcnNpb24SCAoGMC4zMi4ySmYKA2xsbRJfCl17Im5hbWUiOiBudWxsLCAibW9kZWxf
|
||||
bmFtZSI6ICJncHQtNC0wMTI1LXByZXZpZXciLCAidGVtcGVyYXR1cmUiOiAwLjcsICJjbGFzcyI6
|
||||
ICJDaGF0T3BlbkFJIn16AhgBhQEAAQAAEtABChDHSt0G/EqmGlI1IdhjDtotEgj83Lu7mYSDsyoQ
|
||||
VG9vbCBVc2FnZSBFcnJvcjABOZCYGMRsZ9wXQXBEGcRsZ9wXShoKDmNyZXdhaV92ZXJzaW9uEggK
|
||||
BjAuMzIuMkpmCgNsbG0SXwpdeyJuYW1lIjogbnVsbCwgIm1vZGVsX25hbWUiOiAiZ3B0LTQtMDEy
|
||||
NS1wcmV2aWV3IiwgInRlbXBlcmF0dXJlIjogMC43LCAiY2xhc3MiOiAiQ2hhdE9wZW5BSSJ9egIY
|
||||
AYUBAAEAABJoChBpV7cx2tTNYorpmU9Iezq6EgjT9Wm5SX16tioQVG9vbCBVc2FnZSBFcnJvcjAB
|
||||
ObDiWsdsZ9wXQfB+W8dsZ9wXShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuMzIuMnoCGAGFAQABAAAS
|
||||
kwEKECe14QwK8YEX5k/HAyTJJOESCLC5kWxi9XjbKgpUb29sIFVzYWdlMAE5AHmbyGxn3BdBcA2c
|
||||
yGxn3BdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zMi4ySh8KCXRvb2xfbmFtZRISChBnZXRfZmlu
|
||||
YWxfYW5zd2VySg4KCGF0dGVtcHRzEgIYAXoCGAGFAQABAAASkwEKECkLzxPcPFShQTRBKOuA5VwS
|
||||
CEXEpf9vgT0qKgpUb29sIFVzYWdlMAE5uIX9yWxn3BdBiAr+yWxn3BdKGgoOY3Jld2FpX3ZlcnNp
|
||||
b24SCAoGMC4zMi4ySh8KCXRvb2xfbmFtZRISChBnZXRfZmluYWxfYW5zd2VySg4KCGF0dGVtcHRz
|
||||
EgIYAXoCGAGFAQABAAASkwEKELKxl/VCmmnFuHppNDaUdLYSCMxTFAfY7LKZKgpUb29sIFVzYWdl
|
||||
MAE5cEBDy2xn3BdBKMlDy2xn3BdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zMi4ySh8KCXRvb2xf
|
||||
bmFtZRISChBnZXRfZmluYWxfYW5zd2VySg4KCGF0dGVtcHRzEgIYAXoCGAGFAQABAAASaAoQWsd1
|
||||
ufYSiTjFfOLUvTXiGhIIUqeG6O/KOEsqEFRvb2wgVXNhZ2UgRXJyb3IwATnonMXObGfcF0FANcbO
|
||||
bGfcF0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjMyLjJ6AhgBhQEAAQAAEpwBChDePaRaxJbHM9ti
|
||||
mgjFTw4eEgiJJ6aTsD4j1CoKVG9vbCBVc2FnZTABObD+Q9FsZ9wXQSCTRNFsZ9wXShoKDmNyZXdh
|
||||
aV92ZXJzaW9uEggKBjAuMzIuMkooCgl0b29sX25hbWUSGwoZRGVsZWdhdGUgd29yayB0byBjb3dv
|
||||
cmtlckoOCghhdHRlbXB0cxICGAF6AhgBhQEAAQAAEpsBChB8AxpI4Iiqli5YpSzNuAHkEgiuoKKr
|
||||
AonvqioKVG9vbCBVc2FnZTABOQhPr9ZsZ9wXQRjzr9ZsZ9wXShoKDmNyZXdhaV92ZXJzaW9uEggK
|
||||
BjAuMzIuMkonCgl0b29sX25hbWUSGgoYQXNrIHF1ZXN0aW9uIHRvIGNvd29ya2VySg4KCGF0dGVt
|
||||
cHRzEgIYAXoCGAGFAQABAAASkQEKEHWST7FZaG5Ij0kYfqRwQFISCGo4iAiKiKriKgpUb29sIFVz
|
||||
YWdlMAE5qKLl2Wxn3BdBuEbm2Wxn3BdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zMi4ySh0KCXRv
|
||||
b2xfbmFtZRIQCg5sZWFybl9hYm91dF9BSUoOCghhdHRlbXB0cxICGAF6AhgBhQEAAQAAEpEBChBW
|
||||
s1jZuOvGnPk5qZAn/3bvEghcLUd+91YchSoKVG9vbCBVc2FnZTABOVBUJttsZ9wXQfDgJttsZ9wX
|
||||
ShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuMzIuMkodCgl0b29sX25hbWUSEAoObGVhcm5fYWJvdXRf
|
||||
QUlKDgoIYXR0ZW1wdHMSAhgBegIYAYUBAAEAABJoChB390oxWJKggBM1q5uqyAIWEgilZ4dcV9H3
|
||||
zyoQVG9vbCBVc2FnZSBFcnJvcjABObCnTtxsZ9wXQYAsT9xsZ9wXShoKDmNyZXdhaV92ZXJzaW9u
|
||||
EggKBjAuMzIuMnoCGAGFAQABAAAS+QEKEGWFgsouWvCkBbw9hOp41fkSCHuQEq4LICdiKgpUb29s
|
||||
IFVzYWdlMAE5aLsF4Wxn3BdBcIoG4Wxn3BdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zMi4ySh0K
|
||||
CXRvb2xfbmFtZRIQCg5sZWFybl9hYm91dF9BSUoOCghhdHRlbXB0cxICGAFKZgoDbGxtEl8KXXsi
|
||||
bmFtZSI6IG51bGwsICJtb2RlbF9uYW1lIjogImdwdC0zLjUtdHVyYm8tMDEyNSIsICJ0ZW1wZXJh
|
||||
dHVyZSI6IDAuNywgImNsYXNzIjogIkNoYXRPcGVuQUkifXoCGAGFAQABAAAS+QEKEH74JqRDD3nP
|
||||
pYFzHXVuRBMSCOVLhi1+G6VtKgpUb29sIFVzYWdlMAE5GDRy4mxn3BdB+N9y4mxn3BdKGgoOY3Jl
|
||||
d2FpX3ZlcnNpb24SCAoGMC4zMi4ySh0KCXRvb2xfbmFtZRIQCg5sZWFybl9hYm91dF9BSUoOCghh
|
||||
dHRlbXB0cxICGAFKZgoDbGxtEl8KXXsibmFtZSI6IG51bGwsICJtb2RlbF9uYW1lIjogImdwdC0z
|
||||
LjUtdHVyYm8tMDEyNSIsICJ0ZW1wZXJhdHVyZSI6IDAuNywgImNsYXNzIjogIkNoYXRPcGVuQUki
|
||||
fXoCGAGFAQABAAAS+QEKELEPPr1EMWxY9eCI/9VxYRISCBifMoJVgs46KgpUb29sIFVzYWdlMAE5
|
||||
WMIV5Gxn3BdBOG4W5Gxn3BdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zMi4ySh0KCXRvb2xfbmFt
|
||||
ZRIQCg5sZWFybl9hYm91dF9BSUoOCghhdHRlbXB0cxICGAFKZgoDbGxtEl8KXXsibmFtZSI6IG51
|
||||
bGwsICJtb2RlbF9uYW1lIjogImdwdC0zLjUtdHVyYm8tMDEyNSIsICJ0ZW1wZXJhdHVyZSI6IDAu
|
||||
NywgImNsYXNzIjogIkNoYXRPcGVuQUkifXoCGAGFAQABAAASnAEKEIj74gWv+Pwo4caZDgnygaYS
|
||||
CETmyodLQY8/KgpUb29sIFVzYWdlMAE5sBTAFm1n3BdBANjAFm1n3BdKGgoOY3Jld2FpX3ZlcnNp
|
||||
b24SCAoGMC4zMi4ySigKCXRvb2xfbmFtZRIbChlEZWxlZ2F0ZSB3b3JrIHRvIGNvd29ya2VySg4K
|
||||
CGF0dGVtcHRzEgIYAXoCGAGFAQABAAASnAEKEDq45gMODvysUf5wQQf4F+0SCHZHV4/9LnhnKgpU
|
||||
b29sIFVzYWdlMAE5EDHQIG1n3BdBwOTQIG1n3BdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zMi4y
|
||||
SigKCXRvb2xfbmFtZRIbChlEZWxlZ2F0ZSB3b3JrIHRvIGNvd29ya2VySg4KCGF0dGVtcHRzEgIY
|
||||
AXoCGAGFAQABAAASnAEKEJHNsGUrqNbXyAOQPZnty9kSCAASe0iLg1zKKgpUb29sIFVzYWdlMAE5
|
||||
kCpPLW1n3BdBmPlPLW1n3BdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zMi4ySigKCXRvb2xfbmFt
|
||||
ZRIbChlEZWxlZ2F0ZSB3b3JrIHRvIGNvd29ya2VySg4KCGF0dGVtcHRzEgIYAXoCGAGFAQABAAAS
|
||||
jQEKEP5ZHkp/aYaAyGuu53Anw2kSCDu8tanZxniRKgpUb29sIFVzYWdlMAE5iFVHWG1n3BdBUAVI
|
||||
WG1n3BdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zMi4yShkKCXRvb2xfbmFtZRIMCgptdWx0aXBs
|
||||
aWVySg4KCGF0dGVtcHRzEgIYAXoCGAGFAQABAAASjQEKEFPBMl1rYMxgfMdCym2d+yUSCIu9WYA8
|
||||
9juvKgpUb29sIFVzYWdlMAE5YK3oWW1n3BdB6D3pWW1n3BdKGgoOY3Jld2FpX3ZlcnNpb24SCAoG
|
||||
MC4zMi4yShkKCXRvb2xfbmFtZRIMCgptdWx0aXBsaWVySg4KCGF0dGVtcHRzEgIYAXoCGAGFAQAB
|
||||
AAASaAoQCQJWjJpe1L2HUfWpwOVCVRIIb/BALasmjQEqEFRvb2wgVXNhZ2UgRXJyb3IwATmAGdJc
|
||||
bWfcF0F4wdJcbWfcF0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjMyLjJ6AhgBhQEAAQAAEmgKED/Z
|
||||
tmI7I3o+4n/FgbLHhtMSCGEtw+VqWPc+KhBUb29sIFVzYWdlIEVycm9yMAE5aDyyXW1n3BdBIMWy
|
||||
XW1n3BdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zMi4yegIYAYUBAAEAAA==
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate, br
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '4261'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.25.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Wed, 26 Jun 2024 00:47:59 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are test role. test backstory\nYour personal
|
||||
goal is: test goalTo give my best complete final answer to the task use the
|
||||
exact following format:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||
my best complete final answer to the task.\nYour final answer must be the great
|
||||
and the most complete as possible, it must be outcome described.\n\nI MUST use
|
||||
these formats, my job depends on it!\nCurrent Task: just say hi!\n\nThis is
|
||||
the expect criteria for your final answer: your greeting \n you MUST return
|
||||
the actual complete content as the final answer, not a summary.\n\nBegin! This
|
||||
is VERY important to you, use the tools available and give your best Final Answer,
|
||||
your job depends on it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o",
|
||||
body: '{"messages": [{"content": "You are {topic} Researcher. You have a lot of
|
||||
experience with {topic}.\nYour personal goal is: Express hot takes on {topic}.To
|
||||
give my best complete final answer to the task use the exact following format:\n\nThought:
|
||||
I now can give a great answer\nFinal Answer: my best complete final answer to
|
||||
the task.\nYour final answer must be the great and the most complete as possible,
|
||||
it must be outcome described.\n\nI MUST use these formats, my job depends on
|
||||
it!\nCurrent Task: Give me an analysis around dog.\n\nThis is the expect criteria
|
||||
for your final answer: 1 bullet point about dog that''s under 15 words. \n you
|
||||
MUST return the actual complete content as the final answer, not a summary.\n\nBegin!
|
||||
This is VERY important to you, use the tools available and give your best Final
|
||||
Answer, your job depends on it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o",
|
||||
"n": 1, "stop": ["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
@@ -400269,13 +400165,13 @@ interactions:
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '853'
|
||||
- '963'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.3
|
||||
- OpenAI/Python 1.34.0
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
@@ -400285,73 +400181,116 @@ interactions:
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.3
|
||||
- 1.34.0
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.9
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9eB7HGovGTNNv8jlZBbEHKCMAa2Sn","object":"chat.completion.chunk","created":1719362879,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
string: 'data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7HGovGTNNv8jlZBbEHKCMAa2Sn","object":"chat.completion.chunk","created":1719362879,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7HGovGTNNv8jlZBbEHKCMAa2Sn","object":"chat.completion.chunk","created":1719362879,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7HGovGTNNv8jlZBbEHKCMAa2Sn","object":"chat.completion.chunk","created":1719362879,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7HGovGTNNv8jlZBbEHKCMAa2Sn","object":"chat.completion.chunk","created":1719362879,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7HGovGTNNv8jlZBbEHKCMAa2Sn","object":"chat.completion.chunk","created":1719362879,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7HGovGTNNv8jlZBbEHKCMAa2Sn","object":"chat.completion.chunk","created":1719362879,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7HGovGTNNv8jlZBbEHKCMAa2Sn","object":"chat.completion.chunk","created":1719362879,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7HGovGTNNv8jlZBbEHKCMAa2Sn","object":"chat.completion.chunk","created":1719362879,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7HGovGTNNv8jlZBbEHKCMAa2Sn","object":"chat.completion.chunk","created":1719362879,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7HGovGTNNv8jlZBbEHKCMAa2Sn","object":"chat.completion.chunk","created":1719362879,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7HGovGTNNv8jlZBbEHKCMAa2Sn","object":"chat.completion.chunk","created":1719362879,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7HGovGTNNv8jlZBbEHKCMAa2Sn","object":"chat.completion.chunk","created":1719362879,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7HGovGTNNv8jlZBbEHKCMAa2Sn","object":"chat.completion.chunk","created":1719362879,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7HGovGTNNv8jlZBbEHKCMAa2Sn","object":"chat.completion.chunk","created":1719362879,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"
|
||||
Hi"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
Dogs"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7HGovGTNNv8jlZBbEHKCMAa2Sn","object":"chat.completion.chunk","created":1719362879,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"!"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
are"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7HGovGTNNv8jlZBbEHKCMAa2Sn","object":"chat.completion.chunk","created":1719362879,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
loyal"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
companions"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
enhance"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
human"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
well"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"-being"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
through"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
emotional"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
support"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIPzOXprkfuIluBhM1o1gL8lm3yu","object":"chat.completion.chunk","created":1719152619,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
@@ -400362,20 +400301,20 @@ interactions:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8999306a3c2b97ff-SJC
|
||||
- 8985231e8e72453b-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Wed, 26 Jun 2024 00:48:00 GMT
|
||||
- Sun, 23 Jun 2024 14:23:39 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=cZ1tqEJHf90QH5Wpf8u5w22pMiXEVmHqh.l_FNQzPUE-1719362880-1.0.1.1-ukAMHHbJtb1dXgdiJ9rZPnK9Dr9955CracMEW21Vc4n80TvL8Aj_Ke_B7G85jBPWrNAAQ0LYFLomosxQMd_qSg;
|
||||
path=/; expires=Wed, 26-Jun-24 01:18:00 GMT; domain=.api.openai.com; HttpOnly;
|
||||
- __cf_bm=eo7Wv_7IoFobCr06vVNbn6fiTV000XuWJ5mG6A5XfCI-1719152619-1.0.1.1-VWt6JrnrLWxl1Heg2Mc9q1an5j9.JHISIS2VP5qbC_YCwRn5WkI_QykzIBPo8kil7ndx45QBS0fUvVDo22tWKQ;
|
||||
path=/; expires=Sun, 23-Jun-24 14:53:39 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=jB8ShrBkfWJf2HStRWtCrxPo9TmM70i91UOQVSXB_ik-1719362880048-0.0.1.1-604800000;
|
||||
- _cfuvid=zHfpY3hveQ2EtdkjxmAEpZPcSMSo.R1IXEnggmqkQEI-1719152619731-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
@@ -400384,41 +400323,41 @@ interactions:
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '766'
|
||||
- '115'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
- max-age=15724800; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '16000000'
|
||||
- '12000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '15999808'
|
||||
- '11999780'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
- 1ms
|
||||
x-request-id:
|
||||
- e2e310e93ef808475e7ac73c1ef60ec0
|
||||
- req_6e1809edb78fae06d8a20ed070c6e64a
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are test role. test backstory\nYour personal
|
||||
goal is: test goalTo give my best complete final answer to the task use the
|
||||
exact following format:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||
my best complete final answer to the task.\nYour final answer must be the great
|
||||
and the most complete as possible, it must be outcome described.\n\nI MUST use
|
||||
these formats, my job depends on it!\nCurrent Task: just say hello!\n\nThis
|
||||
is the expect criteria for your final answer: your greeting \n you MUST return
|
||||
the actual complete content as the final answer, not a summary.\n\nThis is the
|
||||
context you''re working with:\nHi!\n\nBegin! This is VERY important to you,
|
||||
use the tools available and give your best Final Answer, your job depends on
|
||||
it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o", "n": 1, "stop": ["\nObservation"],
|
||||
"stream": true, "temperature": 0.7}'
|
||||
body: '{"messages": [{"content": "You are {topic} Researcher. You have a lot of
|
||||
experience with {topic}.\nYour personal goal is: Express hot takes on {topic}.To
|
||||
give my best complete final answer to the task use the exact following format:\n\nThought:
|
||||
I now can give a great answer\nFinal Answer: my best complete final answer to
|
||||
the task.\nYour final answer must be the great and the most complete as possible,
|
||||
it must be outcome described.\n\nI MUST use these formats, my job depends on
|
||||
it!\nCurrent Task: Give me an analysis around cat.\n\nThis is the expect criteria
|
||||
for your final answer: 1 bullet point about cat that''s under 15 words. \n you
|
||||
MUST return the actual complete content as the final answer, not a summary.\n\nBegin!
|
||||
This is VERY important to you, use the tools available and give your best Final
|
||||
Answer, your job depends on it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o",
|
||||
"n": 1, "stop": ["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
@@ -400427,16 +400366,16 @@ interactions:
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '905'
|
||||
- '963'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=cZ1tqEJHf90QH5Wpf8u5w22pMiXEVmHqh.l_FNQzPUE-1719362880-1.0.1.1-ukAMHHbJtb1dXgdiJ9rZPnK9Dr9955CracMEW21Vc4n80TvL8Aj_Ke_B7G85jBPWrNAAQ0LYFLomosxQMd_qSg;
|
||||
_cfuvid=jB8ShrBkfWJf2HStRWtCrxPo9TmM70i91UOQVSXB_ik-1719362880048-0.0.1.1-604800000
|
||||
- __cf_bm=eo7Wv_7IoFobCr06vVNbn6fiTV000XuWJ5mG6A5XfCI-1719152619-1.0.1.1-VWt6JrnrLWxl1Heg2Mc9q1an5j9.JHISIS2VP5qbC_YCwRn5WkI_QykzIBPo8kil7ndx45QBS0fUvVDo22tWKQ;
|
||||
_cfuvid=zHfpY3hveQ2EtdkjxmAEpZPcSMSo.R1IXEnggmqkQEI-1719152619731-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.3
|
||||
- OpenAI/Python 1.34.0
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
@@ -400446,73 +400385,116 @@ interactions:
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.3
|
||||
- 1.34.0
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.9
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9eB7I99F3dYCupsw3MZZeCiuDZbnU","object":"chat.completion.chunk","created":1719362880,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
string: 'data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7I99F3dYCupsw3MZZeCiuDZbnU","object":"chat.completion.chunk","created":1719362880,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7I99F3dYCupsw3MZZeCiuDZbnU","object":"chat.completion.chunk","created":1719362880,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7I99F3dYCupsw3MZZeCiuDZbnU","object":"chat.completion.chunk","created":1719362880,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7I99F3dYCupsw3MZZeCiuDZbnU","object":"chat.completion.chunk","created":1719362880,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7I99F3dYCupsw3MZZeCiuDZbnU","object":"chat.completion.chunk","created":1719362880,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7I99F3dYCupsw3MZZeCiuDZbnU","object":"chat.completion.chunk","created":1719362880,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7I99F3dYCupsw3MZZeCiuDZbnU","object":"chat.completion.chunk","created":1719362880,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7I99F3dYCupsw3MZZeCiuDZbnU","object":"chat.completion.chunk","created":1719362880,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7I99F3dYCupsw3MZZeCiuDZbnU","object":"chat.completion.chunk","created":1719362880,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7I99F3dYCupsw3MZZeCiuDZbnU","object":"chat.completion.chunk","created":1719362880,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7I99F3dYCupsw3MZZeCiuDZbnU","object":"chat.completion.chunk","created":1719362880,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7I99F3dYCupsw3MZZeCiuDZbnU","object":"chat.completion.chunk","created":1719362880,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7I99F3dYCupsw3MZZeCiuDZbnU","object":"chat.completion.chunk","created":1719362880,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7I99F3dYCupsw3MZZeCiuDZbnU","object":"chat.completion.chunk","created":1719362880,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"
|
||||
Hello"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
Cats"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7I99F3dYCupsw3MZZeCiuDZbnU","object":"chat.completion.chunk","created":1719362880,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{"content":"!"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
are"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9eB7I99F3dYCupsw3MZZeCiuDZbnU","object":"chat.completion.chunk","created":1719362880,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5bf7397cd3","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
independent"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":","},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
curious"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
creatures"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
with"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
strong"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
sense"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
territory"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ0Z7g9gLoEyFe4tuHZY3zDHIxh","object":"chat.completion.chunk","created":1719152620,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
@@ -400523,13 +400505,13 @@ interactions:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 89993072097897ff-SJC
|
||||
- 898523253f8c453b-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Wed, 26 Jun 2024 00:48:00 GMT
|
||||
- Sun, 23 Jun 2024 14:23:40 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
@@ -400539,26 +400521,230 @@ interactions:
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '134'
|
||||
- '121'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
- max-age=15724800; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '16000000'
|
||||
- '12000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '15999794'
|
||||
- '11999779'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
- 1ms
|
||||
x-request-id:
|
||||
- 95f7e7603df593de5f778bfbd77838c9
|
||||
- req_fcdeaba57c8c0b5c80fdaaf4f3949da3
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are {topic} Researcher. You have a lot of
|
||||
experience with {topic}.\nYour personal goal is: Express hot takes on {topic}.To
|
||||
give my best complete final answer to the task use the exact following format:\n\nThought:
|
||||
I now can give a great answer\nFinal Answer: my best complete final answer to
|
||||
the task.\nYour final answer must be the great and the most complete as possible,
|
||||
it must be outcome described.\n\nI MUST use these formats, my job depends on
|
||||
it!\nCurrent Task: Give me an analysis around apple.\n\nThis is the expect criteria
|
||||
for your final answer: 1 bullet point about apple that''s under 15 words. \n
|
||||
you MUST return the actual complete content as the final answer, not a summary.\n\nBegin!
|
||||
This is VERY important to you, use the tools available and give your best Final
|
||||
Answer, your job depends on it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o",
|
||||
"n": 1, "stop": ["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '967'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=eo7Wv_7IoFobCr06vVNbn6fiTV000XuWJ5mG6A5XfCI-1719152619-1.0.1.1-VWt6JrnrLWxl1Heg2Mc9q1an5j9.JHISIS2VP5qbC_YCwRn5WkI_QykzIBPo8kil7ndx45QBS0fUvVDo22tWKQ;
|
||||
_cfuvid=zHfpY3hveQ2EtdkjxmAEpZPcSMSo.R1IXEnggmqkQEI-1719152619731-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.34.0
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.34.0
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":".\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
Apple"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
revolution"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"ized"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
technology"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
with"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
i"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"Phone"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":","},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
transforming"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
communication"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
media"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"
|
||||
consumption"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9dIQ14W5gaHA4p3dYi6n7FrRBnzhN","object":"chat.completion.chunk","created":1719152621,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_5e6c71d4a8","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8985232a4e71453b-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Sun, 23 Jun 2024 14:23:41 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '108'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=15724800; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '12000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '11999779'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 1ms
|
||||
x-request-id:
|
||||
- req_d8c11c52c26f88d19413eb84f91587da
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
1431
tests/cassettes/test_crew_kickoff_usage_metrics.yaml
Normal file
1431
tests/cassettes/test_crew_kickoff_usage_metrics.yaml
Normal file
File diff suppressed because it is too large
Load Diff
258
tests/cassettes/test_custom_converter_cls.yaml
Normal file
258
tests/cassettes/test_custom_converter_cls.yaml
Normal file
@@ -0,0 +1,258 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are Scorer. You''re an expert scorer, specialized
|
||||
in scoring titles.\nYour personal goal is: Score the titleTo give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: my best complete final answer to the task.\nYour
|
||||
final answer must be the great and the most complete as possible, it must be
|
||||
outcome described.\n\nI MUST use these formats, my job depends on it!\nCurrent
|
||||
Task: Give me an integer score between 1-5 for the following title: ''The impact
|
||||
of AI in the future of work''\n\nThis is the expect criteria for your final
|
||||
answer: The score of the title. \n you MUST return the actual complete content
|
||||
as the final answer, not a summary.\n\nBegin! This is VERY important to you,
|
||||
use the tools available and give your best Final Answer, your job depends on
|
||||
it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o", "n": 1, "stop": ["\nObservation"],
|
||||
"stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '997'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9hrsMKHuOkxqftWK9DtuC10VCJ17t","object":"chat.completion.chunk","created":1720242230,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hrsMKHuOkxqftWK9DtuC10VCJ17t","object":"chat.completion.chunk","created":1720242230,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hrsMKHuOkxqftWK9DtuC10VCJ17t","object":"chat.completion.chunk","created":1720242230,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hrsMKHuOkxqftWK9DtuC10VCJ17t","object":"chat.completion.chunk","created":1720242230,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hrsMKHuOkxqftWK9DtuC10VCJ17t","object":"chat.completion.chunk","created":1720242230,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hrsMKHuOkxqftWK9DtuC10VCJ17t","object":"chat.completion.chunk","created":1720242230,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hrsMKHuOkxqftWK9DtuC10VCJ17t","object":"chat.completion.chunk","created":1720242230,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hrsMKHuOkxqftWK9DtuC10VCJ17t","object":"chat.completion.chunk","created":1720242230,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hrsMKHuOkxqftWK9DtuC10VCJ17t","object":"chat.completion.chunk","created":1720242230,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hrsMKHuOkxqftWK9DtuC10VCJ17t","object":"chat.completion.chunk","created":1720242230,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hrsMKHuOkxqftWK9DtuC10VCJ17t","object":"chat.completion.chunk","created":1720242230,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hrsMKHuOkxqftWK9DtuC10VCJ17t","object":"chat.completion.chunk","created":1720242230,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hrsMKHuOkxqftWK9DtuC10VCJ17t","object":"chat.completion.chunk","created":1720242230,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hrsMKHuOkxqftWK9DtuC10VCJ17t","object":"chat.completion.chunk","created":1720242230,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hrsMKHuOkxqftWK9DtuC10VCJ17t","object":"chat.completion.chunk","created":1720242230,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hrsMKHuOkxqftWK9DtuC10VCJ17t","object":"chat.completion.chunk","created":1720242230,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"4"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hrsMKHuOkxqftWK9DtuC10VCJ17t","object":"chat.completion.chunk","created":1720242230,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 89ed0cf0dc05741a-MIA
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Sat, 06 Jul 2024 05:03:50 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=JI76H4xxreAnMx1JJoPragplAdYdjbDNA68Hr3Cs_0k-1720242230-1.0.1.1-oHSrtm.ejkvCiAHC11lg0MnvmopYZayTZRq09IcH2yh5BA6FyyufGH7Rm59BAz.gdZHc0izmjElXfLiu2bZ_jQ;
|
||||
path=/; expires=Sat, 06-Jul-24 05:33:50 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=X4.n0cNP9j1jseIPV4H1aDJu2xrsAwcUI8rY0tbLc40-1720242230210-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '71'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '16000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '15999772'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_8dc1d49d85fcf8e39601e32ca80abd6b
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "user", "content": "4"}, {"role": "system", "content":
|
||||
"I''m gonna convert this raw text into valid JSON."}], "model": "gpt-4o", "tool_choice":
|
||||
{"type": "function", "function": {"name": "ScoreOutput"}}, "tools": [{"type":
|
||||
"function", "function": {"name": "ScoreOutput", "description": "Correctly extracted
|
||||
`ScoreOutput` with all the required parameters with correct types", "parameters":
|
||||
{"properties": {"score": {"title": "Score", "type": "integer"}}, "required":
|
||||
["score"], "type": "object"}}}]}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '519'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=JI76H4xxreAnMx1JJoPragplAdYdjbDNA68Hr3Cs_0k-1720242230-1.0.1.1-oHSrtm.ejkvCiAHC11lg0MnvmopYZayTZRq09IcH2yh5BA6FyyufGH7Rm59BAz.gdZHc0izmjElXfLiu2bZ_jQ;
|
||||
_cfuvid=X4.n0cNP9j1jseIPV4H1aDJu2xrsAwcUI8rY0tbLc40-1720242230210-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA2xSS2/bMAy++1cIPNeF81pT34YBG9A13aHAgL5gKArtKJVFTaKBtkH++yDFi91g
|
||||
PggEP34PkN5nQoDeQClAbSWr1pn8euvD6id9OP3V3C6bzZ2qd7eT1f3DykgFF5FB6x0q/se6VNQ6
|
||||
g6zJHmHlUTJG1cnVtJjOp9NZkYCWNmgirXGczymPYF4s8smsJ25JKwxQiqdMCCH26Y0R7QbfoBRJ
|
||||
JnVaDEE2COVpSAjwZGIHZAg6sLQMFwOoyDLamNp2xowAJjKVksYMxsdvP6qHPUljqsI2duX13ePN
|
||||
w2/7HT/+8Lfd45cfYeR3lH53KVDdWXXazwg/9cszMyHAyjZx7xV5/NWx6/iMLgRI33QtWo7RYf8M
|
||||
IQ4/Qzk/wKfRQ/a/+qWvDqe1Gmqcp3U42xLU2uqwrTzKkNJCYHJHiyj3ks7XfboIOE+t44rpFW0U
|
||||
XPbXg+F/GcBFjzGxNCPOIuvjQXgPjG1Va9ugd16nU0LtqnlRLHG2vppcQ3bI/gIAAP//AwCtLU45
|
||||
0wIAAA==
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 89ed0cf40ebc741a-MIA
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Sat, 06 Jul 2024 05:03:50 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '186'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '16000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '15999969'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_5da164d15ccb331864aeb5d3562969aa
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
@@ -0,0 +1,193 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are Researcher. You''re an expert researcher,
|
||||
specialized in technology, software engineering, AI and startups. You work as
|
||||
a freelancer and is now working on doing research and analysis for a new customer.\nYour
|
||||
personal goal is: Make the best research and analysis on content about AI and
|
||||
AI agentsTo give my best complete final answer to the task use the exact following
|
||||
format:\n\nThought: I now can give a great answer\nFinal Answer: my best complete
|
||||
final answer to the task.\nYour final answer must be the great and the most
|
||||
complete as possible, it must be outcome described.\n\nI MUST use these formats,
|
||||
my job depends on it!\nCurrent Task: Look at the available data nd give me a
|
||||
sense on the total number of sales.\n\nThis is the expect criteria for your
|
||||
final answer: The total number of sales as an integer \n you MUST return the
|
||||
actual complete content as the final answer, not a summary.\n\nBegin! This is
|
||||
VERY important to you, use the tools available and give your best Final Answer,
|
||||
your job depends on it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o",
|
||||
"n": 1, "stop": ["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1178'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.34.0
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.34.0
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
The"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
total"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
number"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
sales"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
is"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"150"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"0"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9gJkkQs40FNqD9UjPrPbDEUN4XeLR","object":"chat.completion.chunk","created":1719872734,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 89c9d0107c8abd30-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Mon, 01 Jul 2024 22:25:35 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=xIvvDveyc7bpEywphx5N4EscKoZiGAT_yDVu3aFAWZ4-1719872735-1.0.1.1-ZOUYc2kEes8fxrMFgGdVppzOh9nPbl4y1Syv73ORt38FBXePWFSTJrFZCZRU.zob6ks9nWzr2vBIZbBQdAOOGQ;
|
||||
path=/; expires=Mon, 01-Jul-24 22:55:35 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=aG1BGRRkNAyxmctM98.DLqSNJ2Cx_OQYsMRQbd03.bo-1719872735091-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '80'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '16000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '15999725'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 1ms
|
||||
x-request-id:
|
||||
- req_c90015b7584729268f48a8b33ff7c5ea
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
235903
tests/cassettes/test_hierarchical_async_task_execution_completion.yaml
Normal file
235903
tests/cassettes/test_hierarchical_async_task_execution_completion.yaml
Normal file
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
1464
tests/cassettes/test_kickoff_for_each_multiple_inputs.yaml
Normal file
1464
tests/cassettes/test_kickoff_for_each_multiple_inputs.yaml
Normal file
File diff suppressed because it is too large
Load Diff
192
tests/cassettes/test_kickoff_for_each_single_input.yaml
Normal file
192
tests/cassettes/test_kickoff_for_each_single_input.yaml
Normal file
@@ -0,0 +1,192 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are dog Researcher. You have a lot of experience
|
||||
with dog.\nYour personal goal is: Express hot takes on dog.To give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: my best complete final answer to the task.\nYour
|
||||
final answer must be the great and the most complete as possible, it must be
|
||||
outcome described.\n\nI MUST use these formats, my job depends on it!\nCurrent
|
||||
Task: Give me an analysis around dog.\n\nThis is the expect criteria for your
|
||||
final answer: 1 bullet point about dog that''s under 15 words. \n you MUST return
|
||||
the actual complete content as the final answer, not a summary.\n\nBegin! This
|
||||
is VERY important to you, use the tools available and give your best Final Answer,
|
||||
your job depends on it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o",
|
||||
"n": 1, "stop": ["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '951'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
Dogs"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
are"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
unparalleled"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
in"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
loyalty"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
companionship"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
to"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
humans"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9j74gJiY9YxFxbeZ5jmpPclWEeaiP","object":"chat.completion.chunk","created":1720538982,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8a0959de6b916783-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Tue, 09 Jul 2024 15:29:42 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=LA.xC.jE_aMjiSgGgU6kDsBPhb_akgqn_4Rx.jXYfnQ-1720538982-1.0.1.1-l5Q1BHprIz5Jxb4HWyYsMfbg6mEnP2H95Vxt89ez24pKOb__90s8LJBBqK52zmPNcPYSSUcaR0wRAaSVFoa4Fw;
|
||||
path=/; expires=Tue, 09-Jul-24 15:59:42 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=zzJ51X.VwRkIq7VLCg9xPQGbXoUmAH6b.2g6sf6Y58Y-1720538982657-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '240'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '22000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '21999783'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_abdec68aded596628dfd5b999919447d
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
@@ -1,82 +1,33 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: !!binary |
|
||||
Cp4ICiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkS9QcKEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRLeBwoQLUn1O+7mxvq8vHmZBVzP4hII4O2Vm5CqaQMqDENyZXcgQ3JlYXRlZDABOQj8
|
||||
EY+F/7gXQZhVFI+F/7gXShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuMTYuM0oaCg5weXRob25fdmVy
|
||||
c2lvbhIICgYzLjExLjdKMQoHY3Jld19pZBImCiQxNGEyZTE0YS0yZmU3LTQxMTEtYWI4OS05YTRl
|
||||
YmI2MmQ2OGFKHAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKFQoNY3Jld19sYW5ndWFnZRIE
|
||||
CgJlbkoaChRjcmV3X251bWJlcl9vZl90YXNrcxICGAFKGwoVY3Jld19udW1iZXJfb2ZfYWdlbnRz
|
||||
EgIYAUrJAgoLY3Jld19hZ2VudHMSuQIKtgJbeyJpZCI6ICIyNWE2NzNmNy1kM2NkLTQxMWYtOWFh
|
||||
Ni0wODZlY2IzMTgyODUiLCAicm9sZSI6ICJTY29yZXIiLCAibWVtb3J5X2VuYWJsZWQ/IjogZmFs
|
||||
c2UsICJ2ZXJib3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAxNSwgIm1heF9ycG0iOiBudWxsLCAi
|
||||
aTE4biI6ICJlbiIsICJsbG0iOiAie1wibmFtZVwiOiBudWxsLCBcIm1vZGVsX25hbWVcIjogXCJn
|
||||
cHQtNFwiLCBcInRlbXBlcmF0dXJlXCI6IDAuNywgXCJjbGFzc1wiOiBcIkNoYXRPcGVuQUlcIn0i
|
||||
LCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6IGZhbHNlLCAidG9vbHNfbmFtZXMiOiBbXX1dSoYBCgpj
|
||||
cmV3X3Rhc2tzEngKdlt7ImlkIjogIjYxMjIyYjU0LTFlNDYtNDhlYy1hNTkwLTU3Y2VkNTI1OTE0
|
||||
OSIsICJhc3luY19leGVjdXRpb24/IjogZmFsc2UsICJhZ2VudF9yb2xlIjogIlNjb3JlciIsICJ0
|
||||
b29sc19uYW1lcyI6IFtdfV1KKAoIcGxhdGZvcm0SHAoabWFjT1MtMTQuMy1hcm02NC1hcm0tNjRi
|
||||
aXRKHAoQcGxhdGZvcm1fcmVsZWFzZRIICgYyMy4zLjBKGwoPcGxhdGZvcm1fc3lzdGVtEggKBkRh
|
||||
cndpbkp7ChBwbGF0Zm9ybV92ZXJzaW9uEmcKZURhcndpbiBLZXJuZWwgVmVyc2lvbiAyMy4zLjA6
|
||||
IFdlZCBEZWMgMjAgMjE6MzA6NTkgUFNUIDIwMjM7IHJvb3Q6eG51LTEwMDAyLjgxLjV+Ny9SRUxF
|
||||
QVNFX0FSTTY0X1Q2MDMwSgoKBGNwdXMSAhgMegIYAQ==
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate, br
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '1057'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.23.0
|
||||
method: POST
|
||||
uri: http://telemetry.crewai.com:4318/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Sat, 02 Mar 2024 16:30:10 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "user", "content": "You are Scorer. You''re an expert
|
||||
scorer, specialized in scoring titles.\nYour personal goal is: Score the titleTo
|
||||
give my best complete final answer to the task use the exact following format:\n\nThought:
|
||||
I now can give a great answer\nFinal Answer: my best complete final answer to
|
||||
the task.\nYour final answer must be the great and the most complete as possible,
|
||||
it must be outcome described.\n\nI MUST use these formats, my job depends on
|
||||
it!\n\nThought: \n\nCurrent Task: Give me an integer score between 1-5 for the
|
||||
following title: ''The impact of AI in the future of work''\n\nThis is the expect
|
||||
criteria for your final answer: The score of the title. \n you MUST return the
|
||||
actual complete content as the final answer, not a summary.\n\nBegin! This is
|
||||
VERY important to you, use the tools available and give your best Final Answer,
|
||||
your job depends on it!\n\nThought: \n"}], "model": "gpt-4", "n": 1, "stop":
|
||||
["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
body: '{"messages": [{"content": "You are Scorer. You''re an expert scorer, specialized
|
||||
in scoring titles.\nYour personal goal is: Score the titleTo give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: my best complete final answer to the task.\nYour
|
||||
final answer must be the great and the most complete as possible, it must be
|
||||
outcome described.\n\nI MUST use these formats, my job depends on it!\nCurrent
|
||||
Task: Give me an integer score between 1-5 for the following title: ''The impact
|
||||
of AI in the future of work''\n\nThis is the expect criteria for your final
|
||||
answer: The score of the title. \n you MUST return the actual complete content
|
||||
as the final answer, not a summary.\n\nBegin! This is VERY important to you,
|
||||
use the tools available and give your best Final Answer, your job depends on
|
||||
it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o", "n": 1, "stop": ["\nObservation"],
|
||||
"stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1012'
|
||||
- '997'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.12.0
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
@@ -86,7 +37,7 @@ interactions:
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.12.0
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
@@ -95,424 +46,64 @@ interactions:
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
string: 'data: {"id":"chatcmpl-9hEEAWWzwQR7GSV8yjob2TVqUZZFF","object":"chat.completion.chunk","created":1720089822,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"The"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hEEAWWzwQR7GSV8yjob2TVqUZZFF","object":"chat.completion.chunk","created":1720089822,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
title"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hEEAWWzwQR7GSV8yjob2TVqUZZFF","object":"chat.completion.chunk","created":1720089822,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
''"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"The"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
impact"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
in"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
future"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
work"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"''"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
is"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
highly"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
relevant"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
in"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
today"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"''s"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
world"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
where"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
artificial"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
intelligence"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
is"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
increasingly"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
becoming"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
part"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
our"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
daily"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
lives"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
It"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
is"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
future"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"-focused"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
topic"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":","},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
hint"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"ing"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
at"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
exploration"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
how"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
could"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
potentially"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
transform"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
workforce"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
The"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
title"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
is"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
clear"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":","},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
concise"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":","},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"-pro"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"v"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"oking"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
However"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":","},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
it"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
could"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
be"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
more"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
engaging"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
if"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
it"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
posed"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
question"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
or"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
suggested"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
particular"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
stance"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
or"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
perspective"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
on"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
issue"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
\n\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9hEEAWWzwQR7GSV8yjob2TVqUZZFF","object":"chat.completion.chunk","created":1720089822,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
would"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hEEAWWzwQR7GSV8yjob2TVqUZZFF","object":"chat.completion.chunk","created":1720089822,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9hEEAWWzwQR7GSV8yjob2TVqUZZFF","object":"chat.completion.chunk","created":1720089822,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hEEAWWzwQR7GSV8yjob2TVqUZZFF","object":"chat.completion.chunk","created":1720089822,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
title"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9hEEAWWzwQR7GSV8yjob2TVqUZZFF","object":"chat.completion.chunk","created":1720089822,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
score"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hEEAWWzwQR7GSV8yjob2TVqUZZFF","object":"chat.completion.chunk","created":1720089822,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hEEAWWzwQR7GSV8yjob2TVqUZZFF","object":"chat.completion.chunk","created":1720089822,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9hEEAWWzwQR7GSV8yjob2TVqUZZFF","object":"chat.completion.chunk","created":1720089822,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hEEAWWzwQR7GSV8yjob2TVqUZZFF","object":"chat.completion.chunk","created":1720089822,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hEEAWWzwQR7GSV8yjob2TVqUZZFF","object":"chat.completion.chunk","created":1720089822,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hEEAWWzwQR7GSV8yjob2TVqUZZFF","object":"chat.completion.chunk","created":1720089822,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hEEAWWzwQR7GSV8yjob2TVqUZZFF","object":"chat.completion.chunk","created":1720089822,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"4"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hEEAWWzwQR7GSV8yjob2TVqUZZFF","object":"chat.completion.chunk","created":1720089822,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"4"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
out"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"5"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMXQdHZBcMpwt1od7QnLwjVHeuWA","object":"chat.completion.chunk","created":1709397008,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
data: {"id":"chatcmpl-9hEEAWWzwQR7GSV8yjob2TVqUZZFF","object":"chat.completion.chunk","created":1720089822,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
@@ -523,83 +114,76 @@ interactions:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 85e2c5046bf300ef-GRU
|
||||
Cache-Control:
|
||||
- no-cache, must-revalidate
|
||||
- 89de840c8999da1f-MIA
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Sat, 02 Mar 2024 16:30:08 GMT
|
||||
- Thu, 04 Jul 2024 10:43:42 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=.grddhTZJdLEJv6GuMnT78ns0mXkV0eSjYeNYhv3dUI-1709397008-1.0.1.1-2sKuVrpd4U3_FRXASU35ZcGL9Q2OZaWMHt3BdGQulV3dZKzn_JyPxnVdEkAM9jjGll2htywS.q3gux93Qh_RjA;
|
||||
path=/; expires=Sat, 02-Mar-24 17:00:08 GMT; domain=.api.openai.com; HttpOnly;
|
||||
- __cf_bm=2Hu0ME75.DNvnesykox1YodaKBclvB_2BKv5lSWfLl0-1720089822-1.0.1.1-4HrDYxtEuvqeonoEr_FZXY8l5Fn2Q1Z08vA.lJKhLWn1bRSsZ.FcJUbAeXPIPGvh6vlSidcfl2yXwKmVe5SyRQ;
|
||||
path=/; expires=Thu, 04-Jul-24 11:13:42 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=rikyw.atvG.At.eOQtg3heSMQKPGLGM66nq3EHbnUZ0-1709397008590-0.0.1.1-604800000;
|
||||
- _cfuvid=7l4V6F6gFvbHLEpNNJMf6OwfmVG.lHcwS8czqjpjDTY-1720089822487-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
access-control-allow-origin:
|
||||
- '*'
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-model:
|
||||
- gpt-4-0613
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '250'
|
||||
- '120'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=15724800; includeSubDomains
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '300000'
|
||||
- '16000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '299768'
|
||||
- '15999772'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 46ms
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_c05576522c6a556b562ddcf905cd08ee
|
||||
- req_16b0326c59b800232bd3b81982efca66
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "user", "content": "I would give the title a score
|
||||
of 4 out of 5."}, {"role": "system", "content": "I''m gonna convert this raw
|
||||
text into valid JSON."}], "model": "gpt-4", "tool_choice": {"type": "function",
|
||||
"function": {"name": "ScoreOutput"}}, "tools": [{"type": "function", "function":
|
||||
{"name": "ScoreOutput", "description": "Correctly extracted `ScoreOutput` with
|
||||
all the required parameters with correct types", "parameters": {"properties":
|
||||
{"score": {"title": "Score", "type": "integer"}}, "required": ["score"], "type":
|
||||
"object"}}}]}'
|
||||
body: '{"messages": [{"role": "user", "content": "4"}, {"role": "system", "content":
|
||||
"I''m gonna convert this raw text into valid JSON."}], "model": "gpt-4o", "tool_choice":
|
||||
{"type": "function", "function": {"name": "ScoreOutput"}}, "tools": [{"type":
|
||||
"function", "function": {"name": "ScoreOutput", "description": "Correctly extracted
|
||||
`ScoreOutput` with all the required parameters with correct types", "parameters":
|
||||
{"properties": {"score": {"title": "Score", "type": "integer"}}, "required":
|
||||
["score"], "type": "object"}}}]}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '562'
|
||||
- '519'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=.grddhTZJdLEJv6GuMnT78ns0mXkV0eSjYeNYhv3dUI-1709397008-1.0.1.1-2sKuVrpd4U3_FRXASU35ZcGL9Q2OZaWMHt3BdGQulV3dZKzn_JyPxnVdEkAM9jjGll2htywS.q3gux93Qh_RjA;
|
||||
_cfuvid=rikyw.atvG.At.eOQtg3heSMQKPGLGM66nq3EHbnUZ0-1709397008590-0.0.1.1-604800000
|
||||
- __cf_bm=2Hu0ME75.DNvnesykox1YodaKBclvB_2BKv5lSWfLl0-1720089822-1.0.1.1-4HrDYxtEuvqeonoEr_FZXY8l5Fn2Q1Z08vA.lJKhLWn1bRSsZ.FcJUbAeXPIPGvh6vlSidcfl2yXwKmVe5SyRQ;
|
||||
_cfuvid=7l4V6F6gFvbHLEpNNJMf6OwfmVG.lHcwS8czqjpjDTY-1720089822487-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.12.0
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
@@ -609,7 +193,7 @@ interactions:
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.12.0
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
@@ -619,60 +203,55 @@ interactions:
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
ISALACDGO+f7GHXv9AhjPyCk5VBp4/m/Gbg0aA+KxtQ2/dItGPYCmVqiCQTU2dk6F7c79HpkTXh2
|
||||
KwCmASWokr9aGa/bs/XV6+Pt6bp5uB3qfBBfP/THr8YsL5amemJLAHT/WajqnnjHxUBNnYUgJ5ef
|
||||
hQEl+tPefDif9vp94R3jglBTgrGv26N2b9If9jpNXKrCihIfAgC2pRHAd++7lOi1WhEvDIES+BHA
|
||||
0umQEvyrqrSq/2zNFiAXtm5SwjZaC+dq5/SP+tOaHMrBbT23SH/90/rn7jw6mL6a2/fL/k3vcRXN
|
||||
3o7WVeFPOCH1zNoPCvXAKMWgsjpJGAFo/8yE3npQrgxvmto3tQoDoIl/TwluPy3wyUq5MvykxOjT
|
||||
7qm6cS/q2Bc9vqdYRu1iX7r/asFwRq0Dfm4IG6MEq9p5GmsvgC9LVhvjXKEvnfH1T+3y0FaUmPdV
|
||||
AtwzeQCccj88kBOZ/V5PjJFgGIGVnyi1cVj6MjWrLPbCAAM=
|
||||
H4sIAAAAAAAAA2xSS2vjMBC++1eIOcfFtZNt4lsXCkuhj20he9gWo8hjR4msEZLcJoT890WOG7th
|
||||
fRDDfPM9mPEhYgxkCTkDseZeNEbFi/Xd3a1Ltj/Sj336uXxpVLmp6Zd8XFYPTzAJDFptUPgv1pWg
|
||||
xij0kvQJFha5x6B6fZMmyXwxT9MOaKhEFWi18fGU4jRJp3Eyi6+znrgmKdBBzv5GjDF26N4QUZe4
|
||||
g5wlk69Og87xGiE/DzEGllToAHdOOs+1h8kACtIedUitW6VGgCdSheBKDcan7zCqhz1xpQr7vN9k
|
||||
7a583NHvlxXOtvXzzz+3y/uR30l6b7pAVavFeT8j/NzPL8wYA82bjvsqyOJT603rL+iMAbd126D2
|
||||
IToc3sCF4TfIp0f4NnqM/le/99XxvFZFtbG0chdbgkpq6daFRe66tOA8mZNFkHvvztd+uwgYS43x
|
||||
hact6iA4768Hw/8ygLMe8+S5GnFmUR8P3N55bIpK6hqtsbI7JVSmEJjcLLIsSyqIjtE/AAAA//8D
|
||||
ANYZ+TrTAgAA
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 85e2c51b9f7900ef-GRU
|
||||
Cache-Control:
|
||||
- no-cache, must-revalidate
|
||||
- 89de84103ab8da1f-MIA
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- br
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Sat, 02 Mar 2024 16:30:12 GMT
|
||||
- Thu, 04 Jul 2024 10:43:43 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
access-control-allow-origin:
|
||||
- '*'
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-model:
|
||||
- gpt-4-0613
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '518'
|
||||
- '235'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=15724800; includeSubDomains
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '300000'
|
||||
- '16000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '299958'
|
||||
- '15999969'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 8ms
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_6cf57525fad025cb1fe9bddb5d669312
|
||||
- req_5d283f799cb8d11c8280f1c07e4132a1
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
|
||||
1209
tests/cassettes/test_output_json_dict_hierarchical.yaml
Normal file
1209
tests/cassettes/test_output_json_dict_hierarchical.yaml
Normal file
File diff suppressed because it is too large
Load Diff
453
tests/cassettes/test_output_json_dict_sequential.yaml
Normal file
453
tests/cassettes/test_output_json_dict_sequential.yaml
Normal file
@@ -0,0 +1,453 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are Scorer. You''re an expert scorer, specialized
|
||||
in scoring titles.\nYour personal goal is: Score the titleTo give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: my best complete final answer to the task.\nYour
|
||||
final answer must be the great and the most complete as possible, it must be
|
||||
outcome described.\n\nI MUST use these formats, my job depends on it!\nCurrent
|
||||
Task: Give me an integer score between 1-5 for the following title: ''The impact
|
||||
of AI in the future of work''\n\nThis is the expect criteria for your final
|
||||
answer: The score of the title. \n you MUST return the actual complete content
|
||||
as the final answer, not a summary.\n\nBegin! This is VERY important to you,
|
||||
use the tools available and give your best Final Answer, your job depends on
|
||||
it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o", "n": 1, "stop": ["\nObservation"],
|
||||
"stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '997'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9jCJmkP063CQ01vF8ENhkPSwN9BQH","object":"chat.completion.chunk","created":1720559138,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJmkP063CQ01vF8ENhkPSwN9BQH","object":"chat.completion.chunk","created":1720559138,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJmkP063CQ01vF8ENhkPSwN9BQH","object":"chat.completion.chunk","created":1720559138,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJmkP063CQ01vF8ENhkPSwN9BQH","object":"chat.completion.chunk","created":1720559138,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJmkP063CQ01vF8ENhkPSwN9BQH","object":"chat.completion.chunk","created":1720559138,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJmkP063CQ01vF8ENhkPSwN9BQH","object":"chat.completion.chunk","created":1720559138,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJmkP063CQ01vF8ENhkPSwN9BQH","object":"chat.completion.chunk","created":1720559138,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJmkP063CQ01vF8ENhkPSwN9BQH","object":"chat.completion.chunk","created":1720559138,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJmkP063CQ01vF8ENhkPSwN9BQH","object":"chat.completion.chunk","created":1720559138,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJmkP063CQ01vF8ENhkPSwN9BQH","object":"chat.completion.chunk","created":1720559138,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJmkP063CQ01vF8ENhkPSwN9BQH","object":"chat.completion.chunk","created":1720559138,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJmkP063CQ01vF8ENhkPSwN9BQH","object":"chat.completion.chunk","created":1720559138,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJmkP063CQ01vF8ENhkPSwN9BQH","object":"chat.completion.chunk","created":1720559138,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJmkP063CQ01vF8ENhkPSwN9BQH","object":"chat.completion.chunk","created":1720559138,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"4"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJmkP063CQ01vF8ENhkPSwN9BQH","object":"chat.completion.chunk","created":1720559138,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8a0b45f368ab6734-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Tue, 09 Jul 2024 21:05:38 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=43lNOCqE3W6gMhKEVIvu20BhU4nI7wyQYcgn28hcb3o-1720559138-1.0.1.1-2pdG6KFn0J2AHC_tnhcxXCqmZ_RyZfwthLi5ET6Aq4v1L9z3EcYxV1D1CeKjOgEBJPLD9GUDdMmIR3h86QYx7w;
|
||||
path=/; expires=Tue, 09-Jul-24 21:35:38 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=T4YvZnF6fWjq7JTPVyPFDIHaXBpT8E23GcG55Q0Ky6A-1720559138248-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '159'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '22000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '21999771'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_a58604fe17d3de5d4491ec972e98312b
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "user", "content": "4"}, {"role": "system", "content":
|
||||
"I''m gonna convert this raw text into valid JSON."}], "model": "gpt-4o", "tool_choice":
|
||||
{"type": "function", "function": {"name": "ScoreOutput"}}, "tools": [{"type":
|
||||
"function", "function": {"name": "ScoreOutput", "description": "Correctly extracted
|
||||
`ScoreOutput` with all the required parameters with correct types", "parameters":
|
||||
{"properties": {"score": {"title": "Score", "type": "integer"}}, "required":
|
||||
["score"], "type": "object"}}}]}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '519'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=43lNOCqE3W6gMhKEVIvu20BhU4nI7wyQYcgn28hcb3o-1720559138-1.0.1.1-2pdG6KFn0J2AHC_tnhcxXCqmZ_RyZfwthLi5ET6Aq4v1L9z3EcYxV1D1CeKjOgEBJPLD9GUDdMmIR3h86QYx7w;
|
||||
_cfuvid=T4YvZnF6fWjq7JTPVyPFDIHaXBpT8E23GcG55Q0Ky6A-1720559138248-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA2xSTW/bMAy9+1cIPMeDncRL7Gt2GNJhOWzdDmthKIriqJFETaK7FkH++yDHjd2g
|
||||
PggEH98HSJ8SxkDtoGIgDpyEcTotn1Zr8/1lvViV9+rY6ru/zhW/8i+HpvA/YRIZuH2Sgt5YnwQa
|
||||
pyUptBdYeMlJRtV8Mc2Kosxnyw4wuJM60hpH6RzTaTadp1mR5rOeeEAlZICK/UkYY+zUvTGi3ckX
|
||||
qFg2eesYGQJvJFTXIcbAo44d4CGoQNwSTAZQoCVpY2rbaj0CCFHXgms9GF++06ge9sS1rk3Y/Cvv
|
||||
Nq75upx+Pj7/vsfnb2ta6ZHfRfrVdYH2rRXX/Yzwa7+6MWMMLDcd94dALzctuZZu6IwB901rpKUY
|
||||
HU4PEOLwA1TzM7wbPScf1Y99db6uVWPjPG7DzZZgr6wKh9pLHrq0EAjdxSLKPXbna99dBJxH46gm
|
||||
PEobBZf99WD4Xwaw6DFC4nrEKZI+HoTXQNLUe2Ub6Z1X3Slh7+p5li3lbLvIS0jOyX8AAAD//wMA
|
||||
s5wGAdMCAAA=
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8a0b45f828146734-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Tue, 09 Jul 2024 21:05:38 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '215'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '22000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '21999969'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_34b28faf23b6422bb0855e6a45c65e92
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"input": ["examples(Examples): Specific scenarios used to explain counting,
|
||||
addition, and subtraction through the zoo theme, such as counting animals, adding
|
||||
new ones, and subtracting adopted ones."], "model": "text-embedding-ada-002",
|
||||
"encoding_format": "base64"}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '265'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/embeddings
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA1SaWw+ySrel779fsbJu6R0RkJp8d5xEzoWAp06nI4gIHpBDFVA7+7938F3Z3X1j
|
||||
IqJgMWvMMZ6q//zXX3/93WR1kQ9///uvv19VP/z9P5Zjt+tw/fvff/3Pf/31119//efv9f87s3hn
|
||||
xe1Wfcrf6b8Pq8+tmP7+91/8fx/5vyf9+6+/VYHbhcL5EyZzUN8aML+TT5g+ChmrzjYPH2qdqV16
|
||||
NJs25BtD8dhUNEg1pR4Cch2RP6wE0kRo302x9m3Q7szJ2L+nvTG/n+crsl3BDVfRZtMRg/Yc+C+L
|
||||
Yeu6vhutK+VXANs9U0fdoK6b73YK3Hq6UfxJPWOyZuEIlmGq+PguPh6zelMAe/0s6GV7S7O+7KIG
|
||||
ykDwcDhfOI+W77UKHRdeCXev3Hq0nSEFzVUm6l73WrceGt6HQ+cOWOVQ5TFV7wU4HXFDt63uGIMT
|
||||
PZ9Qa8YzXPUfL+PHQn6iyTZHjIdcTGj1Snj0WrMXmfzLyOj1Is2Q8NsbNdHNYEL+KACJWeOH6101
|
||||
dGzFgw7aLUiJ2IgVm070WCD3LFZEMK4zm3lfrUDdWmIoK5PYjbO4OkOaoYSGZG4yOpbfEsSQAxwE
|
||||
5ith2cZ2kTvLQC13XGejkY25YlnnEW+V2yGZBe3hgiWoN5ybpPe+cGlLeAzzPeSJa6BRTZMCRcpO
|
||||
wrosZAYrKfLh+0pkqgnsk8zHtpNlr7v0VNsaJOu82W7AxMNMQ8/NsqlrD2e4yDGlu/dDZqM1Njka
|
||||
PXND8nNVdTNv6EflNO51nB2dNxtxFnNK9LVC6ho7u2aOChLacs8NmeIjyZb3MioMoQjl6/5R06Fy
|
||||
e5jl6EN3JlfVM9ud7E1l7SPs9onjMecWtUqBIo0mrKuT+cOf1N/9hxtL2BhL/bUQfqsA42izqWmG
|
||||
L2e4bC1KHZrE3hhYFcB6iv1wIpve6/OrkqP5dHSJGO/1WjBfwRMa0Rewtn/o2VTZDxeqGErsdjQy
|
||||
xmMw+eDfy3v4+n45NpshXFHP1doynsgYss3HQtv+dAzncXOrl/GUQN7v7FBM0LWm35NsotXr4mJd
|
||||
tT71yMeyitBaXoeSEBy7uX86LUqMUcDqYH6zLoJ9pHAgX4jUf+tkIPL4hPrOHbHn3R6IdXUtK438
|
||||
eGNsr4rs0R1FHm3K+YHDZu0i8RXdLGjCfou3icHVQ3bUJRhb8sZO3588BqWvgutePTJ+zgWbIkMX
|
||||
Vk/jQ8JeKF7e6MkbC+7WtcF+P8ts1OGhorXsYLqr6eRNj1UrgRNDg/2vt/XmWTznMEihTfhEbVHb
|
||||
zooAe3zKiHLy99nkSbIL/OtUYWwFfMdC8SOBZVjqn+fNzDJREc3uPrYUc5ew7niW4SNIGO93rZOR
|
||||
Tf10ZXm/tel2Vw31/HBXKqJDT/HZYys2rlhggRKXF3xEnZ7N7aviFaabBb3GJysZQZZ4eKjXI/Zj
|
||||
zTfaJ7gj2nBuiwM+aQy2PuVPZGUrI5T2o1KzbW210O9tiazN0fS+oRgB3L6Tid0iH43R+yQVrF6Z
|
||||
S6rI22e9NJY+mPR0p8ZFxll7UhUXdaWwp76SKcn0WeUu2pXzFEKEb8Zyf4LyVrCNfSnTmfjTE8k6
|
||||
l4SdH209evy7gqlyXRxcuMybHjV2QeTeX6xd+ohNduZxaChcQjWOOyAm2c0ZsFyTUHj2kjc2pjXC
|
||||
5B4TrN8jNREOZ1oikXcVIrGsR2xbhy0I78Kn+Pqq2cjHs76RPGBklE/fZMzFfajcHMul7k+fzG5l
|
||||
oa3Bv7F5eWjZ4FnbEGrExKU+jWQdXte+sq3zhO4ORp/NpW1cYW9uKIGjL9fMDrGE9iaidJtTxxBi
|
||||
4VmAa7sN1qZujYbWmnwF02yHLc3XjenL8zNc11TBWHpY9cRpkgvm6fGl3rh16rHzuhlN5ecRNvGg
|
||||
duOjnCJE3olP0FeVkzF7qKq8C/gE/+lX3PxJIUo/NtWy7bobz8ZFgETihHCEdOfxfXg/o7aILGoV
|
||||
zw7N8rosf3pMSlc9GKNS3Cx0DLcuVZsyqmnuhAJSWJAQcV4R9md+zlC5dNcUUkepez5C7B9ibH2j
|
||||
kU3qS7bAi8sv3ZbbrpvTYrZguX8yiU2SsPUpfQOVxCPFz2H05u9GvSo61+9JVb11Nh9FQZIFx9Jx
|
||||
OD1nj0G80QHNzy1VyY14Sz8o0KbWzZAzz3s0QzhcEWrnEHuNR5L5a2U5BLYRhUJsiYz0wZgrN/nG
|
||||
YXsDRiLM4rmAbH9Uw7WbiskgT6mM3DsU/3y/eKsp3M+Nh4+n8Jv86kHWdl+NBvSp1u3TufHQhGRL
|
||||
3T75GtQVbgUseoJNSWgY23ipCtNdelO3VhvEPnkoAzqXXTgx81ozQag59NLtkNoW22b0N/+W64cg
|
||||
f1lHxc39CWouDyHbSR80a8fSBs8lM+F29aseeeHoo/dGPtPtMYzr8XN6piiMPELkm9ElM3SOK28o
|
||||
FonUJlnCppUkw2SOKk77M486vp1doAOh2KroNZnWRWihfN9+sA1kMHo1TXLo6P7zxz/M/PtdAhsQ
|
||||
F8rne9xNlX58w3i+GXhbyqea3LMG0IV5LJTl7GCM61V+hnXOttjGld/NR5GT0XHoRew/P37Sc0qf
|
||||
o3vfrQmf0dl4fT/fGeqgZ1Rjply3p9tZhql8PULQoMpYaI0E8X1RhKOaa6yjA1/BvohC6pgmrunO
|
||||
SyO40oJSD2UdGnqn59EYSyO2tY6xLxiXI/Iv4FM3s4J6GBOlAPn9zbC7eTvGog8N4u4VJqKprTr6
|
||||
yS1ZttBzJKg7Vl6v3S4NxCV/oUt/QEu9SPKvP666+6abH3vnKXvru4Ut25EzKnzPIN8Lv6D7m84y
|
||||
ZtllAxm2cqIM89TNZhe5cE/FFXngp1nzdNW5kDmVQ7fu8Ojoos9wXFcE48gc6lYZ+FFmulVQjXO+
|
||||
jOyZxCuX7gCEIe2FaIb3//R/977PvW/RzrKyyiuLoJ5+jGn5v6ih1xXdWk6DxmSj2QrMck8S1hnJ
|
||||
HKWPXDESKab7qTugIbqkKmh+ecWWcY3ZvHddC1Q5FnEwH9tk5IUiRIczcgnHU8vgm7IqlaEedKo/
|
||||
+QTxxir9U89UT6lr9Ja16SG0TyJRnINUs6HSCaw+7YC32ms05n7KWwApX1Gt2VVLPfdHGT7zlvp7
|
||||
X/Wms3aVZencGtT5+R9+x8vwdElLd8oOGdNlWMco/g45xtuz3c195uXwXMlt2C56zjbnjwp1+d2T
|
||||
9vExPdE0xBhQJmakDJ8BYnzacH/0xTqszW50hB0B45u/8K/fiOdn5IKWpnt6/+4bNEWGK4DRswN1
|
||||
hu0jGfOgyoF3Kg3v5lWIJhT1R1Tf4Rj+9Lmr90IM1fwOqBbM627xU2d0XpUZtpb+1Xg8KeUjSzxq
|
||||
vvlHws47/grkvNFDbrjU9bxabVy4c5vzMj8Doze+RgE5IirWHm/fYLu1p4N94Ca8i/dVN6+lVoak
|
||||
QDkp/ZljBD/44ucXqWHnJRtxnI8oPMUtdb7ykExVHxyBvz067HoCZfNmo6RIf51mImxPYz0+0CkC
|
||||
lqYaPoXVGs2GVNuw3z9X1BeTyWgvq0QCfrc608A5nOvRPdxteV8LFd2lYZnNyNvM6Ms1F7IK6pMx
|
||||
zfdPDGLW+j9/4C3+SIYROIeG4aYxJv+VvjfPoduF4vtxZUwWhTOg/YViSzwDmlLzECEH3c9U3ytl
|
||||
NvsC8yGn+ki3+hWMwXrVR+B25EKQrH+88VsXNnpmlyfWymCXre+ufgSp7wasfg9Qz4fPGEHmlA6+
|
||||
ZWPrzfvTzYT6MwbUIPWElv7HI3v9LsgLmm9CZjEqlJVJIqL+/OTih2Fci0EozOcqG7StVipLv8M7
|
||||
DR8YwVwUol8+Wt1OQd1/nmIKsbh7412Crh2pW+sMtnouyOa2Nup+6edQ+vsqnL3cYbTIjPDndzB+
|
||||
oHfW7Ivwidbf4kCG2FyhITB7HQTRs2g4XIxOtOyyRUt+xQbStox9ckuC+/6BsX8dk4Qh9ngD17ZC
|
||||
KJqc3o3bTxujJR+GXIloN9bHvbx56W5IvfgqG/S0VWc5cU4OduPTO5lf2TUEtmumUOLghkZPniyk
|
||||
9znBuG2Hbnx3mqRsAt7AJ/f0SabjZRuhvtikVJ/iHRqK9NHD97FyCUsuEhtmMFTw4uqLHbOxE3pr
|
||||
ihD0ytDDEtKP0Zuz1ipL3qSn/EIQ6Z9OA8/G4rG5/7w9NnibEtQoKOiWrt71mO3FGB0b90n95Doy
|
||||
usw/tOWdgO7K5y1h1SsT0KMQe2yo1atjt6bwUcg3PJF1UiRTkT6I4l84H//8Fzu+plRJ6mtNABX3
|
||||
ej7dIklJW8EiPNyK+htdUh1c6Y7pUp9s7B7AIZu7PwhHQtwt9T9Ck0YXqnY7r+uaV2BBLDWYFsOl
|
||||
7qZcfx3ltW96OOO/ejbNdxojJbvk1ORwUPfh/TPC9WUYVC8j5LEvKiJIv/sbDZ2yzpb+XMG4Xgfk
|
||||
OJ+rZL7Mqgxzd+CpRkzVI/1TaxRnf1lRVwrMRAz2kaUsfo2G/BGh+arv3qh8H4GakmAj5gqHHJb8
|
||||
To3kIv3jX7wcDjiknzRh18s4Qyc1uz9+elCkjSQv/TwEJVMy4qRSCGbRvjCuxjiblYFf+jth2Ajy
|
||||
LGMHXNowqOOLbqP92iPtrPDoI/sr6vqc1s3H9Eog4WoHb4dz1xHl1L0h3d8/Cy+wk97ZRhXksncK
|
||||
5V/e1tNvCL/rK9pK8PpiParKz384h4ozaDW/BTRVtov1S1Kxhn73nLLWmjs55nOOxvQcPeF8ZjOZ
|
||||
0En1GD6ujmhjDTsiba9lNzpxXaLH+1tjez/e6qmxr2ewz2EaKsO8r/sVC0x4cLcLWc1qZLAoriql
|
||||
PlQt2ROzNJpn31noaHwLshK8oh7PfqSDswnW1OAaVLPYNlzIpzgNJW43McrZlxZmKF2adnyGxrp+
|
||||
SrCyDTUcL1GMpp0pj9D0JqYu52pI/J5mE6r5GdCtcltnk3U+p3DjYpeMMYVsNDvRhLk78RRP3tMY
|
||||
m2/TQ/KmCXUfH9OY8O7UIkWNhVDBsWAM2tYp4YPzDoeX1+D1XnFLIcx5hVqP46Ebvp6lonbyn9Rs
|
||||
AivrP8/VEXZDoWJVb+du8QsmuvX6B1vZXHn0x7PMSHbwtssC9EmVjsB7nvLwsoK0Zvus6ZFGczHk
|
||||
5OSSjaX2ldA3/ahEUd9TRh6uqMq/6wfVXu3mfkpbOB6Q9uNv/+i7mfob3N2suzfVa69E1T4TsZee
|
||||
Tmh04q6Ex/p9XPqj3s07VQ3/8DQf4yqbLBzb8Fg/j9SZv3zd2pdAl8NcUOjumu+90R/3BITLDHRr
|
||||
cTzrl/wFDTAnRGYcevM53lRwcTsdO2iyjNE9nFzonYOG/SUPDqfKyJGBLx4N4fvIZkRvnHyI1B01
|
||||
rLvhiRm+XNHi/0J58aPLfHzC51tucd6qk0eb19b8k3dZrKYGU756iI5cNi3jqxtD1W+P4D9GA+cX
|
||||
tEvE8w6ukG2EkchsO9ekShQf7b/XdcjvQ8TI9TDYiBYMQuG11ZO5fbU8WvI5kYRAqKfH90JgJz0/
|
||||
WJc16jWCfW3lZxRzZLwH0cLnVBsKFGsEkOgwslPtELzGBKpuuLu38CodPpH3pta7+BhzapcyIGJy
|
||||
eJepX8ayvL/CfXKBoGjUvTHAX/eXt6i20aRsmf89nMvXd+GLQcfehm7CBm121PDut25G9MABlh+E
|
||||
4qRMa6bM9hmc+X0IF96WMOEbcfD5VluqvlzizZjucqW27RL7k+wnbF5fSkj2nkpEh7t5M3mvI9gx
|
||||
NaVqdTYzatlNi8IpNan5EotkUKqGwDVmJQ7U9z6bd6rt/8l/6ou82cTIqCLZ2e+pUWYvRHnNO//8
|
||||
U6gEm33Snhx0hOt6UKiaiQGbs8+jVMaXm1B74QnTzesqOBze15ArnBeazp3WgpdzB2rifWHMxzQm
|
||||
cIolibQL35mEyyb66WM4h2HSTStyrcB7zv7CT5DHrkoOv/HBW/N0QWPn1bNy9zhEvUIiBlv0BT1D
|
||||
7Up1X61Q/6tXXvHOZJ0YXDf9/D+/U850121Zx/RWvgJmUkhDnlre5FHEwdUvY+psXM+Ta9xcNws/
|
||||
I1xp1V0v2HH748NU69xVMmvR4Q2SygfUK9cdG44RNqHth9vCw/SOn9MsRPhdmVQN6IMN/PPcQmUl
|
||||
UbhZ83YtWKIDUBh8geMV53csbFMJPRuTx0G3Zqi/VA8f1lPkY1XbB4wRKZEQ7s0Tva5FyMiejQJS
|
||||
c2lYeFpQv9dbhYP6MwcY319FNu3Io0L2zuBCODtS0ueT1YB+lDO8ZQfHm7fEm2FuDJO0931uDMLo
|
||||
5kCe2peajx6j+VHk7x8/JFTXLY8v1qMOcp121D8756TnrnaMIq590O1bXBt9q7/ecCuaG1n8Vz1N
|
||||
VlUo+/17FYpj8EqoUq1adC9znZqZd8kmr2oALsxhNIyxu/SbrYliR6+ovnLK+sdHAesqh7WC1xMh
|
||||
7QMTbb/qm4bBbrPki3cF5aHV6Y83iWNgcL88FK5b3fHma1GncPJ2Ghm1/YBodF+XKJyOJpEurcTm
|
||||
B/+af3yWumaXe+yXNy0pUnEUPgdGo7tSwaZWTXx9zXP38wtoqW/sr4I+6XmH8Sg2cowTtXrVJLUb
|
||||
CX48BrNLysTf+ou8fobYiD40GZTkxMlNGl9CaeT3jBW7ofkzH4KUK7tesqYciXoyhGKp7Or1zy95
|
||||
8NZJaRri8vmmQMrWxWTUrQ3il/4ny+fQJPLifyaeE1p4XcAk4i+flN3nCepKdKnph1qyduTkCKcg
|
||||
fdLdog/jvYpzFHN+hR0tfiHCXU8hLL8fZv4tr0lw1BvgCdHw8rwz4j45HuiKi6n9PN7Qn37xelkd
|
||||
WeWMZ/TAzBQMQaAhe/Ft3ZMymwE5Hyl89fyM5vkK/e//UL9d37xZcc9PiJSthHfPpPXmn37Pcvwh
|
||||
YjA4bLo9eAJKXF2w8eLbf/SASusjtlXhgwatGQgwECYayn1d0x8PtQ0aEv4lcskcrmP9x5ewufAY
|
||||
tjlTFf14kbXfd2zw8uKKvMYCrFvSZEyPA7iyyjwU3r4fw1t7kmzDku+wgUM3m/I+tcBbSXO4+R7T
|
||||
bj54eQlIJWuqGSivx+srF8DVtJ6qpnGqmbhqXbir/OGXR7MlT5nyst5Cd1sW1vPCO2BZv8OuvArY
|
||||
6GzPJXpvpHNYlu9Dxvr3qYRnlj3D9Qqbmfjrj7xTajQINvts0uK0gsMMz4WvvuuZf5MKKJd9yaf4
|
||||
rLpp7zsqoG/wCpF305DQxUGElvmCA+EhsUHMe/kPL+Qur8Cbx1dqKfNKkLG61Bd7X5IIdoGQULzw
|
||||
oMlVQkDISmxy8kbfmFD0TGGLYge7axESqt7tJzxcXsfRLvSSZX5zcIzfe7rku2R8nJiFwtV0DVcO
|
||||
aMma2F4Lx4GIIRfsNsYvf2wU7rQh/LIeN0kIYrT40d/6hMHfZFP4h7dE1w+L71nJKVb/VajndDFj
|
||||
U31wodiHDXbmLULUuZ0bdE1TB9uhs2N/+NlMXZXaW+2YzBMaYvjxvTV9QzIARf3P74WKt/vU4/SR
|
||||
RvmXT+1lPaHXyyCGV7y2ybS+6h07NT38+CdWkX/I1sLoFnCVP0dS773GG25clAOuBgk7jv/wRrN4
|
||||
WsAdnTGcRsFB7FE1PvDiZ4uxT7pffeqQOAcHq0yzmHDgtAJyq1Lp8rzQoGVxD/qtRkRY9K7vW+ai
|
||||
rXPdUP/oX7u+1B4yHLNbuuQBnY3HYBNCHm25JU9axthRDCjUyCmciFl6Q1JOOlr0NxSWfLfk0Qb4
|
||||
Pi9oVE9fY9jF21H55TnjQEvUkIHMP71b+P/Xm8ihaODef9fYvl1eHbnq+A3b644Reck7bKhcAj+9
|
||||
9LTymbRTg3vg9cv2x3Pq0REwgYWnhPIl5bqmKdsKLfyJIOXsdHNw1FuAz7jFP94/65dslslwtbE9
|
||||
67YxPz7xqDQXoQ83jc1lrSVq8NObhb9HBtuvfAmC4FPjYAesm+LXVgCejw74sKwHjWQgIyrClUmU
|
||||
+ehmbMlTctdxIjVqvWTMOKUCMk5YDzn64ZMxM58xRFcnw4ffetxPb68vzaA7aafWE+43b1juj6q0
|
||||
ey58VyLglOlMf7x//o0Pp9vWwu9OHW2vL/KHh8HTYDUp4mqG1ldrMmvjqiOKe35D7psJNYTb4P1Z
|
||||
f22mtl74kJoxvxMs5e/froD/+tdff/2v3w6Dd3MrXsvGgKGYhv/4760C/3G9Xf+D54U/2xBIfy2L
|
||||
v//9zw6Ev79d8/4O/3tonsWn//vff4mbP3sN/h6a4fr6f4//a7nUf/3r/wAAAP//AwCfQckA4CAA
|
||||
AA==
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8a0b45f9bc6c4589-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Tue, 09 Jul 2024 21:05:39 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=IuvCm3DYH_Yj2KCjQfa873zjAj_2TgaF47eD6tQ4Mhs-1720559139-1.0.1.1-fWDjNt6ARNNwSwU4ZoyX.VoMhynDVIi97V54zsXBMuMg_KjRGid.vTsH.YWP4cEbPWj_vdlZjnfl3ef4S90Eog;
|
||||
path=/; expires=Tue, 09-Jul-24 21:35:39 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=uolQOZ2C52Hd5W7TXNWyFaYk4FEIIwP0B2MH49GGYtA-1720559139009-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
access-control-allow-origin:
|
||||
- '*'
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-model:
|
||||
- text-embedding-ada-002
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '18'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '10000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '9999953'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_5f80532d772393d55159f71cbd4e8211
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
1516
tests/cassettes/test_output_json_hierarchical.yaml
Normal file
1516
tests/cassettes/test_output_json_hierarchical.yaml
Normal file
File diff suppressed because it is too large
Load Diff
258
tests/cassettes/test_output_json_sequential.yaml
Normal file
258
tests/cassettes/test_output_json_sequential.yaml
Normal file
@@ -0,0 +1,258 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are Scorer. You''re an expert scorer, specialized
|
||||
in scoring titles.\nYour personal goal is: Score the titleTo give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: my best complete final answer to the task.\nYour
|
||||
final answer must be the great and the most complete as possible, it must be
|
||||
outcome described.\n\nI MUST use these formats, my job depends on it!\nCurrent
|
||||
Task: Give me an integer score between 1-5 for the following title: ''The impact
|
||||
of AI in the future of work''\n\nThis is the expect criteria for your final
|
||||
answer: The score of the title. \n you MUST return the actual complete content
|
||||
as the final answer, not a summary.\n\nBegin! This is VERY important to you,
|
||||
use the tools available and give your best Final Answer, your job depends on
|
||||
it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o", "n": 1, "stop": ["\nObservation"],
|
||||
"stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '997'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9jCJkXh6z3EfmS24VIKd3az5QmUI6","object":"chat.completion.chunk","created":1720559136,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJkXh6z3EfmS24VIKd3az5QmUI6","object":"chat.completion.chunk","created":1720559136,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJkXh6z3EfmS24VIKd3az5QmUI6","object":"chat.completion.chunk","created":1720559136,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJkXh6z3EfmS24VIKd3az5QmUI6","object":"chat.completion.chunk","created":1720559136,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJkXh6z3EfmS24VIKd3az5QmUI6","object":"chat.completion.chunk","created":1720559136,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJkXh6z3EfmS24VIKd3az5QmUI6","object":"chat.completion.chunk","created":1720559136,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJkXh6z3EfmS24VIKd3az5QmUI6","object":"chat.completion.chunk","created":1720559136,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJkXh6z3EfmS24VIKd3az5QmUI6","object":"chat.completion.chunk","created":1720559136,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJkXh6z3EfmS24VIKd3az5QmUI6","object":"chat.completion.chunk","created":1720559136,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJkXh6z3EfmS24VIKd3az5QmUI6","object":"chat.completion.chunk","created":1720559136,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJkXh6z3EfmS24VIKd3az5QmUI6","object":"chat.completion.chunk","created":1720559136,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJkXh6z3EfmS24VIKd3az5QmUI6","object":"chat.completion.chunk","created":1720559136,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJkXh6z3EfmS24VIKd3az5QmUI6","object":"chat.completion.chunk","created":1720559136,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJkXh6z3EfmS24VIKd3az5QmUI6","object":"chat.completion.chunk","created":1720559136,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJkXh6z3EfmS24VIKd3az5QmUI6","object":"chat.completion.chunk","created":1720559136,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJkXh6z3EfmS24VIKd3az5QmUI6","object":"chat.completion.chunk","created":1720559136,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"4"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJkXh6z3EfmS24VIKd3az5QmUI6","object":"chat.completion.chunk","created":1720559136,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8a0b45eb19a3c00b-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Tue, 09 Jul 2024 21:05:36 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=pA7SjF9QjLel4TzQ_lNj63W_TlcZBVsYreOxByhCguY-1720559136-1.0.1.1-HZhSIVb4ZIrgcL3DwhR7q53vNdieKNmEv_0ZAHDbmBBkD891hDrzxqLpBZSw7j_mFtCPQEjxpAMjD5JI3o8NEw;
|
||||
path=/; expires=Tue, 09-Jul-24 21:35:36 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=OblnrTSQSq8R858tQhKPz9cFRCWv.MPPI1wxnvjeHJI-1720559136855-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '96'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '22000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '21999771'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_ee2fc8fd37b03ee0bcf92ff34f91a51c
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "user", "content": "4"}, {"role": "system", "content":
|
||||
"I''m gonna convert this raw text into valid JSON."}], "model": "gpt-4o", "tool_choice":
|
||||
{"type": "function", "function": {"name": "ScoreOutput"}}, "tools": [{"type":
|
||||
"function", "function": {"name": "ScoreOutput", "description": "Correctly extracted
|
||||
`ScoreOutput` with all the required parameters with correct types", "parameters":
|
||||
{"properties": {"score": {"title": "Score", "type": "integer"}}, "required":
|
||||
["score"], "type": "object"}}}]}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '519'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=pA7SjF9QjLel4TzQ_lNj63W_TlcZBVsYreOxByhCguY-1720559136-1.0.1.1-HZhSIVb4ZIrgcL3DwhR7q53vNdieKNmEv_0ZAHDbmBBkD891hDrzxqLpBZSw7j_mFtCPQEjxpAMjD5JI3o8NEw;
|
||||
_cfuvid=OblnrTSQSq8R858tQhKPz9cFRCWv.MPPI1wxnvjeHJI-1720559136855-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA2xS22rjMBB991eIeY4X5+K68WsoXbplt6WUQjfFKIp8SWSNkMabpiH/XuS4sRvW
|
||||
D2KYM+fCjA8BY1CtIWUgSk6iNiqcbxZ3anfPy6sPnu92k23iVvttUl7/XPz+BSPPwNVGCvpi/RBY
|
||||
GyWpQn2ChZWcpFcdJ5MojufjadICNa6l8rTCUDjDcBJNZmEUh+NpRyyxEtJByv4GjDF2aF8fUa/l
|
||||
O6QsGn11aukcLySk5yHGwKLyHeDOVY64Jhj1oEBNUvvUulFqABCiygRXqjc+fYdB3e+JK5Wtdrc3
|
||||
L83Y7h8Xm4/7l3/b13nx8Hz3MPA7Se9NGyhvtDjvZ4Cf++mFGWOged1ynwRa+ach09AFnTHgtmhq
|
||||
qclHh8MSnB9eQjo7wrfRY/C/+q2rjue1KiyMxZW72BLkla5cmVnJXZsWHKE5WXi5t/Z8zbeLgLFY
|
||||
G8oIt1J7wevuetD/Lz0YdxghcTXgxEEXD9zekayzvNKFtMZW7SkhN9k6Tq6mUZLPIwiOwScAAAD/
|
||||
/wMAE8hqg9MCAAA=
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8a0b45ef8871c00b-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Tue, 09 Jul 2024 21:05:37 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '298'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '22000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '21999969'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_cea16690999d7a4128fd55687dc31397
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
1649
tests/cassettes/test_output_pydantic_hierarchical.yaml
Normal file
1649
tests/cassettes/test_output_pydantic_hierarchical.yaml
Normal file
File diff suppressed because it is too large
Load Diff
258
tests/cassettes/test_output_pydantic_sequential.yaml
Normal file
258
tests/cassettes/test_output_pydantic_sequential.yaml
Normal file
@@ -0,0 +1,258 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are Scorer. You''re an expert scorer, specialized
|
||||
in scoring titles.\nYour personal goal is: Score the titleTo give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: my best complete final answer to the task.\nYour
|
||||
final answer must be the great and the most complete as possible, it must be
|
||||
outcome described.\n\nI MUST use these formats, my job depends on it!\nCurrent
|
||||
Task: Give me an integer score between 1-5 for the following title: ''The impact
|
||||
of AI in the future of work''\n\nThis is the expect criteria for your final
|
||||
answer: The score of the title. \n you MUST return the actual complete content
|
||||
as the final answer, not a summary.\n\nBegin! This is VERY important to you,
|
||||
use the tools available and give your best Final Answer, your job depends on
|
||||
it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o", "n": 1, "stop": ["\nObservation"],
|
||||
"stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '997'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9jCJjSE1CUTbcdPQgWhGnINQBfDJr","object":"chat.completion.chunk","created":1720559135,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJjSE1CUTbcdPQgWhGnINQBfDJr","object":"chat.completion.chunk","created":1720559135,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJjSE1CUTbcdPQgWhGnINQBfDJr","object":"chat.completion.chunk","created":1720559135,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJjSE1CUTbcdPQgWhGnINQBfDJr","object":"chat.completion.chunk","created":1720559135,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJjSE1CUTbcdPQgWhGnINQBfDJr","object":"chat.completion.chunk","created":1720559135,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJjSE1CUTbcdPQgWhGnINQBfDJr","object":"chat.completion.chunk","created":1720559135,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJjSE1CUTbcdPQgWhGnINQBfDJr","object":"chat.completion.chunk","created":1720559135,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJjSE1CUTbcdPQgWhGnINQBfDJr","object":"chat.completion.chunk","created":1720559135,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJjSE1CUTbcdPQgWhGnINQBfDJr","object":"chat.completion.chunk","created":1720559135,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJjSE1CUTbcdPQgWhGnINQBfDJr","object":"chat.completion.chunk","created":1720559135,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJjSE1CUTbcdPQgWhGnINQBfDJr","object":"chat.completion.chunk","created":1720559135,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJjSE1CUTbcdPQgWhGnINQBfDJr","object":"chat.completion.chunk","created":1720559135,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJjSE1CUTbcdPQgWhGnINQBfDJr","object":"chat.completion.chunk","created":1720559135,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJjSE1CUTbcdPQgWhGnINQBfDJr","object":"chat.completion.chunk","created":1720559135,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJjSE1CUTbcdPQgWhGnINQBfDJr","object":"chat.completion.chunk","created":1720559135,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJjSE1CUTbcdPQgWhGnINQBfDJr","object":"chat.completion.chunk","created":1720559135,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"4"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9jCJjSE1CUTbcdPQgWhGnINQBfDJr","object":"chat.completion.chunk","created":1720559135,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8a0b45e24b19bd4d-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Tue, 09 Jul 2024 21:05:35 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=N7yNe.ilaHt2MJPusthFVyL5PrE._f_nyf4RfU.oIv0-1720559135-1.0.1.1-oCOj_tvpNYp16zBvNbxW.TwSHAFXRiB_i23X4XBw_o01D1_7OKj_HwRNZWdwg9DjDh_C_FSMKTonmzQmsUmtdg;
|
||||
path=/; expires=Tue, 09-Jul-24 21:35:35 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=aiUOV0PnMjHles7YFoHcFY7PK2Ag6MdKr0GWZzZ_rZo-1720559135403-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '105'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '22000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '21999771'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_759b74b995b84a531eae7df3eddf1196
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "user", "content": "4"}, {"role": "system", "content":
|
||||
"I''m gonna convert this raw text into valid JSON."}], "model": "gpt-4o", "tool_choice":
|
||||
{"type": "function", "function": {"name": "ScoreOutput"}}, "tools": [{"type":
|
||||
"function", "function": {"name": "ScoreOutput", "description": "Correctly extracted
|
||||
`ScoreOutput` with all the required parameters with correct types", "parameters":
|
||||
{"properties": {"score": {"title": "Score", "type": "integer"}}, "required":
|
||||
["score"], "type": "object"}}}]}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '519'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=N7yNe.ilaHt2MJPusthFVyL5PrE._f_nyf4RfU.oIv0-1720559135-1.0.1.1-oCOj_tvpNYp16zBvNbxW.TwSHAFXRiB_i23X4XBw_o01D1_7OKj_HwRNZWdwg9DjDh_C_FSMKTonmzQmsUmtdg;
|
||||
_cfuvid=aiUOV0PnMjHles7YFoHcFY7PK2Ag6MdKr0GWZzZ_rZo-1720559135403-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA2xS22rcMBB991eIeV4XX9ZJ1m8lEGgKSZtSUnLBKPLYq40sCWmcdFn234u8ztpZ
|
||||
6gcxzJlzYca7iDGQNZQMxJqT6KyKV5vL680mzX58Xd4l773D7O63vMjP7rfrKwuLwDAvGxT0wfoi
|
||||
TGcVkjT6AAuHnDCopudZUhSrNC8GoDM1qkBrLcVLE2dJtoyTIk7zkbg2UqCHkj1GjDG2G94QUdf4
|
||||
F0qWLD46HXrPW4TyOMQYOKNCB7j30hPXBIsJFEYT6pBa90rNADJGVYIrNRkfvt2snvbElareb27S
|
||||
b28/v7/VRXtP13X75+GMF+hmfgfprR0CNb0Wx/3M8GO/PDFjDDTvBu4vYRze9mR7OqEzBty1fYea
|
||||
QnTYPYEPw09QLvfwaXQf/a9+Hqv9ca3KtNaZF3+yJWikln5dOeR+SAuejD1YBLnn4Xz9p4uAdaaz
|
||||
VJF5RR0EL8brwfS/TGAxYmSIqxmniMZ44LeesKsaqVt01snhlNDYSmByvsrzPGkg2kf/AAAA//8D
|
||||
AKtPZkLTAgAA
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8a0b45e5ffd7bd4d-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Tue, 09 Jul 2024 21:05:36 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '153'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '22000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '21999969'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_36cd16b74a2085c72139d09309d21e39
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,82 +1,33 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: !!binary |
|
||||
Cp4ICiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkS9QcKEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRLeBwoQbcSgVp+c/HPtPueafW+iDBII6Vtopi7FQgAqDENyZXcgQ3JlYXRlZDABOcjB
|
||||
8hWO/7gXQRAt9BWO/7gXShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuMTYuM0oaCg5weXRob25fdmVy
|
||||
c2lvbhIICgYzLjExLjdKMQoHY3Jld19pZBImCiQ1OGI3YjUzMS1jYmQxLTQ5Y2UtOGFjOC0wNTkz
|
||||
NWJlYjdlNGVKHAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKFQoNY3Jld19sYW5ndWFnZRIE
|
||||
CgJlbkoaChRjcmV3X251bWJlcl9vZl90YXNrcxICGAFKGwoVY3Jld19udW1iZXJfb2ZfYWdlbnRz
|
||||
EgIYAUrJAgoLY3Jld19hZ2VudHMSuQIKtgJbeyJpZCI6ICI1NTk2M2UzNS02NjU4LTRmMmItOTJi
|
||||
My04ZDE0MGNhMDYwOGUiLCAicm9sZSI6ICJTY29yZXIiLCAibWVtb3J5X2VuYWJsZWQ/IjogZmFs
|
||||
c2UsICJ2ZXJib3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAxNSwgIm1heF9ycG0iOiBudWxsLCAi
|
||||
aTE4biI6ICJlbiIsICJsbG0iOiAie1wibmFtZVwiOiBudWxsLCBcIm1vZGVsX25hbWVcIjogXCJn
|
||||
cHQtNFwiLCBcInRlbXBlcmF0dXJlXCI6IDAuNywgXCJjbGFzc1wiOiBcIkNoYXRPcGVuQUlcIn0i
|
||||
LCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6IGZhbHNlLCAidG9vbHNfbmFtZXMiOiBbXX1dSoYBCgpj
|
||||
cmV3X3Rhc2tzEngKdlt7ImlkIjogIjc1OWM0ZGYwLTI4MWUtNDNlMy05Yzc0LWNmMDg3NTczZTJi
|
||||
YSIsICJhc3luY19leGVjdXRpb24/IjogZmFsc2UsICJhZ2VudF9yb2xlIjogIlNjb3JlciIsICJ0
|
||||
b29sc19uYW1lcyI6IFtdfV1KKAoIcGxhdGZvcm0SHAoabWFjT1MtMTQuMy1hcm02NC1hcm0tNjRi
|
||||
aXRKHAoQcGxhdGZvcm1fcmVsZWFzZRIICgYyMy4zLjBKGwoPcGxhdGZvcm1fc3lzdGVtEggKBkRh
|
||||
cndpbkp7ChBwbGF0Zm9ybV92ZXJzaW9uEmcKZURhcndpbiBLZXJuZWwgVmVyc2lvbiAyMy4zLjA6
|
||||
IFdlZCBEZWMgMjAgMjE6MzA6NTkgUFNUIDIwMjM7IHJvb3Q6eG51LTEwMDAyLjgxLjV+Ny9SRUxF
|
||||
QVNFX0FSTTY0X1Q2MDMwSgoKBGNwdXMSAhgMegIYAQ==
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate, br
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '1057'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.23.0
|
||||
method: POST
|
||||
uri: http://telemetry.crewai.com:4318/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Sat, 02 Mar 2024 16:30:45 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "user", "content": "You are Scorer. You''re an expert
|
||||
scorer, specialized in scoring titles.\nYour personal goal is: Score the titleTo
|
||||
give my best complete final answer to the task use the exact following format:\n\nThought:
|
||||
I now can give a great answer\nFinal Answer: my best complete final answer to
|
||||
the task.\nYour final answer must be the great and the most complete as possible,
|
||||
it must be outcome described.\n\nI MUST use these formats, my job depends on
|
||||
it!\n\nThought: \n\nCurrent Task: Give me an integer score between 1-5 for the
|
||||
following title: ''The impact of AI in the future of work''\n\nThis is the expect
|
||||
criteria for your final answer: The score of the title. \n you MUST return the
|
||||
actual complete content as the final answer, not a summary.\n\nBegin! This is
|
||||
VERY important to you, use the tools available and give your best Final Answer,
|
||||
your job depends on it!\n\nThought: \n"}], "model": "gpt-4", "n": 1, "stop":
|
||||
["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
body: '{"messages": [{"content": "You are Scorer. You''re an expert scorer, specialized
|
||||
in scoring titles.\nYour personal goal is: Score the titleTo give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: my best complete final answer to the task.\nYour
|
||||
final answer must be the great and the most complete as possible, it must be
|
||||
outcome described.\n\nI MUST use these formats, my job depends on it!\nCurrent
|
||||
Task: Give me an integer score between 1-5 for the following title: ''The impact
|
||||
of AI in the future of work''\n\nThis is the expect criteria for your final
|
||||
answer: The score of the title. \n you MUST return the actual complete content
|
||||
as the final answer, not a summary.\n\nBegin! This is VERY important to you,
|
||||
use the tools available and give your best Final Answer, your job depends on
|
||||
it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o", "n": 1, "stop": ["\nObservation"],
|
||||
"stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1012'
|
||||
- '997'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.12.0
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
@@ -86,7 +37,7 @@ interactions:
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.12.0
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
@@ -95,265 +46,64 @@ interactions:
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
string: 'data: {"id":"chatcmpl-9hED3nsqYzSiiorjVCmuS96jevbba","object":"chat.completion.chunk","created":1720089753,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"Considering"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hED3nsqYzSiiorjVCmuS96jevbba","object":"chat.completion.chunk","created":1720089753,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hED3nsqYzSiiorjVCmuS96jevbba","object":"chat.completion.chunk","created":1720089753,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
relevance"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
topic"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":","},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
straightforward"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"ness"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
title"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
its"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
ability"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
to"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
capt"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"ivate"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
attention"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
due"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
to"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
current"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
global"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
interest"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
in"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
its"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
future"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
implications"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":","},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9hED3nsqYzSiiorjVCmuS96jevbba","object":"chat.completion.chunk","created":1720089753,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
would"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hED3nsqYzSiiorjVCmuS96jevbba","object":"chat.completion.chunk","created":1720089753,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9hED3nsqYzSiiorjVCmuS96jevbba","object":"chat.completion.chunk","created":1720089753,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hED3nsqYzSiiorjVCmuS96jevbba","object":"chat.completion.chunk","created":1720089753,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
it"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9hED3nsqYzSiiorjVCmuS96jevbba","object":"chat.completion.chunk","created":1720089753,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
high"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hED3nsqYzSiiorjVCmuS96jevbba","object":"chat.completion.chunk","created":1720089753,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
score"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hED3nsqYzSiiorjVCmuS96jevbba","object":"chat.completion.chunk","created":1720089753,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hED3nsqYzSiiorjVCmuS96jevbba","object":"chat.completion.chunk","created":1720089753,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
\n\n"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hED3nsqYzSiiorjVCmuS96jevbba","object":"chat.completion.chunk","created":1720089753,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9hED3nsqYzSiiorjVCmuS96jevbba","object":"chat.completion.chunk","created":1720089753,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hED3nsqYzSiiorjVCmuS96jevbba","object":"chat.completion.chunk","created":1720089753,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
The"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
title"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
''"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"The"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
impact"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
in"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
future"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
work"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"''"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
scores"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9hED3nsqYzSiiorjVCmuS96jevbba","object":"chat.completion.chunk","created":1720089753,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"4"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hED3nsqYzSiiorjVCmuS96jevbba","object":"chat.completion.chunk","created":1720089753,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"4"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
out"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"5"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-8yMY0ocCeiaktjmCcObeikdUZPfLi","object":"chat.completion.chunk","created":1709397044,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
data: {"id":"chatcmpl-9hED3nsqYzSiiorjVCmuS96jevbba","object":"chat.completion.chunk","created":1720089753,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
@@ -364,60 +114,53 @@ interactions:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 85e2c5e93ad900d7-GRU
|
||||
Cache-Control:
|
||||
- no-cache, must-revalidate
|
||||
- 89de825e5af34c0d-MIA
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Sat, 02 Mar 2024 16:30:45 GMT
|
||||
- Thu, 04 Jul 2024 10:42:33 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=29uZO.hVYKS5gYK5X6QpUY7F78cYAZS5Re6SlF1k0IE-1709397045-1.0.1.1-0R8xzeW3JkesWRs7meP9a8me.3lSCf1DoFGhgIP2AQEe8B6hF98sBDkh4JwHVXhf7vAWVGSdnfsBCbZpuXm8wQ;
|
||||
path=/; expires=Sat, 02-Mar-24 17:00:45 GMT; domain=.api.openai.com; HttpOnly;
|
||||
- __cf_bm=XNWuPzzvDCR6vj8X4pwaZq_1zuK8TwGTpIqHQc0EbWw-1720089753-1.0.1.1-f61Hw2P4yRgm8mOUN2RhRrvndJQwdxwAS5T8bsfbqXLXSlbSKQONzTKvwOzVDnhHR3gy56nDVq.uAOE1cvvDDQ;
|
||||
path=/; expires=Thu, 04-Jul-24 11:12:33 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=NyDbNO5ZfRQYnj5YFoyLtS.qoTk641EufpijridgyHo-1709397045493-0.0.1.1-604800000;
|
||||
- _cfuvid=ZIy1L3HZwWapuY1KTKhqiOCKReYrjZwlhU2BUCsEpUs-1720089753602-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
access-control-allow-origin:
|
||||
- '*'
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-model:
|
||||
- gpt-4-0613
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '234'
|
||||
- '135'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=15724800; includeSubDomains
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '300000'
|
||||
- '16000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '299768'
|
||||
- '15999772'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 46ms
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_42d5dbd362d2ba837a2d9a0303d93854
|
||||
- req_ae342ee6e026c54b420e69ccb8235272
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "user", "content": "The title ''The impact of AI
|
||||
in the future of work'' scores a 4 out of 5."}, {"role": "system", "content":
|
||||
"I''m gonna convert this raw text into valid JSON."}], "model": "gpt-4", "tool_choice":
|
||||
body: '{"messages": [{"role": "user", "content": "4"}, {"role": "system", "content":
|
||||
"I''m gonna convert this raw text into valid JSON."}], "model": "gpt-4o", "tool_choice":
|
||||
{"type": "function", "function": {"name": "ScoreOutput"}}, "tools": [{"type":
|
||||
"function", "function": {"name": "ScoreOutput", "description": "Correctly extracted
|
||||
`ScoreOutput` with all the required parameters with correct types", "parameters":
|
||||
@@ -427,20 +170,20 @@ interactions:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '588'
|
||||
- '519'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=29uZO.hVYKS5gYK5X6QpUY7F78cYAZS5Re6SlF1k0IE-1709397045-1.0.1.1-0R8xzeW3JkesWRs7meP9a8me.3lSCf1DoFGhgIP2AQEe8B6hF98sBDkh4JwHVXhf7vAWVGSdnfsBCbZpuXm8wQ;
|
||||
_cfuvid=NyDbNO5ZfRQYnj5YFoyLtS.qoTk641EufpijridgyHo-1709397045493-0.0.1.1-604800000
|
||||
- __cf_bm=XNWuPzzvDCR6vj8X4pwaZq_1zuK8TwGTpIqHQc0EbWw-1720089753-1.0.1.1-f61Hw2P4yRgm8mOUN2RhRrvndJQwdxwAS5T8bsfbqXLXSlbSKQONzTKvwOzVDnhHR3gy56nDVq.uAOE1cvvDDQ;
|
||||
_cfuvid=ZIy1L3HZwWapuY1KTKhqiOCKReYrjZwlhU2BUCsEpUs-1720089753602-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.12.0
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
@@ -450,7 +193,7 @@ interactions:
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.12.0
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
@@ -460,60 +203,55 @@ interactions:
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
ISALACBGzZmPUTurS3j2kVTLAT6a5FL++b8ZuDRoD4rG1Db90i0Y9gKZWqIJBNTZ2ToXtzv0emRN
|
||||
ePbqAcwjCjDMfBcqK9uz88diNH4LvscP964vn+/L7+hu+vN6NvL0w5YH0ARFHLqeeMfFQM2NhiAn
|
||||
l5/FEQX60958OJ/2RjPhHWWiWFKAqXXtUbs36Q97nWYmD+OaAisPAK6lEcB377sU6LVaES8MgQL4
|
||||
EcDKyJgC9Os6r52vHVuAXNi6SQHdSCmcc8bIXehLSQ7l4LWeW6S/+lLuksPx1H/5/SlN9tw/x+PH
|
||||
sa9/Xh5OnJB65mwHhXpglGJQWZ0gjADUvprQW3+hqeKvxtnGqTAAmvj3FOB1rYE169BU8ZoCo7X+
|
||||
p+rGf6+Obejxf4pllCa1lQnqBcMZtQ7Y3RA2RgHWzlga698DNpasNsa5QlsZZd3OmTLWNQXmM5UA
|
||||
90weAKfcDw/kRGa/N/XGSDCMwMouyXUaV7bKzSp7/54BAw==
|
||||
H4sIAAAAAAAAA2xSS2/bMAy++1cIPMeDm9pN49se3YYVWwKsww5rYagy7biTRUGit6RB/vsgx43d
|
||||
YD4IBD9+D5DeR0JAU0IuQG0kq9bqeLm5+ZDWqvz4nJZ/7nbrBOV2/c0/+XdvP2uYBQY9PqHiF9Yb
|
||||
Ra3VyA2ZI6wcSsagerGYJ8n1cpGlPdBSiTrQastxSvE8madxksUXlwNxQ41CD7n4FQkhxL5/Q0RT
|
||||
4hZykcxeOi16L2uE/DQkBDjSoQPS+8azNAyzEVRkGE1IbTqtJwAT6UJJrUfj47ef1OOepNbF8uuP
|
||||
9ep2O7+6/Zvebd//VKvdzacvyk78jtI72weqOqNO+5ngp35+ZiYEGNn23O+KHK46th2f0YUA6equ
|
||||
RcMhOuzvwYfhe8jTA7waPUT/qx+G6nBaq6baOnr0Z1uCqjGN3xQOpe/TgmeyR4sg99Cfr3t1EbCO
|
||||
WssF0280QfB6uB6M/8sIZgPGxFJPOFk0xAO/84xtUTWmRmdd058SKluU2eLqMllUywSiQ/QPAAD/
|
||||
/wMAo1FYu9MCAAA=
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 85e2c5fe7ccc00d7-GRU
|
||||
Cache-Control:
|
||||
- no-cache, must-revalidate
|
||||
- 89de8261dc6c4c0d-MIA
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- br
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Sat, 02 Mar 2024 16:30:48 GMT
|
||||
- Thu, 04 Jul 2024 10:42:34 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
access-control-allow-origin:
|
||||
- '*'
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-model:
|
||||
- gpt-4-0613
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '516'
|
||||
- '193'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=15724800; includeSubDomains
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '300000'
|
||||
- '16000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '299951'
|
||||
- '15999969'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 9ms
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_f5cc008a441e72f5c41ca547cac30f03
|
||||
- req_7cf743796bf4a52626b923135ff8f936
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
|
||||
@@ -1,168 +1,33 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: !!binary |
|
||||
CqcuCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkS/i0KEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRKNAQoQWsK2m2g3VyJHWaXnQym/VhIII86T/ZrUN8EqClRvb2wgVXNhZ2UwATnYaVBJ
|
||||
jjbPF0FYJVFJjjbPF0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjMwLjZKHwoJdG9vbF9uYW1lEhIK
|
||||
EGdldF9maW5hbF9hbnN3ZXJKDgoIYXR0ZW1wdHMSAhgBegIYARKNAQoQ6Z32Q3lxb6RU5NbYTjLU
|
||||
SxIIqcY2PQT6l3gqClRvb2wgVXNhZ2UwATnAus9KjjbPF0EYU9BKjjbPF0oaCg5jcmV3YWlfdmVy
|
||||
c2lvbhIICgYwLjMwLjZKHwoJdG9vbF9uYW1lEhIKEGdldF9maW5hbF9hbnN3ZXJKDgoIYXR0ZW1w
|
||||
dHMSAhgBegIYARKNAQoQwek7gO4ckm9Jvn6BwHW9QxII3T6AbeLq7EIqClRvb2wgVXNhZ2UwATlY
|
||||
cF9PjjbPF0GoM2BPjjbPF0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjMwLjZKHwoJdG9vbF9uYW1l
|
||||
EhIKEGdldF9maW5hbF9hbnN3ZXJKDgoIYXR0ZW1wdHMSAhgBegIYARLKAQoQxj4ww0iUXwfHLphC
|
||||
57m2khIIjJFrK75dr+gqEFRvb2wgVXNhZ2UgRXJyb3IwATlQt3hRjjbPF0GIfnlRjjbPF0oaCg5j
|
||||
cmV3YWlfdmVyc2lvbhIICgYwLjMwLjZKZgoDbGxtEl8KXXsibmFtZSI6IG51bGwsICJtb2RlbF9u
|
||||
YW1lIjogImdwdC00LTAxMjUtcHJldmlldyIsICJ0ZW1wZXJhdHVyZSI6IDAuNywgImNsYXNzIjog
|
||||
IkNoYXRPcGVuQUkifXoCGAESygEKEArpOi/Us2A3PpB8Gii7j88SCPzLVYFtDQSsKhBUb29sIFVz
|
||||
YWdlIEVycm9yMAE56MF6Uo42zxdBaH17Uo42zxdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zMC42
|
||||
SmYKA2xsbRJfCl17Im5hbWUiOiBudWxsLCAibW9kZWxfbmFtZSI6ICJncHQtNC0wMTI1LXByZXZp
|
||||
ZXciLCAidGVtcGVyYXR1cmUiOiAwLjcsICJjbGFzcyI6ICJDaGF0T3BlbkFJIn16AhgBEsoBChBt
|
||||
yk7Qq7Sgw6WMHkcEj6WfEgijl4wq3/wt7SoQVG9vbCBVc2FnZSBFcnJvcjABOeAyk1OONs8XQdja
|
||||
k1OONs8XShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuMzAuNkpmCgNsbG0SXwpdeyJuYW1lIjogbnVs
|
||||
bCwgIm1vZGVsX25hbWUiOiAiZ3B0LTQtMDEyNS1wcmV2aWV3IiwgInRlbXBlcmF0dXJlIjogMC43
|
||||
LCAiY2xhc3MiOiAiQ2hhdE9wZW5BSSJ9egIYARJiChAmBkQ6MDSlxvmv4YVRGzmdEgj1PJQNKV6C
|
||||
sCoQVG9vbCBVc2FnZSBFcnJvcjABOfgzEleONs8XQQjYEleONs8XShoKDmNyZXdhaV92ZXJzaW9u
|
||||
EggKBjAuMzAuNnoCGAESjQEKEAwZ3MFe3tvKT/3fZgpV4ywSCEicyz0QPrdVKgpUb29sIFVzYWdl
|
||||
MAE5CPdvWI42zxdBWLpwWI42zxdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zMC42Sh8KCXRvb2xf
|
||||
bmFtZRISChBnZXRfZmluYWxfYW5zd2VySg4KCGF0dGVtcHRzEgIYAXoCGAESjQEKEGIhNyFGf0X0
|
||||
iubEVUn96JASCCPQkQmTahyuKgpUb29sIFVzYWdlMAE5gFEoWo42zxdBgEspWo42zxdKGgoOY3Jl
|
||||
d2FpX3ZlcnNpb24SCAoGMC4zMC42Sh8KCXRvb2xfbmFtZRISChBnZXRfZmluYWxfYW5zd2VySg4K
|
||||
CGF0dGVtcHRzEgIYAXoCGAESjQEKECluqAOjM96Xcuo2pY/AxrUSCE2CDrSmkuFVKgpUb29sIFVz
|
||||
YWdlMAE5YJvZW442zxdBoLTaW442zxdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zMC42Sh8KCXRv
|
||||
b2xfbmFtZRISChBnZXRfZmluYWxfYW5zd2VySg4KCGF0dGVtcHRzEgIYAXoCGAESYgoQ3rypXzEG
|
||||
OaBuogXtWu23OxIIJU2tAn4rSesqEFRvb2wgVXNhZ2UgRXJyb3IwATmAst9fjjbPF0GgfeBfjjbP
|
||||
F0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjMwLjZ6AhgBEpcBChCoJakn4gJ/bsmLqhOQ+pzJEggZ
|
||||
FE39y55kSSoKVG9vbCBVc2FnZTABOdAenmKONs8XQdAYn2KONs8XShoKDmNyZXdhaV92ZXJzaW9u
|
||||
EggKBjAuMzAuNkopCgl0b29sX25hbWUSHAoaRGVsZWdhdGUgd29yayB0byBjby13b3JrZXJKDgoI
|
||||
YXR0ZW1wdHMSAhgBegIYARKWAQoQbG8DmDglapO/4qWaAMeMhhII+ZHmV8V3pYsqClRvb2wgVXNh
|
||||
Z2UwATlwxGFojjbPF0Goi2JojjbPF0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjMwLjZKKAoJdG9v
|
||||
bF9uYW1lEhsKGUFzayBxdWVzdGlvbiB0byBjby13b3JrZXJKDgoIYXR0ZW1wdHMSAhgBegIYARKL
|
||||
AQoQA3elhWektmDyQtq61PMfARIInyy2MM6Y3gwqClRvb2wgVXNhZ2UwATnoK/drjjbPF0EI9/dr
|
||||
jjbPF0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjMwLjZKHQoJdG9vbF9uYW1lEhAKDmxlYXJuX2Fi
|
||||
b3V0X0FJSg4KCGF0dGVtcHRzEgIYAXoCGAESiwEKEGOUc3nLZT3TTF1UqTEf3aYSCAIG5j6E1JLk
|
||||
KgpUb29sIFVzYWdlMAE5wIxYbY42zxdBuDRZbY42zxdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4z
|
||||
MC42Sh0KCXRvb2xfbmFtZRIQCg5sZWFybl9hYm91dF9BSUoOCghhdHRlbXB0cxICGAF6AhgBEmIK
|
||||
ELpJQh4iRyb0LtlYOjZkn+8SCCPJgujx+G0+KhBUb29sIFVzYWdlIEVycm9yMAE5ULmabo42zxdB
|
||||
wE2bbo42zxdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zMC42egIYARLzAQoQFQ74Wok+KKV4rR5E
|
||||
kSYVMRIIJRc3G0tgxY4qClRvb2wgVXNhZ2UwATlYx7FzjjbPF0EYorJzjjbPF0oaCg5jcmV3YWlf
|
||||
dmVyc2lvbhIICgYwLjMwLjZKHQoJdG9vbF9uYW1lEhAKDmxlYXJuX2Fib3V0X0FJSg4KCGF0dGVt
|
||||
cHRzEgIYAUpmCgNsbG0SXwpdeyJuYW1lIjogbnVsbCwgIm1vZGVsX25hbWUiOiAiZ3B0LTMuNS10
|
||||
dXJiby0wMTI1IiwgInRlbXBlcmF0dXJlIjogMC43LCAiY2xhc3MiOiAiQ2hhdE9wZW5BSSJ9egIY
|
||||
ARLzAQoQnJMORTVCM7CSablsUBxMJxIIQCKuv5AXvFEqClRvb2wgVXNhZ2UwATl4LDp1jjbPF0EQ
|
||||
5Dp1jjbPF0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjMwLjZKHQoJdG9vbF9uYW1lEhAKDmxlYXJu
|
||||
X2Fib3V0X0FJSg4KCGF0dGVtcHRzEgIYAUpmCgNsbG0SXwpdeyJuYW1lIjogbnVsbCwgIm1vZGVs
|
||||
X25hbWUiOiAiZ3B0LTMuNS10dXJiby0wMTI1IiwgInRlbXBlcmF0dXJlIjogMC43LCAiY2xhc3Mi
|
||||
OiAiQ2hhdE9wZW5BSSJ9egIYARLzAQoQAfw9BLcOs9bjmKBOkNP2pRII8jFQggGaAhsqClRvb2wg
|
||||
VXNhZ2UwATlAbQJ3jjbPF0HwIAN3jjbPF0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjMwLjZKHQoJ
|
||||
dG9vbF9uYW1lEhAKDmxlYXJuX2Fib3V0X0FJSg4KCGF0dGVtcHRzEgIYAUpmCgNsbG0SXwpdeyJu
|
||||
YW1lIjogbnVsbCwgIm1vZGVsX25hbWUiOiAiZ3B0LTMuNS10dXJiby0wMTI1IiwgInRlbXBlcmF0
|
||||
dXJlIjogMC43LCAiY2xhc3MiOiAiQ2hhdE9wZW5BSSJ9egIYARKXAQoQuzP2zqalo852qSEZJPoW
|
||||
/RIInedZySqvqygqClRvb2wgVXNhZ2UwATkYAGWujjbPF0Go4mWujjbPF0oaCg5jcmV3YWlfdmVy
|
||||
c2lvbhIICgYwLjMwLjZKKQoJdG9vbF9uYW1lEhwKGkRlbGVnYXRlIHdvcmsgdG8gY28td29ya2Vy
|
||||
Sg4KCGF0dGVtcHRzEgIYAXoCGAESlwEKEATlyOCYjHSp1K3QoCeqMmUSCP/SCa7gOMy+KgpUb29s
|
||||
IFVzYWdlMAE5QKeKuY42zxdBqGaLuY42zxdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zMC42SikK
|
||||
CXRvb2xfbmFtZRIcChpEZWxlZ2F0ZSB3b3JrIHRvIGNvLXdvcmtlckoOCghhdHRlbXB0cxICGAF6
|
||||
AhgBEpcBChDCNHUdgM9SPykW9jZLAszpEgjxF3ox7N4UJyoKVG9vbCBVc2FnZTABOeDPiMqONs8X
|
||||
QdCiicqONs8XShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuMzAuNkopCgl0b29sX25hbWUSHAoaRGVs
|
||||
ZWdhdGUgd29yayB0byBjby13b3JrZXJKDgoIYXR0ZW1wdHMSAhgBegIYARKHAQoQ79rs2SVt97mT
|
||||
uX/cWDscnBII6rrlGA8RtbMqClRvb2wgVXNhZ2UwATkYFcz6jjbPF0FQ3Mz6jjbPF0oaCg5jcmV3
|
||||
YWlfdmVyc2lvbhIICgYwLjMwLjZKGQoJdG9vbF9uYW1lEgwKCm11bHRpcGxpZXJKDgoIYXR0ZW1w
|
||||
dHMSAhgBegIYARKHAQoQVP9tyrp7Wv9IolbUgao9uBII8inECsdSecYqClRvb2wgVXNhZ2UwATl4
|
||||
JZH8jjbPF0GIyZH8jjbPF0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjMwLjZKGQoJdG9vbF9uYW1l
|
||||
EgwKCm11bHRpcGxpZXJKDgoIYXR0ZW1wdHMSAhgBegIYARJiChCQ/ikD6wODQRRK4sOy00xbEgh3
|
||||
W3X41OAJQSoQVG9vbCBVc2FnZSBFcnJvcjABOXi4pf+ONs8XQeB3pv+ONs8XShoKDmNyZXdhaV92
|
||||
ZXJzaW9uEggKBjAuMzAuNnoCGAESYgoQ53JWAKq1hJb9rscUFhC6GhIIPzTRaeMJnJYqEFRvb2wg
|
||||
VXNhZ2UgRXJyb3IwATkIPJsAjzbPF0GQzJsAjzbPF0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjMw
|
||||
LjZ6AhgBEvMBChC/rx/MIKAdxGOMAfLmXNJoEghCqOptNDfKXSoKVG9vbCBVc2FnZTABOaAAQwuP
|
||||
Ns8XQejuQwuPNs8XShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuMzAuNkodCgl0b29sX25hbWUSEAoO
|
||||
bGVhcm5fYWJvdXRfQUlKDgoIYXR0ZW1wdHMSAhgBSmYKA2xsbRJfCl17Im5hbWUiOiBudWxsLCAi
|
||||
bW9kZWxfbmFtZSI6ICJncHQtMy41LXR1cmJvLTAxMjUiLCAidGVtcGVyYXR1cmUiOiAwLjcsICJj
|
||||
bGFzcyI6ICJDaGF0T3BlbkFJIn16AhgBEsoBChAJc/cdYOh6bOxXLE4BSMa0Egg1m5FL4NyjeCoQ
|
||||
VG9vbCBVc2FnZSBFcnJvcjABOXiAigyPNs8XQZhLiwyPNs8XShoKDmNyZXdhaV92ZXJzaW9uEggK
|
||||
BjAuMzAuNkpmCgNsbG0SXwpdeyJuYW1lIjogbnVsbCwgIm1vZGVsX25hbWUiOiAiZ3B0LTMuNS10
|
||||
dXJiby0wMTI1IiwgInRlbXBlcmF0dXJlIjogMC43LCAiY2xhc3MiOiAiQ2hhdE9wZW5BSSJ9egIY
|
||||
ARLzAQoQIwsE8AeUGx2Zlpsj/8cIpxIIGiYkqh1O8V8qClRvb2wgVXNhZ2UwATmIou0NjzbPF0EI
|
||||
Xu4NjzbPF0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjMwLjZKHQoJdG9vbF9uYW1lEhAKDmxlYXJu
|
||||
X2Fib3V0X0FJSg4KCGF0dGVtcHRzEgIYAUpmCgNsbG0SXwpdeyJuYW1lIjogbnVsbCwgIm1vZGVs
|
||||
X25hbWUiOiAiZ3B0LTMuNS10dXJiby0wMTI1IiwgInRlbXBlcmF0dXJlIjogMC43LCAiY2xhc3Mi
|
||||
OiAiQ2hhdE9wZW5BSSJ9egIYARLzAQoQzE4sWSyDKyW+MsjMu3qaURII0Ek+EFuG+pQqClRvb2wg
|
||||
VXNhZ2UwATl4knUPjzbPF0FYPnYPjzbPF0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjMwLjZKHQoJ
|
||||
dG9vbF9uYW1lEhAKDmxlYXJuX2Fib3V0X0FJSg4KCGF0dGVtcHRzEgIYAUpmCgNsbG0SXwpdeyJu
|
||||
YW1lIjogbnVsbCwgIm1vZGVsX25hbWUiOiAiZ3B0LTMuNS10dXJiby0wMTI1IiwgInRlbXBlcmF0
|
||||
dXJlIjogMC43LCAiY2xhc3MiOiAiQ2hhdE9wZW5BSSJ9egIYARKIAQoQV5Ebj7z0WSmnDL6cuEwc
|
||||
aRIIjQwTYaUnU+0qClRvb2wgVXNhZ2UwATmoPboXjzbPF0Eo+boXjzbPF0oaCg5jcmV3YWlfdmVy
|
||||
c2lvbhIICgYwLjMwLjZKGgoJdG9vbF9uYW1lEg0KC3JldHVybl9kYXRhSg4KCGF0dGVtcHRzEgIY
|
||||
AXoCGAESlwEKEC6MfuEYt3qySHDPP/fGhDoSCME60nPc2PP5KgpUb29sIFVzYWdlMAE5kBWpIo82
|
||||
zxdBmOSpIo82zxdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zMC42SikKCXRvb2xfbmFtZRIcChpE
|
||||
ZWxlZ2F0ZSB3b3JrIHRvIGNvLXdvcmtlckoOCghhdHRlbXB0cxICGAF6AhgBEo8BChAW4yhha9hM
|
||||
Bq622l4ljFgrEgiqbqTTWlbblCoKVG9vbCBVc2FnZTABOcgCViiPNs8XQVjlViiPNs8XShoKDmNy
|
||||
ZXdhaV92ZXJzaW9uEggKBjAuMzAuNkohCgl0b29sX25hbWUSFAoSbXVsdGlwbGNhdGlvbl90b29s
|
||||
Sg4KCGF0dGVtcHRzEgIYAXoCGAESjwEKEDMq2VDCq4/7QaQhcHACuUQSCN8bD12jrjruKgpUb29s
|
||||
IFVzYWdlMAE5OAsOKo82zxdBSK8OKo82zxdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC4zMC42SiEK
|
||||
CXRvb2xfbmFtZRIUChJtdWx0aXBsY2F0aW9uX3Rvb2xKDgoIYXR0ZW1wdHMSAhgBegIYARKPAQoQ
|
||||
7Q6do5ZLFG1iUnTRbk8ZOBIISweZORdlmm8qClRvb2wgVXNhZ2UwATngLc4rjzbPF0HY1c4rjzbP
|
||||
F0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjMwLjZKIQoJdG9vbF9uYW1lEhQKEm11bHRpcGxjYXRp
|
||||
b25fdG9vbEoOCghhdHRlbXB0cxICGAF6AhgBEo8BChDAqYWXAeTztQaza44KbHvPEgjLnc1ioqFZ
|
||||
2yoKVG9vbCBVc2FnZTABOWAPky2PNs8XQaCrky2PNs8XShoKDmNyZXdhaV92ZXJzaW9uEggKBjAu
|
||||
MzAuNkohCgl0b29sX25hbWUSFAoSbXVsdGlwbGNhdGlvbl90b29sSg4KCGF0dGVtcHRzEgIYAXoC
|
||||
GAE=
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate, br
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '5930'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.24.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Tue, 14 May 2024 01:26:13 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "user", "content": "You are Scorer. You''re an expert
|
||||
scorer, specialized in scoring titles.\nYour personal goal is: Score the titleTo
|
||||
give my best complete final answer to the task use the exact following format:\n\nThought:
|
||||
I now can give a great answer\nFinal Answer: my best complete final answer to
|
||||
the task.\nYour final answer must be the great and the most complete as possible,
|
||||
it must be outcome described.\n\nI MUST use these formats, my job depends on
|
||||
it!\nCurrent Task: Give me an integer score between 1-5 for the following title:
|
||||
''The impact of AI in the future of work''\n\nThis is the expect criteria for
|
||||
your final answer: The score of the title. \n you MUST return the actual complete
|
||||
content as the final answer, not a summary.\n\nBegin! This is VERY important
|
||||
to you, use the tools available and give your best Final Answer, your job depends
|
||||
on it!\n\nThought:\n"}], "model": "gpt-4", "n": 1, "stop": ["\nObservation"],
|
||||
body: '{"messages": [{"content": "You are Scorer. You''re an expert scorer, specialized
|
||||
in scoring titles.\nYour personal goal is: Score the titleTo give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: my best complete final answer to the task.\nYour
|
||||
final answer must be the great and the most complete as possible, it must be
|
||||
outcome described.\n\nI MUST use these formats, my job depends on it!\nCurrent
|
||||
Task: Give me an integer score between 1-5 for the following title: ''The impact
|
||||
of AI in the future of work''\n\nThis is the expect criteria for your final
|
||||
answer: The score of the title. \n you MUST return the actual complete content
|
||||
as the final answer, not a summary.\n\nBegin! This is VERY important to you,
|
||||
use the tools available and give your best Final Answer, your job depends on
|
||||
it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o", "n": 1, "stop": ["\nObservation"],
|
||||
"stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '996'
|
||||
- '997'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.29.0
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
@@ -172,7 +37,7 @@ interactions:
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.29.0
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
@@ -181,268 +46,64 @@ interactions:
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
string: 'data: {"id":"chatcmpl-9hry2om1JBkreHpDHFbfD2YDtg2oA","object":"chat.completion.chunk","created":1720242582,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"After"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hry2om1JBkreHpDHFbfD2YDtg2oA","object":"chat.completion.chunk","created":1720242582,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
considering"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hry2om1JBkreHpDHFbfD2YDtg2oA","object":"chat.completion.chunk","created":1720242582,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
relevance"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":","},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
depth"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":","},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
potential"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
interest"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
in"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
topic"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":","},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
as"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
well"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
as"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
clarity"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
and"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
simplicity"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
title"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
''"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"The"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
impact"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
in"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
future"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
work"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"''.\n\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
Based"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
on"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
my"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
evaluation"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":","},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9hry2om1JBkreHpDHFbfD2YDtg2oA","object":"chat.completion.chunk","created":1720242582,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9hry2om1JBkreHpDHFbfD2YDtg2oA","object":"chat.completion.chunk","created":1720242582,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hry2om1JBkreHpDHFbfD2YDtg2oA","object":"chat.completion.chunk","created":1720242582,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hry2om1JBkreHpDHFbfD2YDtg2oA","object":"chat.completion.chunk","created":1720242582,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
title"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
''"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"The"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
impact"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
in"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
future"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
work"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"''"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9hry2om1JBkreHpDHFbfD2YDtg2oA","object":"chat.completion.chunk","created":1720242582,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
score"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hry2om1JBkreHpDHFbfD2YDtg2oA","object":"chat.completion.chunk","created":1720242582,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hry2om1JBkreHpDHFbfD2YDtg2oA","object":"chat.completion.chunk","created":1720242582,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
data: {"id":"chatcmpl-9hry2om1JBkreHpDHFbfD2YDtg2oA","object":"chat.completion.chunk","created":1720242582,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hry2om1JBkreHpDHFbfD2YDtg2oA","object":"chat.completion.chunk","created":1720242582,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hry2om1JBkreHpDHFbfD2YDtg2oA","object":"chat.completion.chunk","created":1720242582,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hry2om1JBkreHpDHFbfD2YDtg2oA","object":"chat.completion.chunk","created":1720242582,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hry2om1JBkreHpDHFbfD2YDtg2oA","object":"chat.completion.chunk","created":1720242582,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"4"},"logprobs":null,"finish_reason":null}]}
|
||||
data: {"id":"chatcmpl-9hry2om1JBkreHpDHFbfD2YDtg2oA","object":"chat.completion.chunk","created":1720242582,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{"content":"4"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
out"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"
|
||||
"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"5"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ObDhPtGi5k6ALX2jaFMR0vwXhSx0","object":"chat.completion.chunk","created":1715649973,"model":"gpt-4-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
data: {"id":"chatcmpl-9hry2om1JBkreHpDHFbfD2YDtg2oA","object":"chat.completion.chunk","created":1720242582,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_d576307f90","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
@@ -453,20 +114,20 @@ interactions:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8837194f3860a686-MIA
|
||||
- 89ed158b8bf0a566-MIA
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Tue, 14 May 2024 01:26:14 GMT
|
||||
- Sat, 06 Jul 2024 05:09:42 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=o_N1eUBbqWWHvbE622tXs5BC9yN1X1bxp3npQXI46JU-1715649974-1.0.1.1-ALOCo.oJW08V4.F5WqVArJDbxakTERtLPfwUlDjTYrrg_WiJox.Pw_n.PnDfEdxa52BbaPI8M80h2S3wdJy2sw;
|
||||
path=/; expires=Tue, 14-May-24 01:56:14 GMT; domain=.api.openai.com; HttpOnly;
|
||||
- __cf_bm=5C3MG9ni0I5bZoHGzfXZq16obGaD1INR3_.wX4CRPAk-1720242582-1.0.1.1-fZiD6L1FdBiC0gqcmBK9_IaHhbHPQi4z04fxYQtoDc9KbYqPvxm_sxP_RkuZX_AyPkHgu85IRq9E6MUAZJGzwQ;
|
||||
path=/; expires=Sat, 06-Jul-24 05:39:42 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=vYif.qp5YND_OTGCG3IgxPXT3sqDTSpeVEIUapP.n.k-1715649974367-0.0.1.1-604800000;
|
||||
- _cfuvid=YP7Z3XnHPKQDU2nOhrLzkxr8InOv42HLWchJd1ogneQ-1720242582534-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
@@ -475,55 +136,54 @@ interactions:
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '510'
|
||||
- '90'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=15724800; includeSubDomains
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '300000'
|
||||
- '16000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9997'
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '299098'
|
||||
- '15999772'
|
||||
x-ratelimit-reset-requests:
|
||||
- 13ms
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 180ms
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_26f0f1e02a54a0aaf7e8b96ba7d56837
|
||||
- req_36d283adbca77945609f0da658047ba0
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "user", "content": "Based on my evaluation, I give
|
||||
the title ''The impact of AI in the future of work'' a score of 4 out of 5."},
|
||||
{"role": "system", "content": "I''m gonna convert this raw text into valid JSON."}],
|
||||
"model": "gpt-4", "tool_choice": {"type": "function", "function": {"name": "ScoreOutput"}},
|
||||
"tools": [{"type": "function", "function": {"name": "ScoreOutput", "description":
|
||||
"Correctly extracted `ScoreOutput` with all the required parameters with correct
|
||||
types", "parameters": {"properties": {"score": {"title": "Score", "type": "integer"}},
|
||||
"required": ["score"], "type": "object"}}}]}'
|
||||
body: '{"messages": [{"role": "user", "content": "4"}, {"role": "system", "content":
|
||||
"I''m gonna convert this raw text into valid JSON."}], "model": "gpt-4o", "tool_choice":
|
||||
{"type": "function", "function": {"name": "ScoreOutput"}}, "tools": [{"type":
|
||||
"function", "function": {"name": "ScoreOutput", "description": "Correctly extracted
|
||||
`ScoreOutput` with all the required parameters with correct types", "parameters":
|
||||
{"properties": {"score": {"title": "Score", "type": "integer"}}, "required":
|
||||
["score"], "type": "object"}}}]}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '621'
|
||||
- '519'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=o_N1eUBbqWWHvbE622tXs5BC9yN1X1bxp3npQXI46JU-1715649974-1.0.1.1-ALOCo.oJW08V4.F5WqVArJDbxakTERtLPfwUlDjTYrrg_WiJox.Pw_n.PnDfEdxa52BbaPI8M80h2S3wdJy2sw;
|
||||
_cfuvid=vYif.qp5YND_OTGCG3IgxPXT3sqDTSpeVEIUapP.n.k-1715649974367-0.0.1.1-604800000
|
||||
- __cf_bm=5C3MG9ni0I5bZoHGzfXZq16obGaD1INR3_.wX4CRPAk-1720242582-1.0.1.1-fZiD6L1FdBiC0gqcmBK9_IaHhbHPQi4z04fxYQtoDc9KbYqPvxm_sxP_RkuZX_AyPkHgu85IRq9E6MUAZJGzwQ;
|
||||
_cfuvid=YP7Z3XnHPKQDU2nOhrLzkxr8InOv42HLWchJd1ogneQ-1720242582534-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.29.0
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
@@ -533,7 +193,7 @@ interactions:
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.29.0
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
@@ -543,26 +203,27 @@ interactions:
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
g2QBAMTINfMxamd1CeMeEI6KoYFref5vBi4N2oOiMbVNv3QLhr1AppZoAgF1drbOzezQ65E14dmt
|
||||
ADjNKcFsksRMO1Ud3qXHqqnW3c7ZUq/M23vj7mI9eXr198urN1YEQJvOiiz2xGsuBurUGghycvlZ
|
||||
kVOi2W92e53hsN8X3tE2LxQlOHax2qk2es12r9OJnWZFoMSXAIBtaQTw3fsuJRqVVsQLQ6AEfgTQ
|
||||
W1VQgkkI0xATE1kB5MLWTUqYUinhXLRW/WWJUuRQDm7ruUX6a6LU3/3z09v8MH9W3WF6MLi0g3h7
|
||||
/LJO+pyQembtBoV6YJRiUFmdJIwANIme0FtPmfXFXRldGVUYAE38e0pw+22Ab4bM+uKbEp1vs6fq
|
||||
xr2oYz/0+J5iGZUdO2/TsGA4o9YBfzeEjVGCIVpHY+0F8GPJammcS3Teahf/op0XJlCi2ejpBLhp
|
||||
cghsmHJfPJATmc1mVwySYByBlb/R1IwL7/zUrrLYCwMD
|
||||
H4sIAAAAAAAAA2xS30/bMBB+z19h3XMzhbShJW8wiW1MGogixDRQ5DpOanB8ln1hK1X/d+Q0NKFa
|
||||
HqzTfff90F22EWOgSsgZiDUn0Vgdn63dJl3d3rSv5xuf/b7Di/PLq/JUVYub5wwmgYGrZynog/VF
|
||||
YGO1JIVmDwsnOcmgejJPk3SWZou0AxospQ602lI8wziAcZLFJ9OeuEYlpIec/YkYY2zbvSGiKeU/
|
||||
yFky+eg00nteS8gPQ4yBQx06wL1XnrghmAygQEPShNSm1XoEEKIuBNd6MN5/21E97IlrXdxePlzN
|
||||
/s6/rpZv3x+WF9P7nz++vdz/8iO/vfTGdoGq1ojDfkb4oZ8fmTEGhjcddynQyeuWbEtHdMaAu7pt
|
||||
pKEQHbaP4MPwI+SzHXwa3UX/q5/6andYq8baOlz5oy1BpYzy68JJ7ru04Ant3iLIPXXnaz9dBKzD
|
||||
xlJB+CJNEFz014PhfxnArMcIiesRJ4v6eOA3nmRTVMrU0lmnulNCZYsym59Ok3l1lkC0i94BAAD/
|
||||
/wMAylx2sdMCAAA=
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 88371965caa0a686-MIA
|
||||
- 89ed158dee46a566-MIA
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- br
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Tue, 14 May 2024 01:26:18 GMT
|
||||
- Sat, 06 Jul 2024 05:09:42 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
@@ -572,25 +233,25 @@ interactions:
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '994'
|
||||
- '144'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=15724800; includeSubDomains
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '300000'
|
||||
- '16000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '299944'
|
||||
- '15999969'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 11ms
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_a9b61926ad5fdaf004dd4b99738bcadb
|
||||
- req_990566332b9b1851c581486c0a4da0e6
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
|
||||
6244
tests/cassettes/test_sequential_async_task_execution_completion.yaml
Normal file
6244
tests/cassettes/test_sequential_async_task_execution_completion.yaml
Normal file
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,146 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: !!binary |
|
||||
CtoyCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSsTIKEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRJpChCqSW3lVdefwRBLNXi6Xhm8EgijxkgGOCJOMSoQVG9vbCBVc2FnZSBFcnJvcjAB
|
||||
OQh+giEMv94XQYgthSEMv94XShsKDmNyZXdhaV92ZXJzaW9uEgkKBzAuMzAuMTF6AhgBhQEAAQAA
|
||||
EmkKEITX9xWzBQ0KqeidbMtD1zESCLEDq6L4lZGPKhBUb29sIFVzYWdlIEVycm9yMAE5OF5weAy/
|
||||
3hdBqOZyeAy/3hdKGwoOY3Jld2FpX3ZlcnNpb24SCQoHMC4zMC4xMXoCGAGFAQABAAASgAIKEPfc
|
||||
duYrDUwkgj4RUzAtD2cSCFdOYgVbkCuLKg5UYXNrIEV4ZWN1dGlvbjABOciuqqkLv94XQcA1NNcM
|
||||
v94XSjEKB3Rhc2tfaWQSJgokMzg2ZTBkMWQtMWVjNy00M2QzLTg3MWItNWU3ZjBiZjBkOWVmSi0K
|
||||
FWZvcm1hdHRlZF9kZXNjcmlwdGlvbhIUChJIb3cgbXVjaCBpcyAyICsgMj9KQwoZZm9ybWF0dGVk
|
||||
X2V4cGVjdGVkX291dHB1dBImCiRUaGUgcmVzdWx0IG9mIHRoZSBzdW0gYXMgYW4gaW50ZWdlci5K
|
||||
DQoGb3V0cHV0EgMKATR6AhgBhQEAAQAAErYLChC8sKh5qQC3H99eVsmrP4vCEgizWQpCjdW1WSoM
|
||||
Q3JldyBDcmVhdGVkMAE5eIbY2Qy/3hdBSILa2Qy/3hdKGwoOY3Jld2FpX3ZlcnNpb24SCQoHMC4z
|
||||
MC4xMUoaCg5weXRob25fdmVyc2lvbhIICgYzLjExLjVKMQoHY3Jld19pZBImCiQ1MjA2MDE0Zi0w
|
||||
NDUyLTQzNjQtOWZiMS1lNWM4MzZmNmZiYmJKHAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxK
|
||||
EQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdfbnVtYmVyX29mX3Rhc2tzEgIYAUobChVjcmV3X251
|
||||
bWJlcl9vZl9hZ2VudHMSAhgBStIECgtjcmV3X2FnZW50cxLCBAq/BFt7ImlkIjogIjU2OTBjOGJm
|
||||
LWVkMzEtNGU1OC04NzBhLTE2OWM3OTQ1ODdjOSIsICJyb2xlIjogIlJlc2VhcmNoZXIiLCAiZ29h
|
||||
bCI6ICJNYWtlIHRoZSBiZXN0IHJlc2VhcmNoIGFuZCBhbmFseXNpcyBvbiBjb250ZW50IGFib3V0
|
||||
IEFJIGFuZCBBSSBhZ2VudHMiLCAiYmFja3N0b3J5IjogIllvdSdyZSBhbiBleHBlcnQgcmVzZWFy
|
||||
Y2hlciwgc3BlY2lhbGl6ZWQgaW4gdGVjaG5vbG9neSwgc29mdHdhcmUgZW5naW5lZXJpbmcsIEFJ
|
||||
IGFuZCBzdGFydHVwcy4gWW91IHdvcmsgYXMgYSBmcmVlbGFuY2VyIGFuZCBpcyBub3cgd29ya2lu
|
||||
ZyBvbiBkb2luZyByZXNlYXJjaCBhbmQgYW5hbHlzaXMgZm9yIGEgbmV3IGN1c3RvbWVyLiIsICJ2
|
||||
ZXJib3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAyNSwgIm1heF9ycG0iOiBudWxsLCAiaTE4biI6
|
||||
IG51bGwsICJsbG0iOiAie1wibmFtZVwiOiBudWxsLCBcIm1vZGVsX25hbWVcIjogXCJncHQtNG9c
|
||||
IiwgXCJ0ZW1wZXJhdHVyZVwiOiAwLjcsIFwiY2xhc3NcIjogXCJDaGF0T3BlbkFJXCJ9IiwgImRl
|
||||
bGVnYXRpb25fZW5hYmxlZD8iOiB0cnVlLCAidG9vbHNfbmFtZXMiOiBbXX1dStACCgpjcmV3X3Rh
|
||||
c2tzEsECCr4CW3siaWQiOiAiZTE5ODM4Y2EtOGNhMi00MzhiLThiNmMtNDFmM2VlYjJmMDA1Iiwg
|
||||
ImRlc2NyaXB0aW9uIjogIkxvb2sgYXQgdGhlIGF2YWlsYWJsZSBkYXRhIG5kIGdpdmUgbWUgYSBz
|
||||
ZW5zZSBvbiB0aGUgdG90YWwgbnVtYmVyIG9mIHNhbGVzLiIsICJleHBlY3RlZF9vdXRwdXQiOiAi
|
||||
VGhlIHRvdGFsIG51bWJlciBvZiBzYWxlcyBhcyBhbiBpbnRlZ2VyIiwgImFzeW5jX2V4ZWN1dGlv
|
||||
bj8iOiBmYWxzZSwgImh1bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdlbnRfcm9sZSI6ICJSZXNlYXJj
|
||||
aGVyIiwgImNvbnRleHQiOiBudWxsLCAidG9vbHNfbmFtZXMiOiBbXX1dSioKCHBsYXRmb3JtEh4K
|
||||
HG1hY09TLTE0LjEuMS1hcm02NC1hcm0tNjRiaXRKHAoQcGxhdGZvcm1fcmVsZWFzZRIICgYyMy4x
|
||||
LjBKGwoPcGxhdGZvcm1fc3lzdGVtEggKBkRhcndpbkp7ChBwbGF0Zm9ybV92ZXJzaW9uEmcKZURh
|
||||
cndpbiBLZXJuZWwgVmVyc2lvbiAyMy4xLjA6IE1vbiBPY3QgIDkgMjE6Mjc6MjQgUERUIDIwMjM7
|
||||
IHJvb3Q6eG51LTEwMDAyLjQxLjl+Ni9SRUxFQVNFX0FSTTY0X1Q2MDAwSgoKBGNwdXMSAhgKegIY
|
||||
AYUBAAEAABL/CAoQ96LQBkWdya2FFyVOx27tLRIIXv2S0EgHgE4qDENyZXcgQ3JlYXRlZDABOdDJ
|
||||
bNwMv94XQRiybtwMv94XShsKDmNyZXdhaV92ZXJzaW9uEgkKBzAuMzAuMTFKGgoOcHl0aG9uX3Zl
|
||||
cnNpb24SCAoGMy4xMS41SjEKB2NyZXdfaWQSJgokOGM5OTQzNDMtYzcyYi00MzIwLTlkNTItNzI2
|
||||
MTlkNmNmZTc3ShwKDGNyZXdfcHJvY2VzcxIMCgpzZXF1ZW50aWFsShEKC2NyZXdfbWVtb3J5EgIQ
|
||||
AEoaChRjcmV3X251bWJlcl9vZl90YXNrcxICGAFKGwoVY3Jld19udW1iZXJfb2ZfYWdlbnRzEgIY
|
||||
AUr+AgoLY3Jld19hZ2VudHMS7gIK6wJbeyJpZCI6ICI1YThjOGRjZS00YmQyLTQyMWEtYjUzZC1k
|
||||
ZjE5ODEzMDFiYjEiLCAicm9sZSI6ICJSZXNlYXJjaGVyIiwgImdvYWwiOiAiQmUgc3VwZXIgZW1w
|
||||
YXRoZXRpYy4iLCAiYmFja3N0b3J5IjogIllvdSdyZSBsb3ZlIHRvIHNleSBob3dkeS4iLCAidmVy
|
||||
Ym9zZT8iOiBmYWxzZSwgIm1heF9pdGVyIjogMjUsICJtYXhfcnBtIjogbnVsbCwgImkxOG4iOiBu
|
||||
dWxsLCAibGxtIjogIntcIm5hbWVcIjogbnVsbCwgXCJtb2RlbF9uYW1lXCI6IFwiZ3B0LTRvXCIs
|
||||
IFwidGVtcGVyYXR1cmVcIjogMC43LCBcImNsYXNzXCI6IFwiQ2hhdE9wZW5BSVwifSIsICJkZWxl
|
||||
Z2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJ0b29sc19uYW1lcyI6IFtdfV1K7QEKCmNyZXdfdGFz
|
||||
a3MS3gEK2wFbeyJpZCI6ICJmNWEwMjU3Ni1iNTYxLTRiNzQtOTNhMC0yNmYxMDc2YWI5M2MiLCAi
|
||||
ZGVzY3JpcHRpb24iOiAic2F5IGhvd2R5IiwgImV4cGVjdGVkX291dHB1dCI6ICJIb3dkeSEiLCAi
|
||||
YXN5bmNfZXhlY3V0aW9uPyI6IGZhbHNlLCAiaHVtYW5faW5wdXQ/IjogZmFsc2UsICJhZ2VudF9y
|
||||
b2xlIjogIlJlc2VhcmNoZXIiLCAiY29udGV4dCI6IG51bGwsICJ0b29sc19uYW1lcyI6IFtdfV1K
|
||||
KgoIcGxhdGZvcm0SHgocbWFjT1MtMTQuMS4xLWFybTY0LWFybS02NGJpdEocChBwbGF0Zm9ybV9y
|
||||
ZWxlYXNlEggKBjIzLjEuMEobCg9wbGF0Zm9ybV9zeXN0ZW0SCAoGRGFyd2luSnsKEHBsYXRmb3Jt
|
||||
X3ZlcnNpb24SZwplRGFyd2luIEtlcm5lbCBWZXJzaW9uIDIzLjEuMDogTW9uIE9jdCAgOSAyMToy
|
||||
NzoyNCBQRFQgMjAyMzsgcm9vdDp4bnUtMTAwMDIuNDEuOX42L1JFTEVBU0VfQVJNNjRfVDYwMDBK
|
||||
CgoEY3B1cxICGAp6AhgBhQEAAQAAErgCChCZQqdNoQoZm/bgnzRGX2OfEgguN/7szlsLYioOVGFz
|
||||
ayBFeGVjdXRpb24wATmwOXfcDL/eF0FoPgTeDL/eF0oxCgd0YXNrX2lkEiYKJGY1YTAyNTc2LWI1
|
||||
NjEtNGI3NC05M2EwLTI2ZjEwNzZhYjkzY0okChVmb3JtYXR0ZWRfZGVzY3JpcHRpb24SCwoJc2F5
|
||||
IGhvd2R5SiUKGWZvcm1hdHRlZF9leHBlY3RlZF9vdXRwdXQSCAoGSG93ZHkhSmwKBm91dHB1dBJi
|
||||
CmBIb3dkeSEgSSBob3BlIHRoaXMgbWVzc2FnZSBmaW5kcyB5b3Ugd2VsbCBhbmQgYnJpbmdzIGEg
|
||||
c21pbGUgdG8geW91ciBmYWNlLiBIYXZlIGEgZmFudGFzdGljIGRheSF6AhgBhQEAAQAAEv8IChBG
|
||||
RLGvZYMTRLcpQuhEq3MREggMeXM0BUGtiSoMQ3JldyBDcmVhdGVkMAE5KKNZ4Qy/3hdBWI9b4Qy/
|
||||
3hdKGwoOY3Jld2FpX3ZlcnNpb24SCQoHMC4zMC4xMUoaCg5weXRob25fdmVyc2lvbhIICgYzLjEx
|
||||
LjVKMQoHY3Jld19pZBImCiQxN2Q3YmMzYi04MjE2LTQ4MWMtOGU2YS0zN2U4MDA3M2E1YmJKHAoM
|
||||
Y3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdfbnVt
|
||||
YmVyX29mX3Rhc2tzEgIYAUobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgBSv4CCgtjcmV3X2Fn
|
||||
ZW50cxLuAgrrAlt7ImlkIjogIjRiYTAzMmMzLWE1NDktNDQyMS05MjY0LTY1ZmVmODZhMTI3MiIs
|
||||
ICJyb2xlIjogIlJlc2VhcmNoZXIiLCAiZ29hbCI6ICJCZSBzdXBlciBlbXBhdGhldGljLiIsICJi
|
||||
YWNrc3RvcnkiOiAiWW91J3JlIGxvdmUgdG8gc2V5IGhvd2R5LiIsICJ2ZXJib3NlPyI6IGZhbHNl
|
||||
LCAibWF4X2l0ZXIiOiAyNSwgIm1heF9ycG0iOiBudWxsLCAiaTE4biI6IG51bGwsICJsbG0iOiAi
|
||||
e1wibmFtZVwiOiBudWxsLCBcIm1vZGVsX25hbWVcIjogXCJncHQtNG9cIiwgXCJ0ZW1wZXJhdHVy
|
||||
ZVwiOiAwLjcsIFwiY2xhc3NcIjogXCJDaGF0T3BlbkFJXCJ9IiwgImRlbGVnYXRpb25fZW5hYmxl
|
||||
ZD8iOiBmYWxzZSwgInRvb2xzX25hbWVzIjogW119XUrtAQoKY3Jld190YXNrcxLeAQrbAVt7Imlk
|
||||
IjogImJlOGZjMWNmLTVjMTYtNDQ4NS04NDZkLTlmNzkxNjcxZmE0NCIsICJkZXNjcmlwdGlvbiI6
|
||||
ICJzYXkgaG93ZHkiLCAiZXhwZWN0ZWRfb3V0cHV0IjogIkhvd2R5ISIsICJhc3luY19leGVjdXRp
|
||||
b24/IjogZmFsc2UsICJodW1hbl9pbnB1dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAiUmVzZWFy
|
||||
Y2hlciIsICJjb250ZXh0IjogbnVsbCwgInRvb2xzX25hbWVzIjogW119XUoqCghwbGF0Zm9ybRIe
|
||||
ChxtYWNPUy0xNC4xLjEtYXJtNjQtYXJtLTY0Yml0ShwKEHBsYXRmb3JtX3JlbGVhc2USCAoGMjMu
|
||||
MS4wShsKD3BsYXRmb3JtX3N5c3RlbRIICgZEYXJ3aW5KewoQcGxhdGZvcm1fdmVyc2lvbhJnCmVE
|
||||
YXJ3aW4gS2VybmVsIFZlcnNpb24gMjMuMS4wOiBNb24gT2N0ICA5IDIxOjI3OjI0IFBEVCAyMDIz
|
||||
OyByb290OnhudS0xMDAwMi40MS45fjYvUkVMRUFTRV9BUk02NF9UNjAwMEoKCgRjcHVzEgIYCnoC
|
||||
GAGFAQABAAAS3gEKEPaxZgbRdGE/aC7TcpTEU3USCO2gTkTtndVwKg5UYXNrIEV4ZWN1dGlvbjAB
|
||||
OSjeZOEMv94XQXgCO+IMv94XSjEKB3Rhc2tfaWQSJgokYmU4ZmMxY2YtNWMxNi00NDg1LTg0NmQt
|
||||
OWY3OTE2NzFmYTQ0SiQKFWZvcm1hdHRlZF9kZXNjcmlwdGlvbhILCglzYXkgaG93ZHlKJQoZZm9y
|
||||
bWF0dGVkX2V4cGVjdGVkX291dHB1dBIICgZIb3dkeSFKEgoGb3V0cHV0EggKBkhvd2R5IXoCGAGF
|
||||
AQABAAAS6AwKEDPpREZrHZXFHl0sOJRtgesSCBN6824xY2RxKgxDcmV3IENyZWF0ZWQwATl40Vfi
|
||||
DL/eF0HQY1niDL/eF0obCg5jcmV3YWlfdmVyc2lvbhIJCgcwLjMwLjExShoKDnB5dGhvbl92ZXJz
|
||||
aW9uEggKBjMuMTEuNUoxCgdjcmV3X2lkEiYKJGU3YjFlNDdjLTM4OTQtNDUyYi1hNzRjLWZiZTU3
|
||||
NDUxOWQxOEocCgxjcmV3X3Byb2Nlc3MSDAoKc2VxdWVudGlhbEoRCgtjcmV3X21lbW9yeRICEABK
|
||||
GgoUY3Jld19udW1iZXJfb2ZfdGFza3MSAhgBShsKFWNyZXdfbnVtYmVyX29mX2FnZW50cxICGAFK
|
||||
0wQKC2NyZXdfYWdlbnRzEsMECsAEW3siaWQiOiAiYzRmZTBkOTYtNzRkMy00MTk4LWI5MDQtYWFi
|
||||
NzBlZGMxYjQ1IiwgInJvbGUiOiAiUmVzZWFyY2hlciIsICJnb2FsIjogIk1ha2UgdGhlIGJlc3Qg
|
||||
cmVzZWFyY2ggYW5kIGFuYWx5c2lzIG9uIGNvbnRlbnQgYWJvdXQgQUkgYW5kIEFJIGFnZW50cyIs
|
||||
ICJiYWNrc3RvcnkiOiAiWW91J3JlIGFuIGV4cGVydCByZXNlYXJjaGVyLCBzcGVjaWFsaXplZCBp
|
||||
biB0ZWNobm9sb2d5LCBzb2Z0d2FyZSBlbmdpbmVlcmluZywgQUkgYW5kIHN0YXJ0dXBzLiBZb3Ug
|
||||
d29yayBhcyBhIGZyZWVsYW5jZXIgYW5kIGlzIG5vdyB3b3JraW5nIG9uIGRvaW5nIHJlc2VhcmNo
|
||||
IGFuZCBhbmFseXNpcyBmb3IgYSBuZXcgY3VzdG9tZXIuIiwgInZlcmJvc2U/IjogZmFsc2UsICJt
|
||||
YXhfaXRlciI6IDI1LCAibWF4X3JwbSI6IG51bGwsICJpMThuIjogbnVsbCwgImxsbSI6ICJ7XCJu
|
||||
YW1lXCI6IG51bGwsIFwibW9kZWxfbmFtZVwiOiBcImdwdC00b1wiLCBcInRlbXBlcmF0dXJlXCI6
|
||||
IDAuNywgXCJjbGFzc1wiOiBcIkNoYXRPcGVuQUlcIn0iLCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6
|
||||
IGZhbHNlLCAidG9vbHNfbmFtZXMiOiBbXX1dSoEECgpjcmV3X3Rhc2tzEvIDCu8DW3siaWQiOiAi
|
||||
YzE2NTkyZjktZDM2Yy00MDFlLWJiMTEtYzdlMGY5ODkxZjA2IiwgImRlc2NyaXB0aW9uIjogIkNv
|
||||
bWUgdXAgd2l0aCBhIGxpc3Qgb2YgNSBpbnRlcmVzdGluZyBpZGVhcyB0byBleHBsb3JlIGZvciBh
|
||||
biBhcnRpY2xlLCB0aGVuIHdyaXRlIG9uZSBhbWF6aW5nIHBhcmFncmFwaCBoaWdobGlnaHQgZm9y
|
||||
IGVhY2ggaWRlYSB0aGF0IHNob3djYXNlcyBob3cgZ29vZCBhbiBhcnRpY2xlIGFib3V0IHRoaXMg
|
||||
dG9waWMgY291bGQgYmUuIFJldHVybiB0aGUgbGlzdCBvZiBpZGVhcyB3aXRoIHRoZWlyIHBhcmFn
|
||||
cmFwaCBhbmQgeW91ciBub3Rlcy4iLCAiZXhwZWN0ZWRfb3V0cHV0IjogIjUgYnVsbGV0IHBvaW50
|
||||
cyB3aXRoIGEgcGFyYWdyYXBoIGZvciBlYWNoIGlkZWEuIiwgImFzeW5jX2V4ZWN1dGlvbj8iOiBm
|
||||
YWxzZSwgImh1bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdlbnRfcm9sZSI6ICJOb25lIiwgImNvbnRl
|
||||
eHQiOiBudWxsLCAidG9vbHNfbmFtZXMiOiBbXX1dSioKCHBsYXRmb3JtEh4KHG1hY09TLTE0LjEu
|
||||
MS1hcm02NC1hcm0tNjRiaXRKHAoQcGxhdGZvcm1fcmVsZWFzZRIICgYyMy4xLjBKGwoPcGxhdGZv
|
||||
cm1fc3lzdGVtEggKBkRhcndpbkp7ChBwbGF0Zm9ybV92ZXJzaW9uEmcKZURhcndpbiBLZXJuZWwg
|
||||
VmVyc2lvbiAyMy4xLjA6IE1vbiBPY3QgIDkgMjE6Mjc6MjQgUERUIDIwMjM7IHJvb3Q6eG51LTEw
|
||||
MDAyLjQxLjl+Ni9SRUxFQVNFX0FSTTY0X1Q2MDAwSgoKBGNwdXMSAhgKegIYAYUBAAEAAA==
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate, br
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '6493'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.25.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Wed, 03 Jul 2024 15:56:09 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
333
tests/cassettes/test_single_task_with_async_execution.yaml
Normal file
333
tests/cassettes/test_single_task_with_async_execution.yaml
Normal file
@@ -0,0 +1,333 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are Researcher. You''re an expert researcher,
|
||||
specialized in technology, software engineering, AI and startups. You work as
|
||||
a freelancer and is now working on doing research and analysis for a new customer.\nYour
|
||||
personal goal is: Make the best research and analysis on content about AI and
|
||||
AI agentsTo give my best complete final answer to the task use the exact following
|
||||
format:\n\nThought: I now can give a great answer\nFinal Answer: my best complete
|
||||
final answer to the task.\nYour final answer must be the great and the most
|
||||
complete as possible, it must be outcome described.\n\nI MUST use these formats,
|
||||
my job depends on it!\nCurrent Task: Generate a list of 5 interesting ideas
|
||||
to explore for an article, where each bulletpoint is under 15 words.\n\nThis
|
||||
is the expect criteria for your final answer: Bullet point list of 5 important
|
||||
events. No additional commentary. \n you MUST return the actual complete content
|
||||
as the final answer, not a summary.\n\nBegin! This is VERY important to you,
|
||||
use the tools available and give your best Final Answer, your job depends on
|
||||
it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o", "n": 1, "stop": ["\nObservation"],
|
||||
"stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1237'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.34.0
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.34.0
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.3
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
can"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
give"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
a"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
great"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"-"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
The"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
impact"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
agents"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
on"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
remote"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
work"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
productivity"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":".\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"-"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
Ethical"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
considerations"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
in"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"-driven"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
decision"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"-making"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":".\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"-"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
How"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
agents"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
are"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
transforming"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
customer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
service"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":".\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"-"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
The"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
role"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
of"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
in"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
personalized"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
learning"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
experiences"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":".\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"-"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
AI"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
advancements"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
in"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
healthcare"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"
|
||||
diagnostics"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9ce1Nupvw1SEEUL1MxkSS1S2KMYoY","object":"chat.completion.chunk","created":1718997333,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_3e7d703517","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 897653f3e8ba7ba2-ATL
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Fri, 21 Jun 2024 19:15:33 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=9ch02HraQXiYJx8jBtYzKXOBjm4nToP.1sBISDFt9Gc-1718997333-1.0.1.1-Ykz1rbMzc2Zo8VV5rBwixPedTuO8s_38psrpuLCSy2B.YIyCCXWMGI_JT5WGQVp2gacOcxjWMSVhOOY85gf9QQ;
|
||||
path=/; expires=Fri, 21-Jun-24 19:45:33 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=0srdhmUvYEBaQ2xn7BzySIPRoIiEPWzmvngtQRdnpUY-1718997333518-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '165'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '12000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '11999712'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 1ms
|
||||
x-request-id:
|
||||
- 92f00e3ecc754086e0ddf2d998f6f671
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
1097
tests/cassettes/test_three_task_with_async_execution.yaml
Normal file
1097
tests/cassettes/test_three_task_with_async_execution.yaml
Normal file
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,335 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are Friendly Neighbor. You are the friendly
|
||||
neighbor\nYour personal goal is: Make everyone feel welcome\nYou ONLY have access
|
||||
to the following tools, and should NEVER make up tools that are not listed here:\n\nDecide
|
||||
Greetings() -> str - Decide Greetings() - Decide what is the appropriate greeting
|
||||
to use\n\nUse the following format:\n\nThought: you should always think about
|
||||
what to do\nAction: the action to take, only one name of [Decide Greetings],
|
||||
just the name, exactly as it''s written.\nAction Input: the input to the action,
|
||||
just a simple python dictionary, enclosed in curly braces, using \" to wrap
|
||||
keys and values.\nObservation: the result of the action\n\nOnce all necessary
|
||||
information is gathered:\n\nThought: I now know the final answer\nFinal Answer:
|
||||
the final answer to the original input question\n\nCurrent Task: Say an appropriate
|
||||
greeting.\n\nThis is the expect criteria for your final answer: The greeting.
|
||||
\n you MUST return the actual complete content as the final answer, not a summary.\n\nBegin!
|
||||
This is VERY important to you, use the tools available and give your best Final
|
||||
Answer, your job depends on it!\n\nThought:\n", "role": "user"}], "model": "gpt-4o",
|
||||
"n": 1, "stop": ["\nObservation"], "stream": true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1289'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.9
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
need"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
to"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
decide"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
on"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
an"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
appropriate"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
greeting"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":".\n\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"Action"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
Decide"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
Greetings"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"Action"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
Input"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{"content":"
|
||||
{}\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWRAEA0akLHaVsdYQP1dYZ73QJC","object":"chat.completion.chunk","created":1720137083,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_4008e3b719","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 89e305e3c8e382f5-GIG
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Thu, 04 Jul 2024 23:51:24 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=y7BtDW9RWNaYoBExulKsMw50ppqr1itieWbcStDWqVc-1720137084-1.0.1.1-EYCEQ9jOimP45.FgXjdzWftUrV1HHm49W4wbcxFhbrj2DVC1LnMbz9.l.c._AqBRgFAE3xVolosvjmoFDAMPYQ;
|
||||
path=/; expires=Fri, 05-Jul-24 00:21:24 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=pZBoWQ1_gTeUh2oe6ta.S2mxWtdaHvAtn6m2HszLdwk-1720137084219-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '335'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '16000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '15999700'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 1ms
|
||||
x-request-id:
|
||||
- req_b3f7e3c47df2641d6bef704ef3ae8a0f
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"content": "You are Friendly Neighbor. You are the friendly
|
||||
neighbor\nYour personal goal is: Make everyone feel welcome\nYou ONLY have access
|
||||
to the following tools, and should NEVER make up tools that are not listed here:\n\nDecide
|
||||
Greetings() -> str - Decide Greetings() - Decide what is the appropriate greeting
|
||||
to use\n\nUse the following format:\n\nThought: you should always think about
|
||||
what to do\nAction: the action to take, only one name of [Decide Greetings],
|
||||
just the name, exactly as it''s written.\nAction Input: the input to the action,
|
||||
just a simple python dictionary, enclosed in curly braces, using \" to wrap
|
||||
keys and values.\nObservation: the result of the action\n\nOnce all necessary
|
||||
information is gathered:\n\nThought: I now know the final answer\nFinal Answer:
|
||||
the final answer to the original input question\n\nCurrent Task: Say an appropriate
|
||||
greeting.\n\nThis is the expect criteria for your final answer: The greeting.
|
||||
\n you MUST return the actual complete content as the final answer, not a summary.\n\nBegin!
|
||||
This is VERY important to you, use the tools available and give your best Final
|
||||
Answer, your job depends on it!\n\nThought:\nI need to decide on an appropriate
|
||||
greeting.\n\nAction: Decide Greetings\nAction Input: {}\n\nObservation: Howdy!\n",
|
||||
"role": "user"}], "model": "gpt-4o", "n": 1, "stop": ["\nObservation"], "stream":
|
||||
true, "temperature": 0.7}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1404'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=y7BtDW9RWNaYoBExulKsMw50ppqr1itieWbcStDWqVc-1720137084-1.0.1.1-EYCEQ9jOimP45.FgXjdzWftUrV1HHm49W4wbcxFhbrj2DVC1LnMbz9.l.c._AqBRgFAE3xVolosvjmoFDAMPYQ;
|
||||
_cfuvid=pZBoWQ1_gTeUh2oe6ta.S2mxWtdaHvAtn6m2HszLdwk-1720137084219-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.9
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: 'data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"Thought"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
I"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
now"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
know"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
the"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":".\n\n"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"Final"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
Answer"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":":"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"
|
||||
How"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"dy"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{"content":"!"},"logprobs":null,"finish_reason":null}]}
|
||||
|
||||
|
||||
data: {"id":"chatcmpl-9hQWSD5B35ANI9JLmbxUdPECfNd43","object":"chat.completion.chunk","created":1720137084,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_ce0793330f","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}
|
||||
|
||||
|
||||
data: [DONE]
|
||||
|
||||
|
||||
'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 89e305ea4abc82f5-GIG
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Type:
|
||||
- text/event-stream; charset=utf-8
|
||||
Date:
|
||||
- Thu, 04 Jul 2024 23:51:24 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '91'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '16000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '15999673'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 1ms
|
||||
x-request-id:
|
||||
- req_10032db16fa190e8435947a6aaa700ff
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
File diff suppressed because it is too large
Load Diff
@@ -8,7 +8,7 @@ interactions:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
@@ -18,7 +18,7 @@ interactions:
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.25.1
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
@@ -28,7 +28,7 @@ interactions:
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.25.1
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
@@ -38,137 +38,135 @@ interactions:
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
YXyDACB/th//38/PZLRrT7ldIgJZnBmfn4CJAiq+LyCiIKJAAqSrdXLgkEHeFhU8w8wDSxSD5oc6
|
||||
WflNKJzM7Iut//7nz5+/fdFU5fT33z9/369x+vuff/78+XvPp/zvv3/+1z9//vz5899R6Vh1RXW/
|
||||
vz51iwDwor7+/fePmAWQRvTvn787bSEMK+stofY5z+A4vzmmFLXFYr0KgCU+lpRH0gctwh5hOJ+6
|
||||
HYvObV7Q6fuREQhFifllnkK6uJ9cVo5fC0uvRuPrc9leAFNbInanzsWsykWOWutesQOUh2G7vg4V
|
||||
aklssHCBapjux2yGNjkR9sju72au1duMYlZlhDC8onGvhisqFPvOyLUmDReMY4rq92QzXXxci/Hy
|
||||
3ACMpx9hnvS7J9PozDV8TqZJDt/hhvjz3VIwPq5EtVyshsZS31jTLmZINwyvfClP+xGqIULkUC44
|
||||
3JXfWABr6TbEWycvmYd1e4SyrlsSbOqlWbUqi+G8JR0LHts45GFaC9rvt7HxHO9wM27epxi66e2z
|
||||
Qyae+DyzDSD2KUqs3ZdPM3/TAqCXnRML9nbV8CvcbWAs6/Gqf3XEydJncChMkdLa/hbTN9uPcPVP
|
||||
PYWZdAOvpdRE8tbwyP449M0aGNOsnqfCwLNc8mGKyliFm1CMeHOuTT7Lez4i6j8JifLnbNFQBxFa
|
||||
Sb4ST2vNZneqNy/4WUHNdKKGfFWepYnGbSzgXeS8i1lRogsIX0/ENDrKRVc4L139jFKEBe8SFfzh
|
||||
hTk62UVAokflIL6ZawGEcywRwkNize/r7QjcFnTiFt93yJStH6AsaVz6zi5BSPfm8ahF6BYz++TG
|
||||
zRJOAVZ1ndyp0Nhs4Dg5vKD+0h/dST8tWUZHrpUbFlaC80BrpgZGF0bl2BLsXaJieQoOwCS+TOYI
|
||||
yC/WOEp1sKblQ6wTvxZDfDRWIMolJf7XPTbcUcIUaZUuMOdzEDhVljgHT+IPKqWjOXAafWtQ0XjC
|
||||
y8M+D1x5GzP8YuGEP3sfis9l2mO4Hu2OmXFdhiwVniVkYR2zw3VXoPV0sAE+TjER6/us+frR0pe6
|
||||
aEjAqUdm9NwooQQPNbkS53T7FGtSfEv4vVaTHYi2b7az1frwEXc6VoIFEI3HR4TkxXiywAiihB+f
|
||||
L12rzygh1kEqwmW2Rh/t8OVMTBoaXDppbg15QHaMXGrJWmb+HuHxKjmLTh87YfNgSptbsInwZgnN
|
||||
ZPldsxk1yQ5IwL/Ap6ZkKbwvz5ocOroLl/JkjNDYeksVdGisZS/8KvDCNGS4NZxmdMsiB+FQnoi7
|
||||
5ErIxUz5Ie+3SYnf7p8hPx+0DK3vS8UcXNrNgsAIwIl+Bzo/h85iG6Oa0XnfWfRjYYkvZuK+4Gxw
|
||||
l37D5ZJQXE0RXAwzIHkhn9FSHR8YlHEo6SAqXrPml1OqhdHTYcQbq2G+j10E+6U6YunRFeGq+7cU
|
||||
DOtOial/dcT9C1rh1Jojc8fc4cu+mijSbWLRb/ZWEdv2h1ohyhmxSIxenJ8nNQBrmnXi+tuJz/rs
|
||||
mzDN48CM3NugT1bUAsSszFhFji1nZKlzhPPkQdzvvbH444M7FLMyY54X8nB6IxPDfqmOLNjbVbN4
|
||||
5CYCH7szMbP8g1ZL2pTIOm8sEmi1jfiCvF7dNiePeU8FELuO2RFMPIWUXg8qYoJxvCD7acZ4paHB
|
||||
Fx/Xsvb8hi0xnW4upneo/8B6ryIxf9hpVsOrc6jCzw1r5YItLiw3Fe1NU6M/YkxouKk8AkNVD7gW
|
||||
buUwkdy3gbGsx7Kq0oQ/nQ9A6auYGfOWoVn+CioIt7hk1pG34eLUqqugxs/Y+XT7JNS1Bxte/uLS
|
||||
l9PNxdg7bo1KuFDii9rDWvS2T2H4RT6x9ZgmPd7hUomT75YdtPJjTeOli6F7t2cSYCMtFtbeRBCc
|
||||
h8NC4LU1ObXqgjhaBK+yaQ2/lrxneNP0zgqBj4ieg32ELuPPwfPySoefKS06RJ82ZcfxLaC53+57
|
||||
9MDbH/NeipRQq9tTuA7qjOFZsHBtybQiigSfufPVtNhm7gV4pe8T89/7NJmVmxFo/WxXzFi234LL
|
||||
b5VCZWVXRnCjW2v601LE+1Vgrr898MVePhJAxVVm6t8azcvDxMiPiwIzwaEhO+lwkYe8EAj+CX3I
|
||||
1fYaA1adI4ueBQmXG78dIbjbIZbryOYNLWkKrRdviVldj4j6iavDtXU4iR7Vo5m1jgcogCll9rB7
|
||||
cLqLdy8Yp/LGrLFqhvlGuhkSqTawqAx3viDY+yh/ixG5pOOrWc5uUqLddi6JIbO+mBOr+qFr5t0Y
|
||||
mXBnfVS7dOEpa2cs2iRoZq3jPkzteGLVJbCQJFlND1plClSOz5a13Pgp1iS1FvBuntJkKRZNRK9j
|
||||
a1PBuGkhsxXxCGH0dIj1+hTD9PPyDFzeYNxeLD/Zpp6dwQNvf8wOoymcRL3OtL6OJ6IXiz6suxWO
|
||||
YE38QyLYy5xPWmrClckXUj6qTbOyzMAgby0Py8ZTbpYuuAQgveqQOPsH5qO5KWQIWnkmTvDoi0WJ
|
||||
5Bqy4fxjpL5eOf8eLxGyD7c9cQTUJ3yguxfck43Nwq3ytOZvm6dwe31exCeZnmzX2LOhcIYWP2vB
|
||||
SL728sxAu1grZrJjJLP5yKiqRweJzpH6HlbRTjuob6qCN9/Tl3PByFLIiotFnDJxE340phixpKuI
|
||||
4SW0GSfjJMGrvZl0Te/DMCdNkkGtvQyiN0PMZ5PEKcpz02CH68602p3aYZh/W2DeOnnJcnaLChXf
|
||||
40gCM3AQb/nxCGb1AmLvqNgse9raYAbTm/jpebZmvV5bYE1GiJHZrFinL1ORdRcI3Wntq/kZn6GD
|
||||
G21TEjlTyNfp+y1BRUzG0u8uFau8himS1ueK47gurXl+cB9u8RoQcwwpp8W2aSEwTj0xYLSH3dY3
|
||||
UzVGccNI3Nz4Ui62rO00Toi3/f2sNiN1CnllYcpwU4dj+npVEDUdZVH2tIq1mZMKOGQuHS7YaBbW
|
||||
nkT1eNjHLLgiZRgOlj4i4X40mdEKKFxFO23BUxeH2Dc9tHhKBxfpHzEkx0C7WOwykR7GU0+Ybuu8
|
||||
mbaPtdfGb8XxdgiexTLziYKLXjnxtPbVLMrU/qC4VQjXbe9z6aj5LsiL9STeQ2F8GqTGBLaJboQI
|
||||
Drbmzn0C7LF1YX6UJZy+lihDy8sRsbDKdTPHojlC1LSUWemsJzPXmheyvmvOMDm2aF6FskT5W4qI
|
||||
kXsPxMlS5+BaqsOC52oM2+eUYXilnxMzs/zDx+B3yMGetz8WOVOIVt2/XZDUsQ+L1IgNc/8OYyCH
|
||||
MWOGyrbh+g0VF0kXquP1ej9xHopjgJbgXtFnElgJl7yHhPrswegybxmaoCgq2L1hYDOYHrXgXDGR
|
||||
vFhPLI7iFg0kTlsoI6Vg+21yHfiBm0dQsp4Q97Baw2RwwHD9KAFxtoYV8v7p1CCcY4kFnndOpvLp
|
||||
1pDGYYsbfcNDdtYkgC9PGZZu39iav2kBMM3jQIzguCTLt9dtZAxuOanP1WjmX/ORwDneI+LlUBdj
|
||||
rZ5WhD7hm0Swl/k0vdEIIYIMC6tcD7xbpxm1j6RnlrnZ8GVZxRkMSf3geq+9OF9vrw6sadVZ+DVj
|
||||
vqQfrwVBVDPmHU2jmbnWvFBgJD1dLp4biuUhEZCm/FQs6WibrKKdduqxnQ2il3Qu+IyMCNZ3WpFD
|
||||
rEYF319MEYRDeWIPYzyhnp4uPRyhfTM7cR/JYvLwiDotU9jhukNoWV5CjlaHAlXy94TWA7ldAKrW
|
||||
Jnt9t2vW7KO0iJHFJda7bYe5EHQd0jyYGX54SbJUx0eEcH56MMd+3cLF2yQ5oE/4xuO1nRLu9qkO
|
||||
SWinVK7IueC4CWXwVxKR/Wl4hXx/CUT0iAeTeYG/RVP3y2Vkds+ZBPftu5jtifeArumBXZVPV8yN
|
||||
L85gCn5F1X6sm3W/nkQonG9LPBNbw9ZsbBXIYcyIJRVRQ32L/ND4vh+Yeb474Zb8IhUKzbbIrStf
|
||||
w/i+PkbQKl3At3CRCqoseQ5PJ+3YfritIVe2ug/X1uP4+lQAzai/m7BufcKCC+qKydsUOWwiCliB
|
||||
g9Rweau4IE/4hXda+xpmdzzNIBvthzhbownZdIhL7eRnMyG40S3JPseZhuPjyCJ7voSS8Wk6iHa3
|
||||
kfLr7DZr5omidqvWhoTHU1ysWpXFwDjWib2P6mT1eTAjPTVFZm2LxJqR18twhPbNDIUVw2Ivzxxy
|
||||
82CzZK2OCXu1fIaD1M/MV1QRLaes/SEtIB9m7puxmU+pXMHVjgvaqJnPx7W6xUD9JyHkkfcWy9Ug
|
||||
huVlH1hYhgHnI9V9zRBQThf/aA/8sbMBpNcrxODNZbEan6FF+OXeMS2CnbXodtNpk2Mz5rivyJrC
|
||||
g1PCoslXZipO3iy/XSFAYVQ+00s6F1z6lB3a33ce8eg8WHwNzoEq1qZEkZsfm+maSjl008efttdb
|
||||
GnId/B78oJUYHoSiWDV8k6H4HkeqfMot70n38JHbTHfM1cznq+FREWqlLth5/6B8UUG14ej4DxZg
|
||||
I02WzCY2IrdxZWaP12bsuslHrfWo2J5dcDF5jwxDYVQ+3Q28T+YgJyYAnhNyqJanNV3e04iuHxQQ
|
||||
Szi/rWl5CTkc5y4i++oVhPScYxM867Khw/WUDUtTfi7IFPyKGOI6WuM+91tIzUvEoiW+D+tt51RQ
|
||||
Zqyjl0dXWOPwmit4PoOFhXe/bli126owKnFLDlrphSts7Bxuv5ZSITk8rd6ibavWyqsgh1gdi/Vl
|
||||
BJK6284lu5bKd6B92ecQJbPG9HiHG8aQ6sO3oDfiy6oxjFJ472A89YQFV6QMS1TGqlbqyMQMN7rF
|
||||
Kw9SdYcvZxIp0SXkzt6YNV6uLcHKqiT0bm9ewAh36dCbt2QhqhXDgLuIBJnMhne8mysNPa4627uL
|
||||
ZUnfcLHh0v8eJDp+fnwR9giDWOsSnqRXH87P769E8hPvqVYuNJwu515Vf+q6pZtrTYb5Kx4pZMP1
|
||||
R2GJ78MySIMJhiR/sCaU9bAaXiehWqsNdpaMS/Gbvh8ZMU4+zPAS2lDmm5L6yHvM/MG1GlG9kRYN
|
||||
guWSUL9/Bg4rn8FXzT3zrWfYjNlcVqi7zQFzFCEe1t8gxvDIhy1GVXgdFqJaMRiPk8wOT9qgVUxz
|
||||
AS6GGRDd1nnDzTDtIM2DGb+u9xNafrtCQEGvXibZeMrDIlePCOH89KCyU6V8qeneh/tnLVgwjnmx
|
||||
NCW7qD/Pd8nlsGoJrdJIULfW901+224saILGH3rkPSZ5aJ8aVl3VUe7e7ZmYP/xIRkHyW0g8lxLM
|
||||
0LVZo2MQo8PDzLF072jRN764okAcn0Qv6Vzwx3W/grjb5izqzzRcdlU2q/Eu1uj2c6j49KCrC2X7
|
||||
CrAcFMEwn9K5AppddSoMAkq4lb90oGjDmL/q5rAGS6+i0ZFsEtKNGYrn5/KCMkIF8X4c8ynQlBqc
|
||||
/r5hWFmV5Ge6moruuy5izrFchjV3cwAeAcf8/ous/l4+cyTFNMcqas7JuNLKheJ0Z3SJDjXn71cP
|
||||
UPoqJqRLPWt8DHaq4Tx5MDf/DgM7a3INaIU9w+5l5f1MYAXsVk8SzcQddv6Fr6qaeXsWLtCiqSl7
|
||||
Gaxp+eDNtSbNlNmeC9GuGPE7vR3Q6iWSAJ2WK4w0VzsRsRfJwKx5wknX7vha2oqLklhWCKmvVzTu
|
||||
AxNrsfXDzPq0S7EGSy3DZj6PzNTMC+L7iymqjW8tzM7009B/xUeASl/FzNkIj6Kr2jAFxsmHmOl9
|
||||
aHgqPCvY2GhDY8lJrdn/CSk6KyTCryZ1m12shzmypuXD7GHnhi3XhhfaDnGC07tWo3X32sro/C0K
|
||||
5uwMxn9S00lw3rcW3qzhdpiewgFQ6cuYldd2Sn5+E2XI9b0n2VuiXkzvwilR8U5cLBrxB434Y/TI
|
||||
Cy8hwxMK+G+RfjUc5w9nOA+0gR7IdQXhnnyJZcKr4E1rSqCjYU9n6REU8yP0WtQZmwNzX01r9f5P
|
||||
uKDZ0wXmsdluuP3VZGSfI49Z1c8Z5huhM3pX9cj8dv8MZ+0izoDC2mQ3dGjChRuPFbHhZjLnfYgt
|
||||
iTtOrQy/yGelNZvDTh/No/Z7rSZeVLYNuQ56rxZG6ZNLtP7CWQrPHVjuPmZ+FkeF/MqQj+LrUBC3
|
||||
radiNI5hjIwiHpgu3EwuFduhQ7/XarL9cikSpraPI7KULmSm5IghxdWE1euOIkzty4NToikmwC8n
|
||||
RLf02qK7eFdDNLyfxIBRKCgWvgCu7zzxzwwctFxbcQXjcZKZHUYHa/44ioniXaxhyJ73cPZ/QooO
|
||||
FtlhKR3NhvuSXEO9rbeEdKlnzV/xESDNjEQszdeXxbe9U4N1gy8p4pNTzIl1xNB+3I4FP7ZYq/3O
|
||||
ZlDEOGS6peuWqGYZhsA49cSXtDhcNje/h8+ZVOxw2fkN18fgCE5/3zBDWjNEqaylYCSnBwnZ5lyw
|
||||
RT3G8KiGAsMUW5xrL7tElvJZKRL4iLgq3nv4xcKJeZGzCcfyHl9U3LkWVZ/bhzVrrpCD8HVEclAe
|
||||
AZrPJx7AoZM2BBs3q+CXycBIqkaO5/79ttatWPmonuiR+HVkI857WYfbr6VUtp5Ds9J+rtH3SjkW
|
||||
RYqT3eO6XwEJ8GIEnCJcDkOfwlmLVkI+aV7w+pKJsDm+UgxjHBTzFvY/ABXrFH2c77Aqp3OPxoO0
|
||||
I1YkeZy/+WWEQrHvpHDzueF3uc+Ayy1j8fn+tkZ6qnoYgq4j3skU+fdBVxtwcKfM/OIo3PqP6qdK
|
||||
Mc2JzyylWVqEAqiKImA6UQe+I94jNP96l0Xafi3E1g8lAJd7xKmPZrM9fgsBlW2zEIfbfKCakJSa
|
||||
aGcqiabY4ltxfgpw7BdMgkxmw+IcOUWRBysJoA6LFW22F6DZWcfNUl3QrLlCBtvkcaFLbQzDuot9
|
||||
X7WUNmTpsNug1XQ1GZ2i7MJcWz41c+d+BcRt0Nn5cxDQJB+eFagNSklIUnOQ2k0vI7V0GZ3V0h1E
|
||||
4hYCeGEakgKn35AbRQ7o6aQdKbV9XKybSKiQPe9+FNzJKXYMqT6iO+FFN/5XHMbUfFMYVKtmh/eg
|
||||
h7MgeSaIvW4yO9pb1pxXgJGvmnuihxprFmeWSxid7UCwdZvDRlsDCfRI15mro3PCd13zA8N6UPoI
|
||||
OgtJi/R7ASINIdbr8+RMiLGNlKwnGILfc+AJS3WId7HGQoGPfAnvoQRib5qMOF9sLZb5wKiJjCs7
|
||||
xGpUcDBXDAP+cIZbw2nW6TbY0KflC8ujlRSz4Dqlqm5YxcxjTYrVcUodTq05EgtlZiGS7hqA5KqE
|
||||
CqtcN9ztUxPeQoCJdeQaZ/DSckDWS2JeAcmw1ds61Q4/6U2VYCnReHnqrXLNnBsdi+e+meP0IKh6
|
||||
I76ZwbeSNaVelKvbawkEh0k/rLeP36PutArM+AifgeHEqeEU5RfiyZ+I01F9YCRmB5no9RA161sQ
|
||||
ctTPbsX8Il6HeVqKDNQdHejWe70bNmDxAvVJmdhhr3vW8rkNHdzA/6J14ruEDxguKACWYqV4e8Uo
|
||||
jmKkGap6ILphzWjEO93Upr17Zu6jM5LZws9cu61FwgLXe4UsrUYKvd9fiRf4Zz5n9cUFULFOxatl
|
||||
oKk4rT9w0Sun6pftms5LJAF0NOyJ3+0PfNYuMKO3SV9UOAdBIgVLraLutArEK5XvMIbfsoQ4+W6J
|
||||
McmHcGlgdFXUBBnR6XtBK9EUE4rvcWQEOZ9m3Ea+Deza2cQwLo9kST+kU5K1LrAQH7xhceu9D4ak
|
||||
frB4sfxkHX9xrn0MfCJ6/3astX20PqDk8mG4M+JipnqtIkNAOYUxDgppvmZHNKz6xMK7Xzc8xTRF
|
||||
y8/T6cMbhWGSD98SZmIGeLvaeUifyzYFUb+llIfKwhmJyxa1H7ujAlqjZJXN/QrN7m4z3NismV5T
|
||||
9VOjcN0T77z5DbM++6b6EjHHQu7poYSrKUKHwhSZ9fo8+XQ516omvN0PMWc/DCd7eeZQnB6MEMWX
|
||||
C97yYwx+F60kkLrHsO7Xk6SdSuwRktbfhL6p0qIN+ISKytCGS3FaeySZ8pe5S34LF5wrOuqWrCae
|
||||
9LMaujezo5bH5pbY6rAr1iHELSLbCyUGk/WQqW0mIjN+NYQcm4O1nPwuBi53jDjkUQzvXt5KKBre
|
||||
TxY1hRlK2ZVHcHjtbtNG8k98xp/9D2bPFAgBp7C45N1y6LaZS3I2unw+npmE6hOacCZqG2tO49FV
|
||||
5SfeM3Kuy2Fuer1Dfp59iMvyKNwuLykH8n01dNT2a8H2EtZRkisjfd8TpZmtqsNqobkWCaeN3+xc
|
||||
Oc61OvB8Ru75NmTONwK1bJuFWfomCUcx2pXw958/f/78+Z9//vz5P//558+fv11/r95///3zd6qW
|
||||
6b/sAum/8nv+X6IoWRE8HfO6+vuvC2H09zv03Xf6f1PfVp/x779/dvCfYpv6KX/L/vPnz//88z//
|
||||
GAM=
|
||||
H4sIAAAAAAAAA1SXya6zypKF5/cpfp0pJdnGhgzOjM6AAWdiwF2NAGMMGGy6JMn78ld7n1KVasIg
|
||||
WUpC0Xyx+Pe//vz565NWeTb+9fefv97lMP71Xz9nj2RM/vr7z3//68+fP3/+/fv8f8q8SfPHo2yL
|
||||
X/nvy7J95Oyvv/+s//fk/0R///lr/XoH9NAniUEfqIshdtmIO7GvU16cEcBHybJpkwZtv1gmwoDU
|
||||
cPur92iMr2vUZ/0DC59+9Eb9oIQ7Xd7rWMArhc+Pep/AbipEYqcwe7NK0hx9XelB3Yd67EWIjyVy
|
||||
NVmnmvTJvakVbgNU5zOhJ1F6RwzdpRmdZvdOyIov/TRAPyFrKXLqPFJSsUM4x6hr9ibVWX71Jtun
|
||||
AIueEIqN5mGMz6taANknOiHD/Y5muTUnUEp9O0mOvUmLer3BCotFf1qjeOEz9qUBrpqDyF59YG8z
|
||||
q5UOXBNWRD+bdsXWwyaAuLnVxLm6rJqPn10IHW8banrP0OPZtRCU9WibWLhqOKKbULtAo4wHeqj1
|
||||
E1/aIVBhe92meDmbbcUvGAHEu/uJertPXjGkjCbUOGvxzklUxJfDLoP8Jc9T216+3lAm0gBy//1O
|
||||
si006Zyppo427tcmbnn4VItMBVWuN4aGxfnDPZpeQxlcg494px90PhOdr9EF3piYp3k2JtD9NQzv
|
||||
9ZVoh7NerfOElDBN5ZNaaePxOZQ8AeF3pGLUFG9vfg3DBab1aovpcJdQPbzYIA+ni49Z+/TThV29
|
||||
BMmIukQdE8vjzqEQoA1DkWh1S6JF+hwCaO8XlRjFukbjtPu4yDcqdfqk3gHRdF8EyihJIXWv2zDi
|
||||
0bq7yNtNmk/yyqP9Eu+OOVhb/TuxhSjRUt6vppSkwkLMaKVE9L73XThqTk3IfvJTZqQbgD1ONOqW
|
||||
B6f/yZcKxubQEk9Lz2l3n9gC30sZE+2iBNVP/DeULb5A99uvwOkDlRf4fp7FpOSDns7nx7eAazYE
|
||||
eH7fzj1DdzbDK5pO+LUQJX1fygOGTnTeVMXvFA0aeRVg+kFItecjRUvXxQCqzwfiOEnBmfbmohw7
|
||||
Ox2/lUz33t6ARCjc6krcLW89br3vGcSOrFMj7Q/V9r43HSi0SsNyICjedNJogPROe1G/FXyD9w9d
|
||||
VbR2FZHjK009ft/7Dorqy4Vo3U3j4mQ29W/9KEaZaPC2OQ/wJNlC9d3N5NNzDMWV7xgehuNaj5iY
|
||||
3gaUhTYQe4uBD1m1iqE0jgXZR3jrzeKHDSBYdTUpzb4ylkCSc3j7pkf1ZrIMSgYvgY02hERlkuTN
|
||||
z/pbouewisnRlgvE9uoxQ9ZS5tQVezNixVfDsA/l4wTJ+c2p6Qgz0odOm4qduzUW69OU0DPiTm3u
|
||||
nzlVbqMPq9fikqS4XTx2vlIMRtFnU/kzr7z3tFjZS/f973z1/LS2fIBDQrDym5+jeAiA2KeR2JWg
|
||||
9oto9AvYdTNQa7nvq2VjPCbUSFiYhqBCHm3ihyodr29EjX5fVgvMiQtfZKqE3IrxZ75vOnxp0VFf
|
||||
T7X+s7kwHe7n253GbKj5sEVqjqIbfRJvYJXB9XBqUH7b3amjFdwbnbAMgWzz4B8ecM2V1mCnzZlY
|
||||
xtD2vKKrDBF9axA9ArPnp5v9kWMuWT+8U7zBeO0CuHz2xvQVuYyo2swJkvc4xOyn/ov/LHaKYbc1
|
||||
MWV/9ui1VxvQHX1NDsFhz/mPHljDb3jnfrHBlbUkovqaC1PxvY+os0zuQ9G4GDeJknn0Tm4mnFnR
|
||||
YWH1mKIlTVqAEcuY+vOTosU7YxkqScioPqxqj8n3fpL0zedGky1vDdrGvQnGPChTUTkUTavsckNI
|
||||
ukzEV7VnxNBmF0O4BA5R18tUdY9BGKTJvq+pS66tMazlSwinsL4QFemx9xPvGoTY3lNrOBTGWLHE
|
||||
gnS2CRafrel91fd7Bvk+PGjGrwMarO7goE+dWxjB94K+TGUm9LsgpuFeExCXhe8HRY9jRw+5JFY/
|
||||
/J1A2rgMw9ei3nw7jAt6prpDNcb0ajD5TgCwNgHV50sc8SVlrhJ5QU7dvfVNuRm5E9za9YX68V6N
|
||||
FvmhxOgZCQL1xt2xmoNdK4J0r2RqV0LRz/6nxGiUUIhpK00pvSpaKB20XiDaWvukfG23FxA+74B6
|
||||
N0z+p5+70cFYijSdJ1c1v8FZ1DeEPD5BP16/G4DkfufEnNxnxbyEu+hmjzFVramIBs+3S/jtRyvA
|
||||
Vc9vF3GG9h6rWCLPR8X0+OAj8Rb75BmoZcVWayNGWT1kRM8en/SffvZl+06dV9bypi19C76VcsaS
|
||||
p7oVO7+4A69oONEHyw20PZVRDa/NJEyr590w+FHUQgW/TypefvOnqeMOXbqPPq2CXEmHY17HIBXv
|
||||
PXF2Y5ZSduhu8G5tgt/N14lEItU3+K2XVz4HNKLNHCuXpzUQw5HVfhGaLICQ9i2xxN2uYmZm6tCW
|
||||
8ZVcibyqmCBoGLZDamNuBrtqZqblgm2bLjEEAVfUtxFAYzszscj+k7Jv7hRwuBw7qsP2yhcNLAfd
|
||||
jOOBWNvXJ2JTaDfwLluTavnqZSz9rYvBflQvovauGoliezVBaZ4v/NJA411+OGX/7OsqjrRoVtqP
|
||||
KD/lhzhxm71TxpN1A+1K3OGNlH75vDF3MXSfziAO7qxocWQlRF6j50Tfd1M1kZYt0F6YNknCpk/5
|
||||
oBkZ/PgtYnRZyHmyD330ux/NNpmqqtatEHr/DfTX7/AffiNyWA/EMU77fr7SOQBjnhTi5fm64mNk
|
||||
OhDq+zcxvvFszKezXgMcboQQHlJvri0qo2508bR9WVXV22nawGYVxMQsQ4/zYLhn0FFDwlxUxHSm
|
||||
4AVoh5QtPrv7zPjpTwcUVXcJLhNqTEoc1ZA13w/x6cpMt21cmfK+lyuqKuGds1Mf7xQ62YQc63iu
|
||||
3o9ijiFfpaepy1nhDeWwJPDDF6q/GyOd40eUQ8/nwzRGkVaxcRea8qLcQ7pv1F3fdetih+7+TafH
|
||||
V4q8xf6sawjDg0VUY+PyeesjF5Vd7JFQ+14iKq3JB0a8wxSLMY/GVal/lB++4o1ovFIurx8TrD9d
|
||||
QjTGymqeC3OC1dEFPH7ODl/n6seFXbl6keMNUT7dskqHxdolZP+RsLE0gwaAhehCXf0Y8WnEQ4YU
|
||||
TdrgbSUU1ZyH4Qw0qCfqw0uNuJ1GDYKjnFBLr2vE3WrIEBcbnxzG7onmn30ExxPeU+++0nqxKz8Y
|
||||
jpV3onZQt3xgyzEBMxw7qn8ED/3wJUTOrm+p17a0n53CC2HtmVdqddeNx5AtWagyEgMryDpxFtwz
|
||||
jH79avlujGje5M8JqfKVTiu5p2jI6zQH0Ql66vz63Sb/COh3Hub3bdN/v9G6hrW4T6k/5Nee0akM
|
||||
wOUQEPtlGf2YFBkGc2EeUYfQSOciOdcwnBfxJ56zMf36HWPjtbj58cODEF1VGIT1iNmXhgbvMgS/
|
||||
3ydHPjKD1axwEDaCaFQQ1Soe46sI+knzyF7HRT/9+EV03bdvYvzwYbpN/fC73/Gm64r+H171L+9L
|
||||
XaFc8SVuzBneS/fGDTuVnPPVUsKQiSol12tYLXph1+BDd6Pm29MidhCr7je/EzuLlrctzlxAeZMg
|
||||
jLR0EzG8MSeZBbFGLF7MHutmzYfgW+SETK6fLs8xXMMI2Yle72bk9d3U1LDNd29qvMJnxfUJ+YiJ
|
||||
a+nHjyLE212e/N4/LeU4Ir66Hy7gPAaTuOG0jWbv8i3Q8pFsoiaful8YD1RobHf+6e8o4uz0dFAy
|
||||
Kk+qMunuLexqJL98xe0qGCNWMlOF86eIJvm2OqeLfUE7KCzDJ+ZQl96i3ZI1Yg+uUac3N2gonA4Q
|
||||
qTcz0UTpnc7XqPoAue+O9DzITcoG15zhoQfPie30oppt0NYgqa+aOEww+q2zigXw4XMje3fy+HSZ
|
||||
SIOo+jrSw+jvvY1dZzKM7s0gQSGV/bDJnwNI13rCwczEdBi4nIB1yH7/5xZv6crCh1Nclfh8zgHx
|
||||
Jn7osOg3Qq3zpUn/8bdWmCtYjEex4tpOsmBNmhIzxsp+uU9sBqf3G2KxU4lG11kyJVOdmRyOkmps
|
||||
UVDelF2n9tRz1hdvS7dRAy47z9N/AAAA//9Mml3Purq2xe/Xp1j535oVeZOWfaeAyJuUBxA1OTkB
|
||||
RBRQpNAWmuzvvoPPzsm5NSaCnZ1j/Maca6O0am45pqAZrlsjk4PI463fRfAWpVt0bLQqnk+vTADq
|
||||
kwh0J9oRn4jbKRBGlxd1Vn6Op+f9J4OdeTLppRHCmAywHuDD6BjV41gA7G4nKljuBzX3aKg5DJUS
|
||||
WuYqJx2aHYP8+E4EK94i5DjKm5Oz26eQS41PrTF064kola1B75wTLcxNPFd1AuGlTr0AZLcin+/r
|
||||
vAJ68aqCpp9lY76v46e2yk1CncfsGyM67AsoDcKZBr2axcyevRWM5sim37xgziX4AqKLD8hcn7Ax
|
||||
l+t9qd41XSRyfQo5OW/TDA67CI9h9Eq8KVhfOgi2oUyNScjzyR8dBb5BMhLp3PK629+pDXDxuQWr
|
||||
zdHmbMsDBj27yOmJRYTP3jGz4dsJ79TcBcnCl3cTnFV/psfSmGvqiaMNxlV+W/QywES3ugB6o34g
|
||||
innt6il63HX4SasEeUCv+AjRKICFV5AxHNuarIsyg+9R8pBlCi4eV/uVDhVu6KR9bS94wrmcgUvA
|
||||
SmRsJMxJvrOb7/fpvpJveBqLfQkzaS2R4g7zmnY6K+F8VxkNRLGKqda3KkwVt0H7WLYA3xZN+e13
|
||||
RN2RiuPjVehU+0XyhY+HfDIU96VmOizoteg/OXn+KBmU3wqk29pGnMyfzIbdXb2ibXDYeYO+v71+
|
||||
6x8t/M/bMlK1d8p3AaafrcELUITql1fcL4913sS08pY1X/4zqKyty9/3pY/t1WAbw0ghyCUfOU+V
|
||||
gGbRI23hW4oWPyih02TCD9bvyOtYz+fmCgKovv11QNOw8xbeLEBTPu3f/jWsbxtdXfSWaBeAwDQk
|
||||
IYFi532IvHrf8CxOWIdq6XeBcJ0rzIdGmsE+uOzoRTzF4LMOL9vf+2BHOuFDj54rtXrZAXVxZNQi
|
||||
iVADShVYaDd0b8y7V80gv+oORa/Jqym0hicIhsSlvviOMDPOTfo9r4DdD2fMKDQiWHW7DT1OZQ2Y
|
||||
kPYrqK4iB+mNxuvJXwlPeCUZDeraiLxJQ0AFM5LRKJxsBc8NoiE4qbc70WCQcJYpjg1Db5VTO3tm
|
||||
3qzcaKRiy7fQeeF1yvKbrn792ehnQ06oCXtQH5QAXa11FFPxoF0UBliCvrxJj43dQCPoCdreD+d6
|
||||
2oh9BJbzD2RhTXG/8YQeKG1VIXt7Yfk0R5sZdrdTRg3ejmDxq4KqOj0kovwp+TgqkQVFdXaCDS9c
|
||||
zJpxW8KlXohcFKDma6xv4V6T6Zef8NwLnQo2sWsi++7rnkx2P084J+cc6UcDxYO2WfIX6KyoKcdq
|
||||
3Y/zKAF9co/UbJ4TZumuh7BuLyyA/Oxz3Kg/JajU/hrM5XCqB7cLLLhVT5RI4FnVnLUdhIs/RXZq
|
||||
HIzh3DaJ5vzQO/XQiPPx6+9FlTnUsn1u9O7dn2Egpg/kJZKFxVnirnoxkEMtvG0A0S+d8s2Xglm8
|
||||
o5gE3tmCl5DioF/yzvleWyoEmrShi3+NBZ4WCrTTTg2SI5X5XNcbCwiXZoN8Uz9j2vhRoP2A/kgN
|
||||
aE2Yae9KgEM54oVHU8wX/VC713mi+lsM8efT0gCUZhZQI4kfeSuOXvL1P8h9/+B47sKfEpoHdiB0
|
||||
NybGxJoyATGidkAXvZEl2ytBRD9v6vfyDrfNKX+CsxrMwYVJFeC8bhWwGg85DUSf1R/5YKmwNi5G
|
||||
sE7uIh6NXIOg8y4BjX8yzD+3dLgAvTceyC2nrTdko+iDR4GtABj+Gyx+ioHbS/KoJXy8GN8ntYJh
|
||||
lHN6WL01PN7UM4Fr6n2QL4XPfPbukQTXa+SSaZ+4OZc1uQB0Wx8pmr2m/rCmTME9hiu6w5VZzw0e
|
||||
BeCN2wM1k88eT0dh1YG9Lgx06c9gukBzgAtf0ttrX3u//vSnaQ2KWBYZ8m7bKpvsXNk0AZqOxfb9
|
||||
DLWrn+mBUmPRYyevEtTjih3QtTN6j/n+6QW//LDkcR6ILWyCzeado2POB0DeoZeBLUoxtZc8VuLH
|
||||
/AW2mbr76nU9sGmdgHT38qgX7zkY202hqkv+FLy1XWWMi7+Gm3eP0C+vl9mhgtrr9kC7TbfySNE6
|
||||
EEr7cQhGfdqDxX/O8Gd8KNRf7gNXh24FXl4Pgo1p3bx5yQeAvr+LwTrc6vUvz8ZJKKKD4x0MNgLq
|
||||
gmxTiQG/Dw/OibyvoKdWH1QOwz6flHXlwg3rX9R/ipPB9/GFwe95LudtSCvWBXCpJ2RteZSzV33p
|
||||
4JLnUnsQ7W9/CeGDO2tqVo9rTh8bLYE3NN6RT9MED6LJUmjpeREAxTX4lO6FBiz1Qb7zhhnDWwev
|
||||
qzL85rVg3IjPSKWDaxCVO3djeb8MLnyEArdyAatX3IJiTdZo0RePTfAnBY+Hz4PN6LcGf3orE5wL
|
||||
EiLkUBNPix7AoAg/hC/1PE2EJYBhwgO1uQax2HmbGQ598qTfvHW6VUoCP1bB0eJncnbyOgFe3CgO
|
||||
fuv1xT49fEmfNeE5/mDGu1MDUJDKi786cJbsLAGKqX9DP+srq/n2pVygWyuUXpfnoz0hDbSc8oW8
|
||||
KZSMz7WPTPhWJkLtUPU9McSBpG4GN0OeNm3qX71kAnWXPB/zeelPoK6gRa0WzblwdYAE2cE7ILSc
|
||||
v0RnoIKlnyC3/3AwwE1caMfMVtHR0Q0uFe1uBffhKUDOXqaYWYDPIMvhjJz06uVcPIgZPMqDFpDy
|
||||
moJvfcFPvT4RWLQYz7rV+eo3H04HZ+vJ99OogOpophS1u5946nVnBZ4nYUvv82edk0T/KWH4vCfI
|
||||
jB86ll9lB8Hid4g2vywsG1uwgl/+PhH947Ee9RDsTvCF0qXefvlmKCkmk7TZ53K4ymww9OmTqMeX
|
||||
gMnxfeph06yrL6+BMXqcdejXnU7d+0bn7KMVLmBy5CB37dGaEcEu4PM8YoTa3ZQ/T69MgguPUWsH
|
||||
TgbDj7qHbVTX5N7PBpD9Wn1C2TiHyD8Kz3iYzMAGK2yjQIvjB57fT9OER5lo1NnSYZkHAOk371/m
|
||||
Twa7jtQF8TSd6XeewS05cuHHAAJFWrOP+Tc/bU/mM4CXJs4nuk2gaq3XJfW3PfJYsfW3cK9LA0JK
|
||||
oWNhBG8X1gc1IMoyf2DqydR/9c9Bd60e4tsx+85rqHVuYyy+3yzRFn4l4CHdPJofA6gQrFWkOt2d
|
||||
mlVDGalzfmmpcbDFegR4yNSX1wHkuW2HubC1O1DMwYo6c/IG1Mr3FayOVoocoTnWw/5OXXB4PRSk
|
||||
C2e/nhd//DsPCAiYMb9y7wL1hzUQ9tq1MS1AEkHgHAf67U+zluQvOLbXA7WvmRxz5hQpiE2UBLOD
|
||||
LTAqYuJrvuIe0aJXmAA11DVHS0/UqfCuni6Xn0xTIxxTr7088yH2fAJvOEmRQZtTzW6SZUE+Wzui
|
||||
xfEOL3lOD406yInaVgp/09xawZdGHbT7EY+cqW9/AIs/+OpTLF2Lyfrmu8gWgk9Ol/wF6vubiPa3
|
||||
6phPdxVIKqjSC3IFdQLzPthY0O/MgS7+siZAvWwhEFQTBY/pHvOuuEsbeFSyYDL9A2bh5NhwPKtt
|
||||
MAk7O2ZLvqUt/RA5Jtsbv/lcdijf9HDEUc4uu0kH9VXOCCsUN1/6sw+q1BypXpVVPZtzmYDlvpFQ
|
||||
cFaAdOG1gIeD5QYwrHNvuDX7CxTheCLsIs/GmO5hA87t8Ca84H796wfHg2h+86F6PAqrXv3eH8fb
|
||||
9niKr0qqfqySB5vjZutJYzXa4OA9BeocuqcxKj+Vqp2M1RsdbnsvJ1+/MsCYIHR9KzlbeyyFp0sx
|
||||
oyXPwAv/S9op0W0UFO6nJqvk04Fl3kSm6tB4k5jrHTBP4Yeix/bqMZMr8L/+dVoZNW39LtS++fjB
|
||||
fso5ay+rDkzzMn9I9tt84FUngI1h1Qht6qPB/DKNYNKrFPnL/9H8KKIKPt2toi6Eei5lEfehR7SY
|
||||
+Lv5h/Po+enhMj9B3gFnnE3wmsLxDS2UVvRgsFChErgC8BNcTte1wc7tkKqYzw5F+32BGevDHpB9
|
||||
0SHvbPmeHPdpBp0kKslTfc/eKKw0CM77tiWfYrepeVq9Xl/9W+bBdi1868GxqU1dSRfwuOJa+OsX
|
||||
PWbHOYll+QL/fLcC/v3X33//z3fD4NXdynZZDBjLafzn/1YF/slu2T+CIP2uIZAhq8o///rvBsKf
|
||||
D+5en/F/x64p38Off/0tw99dgz9jN2bt///8r+Wn/v3XfwAAAP//AwAoZH8C4CAAAA==
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 88142860ce7a82e2-GIG
|
||||
- 89de8402cc8ddab1-MIA
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- br
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Thu, 09 May 2024 19:39:49 GMT
|
||||
- Thu, 04 Jul 2024 10:43:40 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=JUH9MsoOBdCquA9K9qOgVbrLUkBpuF4d2k7EvjnBOLg-1715283589-1.0.1.1-96.yHqUmYzRiWC7xZ4.TPKu5RqvB1EFecasHfb_ix62oXcsw_Sp.gz0gDk4Inl119IxRMINM613CI5HheoJM2A;
|
||||
path=/; expires=Thu, 09-May-24 20:09:49 GMT; domain=.api.openai.com; HttpOnly;
|
||||
- __cf_bm=4yTFGytKZgiKE08enzMel3PhLQLu9mwQY7gk_lH43Bc-1720089820-1.0.1.1-4V06JbYY1zuYkbV7SSudBUPyqHqm6yFze0ufE2rHtJTtOmU.XLh_k1M9h.a.oVJGC44GhO25bv5s5224Ic.p6w;
|
||||
path=/; expires=Thu, 04-Jul-24 11:13:40 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=DPNTnqcoTitPoe12n1vvVltuYsIzavX_dftsSEhQQcc-1715283589489-0.0.1.1-604800000;
|
||||
- _cfuvid=04ERWHryLOXBKqyH.MpOILpw8OtW9Tchzbutsc9OJnE-1720089820726-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
@@ -181,210 +179,207 @@ interactions:
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '15'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=15724800; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '10000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '9999946'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_ace5babe6674a08d07fce90792eef7eb
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"input": ["test value"], "model": "text-embedding-ada-002", "encoding_format":
|
||||
"base64"}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, br
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '89'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=JUH9MsoOBdCquA9K9qOgVbrLUkBpuF4d2k7EvjnBOLg-1715283589-1.0.1.1-96.yHqUmYzRiWC7xZ4.TPKu5RqvB1EFecasHfb_ix62oXcsw_Sp.gz0gDk4Inl119IxRMINM613CI5HheoJM2A;
|
||||
_cfuvid=DPNTnqcoTitPoe12n1vvVltuYsIzavX_dftsSEhQQcc-1715283589489-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.25.1
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.25.1
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/embeddings
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
YXSDACDztvb89z8/k9FeZ0rVEhBIuDPopTFBQMT3fiOISifSJEBqlQdcfLgwYYhq0ycWOOTwIO1h
|
||||
81r/88/f379D0VTl8u9//v7t6nn597/++fv79/lYHv/+5+///PP39/f3P6StVV9Uz2f9fVcg2EV9
|
||||
/vufPz418QbBf/7+TVtk0UC95xN7n/0a8PWNEcnICNjKbJqhe2gQ9Wr4Ayx1fzE8z0dC8Rr2gMLH
|
||||
FcIu2zgCDvBbsI/U1ODVDA+MzOeXsVewcrAIswt25vo6rW2Gcinhp4bEvzc/bfr7kYK+dDWqR3IE
|
||||
1ruc89CtpJLeh3xPyOkbRWA+zQG+nYE5bZoXxepzeYTUFCS7GW6dZQGJPC8UO20HmGwYqRrLTUm4
|
||||
GHsNIec4h4P6OWLz6zRgsz8XAqtlMKjzDUtzAy1XQ8s0eBpeP0mzTtkvh0WYXtB+v61gu3O5AXdN
|
||||
geiYPBW2mMc0hxUqFMInt7rZXHnJlCi57dQR2GnalFpHaujICvaAzwB15s8IKa0stGW+ybZ8NCP4
|
||||
jBUJh8Jgsz3J5xUON0cnHPKbhF0KjVc/3sEna/oixfrVn7ESzsmLLJzTsf1sYAsG0VGjDrPbYuOF
|
||||
HUJdH97YnC53ky2XkwHw8/bAzkKMYjGP6QNIZjTgsD7ObAHnOoIWi0aMlcMWzHeITqCSzwM6gsZK
|
||||
xO4k1aA69SLVFtk2Se33FdwyG2Ebb12wnTvQw68RACTd42ewNt/VgHSkF3QctpptEbgoaq9/ZxyK
|
||||
ij6x2y458KRMV7ROwZtNwqjx8O09ABHn27vY5W/cwvvX3vEJPeaEukYcqkpr+NQR2NCs1rqV0MPk
|
||||
Q8+HXJ+O4Wr5sOJtmcjnOjEFv4Uh5Osbw37H6Q0rBc+AXHb9Yf/0aqfV3gMNZlS5U60d4oAJRj+D
|
||||
Iswu+OxtXrP/hEiB3XZJkPQ9SAnzaLjCFPEVxl48BuQwoxGey+ZK1I4bwaYfSQsX7FbYFl6C+ZPP
|
||||
fAmokl6x6yaoICD+VHC5HyiRw+hWEF4eWmDV3AWpx/Y1sSVRU/CuShdXjVAl+/KRU2C3WCCAfDOw
|
||||
b60fQ2UMvhivsQBWiVkzJPLqUP325dnqbhIPL+qsYe0pfKepFHRN8bVpobrvOIl4zeNWjZsd4EDs
|
||||
STDnzZCB72U/IPZyw2Y9em4Jiuf1QA0sGia/k2iA3zAVsdO4Avjxws4dmpHMNORL2Vw8OfGhey9T
|
||||
XFEZMLb99FIt9iIkR2s9TNvunXJ472eRCPWwmJsS5gSO58Wn1ls9NsQdeANOpnHGVmb3gDn6aQAH
|
||||
5/vCxrhHCbVmxsNXkun0nP6UZMkPbIczjxHZVEubWIM8BXagQkTUnE+zVNm9htETpeh1O9gFO5nT
|
||||
CrOHcMZ5o+zBEiddDwE9dsgf0TNg/t2w1Iq3ZaofpGRiTf3mofvqNuy6CSr2w8HjFaqkV6wfydec
|
||||
b/JkwDHvikXAWxcs83Z3oHf1RyRdzxn40mQ6Af0CZarxvdYI1R2J8HiOI+zi2G6ONz98gNq2L9Q3
|
||||
vSn4HZEzQrNxb9Qev21CoenNgCivAhvbK2CTcP0psLbtC1Ln/h1QbRwVKDXIJoxFa7L7XJyqbJ54
|
||||
bFyfB7Dp70cGeD2zsdv01UTUhBlw/3wINctWZcNGYQ7D7GRQ2/g8APtIzRvaSeiio6T0DTkfLrOa
|
||||
WvEXIz+KE7ZcTgYIt2VAe734wTSCzwijJ0rRVsyFuR0/ugj9ydOpcb9FbLZ3U4Pn+XQnLZfZ0xJE
|
||||
1whGquBgfzgCkxbhM4U2E2WqvZsJrHS5xPAOdoPanECKfQoLTbGG/E7Tcq+TldyGFX5eWkaYXb+n
|
||||
fT63LcTG00diGBFzDYUiBV9HIeTNnSuTnq2Ig3GzA+rEXBz0+f3OQaFcW2yG1yAgfN/l8PmFFq1k
|
||||
z0m2lY0G/O2igmBu1dP65G4WrG33Qi23UgoyN1cLEARzxBVdDbZzB1ooMvdK9ZQ7BtsROQPw/PiO
|
||||
pC7oklVklQJDR1aoVXtWcRwjbYSTaZypN8RNsaltm4KYizE9a1c9YcJBzuCUV1eq94nbrMfSGME7
|
||||
USIidJGRLI/Dp4em6Sn0fFeWZL7LkQhFt43pK/UVQL+tvMP4/j2QJ+fYbMXN660cE23HdtX0YE2l
|
||||
aQQLvz2RUA9LsB9+90xxToqN9ubQgSn/oBKO8X7D3vtemEzTkhI0Fhdgb/l1RXf9BRI8XS8rdR6q
|
||||
l+xqAgxoLlGMhK+sBZsbDi28Xj2MA9uCydKS0wpd1Js4FAabHYVZPEHffrwxcoIwILz87pXp3Fr0
|
||||
Bog8/dLxO0It4D5IPpetyeahjABo2zuNr88D2GMDxsr1EXEkRdq72Xl4VUCD3waR19tvmiezR8Ac
|
||||
mheSNmwA5uinAW7vwwuRB3qzOYuOKcjFl4ZWb9/Z6vrZDLcAqeTI2nOwKk9fg/ztVWPnkB6DqVNg
|
||||
CdfumGPtKXyb9d7KCsxE84j9q9Ka223YCGgEOcOBnO7JmiHfgOHU+tgE2G32j1dKsIC9iy1AomL1
|
||||
he4NR+as6GiiwSRhIs2gOB5TtIm3zmSvz6+C909M6Pld182+fOQUys2a4kc720xcpV8IldbwEddQ
|
||||
Wiy7p+UQS96JekPcJO1OohHMjSKQY1HuATM0PYbTU3mS9QnEhjXSJQUogy6+/PSJjQYwNOhw3AG7
|
||||
tXdu2HLRDCjk5kbPl1UDAvXmFZ4EfEarNDlsTyxbhMLiqNT0FgH88tGMoXMPK7TPZjPNSQ18wER+
|
||||
xeGnuwTb9WdKsDqTDntfPTIZXO4nIJnRgANLdoMt+e4plJKDSqQN14BF+9OB71+aEw75ZsFu++qo
|
||||
1fBO6Olsjsm6W14IN3sVsKWU6cSm1K3hKW0LUqPX21xfQ5rDJvpsVNu4IKEZn/DwJTs11p5AnPa5
|
||||
uTrQX0NEz6N5MRfxJUVQtS0b3881M8mrrGMl82ueiKfsaRLrC0e4+iOhlngjCbsyPwL6fPRQa41u
|
||||
sQ4G4UHwriyirGuTLFV2eUP2Pc9oAsdPsOpFcQKS585IUNaTuZvqIQUV78r4XA5bw85WBOF5vxyp
|
||||
Zn1ebHfUB4RbZiOMnGAOFv7tV5BYjw8OtmthUik/GyCqORG9h9o0j44pcJDzdxOHsVwlM7+CHWRj
|
||||
ecHa75Ca7H28Q9D1hUF4N7GDdbjWBvTH1cauGNbJHma9Bl3Um1TjZ9osz5saAxe1Jk3J+QM2X2kJ
|
||||
iCXRxZ68GEDs3tYOP/wu0aBRQLG2P6+Hj0vdUof/AkbKrXjAblVu1BK0udjPVz8GnGNGNHiGx2Iv
|
||||
t+IBC9i61ECjUGwKRgZ87vyVbNq6mFtphKt8IQ4kt1JSGmrpDxHkb/LCliXAqRjBb4T+uNsE9qIV
|
||||
7IfkLcI2O1N8YtK7mXwHSQAdexl9T60+rXKxlrAWbi5aoCIHs8JXMQRppOFI2gTATvc0gkJubgRM
|
||||
8t7Qm2/l8O44FpEavglW9RRJ8LnkIbnqlgeW2pAhfOhhQB1mW8l+mNGoSEmnox72zNzEY1hDfw0R
|
||||
NvmPOW305Gnw2/MBtt19AUsELpLq3suUOlq4TjPy5RlG8fikesiPzf57vXPgot5EEt5IwgKr8RUt
|
||||
43ckcWNdzJ+rq8EwOxnUbQov2X7PrIbsYmFyYJoM9mMU1Sqk5IUOoYaSFTevN1AEUyIM4O/ETvcy
|
||||
gt85TqhnNwXYm4qFsA9fkPodp0/bsKkl7ETDoUZpC+bWl7cTEOLuR13ldpw2/SM68Pt5mPS8Tk4h
|
||||
7vK+w/skdmhSJj5h+ySusICti9MDy6YV+dsMy9u+USeWTLMLlccbtFeeUcceJECCw6QA+JiSZfbi
|
||||
N1tDoUhBQoCOxPM3MsmzhCVY5DrCzn1apvG+lhrIroZHA+0gBtMzNE/QTdMOqSxtg309ayGUmtCm
|
||||
l58+AUrHQw7Dj5bQgDdfCYm0+wlUA8DYZL5eiBe9JiDshRs9/aJpWnhB4cC77TrqpZhMmyR/Rzjl
|
||||
1RWbiDuBvTzRGT6l+Ek2J7wmmxX8ELwKzowEtQqLfa1XArP1SwmXwy7YXHHVwO0xu0TCGyrmmzwZ
|
||||
MJIMA7uGAZK9/2QZvKizRrgqcYOtuiMehJlvYMzJtTm5Rhyqr9+zwucmcpojdqAFKxUWSEKPsBC8
|
||||
s+LDNrhW1OAE26Tv4wWqVm5p+Ho5LxNbLpoBUQZdfKmRnmwz7+TghWKeyMxeJgKgJEGgLCLVLtsY
|
||||
zOaTi8D3sh7o6+SoybJ6jx7kVxFRv7bWaX18vwiiYy9Ta48Hc/1loIe3B3Gxl2LSEKK8fDg3ioA9
|
||||
yxkBtcith62wR0joF5ZQJ9tO6vnys+m540awvG6TpXBraVDTrt/TjkZlhViNjtQMC8XczioWoRCW
|
||||
Aba4rGtovPsSKFU/I4CUn2bOomMGf7cxoo6h8NNmfy4zvEfcFetirSTduQO9onhvgTDmf4r5ezvv
|
||||
wLiNR+o2PTdtfH1YQb8dazSrSxisXVoPwFuvKfracE6YM39GcJsWgQYvQTB38NVy9WBLPq2swyvZ
|
||||
uvuHB9S3XhiFkjuxqGxbqMX2gXox64NdiMYZ1q/DhxqlfQ22o1Q5sMs2DgmlXSX7Ra8JxJJ3or4k
|
||||
/KZZYtYMH1ulYzfWPEZO3ygGgVZ/se7o72CT5O8IPKXG5MDJddDtnpZDq4YXqlmfAyO/sAuhfmMq
|
||||
DfrhMFGO8jHk6xtDwg1s0+8Cp0q5+j+FOp/RAhsLBAJ/7Cxib8jjZFV2pYbiTZlI0QVkYrH3QvBl
|
||||
EhXB8+0bbJy0OfBDl4k65X1j++F3z2DjM0w9PXeaNQNohYJrmNijmpuMfKeP6sKzJ+Kt0S343dKR
|
||||
ahzblBwr65tsu6flEPDSjiR3PwZ7ZKISvvgbRTdfj5s90u4+3AH5kd15nxgZRDhIRza1OBQVvdmN
|
||||
AYlw2y4S4dyamNtUvFu4WM0XcZuPTcZZSQwFd8nJfNK6gJFDloI+WL/Y6WgQsIWzdsh1cKGGURjT
|
||||
b3+XA+hEWlCzU27B3glqDN9V6aLve5+mPb9fIBCNk08rdM0n5pDzA6hX9qBWUt/A4jX1AE6twqPd
|
||||
3K1GSGKdh3b1TLBh9EcwM/uxQovFI7VUXwKzZt0dMJ4Xnwbcqkz7tXAVeA/eIdZecT5tQXSNQUJq
|
||||
mzoskifyDZIReBZ/wd7kiMl6l3MRrJ2QU1ucdLAVkzGCkzf72Hqrx+YbrtYJ1p5oIQYekM3olQ+w
|
||||
THSOOvfpPO1VkRMAfNvCWll10zK6hQjIQW7w9zD40yCuvx5+9arH+OSoxcJJsgOO6tvBwQF+E7Iq
|
||||
IQRh5hv4fOogY9+L2AJ/CnQEjEwy55WMPLiD3cCm1HtgO350EcqDZFJ/3sJg0z+cBWB3Gqivq1VC
|
||||
63DNoPQzPRxK2RjM1iqX4E32N9WaG8fWi3h6QwrMkXpfPQrmgPdWOJ17C5/Tn1Ks1NRFaOcDxgjs
|
||||
VrB2txeExfN2QEpKjgn5sDaDckVdMi2+Dfbw9wmhQ4Yen7WrngyMjgjOv0xFs51Ttp+vfgw5DYVE
|
||||
UcKpYdD0ZgioH2HTOWzTb/XFVIWRW1KL5EeTqL+pVsB4f1JHb/aGSBwRYZmYHHbvBz0QnjchAqdW
|
||||
4ak5XeRgSeA9AvAR3ait44O5/JhrAOPYpkjZz3wyyNWyw0ueFdT0ZaM5VudyhqfD44TPl1VjQqY5
|
||||
BkjulYxmYbDBT6l1pF5UolEDhm4h1n5fgWu7JwThj52Q9/5YwQtFPPVZmxVs7GQfVsnLpbolsKSv
|
||||
9hnCLkA52vYnbzLrFqTyhfke2TLfBJvduhqkvHJG+5ocTBYQpVW6AOWInfcyYNDqSuWu3y5oc/M6
|
||||
WH8Z6BWusEp6EetHQd77Y4fBu7LQMbMjc/7q1xgogi5RhGeloTNkCNajsOMzv87T7O8SAqy8Fvhk
|
||||
Liv7PbWfAWbUYOoOm8F+7xwOwM5HTB3N0aetNjYOikvGsKfnzsQ27boq9/L8odbDXwLmGnEI7o5l
|
||||
0Zgcs2kKrOYEHxk3U4fZbbEgMFuSk74mImrOpyHgq+UqUKhIPuKtC1ZWfjWlrCeDGl1Bit35jhWU
|
||||
GHgQ+dO5TBCCqQbW832i1+Wtgr2qkAb2goVkvcJDwaRFfSvGPJv4OffvYP0MPoL5xS6odioiIDms
|
||||
bGEfPiH2BHYBu9AXIXjo4E0OTmsz3n65IizUpCSq9fg29IOPPCyN9EetpL6BPbHOoiJVvoSAWbeM
|
||||
PWVXAr/zCSABb53J+mjRIE25FbvHX1LM5yCQwGavAhGMYgL04vwUADBi1E0JMren9tPA7N3fpE9x
|
||||
V+wGMDR4pamPul9xbJZ8DCLgg9cD7ZzQmWuKUgRd1JtISsS0WbH4mKF5TgVclikXEMgZFbxd1ZSe
|
||||
7rFqrrnCt9B/f33EOuVmbgm8x9Bf5QY75X0DdD1YJ+C/O59qidkx1nTZCdzBblDrVp7A+vh+Q3gu
|
||||
eA2jUHKnzaqFCpo5dpAk33Eyj9Mpg4aqldQWXkKwtsnFgt9PbtLX0m4NXX/FCFbNOVK/uPQmW3/F
|
||||
CD8vI8N6Lp2CjVQBgU8pfuKwF61gfz+ZCPrS1UijmArb5ZiNYGmfFsYn51kMKLyWMABZSlRIG7ad
|
||||
VVcEj63UCQBH3eRprmQwOncR3fDOpeSQpRC0/R2fNlyD5XYV3xB/RYoO9TEL2KcGBgQKFYmcSXOx
|
||||
QdmMYGRkFtasbGx2PnZKCNJIw8nvoCabEkYzbKJmo9rxwab5dFIg3KNawxbJj+ZSU26GW0diio2O
|
||||
TNs1bTmo3AWE2lehmccX6mN46GtIrfY4NOswizEAvLQT9nLnSewENYbbdpHwWbvqCV8nTQQ8u+nx
|
||||
qa+/CVOWJlWH+IbIgZMNU7jptx6iD59Tz6HDxPr+WkOJc67kfRWjYjBP9g6j5LpT3xP7aXcuG1Qf
|
||||
u5eR9QoPBYO39wAprSx61ve9IEJUE6XVti91GldgzJzqCNz164VqVuY3K+RDB67+SLBxjcdpS75K
|
||||
Cpa7SmlojMHEb/ngw7U75lg3qy0gl0k5wdUfCbbVai6W57hW8PY+duTdDnvAGuQpoPEfCXZgnwTr
|
||||
R1UIhMJLp76uckmrBg4CE+l2bMcjNy2Lx3Hg/okJ9kekmvQVrBzwHu1ODj4FwfyGHx9+9bKnji8H
|
||||
jOW8PsDfYuUY4Y9dHE8LQHBGfoJPDW8GZD7PvdJ0vYq4ojPAnlg2D50fYNQs2yfbwVd7qOTRcPT8
|
||||
ro2JTGFhgK/hAWo3BxosF9hUUC/rlXox600m6gYHUAZd6m1kmn7guM3wG2Yi1Q8Sa1prlUsoPinF
|
||||
56vgBnw7cQqE9jnAJ3F7mCsSdB+q17xCnNIfiwk70IFDCWxqSr0Hjqn7i+BYmQ3hcmgHLN4NBaCZ
|
||||
DBiFXNas4vrrYZp2GmKHdAHtaygf8KY1IWK/g1qwpn7zAEZuSZEThAFjWtDCWri62JMlydxTY7SA
|
||||
Z3967K4AF8yaGS/zYXeh5sbx04Yu6QD3oGTULy69uaqBg+CC7YoiwXqbzMlkH0buHtKgHw4NSyu1
|
||||
Bvcsf1L99k3BJsl0gIfpYOHw3stg44Wdg+87Tal7P3zM+cm9HGgP3ECOvr43lPwWER704Eaa/HVr
|
||||
1sN7fis+sCq0TlkWLGmSPGAjgIwo8xYGpPd9Bz6/0MIes8/NMm8XC9YHhUPNFPgmL9haD07XZMVB
|
||||
lbfmfGnsFHTa8MFh6m8TvflhDub0smHzXRGwV0VEYCndG+pNTpasNFdS+EoyHWtT8GZL8qscwF3H
|
||||
FKksbYOfgpEGbbLNaG3CX0NP9zRWtUs+Yudl4Gb1X6oGnst2ok6bewHb1HcORLeNkVhBbTryU9vC
|
||||
63qTieRfHZO1cq1A8GpCtPlOn0xpqYXwZ3QW1o8yTAhrJAt+DQ/g0y8Kms2Rkh2Wg+sTbiL91E9h
|
||||
YcATPsU0zWwHrKIrSPAOdgMjPCsTsaLhAWaJ03EQfKdgU8J8Bt/Pw6TOQuqEdMEiQmvI79gDVsto
|
||||
uq0ERO4a4udZpICcD/cZHkB7IjC3jGbVrIujzr9MxdrhPjJyGu0W2AMcyFsf+oKpZCRQtS2bYgKS
|
||||
aRsePwR+7CxirfwpgB5I/lbRsZfxySmuAVEToIEyVN6I659JszqsbMFzmO7YPH97MMjbm1PhashY
|
||||
Ay+ZTbcXL8Hbg7hE0C0PiPWutWA+kQBtR/I1mXb7Emg96xPGMYmTfT1rCAbr+sQ5dJNmC6JrDHg9
|
||||
tRHdfje2EmvpQbdKN8LsgU4rv4IVmPVcYFfrxmJVdqWG3vU0YuP35qfdjPYeBFlyoi5JHubmt3wI
|
||||
OufVYWtMIrZ+R41TVV/2qC7YXrJb0fAANvdYqZXUN0aD4piDf//5+/v7+99//v7+n//65+/v3354
|
||||
Vt2///n7d6m25b+rvqiez/r7/u/H8/HfPC/++1///P39S+bHu/r3P3//88/f39/fv79p6H/L/7cM
|
||||
bfWd//3Pn/hf//z9/f39uwzLo/v/lqGtvvO///kT//n7+99//vcfAwM=
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 88142862e9f182e2-GIG
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- br
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Thu, 09 May 2024 19:39:49 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
access-control-allow-origin:
|
||||
- '*'
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-model:
|
||||
- text-embedding-ada-002
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '22'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=15724800; includeSubDomains
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '10000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '9999946'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_49a762eade78fd2313d74a32fa22ead7
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"input": ["test value"], "model": "text-embedding-ada-002", "encoding_format":
|
||||
"base64"}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '89'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=4yTFGytKZgiKE08enzMel3PhLQLu9mwQY7gk_lH43Bc-1720089820-1.0.1.1-4V06JbYY1zuYkbV7SSudBUPyqHqm6yFze0ufE2rHtJTtOmU.XLh_k1M9h.a.oVJGC44GhO25bv5s5224Ic.p6w;
|
||||
_cfuvid=04ERWHryLOXBKqyH.MpOILpw8OtW9Tchzbutsc9OJnE-1720089820726-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.35.10
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.35.10
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/embeddings
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA1R6WbO6Orvn/fsp/rVv7bdkTth3zDKZIKgLu7q6QBFBERkSIKfOdz+Fa/fp7ptV
|
||||
JeAiJs/zm5L/+NefP3+1eV1cx7/+/vPXqxrGv/7Heu2Wjdlff//5n//68+fPn//4/v3/niyavLjd
|
||||
qnf5ffx7s3rfivmvv/9w/33l/z7095+/GpyZFHdumi+T7J9BWuUz2eglAVNAwACtq4mol9gfMC8P
|
||||
L4b7bksoTj4NIKfmJMFNq2gEdsM7n+/7pAKXnMuwr+zfbAmzSIGHUogxqi+nfj7co6MUmFtAiuLO
|
||||
9Yv9Ua7gGt41GjyUCCz5Q5LgDZdXelGWhY33exkC5zUF+JhPJmBjV8ZqPhshdVmD626vcRY47/YH
|
||||
Gj5OLzA9iBGpQ15fCZxKrx7zJL7CVORFbDRcDVizzARytmVQbV6u5tQgUsH80HAUx2ZSL71zucJR
|
||||
SUOknoUJsOLYGtDX4w2aBgmYwzV7pvDAA5ksD6ViLCj3sWKm24Xun5bbT9pLRyp5ygrWdzcuGM/l
|
||||
o4LcWbHQxEsmmw3RPML7zpGwP1xsxl5xOME4P22JqI11PWVFxKn80ocEhjuSz8F1RMprxBX5iOBV
|
||||
z5VHLTiflC3VhNMzX54npsHzPS2xd58u5sLZqQ8oeGd477dGQIrhScBZaD84SMKxHtswDqE+hh32
|
||||
5nzOKQ6IBbhxHtHSH6yEu7htAbaOL1Lsx3ZCGl8oIJlGhH0KX8H0FvoGWoapoq1Nb8HcwPsGyG0Q
|
||||
oe98zD6ZFVXaiz22/UwHs/JOHZiF9IQE81QlH5ubOPjDzhLh5UeZz/vEeML0uWfYFvWeDdbP4qqz
|
||||
X/jUMPK2ns/lo4Ty4ldUf1Y6EJ2Cc+DldlDJ5NuJKdZ8GEL/SheMwE6v523mGdCg8hubNX3mDA+5
|
||||
BrWCXOjOLQ5g8YCQgm7wI6ydL17NHpomwNt0SRELdamejMd1gtjjCoxechcM/K3ovuMnTA06sDQx
|
||||
ekL22N2xtzcE87PsnldgQnjCWJFxPi7tIYPNORmJ8jOn/WiP7QC2YnZAQj7fwfzuxxBUs+vi/IOK
|
||||
erpyXggaQBmB4s+5n9ukO8NPRd943wG+n1/oOUCuK10a8BVvzjYncVDVzC3a5EEDukcXh0qK2ED9
|
||||
SXASfp1PVXIMgPewGQG9124MxgqpiAcgZIvOv5+ASI8N1Z9bIxHV3TTA6u0K2OZzqf/04rLZKj0a
|
||||
qH6vJEbYs/ZhKqRH/MPbgDH3dLiqNy/ABBB32y+pJaXQaC2RbLj3mMyboF1gjGWfepkgmYOYcAbc
|
||||
g/Mehw1uwPydDzur79ioRcRG+2hK8HMvdLqbAoWNtE4WuFMBIlIDtJw1iifA+5MgsjGdyhww8Ipv
|
||||
vaCLCOx+tkIwwfKz3+OLcFpyUt1PHYQf94Tivr0FMx9WlmqSi0T1nZb087f+0uoyY68ZUcDeeqYp
|
||||
1846Y7uCb5PiITeg2ECDBLX67IfrIDtwnQ/ETfQD3uCQH8HxcJWo/yNqidCWRIAS6w7YdbZ2LQwD
|
||||
bEDweB3ors2G/kPAuYNZUf9QjZdqRiJbHkBS5DnGbRuyntM/CiyBjpEwb8t8lCVfgPmY7cjExVO9
|
||||
TLJxVM2Mcdjg5S1ggabEwNHONsb+pwgG9jAd2LQ2pa4bbesWJWEK90ZkUuPG5QGLveQJ7wfLR1sc
|
||||
Nskwpo9JPXPCG5u9GZtLkewUECXzEykU2kH/xcfTsTqi+XXOzWlMHws8nHmN7owO1+R1M13Iy9qF
|
||||
1M+P3VNBeEWQp/oOa22u1MNjfzvCDyQSNb13D5ar+IihvakMiuQHydlt6CRl2bkXWqzruYSJO8H1
|
||||
M9mSvuwXfOOe8A3VPZo6fWQTvwmewHF9njQ/x3tCvnwyGzGg7vntgeaeyxtoye0Te8dr0JMkepUQ
|
||||
ZK1F063imEucdw40WkdEwpZV+SKpYghv2fZAEQAKoHxra1+++uIXmIJ3/4TCBRypeynEgFnteQC3
|
||||
PckQe35eNaMOUiDffhSK7dTK+U1QLrBpupBirajzCSVWCg7UwFTr37o5acPlDOHHP1F8rF1zWi7L
|
||||
BuxVH5Ol4Uw2RMahg5f8BqiRlWMyJjAS4PV8jekP14J8bITP8ss316tkJ7OtUE0JpWjGX/5lyQ8Q
|
||||
gHK/pEi+sTGYbsanUiZBcNBWvtSgNy6bK5Q95wfvpiBjS/NMImAwJcA2TMf+NXW9BLnti1Jrir1k
|
||||
VrbZBnaDG631qOWzaqXttz5xUKiAUXt2JwiyzsLh42QzXsdOBM/3rMRa2YX5GI9To3zn/yiVUtCz
|
||||
+d3B1l0eSNT9pznvPjAC5BVd6O1ubXvmFNBRftxrTw73tqyX89gYoLqVJlGcTwtGHTvxFy+QNIdm
|
||||
8DsepTZviMxTaZL8/bbA4SxqiPXlUn/XC3LisiXb0d7n7CLLEE6xWGPjqMlB/zSHEj7iJMVWB971
|
||||
ZHoXBcKUCdiO0NNcxqsuAPxjn3DQH5aE8a5iwGN/9bDBb3f1Eg1XDu5Pyg7rwxT1y7fewqagaFGu
|
||||
rTlkP+4ALgpIEHhmr2QafK+CH7MjNIzfVc1Op88Rno7lEd8RtRnXgU8ID3kWoKUWCaBknlJYnl7e
|
||||
uh6f5DWS6AwKVnCEqx9LsGjtI4YlQ3ciPG2hnr3y0AKP03a46FhvtnFqaDDt3tq6vvua6XZpwFHt
|
||||
Z2p5hgaEgQsHKDc/aOUPh03O0SaQ4zcq9YKeyz+RmCBY0meGeHysAI0UEAO7Gih2uTICU6bWHGyv
|
||||
2Qv7z22U/PKXKl9bvC9GByx2VR3hV9/B+F31s3+6Od//R7ZaYebMCCdfNXs3oUawdOZynS4uFJdU
|
||||
xL7mHfsJye8KhjhsSFmNpfntJwhtb6GBlwb1yg/TVw9g96IL/WKykwP9y3VP9z/JIaFj10Ywz582
|
||||
jsU7M0cxN87KPlB4MoGmMEn0Csk/6xHuSLLy7REEYeCjit/u+iU4EA5ET98kAoxqc7gZjxIeyhtD
|
||||
4yZ8BCxivQugHSyILYprTp1+v4KL18tYn7u5nqcgkqA5vAS6l+PSZDLoJGhddYQdeu/BmCdZAW+J
|
||||
/8Du8MnYyPeqAw56pqC3RExTWGBjwOpWmVhbfz8RXCCAtR+xt0MJW6yjb4GRF3Xyxeuv3oLHH87G
|
||||
3mtf1dOps+H3+9SxKTVpRtQMjKZr0fimPcAEtscFrPyCzSw1AO85nADzF5KoXxkgnwpP7qBQOU/q
|
||||
uA1M6HgICsi/0Q9FbByC6UH8+Kv3qLnbifmUvkEBI3HaUacO+Hy+YWLA/XvwCOdvBzbbx0CST5oS
|
||||
kZtTKcnYfTICzvumwLtEegc3r7x0MA2qHVGjq5WzDdYEuO9Ugt13WrI+FhEHjveOR4/S1vOFz7US
|
||||
HpS7hYjwlPNRxyiGYKdpONEnHsy7DxfBlS8JGAJWD9/6WfFwrb86Xz6fSIJKHw7kfrl5wegTWYJg
|
||||
sgKKVcFKph+JIOV5lTH6rP5juj1gBc2fEmOsAROw7e1jwXzWQqxX9xGMqXCQ1LWf6Q68poCMR3mA
|
||||
RFJudD9LXc3AokWg0ZGDBD0m9cRvzEb5fCwOCYb0ACQrz/CrN6h2gJ45+Y7Qff0Hgd5bBssl1Cp1
|
||||
7XfEQx6Zy82iLSBPoJCF3777edauxy9fU6M65P0ciUkISUMh9dNZ7+ew318hGAqH7vKQY8sXLxvj
|
||||
86FOuxX75VtvBejMr37MhdPOUOBNat6oKQ5cMl1OzQTzMd3h9KKf+y/ewOLRLRTBFNT1Hmcl8NKQ
|
||||
Ue8+ycFI63wBQ/64Iu3tl/XC9SACJ782ELhpmI1PLbyCr1766tNO/LlaQAozl5rr+vX6YLowS8sP
|
||||
2oTKM1/9XgQd7WjT89o/9HTEKUy5Z0zd3blIiLS9uKAEJsZuz+u5mJ+XCqjb25k6DA754Pq9A/bm
|
||||
2FArOpN+6t13A+VdfMLa4e72E9a2A9SWc0EU2zzVv/7W3sctgs4U9rNqRR2sHUyJzIpXMDF9skCc
|
||||
/2xJ43+igN4YMOBUIwMHWQkSNjwE9MUfIlvcLp9VH3EApIKBw/34qNuyWVxVNNQ7ds6qUwu5e7Ug
|
||||
asIMcU4V5lzZKD6MNK+k6MDZCb3IM1TFl6vhSxWN/fp+B+5P0g6fbFVPljMnuCAaF55sVn017AsX
|
||||
Qr88idSEpMvp4VFcwchpW5qXR7UerLsfg1VPU/MxTP2qhxD01E6mule2Jou9vPnlt3V92FhFWwSN
|
||||
c8xjL6m6nOj8u4GHG9oj/iqzZDz5s6uSy82muOK6fqj4LFWi+KhT734p82mJOw7mr1CiSNRlNnsN
|
||||
FuD+UwbY/DiNOdSBz4FT66dkuWkPNm7fuzOs0yyi+Kxy/cwJcwvJoTthZ+2f5nULfOV4b3ki10IJ
|
||||
xlxWG7DqO2o9wAYsSkJb8NUH/Z4Pg4XrWQSenJciwsbBXO7HhwDgWeWoRTCfsEM1persX316K953
|
||||
c06pPoHgM5XYsw87MOmZ1cL3c97QdbzBdIJZCz2EH9TXmlPAlAo50DJ0FUlrPc7Xx0Kgr5xcursN
|
||||
n3z0HI6DUXzWVz/pm4O3lD4YNKfFzuoHlyIQl189yGF5D5rUmlIosfZAnZ/jNhnf/SuEcIMhxbje
|
||||
BkOXcTEk+M7QFkZz0FcjdJQ1b6A6iyzApPBEvvyDrfoRm/M+8RvoTpueJD8SWfXvFkGHZioS8uwd
|
||||
sO3t4UBpz/dUk8O5ZjvkneHq76jPJw6b3jc0wZUPsVbuA/bxykOnhlmfI1mNd7kgbQ++epeOMZHV
|
||||
+J188x34EkMOyScoBgsXF1cIK+ai27VLTHZCFwQFnrSEo7xbj/FwbiUjTOpvXsBmahABHqoDT6QO
|
||||
E3O64OkJNzB4IdUzEJvBnsVwN9ghafbNK2B6cA5BYsA3Nj0/yKfDZClQ9I4DRWfi5P3ZvLbAlVi+
|
||||
6vmfYA5q9QzrQTui3kv7fgH1gwP5NvRonNRpvljKiMDg0YwGjvQT0O5VtcAWzgwp/d2qhfvxwcGH
|
||||
djvj3UuRgpXPBkjZ0lH9vpVzcp0uPuiDvUctlCj911/Bzi1DHARLGrCM8BlY+ejbD2Cculr55Xft
|
||||
yISEqTuJAEF9pNQBZx2wt1v54OA9PbyPmGq+Z/gM4R3HO7Q0BJrjVw8H9qzSVe+teJV2wKxUC3tS
|
||||
2uRDKAQCMMJDjfvP0ck/X77VL02DjXcA+3G+yD4Aw9XBu+PSJmN6VS0w6mcDewRAtrwv5wHAcusg
|
||||
Tiolk6KbMoGX2ug40HQ/mECpL//4zYMXgOXJIw08suubelxc1PTeRRm86KKHA/P5yYmyl69g1T/U
|
||||
k+jWZHHnljANtx11nBD3w/3gTXBvOTb2L66Sr/24wNp5IhzmG+ubN0F4obstYrudmNDT+ERw1duk
|
||||
GgK7n9DwCOFNer6xBVWpfgmJgmBq+woifkbZ1z9DKTL2RHyXfTK/6KWF3zxon+6Xvi8ezVHNquD6
|
||||
5cd6EOx8Uah6uVLNbpeaxiISIJHqDd5vQj0Q32/eBS8RcXRHD3JPtPYTgTV/WPWfZi5C6whg9XOI
|
||||
W/Gmq5txgfJAcuo/eqP+1eMfTnGxrW40xu3yk/Cb1zwWZuUtGh5IDadCp97KJ0ITOw2AurQn9zWf
|
||||
G3bQn8A0cBzdXZ1zsEjbiw9Lk+2ojXtQN9DfazB24hSBqeXMudiAUP7VW7gyg+nTvzXovJs9ktf1
|
||||
WC5L3yorPyJRrq75fJqPkYLN+wGB01AFcz0PvoIk7Uqv3vsCqMxly5f/kFLaOCGW8kIAdi+JhshS
|
||||
koFrE/TVz9hyqiEfAWkREHXvhl3tPiedOEkKeF4B/s0vnnoWtmD7ajC1N7mez6lwUGDUbxhGeeCA
|
||||
JUwFqCSMr+je7Mdguh/0CWxgadGfA5/kLdcmIRwSMtBv/rfmKZwIKO0JOxUPttZjqqaCyZMWaK+A
|
||||
GaHkKv4R/J98Ij8rFWRBnxLYezvGw2teAUAil0bAU/sV7y2A5h6TZb5u8+nU7aGSXyYTn794buUZ
|
||||
gu2bz+hu94x6+bm7Pr/9+cXjYPk59S54Xd4F2a7+T2SSuMAVT4nknt/J6DU7Dq5+ilqakQa/+bKP
|
||||
C4CEffNa9d+7BZ9OEpBkVC9zOog3DRZUoFg36iQgH7eHwPGgSFQwDD2BnbwBP5QwuofNnrFp13Ig
|
||||
ltQT6Z/ZK58DwdAgr+cLOpxbkdG7kR/BeXO//PO+K36i3/x6kS7HZKl32QQd1+XxhaVqPn6ypYBK
|
||||
MB8ptqlqzs/d8Qn7AHto004/ydJvvRiKzavGHi/NYFzvAzkefWo9no3JkHWCwCsz85s/5PP2LobQ
|
||||
MsUtUvhply80PWWwFu8eghcQJeO9kc7wW39rHhpM+auEv/hzWP3XWB777je/drdKYzJ67wmsVeGM
|
||||
da90g3U/g8ClW244qEsrWDKZLcDcVyppwKjUU0AYATuLt7AjyI+8b4VTCmdWnchWAzX7zddXfU2q
|
||||
tNYT/pvXXmT7QPf6QavJy2yOUM6UC9aGR5UPDWpKWOlkQttyPoNZf2QbyJ9FiQirPp0NykI4vQwL
|
||||
f/3ClO6cEjIWaTgSBtVk9qxNcN2PoT6LFjAsa1yhavoWW2IjMPK+FANMuSamOwGRnvmHlwHtdLyh
|
||||
p/XSErF4NGd4MBGkuqe29fSUBAe4yzT+1odQemMMD5wt4m9exU9dLYG9SRvs/fL5pj6q5QfvyQY8
|
||||
DVNcup8KXjItpehatf38UewC1tsuJSRCcdB+84RR/czULu5NP/P2AaqLsI8J16TbnFmfqf3mRxS7
|
||||
wxIQVY6RUqh2Q/WQ49kE3wcJ7LZj9M33GJv4qwM3hUKwliddP1G+i8Ca39JQF4Ke85bWh7cpT7E3
|
||||
bSYwUJK5UMx9gr/6fzTvWgHT6E1J3ZtLMB0iWQF80iR4f+2SYL4+FAK/+0/4ehSSV1E4MVjHj90d
|
||||
24AxUTYKsCNlxEEtQkZXfwqKR7sQGG/AyvcHHz7S9rXmnQGbX/TQwji1MqwZs51z3gEgaPZ+gs0L
|
||||
MHJ6Nq+d4uSLjJTKNQOGVHv66jm6P42Fycw8ytTyYW6ov96n6OUrIN6NgPr3lvQjObIKfuxioiu/
|
||||
JwzdltXPpztqWz8k6Ir9YYDikokUWZqZtDfjU379Czb4ahfw1NtsYPlzCbC1FzJzni+zDy/0WiBo
|
||||
jRLocvfqQGm7s6lpMq/n1nwHVsFPRfiTawfzuK8z4FK/xYGRn+tJSeUOdtpRRSpNheDZV2EGVz+G
|
||||
QHlU+yk8TxzAsM6pW+ZhMB9sUEJZk1383R9YSH2BIPcuT2yUR9xPjd6G8lovdN/oXM9YfGzhcc+x
|
||||
3987sV0TwxbcCrrqZzY75sWHo5KFVNfULVu25NaBczBdKc6zUzC/xu0A0w22MKZ3GSzslhiQnoMj
|
||||
RUL3MIfPsvVhqBsfIvkJS4ZTpQoQt+CHVOjzk8wnaUiVQE2fSHzdfno6HswCnjvxRMDqL8a37TuQ
|
||||
O0sWDn4+yBzr+OBCLqpE9EiOvsn58hSDexPM2D72z5ok3ikFsVqW37yoH7RgSEEjyzN25In0U1uU
|
||||
BIrH+UF9Tz3Xcxb6R3i0kI4xdR8mcRtkgG8+rwZ21bcr/8DmfBjRnBw7c0ABF6uPIfpgfUgic4kV
|
||||
1QIxsl3qHkb3myeUX/2A5nur9ZwycSWEMFeILAK7nlwz3sBnsNujNV+oOy4sI+g3o4Wtnw6a43CX
|
||||
LPg+eAoOlWNQz2rHFnhJ+uDXfzZnlBuw7sKYniq4C1gX2tw//mGQQDDGo9SAr1/evcs+X9rKnf7B
|
||||
/7NQMXK67xdYb9v0u3+XEEnSDdA0bYjjF5r633yF27ghmYaNwViNHo6aSwLEpjV2bLA2pwhk3bEn
|
||||
FIdNPkuTQmDEQ5t6xpzk7LjzHJAKOo+9ci8DSmS3VKepkLEeKMd+9fMa+OYtYAiSmrH42gLb3l2w
|
||||
Kf60fZu/yo0qWUjGyPqBZgu2xwmO58Ulwqv3AG+P5QDA5ASIa4q3uSjJu4NrvoHx3otrZhy1GJaP
|
||||
9IaTIUvYkninDKSiKKLL+fJTL0k0VuAytlfCnSLaz7TOJ8ArXI4tPuzyyRm7Cm6CqMPou39uHQ0H
|
||||
kDzZUbcNs4RVI2cBzqlfKx5GbPqOl+SHHfVQ6yXzyp+AAWeibsWn9VhFYgj++p4K+M9//fnzv74n
|
||||
DJr2VrzWgwFjMY///u+jAv/Obtm/OU74PYZAhqws/vr7nxMIf336tvmM/3tsn8V7+OvvP8LvUYO/
|
||||
xnbMXv/P5X+tL/rPf/0XAAAA//8DAPM57x7eIAAA
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 89de84040d6cdab1-MIA
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Thu, 04 Jul 2024 10:43:40 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
access-control-allow-origin:
|
||||
- '*'
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-model:
|
||||
- text-embedding-ada-002
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '15'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
@@ -398,7 +393,7 @@ interactions:
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_834c84237ace9a79492cb9e8d1f68737
|
||||
- req_c684605ab95c8ee822c1b082fc95d416
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
|
||||
@@ -9,6 +9,7 @@ from pydantic_core import ValidationError
|
||||
|
||||
from crewai import Agent, Crew, Process, Task
|
||||
from crewai.tasks.task_output import TaskOutput
|
||||
from crewai.utilities.converter import Converter
|
||||
|
||||
|
||||
def test_task_tool_reflect_agent_tools():
|
||||
@@ -80,7 +81,7 @@ def test_task_prompt_includes_expected_output():
|
||||
|
||||
with patch.object(Agent, "execute_task") as execute:
|
||||
execute.return_value = "ok"
|
||||
task.execute()
|
||||
task.execute_sync()
|
||||
execute.assert_called_once_with(task=task, context=None, tools=[])
|
||||
|
||||
|
||||
@@ -103,11 +104,13 @@ def test_task_callback():
|
||||
|
||||
with patch.object(Agent, "execute_task") as execute:
|
||||
execute.return_value = "ok"
|
||||
task.execute()
|
||||
task.execute_sync()
|
||||
task_completed.assert_called_once_with(task.output)
|
||||
|
||||
|
||||
def test_task_callback_returns_task_ouput():
|
||||
from crewai.tasks.output_format import OutputFormat
|
||||
|
||||
researcher = Agent(
|
||||
role="Researcher",
|
||||
goal="Make the best research and analysis on content about AI and AI agents",
|
||||
@@ -126,7 +129,7 @@ def test_task_callback_returns_task_ouput():
|
||||
|
||||
with patch.object(Agent, "execute_task") as execute:
|
||||
execute.return_value = "exported_ok"
|
||||
task.execute()
|
||||
task.execute_sync()
|
||||
# Ensure the callback is called with a TaskOutput object serialized to JSON
|
||||
task_completed.assert_called_once()
|
||||
callback_data = task_completed.call_args[0][0]
|
||||
@@ -139,10 +142,12 @@ def test_task_callback_returns_task_ouput():
|
||||
output_dict = json.loads(callback_data)
|
||||
expected_output = {
|
||||
"description": task.description,
|
||||
"exported_output": "exported_ok",
|
||||
"raw_output": "exported_ok",
|
||||
"raw": "exported_ok",
|
||||
"pydantic": None,
|
||||
"json_dict": None,
|
||||
"agent": researcher.role,
|
||||
"summary": "Give me a list of 5 interesting ideas to explore...",
|
||||
"output_format": OutputFormat.RAW,
|
||||
}
|
||||
assert output_dict == expected_output
|
||||
|
||||
@@ -161,7 +166,7 @@ def test_execute_with_agent():
|
||||
)
|
||||
|
||||
with patch.object(Agent, "execute_task", return_value="ok") as execute:
|
||||
task.execute(agent=researcher)
|
||||
task.execute_sync(agent=researcher)
|
||||
execute.assert_called_once_with(task=task, context=None, tools=[])
|
||||
|
||||
|
||||
@@ -181,7 +186,7 @@ def test_async_execution():
|
||||
)
|
||||
|
||||
with patch.object(Agent, "execute_task", return_value="ok") as execute:
|
||||
task.execute(agent=researcher)
|
||||
task.execute_async(agent=researcher)
|
||||
execute.assert_called_once_with(task=task, context=None, tools=[])
|
||||
|
||||
|
||||
@@ -199,7 +204,7 @@ def test_multiple_output_type_error():
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_output_pydantic():
|
||||
def test_output_pydantic_sequential():
|
||||
class ScoreOutput(BaseModel):
|
||||
score: int
|
||||
|
||||
@@ -217,13 +222,46 @@ def test_output_pydantic():
|
||||
agent=scorer,
|
||||
)
|
||||
|
||||
crew = Crew(agents=[scorer], tasks=[task])
|
||||
crew = Crew(agents=[scorer], tasks=[task], process=Process.sequential)
|
||||
result = crew.kickoff()
|
||||
assert isinstance(result, ScoreOutput)
|
||||
assert isinstance(result.pydantic, ScoreOutput)
|
||||
assert result.to_dict() == {"score": 4}
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_output_json():
|
||||
def test_output_pydantic_hierarchical():
|
||||
from langchain_openai import ChatOpenAI
|
||||
|
||||
class ScoreOutput(BaseModel):
|
||||
score: int
|
||||
|
||||
scorer = Agent(
|
||||
role="Scorer",
|
||||
goal="Score the title",
|
||||
backstory="You're an expert scorer, specialized in scoring titles.",
|
||||
allow_delegation=False,
|
||||
)
|
||||
|
||||
task = Task(
|
||||
description="Give me an integer score between 1-5 for the following title: 'The impact of AI in the future of work'",
|
||||
expected_output="The score of the title.",
|
||||
output_pydantic=ScoreOutput,
|
||||
agent=scorer,
|
||||
)
|
||||
|
||||
crew = Crew(
|
||||
agents=[scorer],
|
||||
tasks=[task],
|
||||
process=Process.hierarchical,
|
||||
manager_llm=ChatOpenAI(model="gpt-4o"),
|
||||
)
|
||||
result = crew.kickoff()
|
||||
assert isinstance(result.pydantic, ScoreOutput)
|
||||
assert result.to_dict() == {"score": 4}
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_output_json_sequential():
|
||||
class ScoreOutput(BaseModel):
|
||||
score: int
|
||||
|
||||
@@ -241,9 +279,126 @@ def test_output_json():
|
||||
agent=scorer,
|
||||
)
|
||||
|
||||
crew = Crew(agents=[scorer], tasks=[task])
|
||||
crew = Crew(agents=[scorer], tasks=[task], process=Process.sequential)
|
||||
result = crew.kickoff()
|
||||
assert '{\n "score": 4\n}' == result
|
||||
assert '{"score": 4}' == result.json
|
||||
assert result.to_dict() == {"score": 4}
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_output_json_hierarchical():
|
||||
from langchain_openai import ChatOpenAI
|
||||
|
||||
class ScoreOutput(BaseModel):
|
||||
score: int
|
||||
|
||||
scorer = Agent(
|
||||
role="Scorer",
|
||||
goal="Score the title",
|
||||
backstory="You're an expert scorer, specialized in scoring titles.",
|
||||
allow_delegation=False,
|
||||
)
|
||||
|
||||
task = Task(
|
||||
description="Give me an integer score between 1-5 for the following title: 'The impact of AI in the future of work'",
|
||||
expected_output="The score of the title.",
|
||||
output_json=ScoreOutput,
|
||||
agent=scorer,
|
||||
)
|
||||
|
||||
crew = Crew(
|
||||
agents=[scorer],
|
||||
tasks=[task],
|
||||
process=Process.hierarchical,
|
||||
manager_llm=ChatOpenAI(model="gpt-4o"),
|
||||
)
|
||||
result = crew.kickoff()
|
||||
assert '{"score": 4}' == result.json
|
||||
assert result.to_dict() == {"score": 4}
|
||||
|
||||
|
||||
def test_json_property_without_output_json():
|
||||
class ScoreOutput(BaseModel):
|
||||
score: int
|
||||
|
||||
scorer = Agent(
|
||||
role="Scorer",
|
||||
goal="Score the title",
|
||||
backstory="You're an expert scorer, specialized in scoring titles.",
|
||||
allow_delegation=False,
|
||||
)
|
||||
|
||||
task = Task(
|
||||
description="Give me an integer score between 1-5 for the following title: 'The impact of AI in the future of work'",
|
||||
expected_output="The score of the title.",
|
||||
output_pydantic=ScoreOutput, # Using output_pydantic instead of output_json
|
||||
agent=scorer,
|
||||
)
|
||||
|
||||
crew = Crew(agents=[scorer], tasks=[task], process=Process.sequential)
|
||||
result = crew.kickoff()
|
||||
|
||||
with pytest.raises(ValueError) as excinfo:
|
||||
_ = result.json # Attempt to access the json property
|
||||
|
||||
assert "No JSON output found in the final task." in str(excinfo.value)
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_output_json_dict_sequential():
|
||||
class ScoreOutput(BaseModel):
|
||||
score: int
|
||||
|
||||
scorer = Agent(
|
||||
role="Scorer",
|
||||
goal="Score the title",
|
||||
backstory="You're an expert scorer, specialized in scoring titles.",
|
||||
allow_delegation=False,
|
||||
)
|
||||
|
||||
task = Task(
|
||||
description="Give me an integer score between 1-5 for the following title: 'The impact of AI in the future of work'",
|
||||
expected_output="The score of the title.",
|
||||
output_json=ScoreOutput,
|
||||
agent=scorer,
|
||||
)
|
||||
|
||||
crew = Crew(agents=[scorer], tasks=[task], process=Process.sequential)
|
||||
result = crew.kickoff()
|
||||
assert {"score": 4} == result.json_dict
|
||||
assert result.to_dict() == {"score": 4}
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_output_json_dict_hierarchical():
|
||||
from langchain_openai import ChatOpenAI
|
||||
|
||||
class ScoreOutput(BaseModel):
|
||||
score: int
|
||||
|
||||
scorer = Agent(
|
||||
role="Scorer",
|
||||
goal="Score the title",
|
||||
backstory="You're an expert scorer, specialized in scoring titles.",
|
||||
allow_delegation=False,
|
||||
)
|
||||
|
||||
task = Task(
|
||||
description="Give me an integer score between 1-5 for the following title: 'The impact of AI in the future of work'",
|
||||
expected_output="The score of the title.",
|
||||
output_json=ScoreOutput,
|
||||
agent=scorer,
|
||||
)
|
||||
|
||||
crew = Crew(
|
||||
agents=[scorer],
|
||||
tasks=[task],
|
||||
process=Process.hierarchical,
|
||||
manager_llm=ChatOpenAI(model="gpt-4o"),
|
||||
)
|
||||
result = crew.kickoff()
|
||||
assert {"score": 4} == result.json_dict
|
||||
assert result.to_dict() == {"score": 4}
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
@@ -279,7 +434,11 @@ def test_output_pydantic_to_another_task():
|
||||
|
||||
crew = Crew(agents=[scorer], tasks=[task1, task2], verbose=2)
|
||||
result = crew.kickoff()
|
||||
assert 5 == result.score
|
||||
pydantic_result = result.pydantic
|
||||
assert isinstance(
|
||||
pydantic_result, ScoreOutput
|
||||
), "Expected pydantic result to be of type ScoreOutput"
|
||||
assert 5 == pydantic_result.score
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
@@ -310,7 +469,7 @@ def test_output_json_to_another_task():
|
||||
|
||||
crew = Crew(agents=[scorer], tasks=[task1, task2])
|
||||
result = crew.kickoff()
|
||||
assert '{\n "score": 3\n}' == result
|
||||
assert '{"score": 5}' == result.json
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
@@ -362,7 +521,9 @@ def test_save_task_json_output():
|
||||
with patch.object(Task, "_save_file") as save_file:
|
||||
save_file.return_value = None
|
||||
crew.kickoff()
|
||||
save_file.assert_called_once_with('{\n "score": 4\n}')
|
||||
save_file.assert_called_once_with(
|
||||
{"score": 4}
|
||||
) # TODO: @Joao, should this be a dict or a json string?
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
@@ -393,6 +554,38 @@ def test_save_task_pydantic_output():
|
||||
save_file.assert_called_once_with('{"score":4}')
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_custom_converter_cls():
|
||||
class ScoreOutput(BaseModel):
|
||||
score: int
|
||||
|
||||
class ScoreConverter(Converter):
|
||||
pass
|
||||
|
||||
scorer = Agent(
|
||||
role="Scorer",
|
||||
goal="Score the title",
|
||||
backstory="You're an expert scorer, specialized in scoring titles.",
|
||||
allow_delegation=False,
|
||||
)
|
||||
|
||||
task = Task(
|
||||
description="Give me an integer score between 1-5 for the following title: 'The impact of AI in the future of work'",
|
||||
expected_output="The score of the title.",
|
||||
output_pydantic=ScoreOutput,
|
||||
converter_cls=ScoreConverter,
|
||||
agent=scorer,
|
||||
)
|
||||
|
||||
crew = Crew(agents=[scorer], tasks=[task])
|
||||
|
||||
with patch.object(
|
||||
ScoreConverter, "to_pydantic", return_value=ScoreOutput(score=5)
|
||||
) as mock_to_pydantic:
|
||||
crew.kickoff()
|
||||
mock_to_pydantic.assert_called_once()
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_increment_delegations_for_hierarchical_process():
|
||||
from langchain_openai import ChatOpenAI
|
||||
@@ -413,31 +606,29 @@ def test_increment_delegations_for_hierarchical_process():
|
||||
agents=[scorer],
|
||||
tasks=[task],
|
||||
process=Process.hierarchical,
|
||||
manager_llm=ChatOpenAI(model="gpt-4-0125-preview"),
|
||||
manager_llm=ChatOpenAI(model="gpt-4o"),
|
||||
)
|
||||
|
||||
with patch.object(Task, "increment_delegations") as increment_delegations:
|
||||
increment_delegations.return_value = None
|
||||
crew.kickoff()
|
||||
increment_delegations.assert_called_once
|
||||
increment_delegations.assert_called_once()
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_increment_delegations_for_sequential_process():
|
||||
pass
|
||||
|
||||
manager = Agent(
|
||||
role="Manager",
|
||||
goal="Coordinate scoring processes",
|
||||
backstory="You're great at delegating work about scoring.",
|
||||
allow_delegation=False,
|
||||
allow_delegation=True,
|
||||
)
|
||||
|
||||
scorer = Agent(
|
||||
role="Scorer",
|
||||
goal="Score the title",
|
||||
backstory="You're an expert scorer, specialized in scoring titles.",
|
||||
allow_delegation=False,
|
||||
allow_delegation=True,
|
||||
)
|
||||
|
||||
task = Task(
|
||||
@@ -455,7 +646,7 @@ def test_increment_delegations_for_sequential_process():
|
||||
with patch.object(Task, "increment_delegations") as increment_delegations:
|
||||
increment_delegations.return_value = None
|
||||
crew.kickoff()
|
||||
increment_delegations.assert_called_once
|
||||
increment_delegations.assert_called_once()
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
@@ -490,7 +681,7 @@ def test_increment_tool_errors():
|
||||
with patch.object(Task, "increment_tools_errors") as increment_tools_errors:
|
||||
increment_tools_errors.return_value = None
|
||||
crew.kickoff()
|
||||
increment_tools_errors.assert_called_once
|
||||
assert len(increment_tools_errors.mock_calls) == 3
|
||||
|
||||
|
||||
def test_task_definition_based_on_dict():
|
||||
@@ -525,3 +716,80 @@ def test_interpolate_inputs():
|
||||
== "Give me a list of 5 interesting ideas about ML to explore for an article, what makes them unique and interesting."
|
||||
)
|
||||
assert task.expected_output == "Bullet point list of 5 interesting ideas about ML."
|
||||
|
||||
|
||||
def test_task_output_str_with_pydantic():
|
||||
from crewai.tasks.output_format import OutputFormat
|
||||
|
||||
class ScoreOutput(BaseModel):
|
||||
score: int
|
||||
|
||||
score_output = ScoreOutput(score=4)
|
||||
task_output = TaskOutput(
|
||||
description="Test task",
|
||||
agent="Test Agent",
|
||||
pydantic=score_output,
|
||||
output_format=OutputFormat.PYDANTIC,
|
||||
)
|
||||
|
||||
assert str(task_output) == str(score_output)
|
||||
|
||||
|
||||
def test_task_output_str_with_json_dict():
|
||||
from crewai.tasks.output_format import OutputFormat
|
||||
|
||||
json_dict = {"score": 4}
|
||||
task_output = TaskOutput(
|
||||
description="Test task",
|
||||
agent="Test Agent",
|
||||
json_dict=json_dict,
|
||||
output_format=OutputFormat.JSON,
|
||||
)
|
||||
|
||||
assert str(task_output) == str(json_dict)
|
||||
|
||||
|
||||
def test_task_output_str_with_raw():
|
||||
from crewai.tasks.output_format import OutputFormat
|
||||
|
||||
raw_output = "Raw task output"
|
||||
task_output = TaskOutput(
|
||||
description="Test task",
|
||||
agent="Test Agent",
|
||||
raw=raw_output,
|
||||
output_format=OutputFormat.RAW,
|
||||
)
|
||||
|
||||
assert str(task_output) == raw_output
|
||||
|
||||
|
||||
def test_task_output_str_with_pydantic_and_json_dict():
|
||||
from crewai.tasks.output_format import OutputFormat
|
||||
|
||||
class ScoreOutput(BaseModel):
|
||||
score: int
|
||||
|
||||
score_output = ScoreOutput(score=4)
|
||||
json_dict = {"score": 4}
|
||||
task_output = TaskOutput(
|
||||
description="Test task",
|
||||
agent="Test Agent",
|
||||
pydantic=score_output,
|
||||
json_dict=json_dict,
|
||||
output_format=OutputFormat.PYDANTIC,
|
||||
)
|
||||
|
||||
# When both pydantic and json_dict are present, pydantic should take precedence
|
||||
assert str(task_output) == str(score_output)
|
||||
|
||||
|
||||
def test_task_output_str_with_none():
|
||||
from crewai.tasks.output_format import OutputFormat
|
||||
|
||||
task_output = TaskOutput(
|
||||
description="Test task",
|
||||
agent="Test Agent",
|
||||
output_format=OutputFormat.RAW,
|
||||
)
|
||||
|
||||
assert str(task_output) == ""
|
||||
|
||||
@@ -17,6 +17,10 @@ class TestPickleHandler(unittest.TestCase):
|
||||
os.remove(self.file_path)
|
||||
|
||||
def test_initialize_file(self):
|
||||
assert os.path.exists(self.file_path) is False
|
||||
|
||||
self.handler.initialize_file()
|
||||
|
||||
assert os.path.exists(self.file_path) is True
|
||||
assert os.path.getsize(self.file_path) >= 0
|
||||
|
||||
|
||||
Reference in New Issue
Block a user