mirror of
https://github.com/crewAIInc/crewAI.git
synced 2025-12-17 12:58:31 +00:00
Compare commits
88 Commits
feat/add-i
...
knowledge
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
c0ad4576e2 | ||
|
|
6359b64d22 | ||
|
|
9329119f76 | ||
|
|
38c0d61b11 | ||
|
|
8564f5551f | ||
|
|
8a5404275f | ||
|
|
52189a46bc | ||
|
|
44ab749fda | ||
|
|
3c4504bd4f | ||
|
|
fde1ee45f9 | ||
|
|
6774bc2c53 | ||
|
|
94c62263ed | ||
|
|
495c3859af | ||
|
|
3e003f5e32 | ||
|
|
1c8b509d7d | ||
|
|
58af5c08f9 | ||
|
|
23276cbd76 | ||
|
|
fe18da5e11 | ||
|
|
76da972ce9 | ||
|
|
4663997b4c | ||
|
|
b185b9e289 | ||
|
|
787f2eaa7c | ||
|
|
e7d816fb2a | ||
|
|
8373c9b521 | ||
|
|
ec2fe6ff91 | ||
|
|
55e968c9e0 | ||
|
|
58bf2d57f7 | ||
|
|
705ee16c1c | ||
|
|
0c5b6f2a93 | ||
|
|
914067df37 | ||
|
|
de742c827d | ||
|
|
efa8a378a1 | ||
|
|
e882725b8a | ||
|
|
cbfdbe3b68 | ||
|
|
c8bf242633 | ||
|
|
70910dd7b4 | ||
|
|
b104404418 | ||
|
|
d579c5ae12 | ||
|
|
4831dcb85b | ||
|
|
cbfcde73ec | ||
|
|
b2c06d5b7a | ||
|
|
352d05370e | ||
|
|
0b9092702b | ||
|
|
8376698534 | ||
|
|
b90793874c | ||
|
|
cdf5233523 | ||
|
|
cb03ee60b8 | ||
|
|
10f445e18a | ||
|
|
3dc02310b6 | ||
|
|
98a708ca15 | ||
|
|
e70bc94ab6 | ||
|
|
9285ebf8a2 | ||
|
|
4ca785eb15 | ||
|
|
c57cbd8591 | ||
|
|
7fb1289205 | ||
|
|
f02681ae01 | ||
|
|
c725105b1f | ||
|
|
36aa4bcb46 | ||
|
|
b98f8f9fe1 | ||
|
|
bcfcf88e78 | ||
|
|
fd0de3a47e | ||
|
|
c7b9ae02fd | ||
|
|
4afb022572 | ||
|
|
8610faef22 | ||
|
|
6d677541c7 | ||
|
|
49220ec163 | ||
|
|
40a676b7ac | ||
|
|
50bf146d1e | ||
|
|
40d378abfb | ||
|
|
1b09b085a7 | ||
|
|
7b59c5b049 | ||
|
|
86ede8344c | ||
|
|
59165cbad8 | ||
|
|
4af263ca1e | ||
|
|
9f2acfe91f | ||
|
|
617ee989cd | ||
|
|
6131dbac4f | ||
|
|
1a35114c08 | ||
|
|
e856359e23 | ||
|
|
a8a2f80616 | ||
|
|
faa231e278 | ||
|
|
3d44795476 | ||
|
|
f50e709985 | ||
|
|
dc314c1151 | ||
|
|
75322b2de1 | ||
|
|
d70c542547 | ||
|
|
57201fb856 | ||
|
|
9b142e580b |
2
.github/workflows/security-checker.yml
vendored
2
.github/workflows/security-checker.yml
vendored
@@ -19,5 +19,5 @@ jobs:
|
||||
run: pip install bandit
|
||||
|
||||
- name: Run Bandit
|
||||
run: bandit -c pyproject.toml -r src/ -lll
|
||||
run: bandit -c pyproject.toml -r src/ -ll
|
||||
|
||||
|
||||
4
.github/workflows/tests.yml
vendored
4
.github/workflows/tests.yml
vendored
@@ -26,7 +26,7 @@ jobs:
|
||||
run: uv python install 3.11.9
|
||||
|
||||
- name: Install the project
|
||||
run: uv sync --dev
|
||||
run: uv sync --dev --all-extras
|
||||
|
||||
- name: Run tests
|
||||
run: uv run pytest tests
|
||||
run: uv run pytest tests -vv
|
||||
|
||||
4
.gitignore
vendored
4
.gitignore
vendored
@@ -17,3 +17,7 @@ rc-tests/*
|
||||
temp/*
|
||||
.vscode/*
|
||||
crew_tasks_output.json
|
||||
.codesight
|
||||
.mypy_cache
|
||||
.ruff_cache
|
||||
.venv
|
||||
|
||||
@@ -22,7 +22,8 @@ A crew in crewAI represents a collaborative group of agents working together to
|
||||
| **Max RPM** _(optional)_ | `max_rpm` | Maximum requests per minute the crew adheres to during execution. Defaults to `None`. |
|
||||
| **Language** _(optional)_ | `language` | Language used for the crew, defaults to English. |
|
||||
| **Language File** _(optional)_ | `language_file` | Path to the language file to be used for the crew. |
|
||||
| **Memory** _(optional)_ | `memory` | Utilized for storing execution memories (short-term, long-term, entity memory). Defaults to `False`. |
|
||||
| **Memory** _(optional)_ | `memory` | Utilized for storing execution memories (short-term, long-term, entity memory). |
|
||||
| **Memory Config** _(optional)_ | `memory_config` | Configuration for the memory provider to be used by the crew. |
|
||||
| **Cache** _(optional)_ | `cache` | Specifies whether to use a cache for storing the results of tools' execution. Defaults to `True`. |
|
||||
| **Embedder** _(optional)_ | `embedder` | Configuration for the embedder to be used by the crew. Mostly used by memory for now. Default is `{"provider": "openai"}`. |
|
||||
| **Full Output** _(optional)_ | `full_output` | Whether the crew should return the full output with all tasks outputs or just the final output. Defaults to `False`. |
|
||||
|
||||
75
docs/concepts/knowledge.mdx
Normal file
75
docs/concepts/knowledge.mdx
Normal file
@@ -0,0 +1,75 @@
|
||||
---
|
||||
title: Knowledge
|
||||
description: What is knowledge in CrewAI and how to use it.
|
||||
icon: book
|
||||
---
|
||||
|
||||
# Using Knowledge in CrewAI
|
||||
|
||||
## Introduction
|
||||
|
||||
The Knowledge class in CrewAI provides a powerful way to manage and query knowledge sources for your AI agents. This guide will show you how to implement knowledge management in your CrewAI projects.
|
||||
Additionally, we have specific tools for generate knowledge sources for strings, text files, PDF's, and Spreadsheets. You can expand on any source type by extending the `KnowledgeSource` class.
|
||||
|
||||
## Basic Implementation
|
||||
|
||||
Here's a simple example of how to use the Knowledge class:
|
||||
|
||||
```python
|
||||
from crewai import Agent, Task, Crew, Process, LLM
|
||||
from crewai.knowledge.source.string_knowledge_source import StringKnowledgeSource
|
||||
|
||||
# Create a knowledge source
|
||||
content = "Users name is John. He is 30 years old and lives in San Francisco."
|
||||
string_source = StringKnowledgeSource(
|
||||
content=content, metadata={"preference": "personal"}
|
||||
)
|
||||
|
||||
|
||||
llm = LLM(model="gpt-4o-mini", temperature=0)
|
||||
# Create an agent with the knowledge store
|
||||
agent = Agent(
|
||||
role="About User",
|
||||
goal="You know everything about the user.",
|
||||
backstory="""You are a master at understanding people and their preferences.""",
|
||||
verbose=True,
|
||||
allow_delegation=False,
|
||||
llm=llm,
|
||||
)
|
||||
task = Task(
|
||||
description="Answer the following questions about the user: {question}",
|
||||
expected_output="An answer to the question.",
|
||||
agent=agent,
|
||||
)
|
||||
|
||||
crew = Crew(
|
||||
agents=[agent],
|
||||
tasks=[task],
|
||||
verbose=True,
|
||||
process=Process.sequential,
|
||||
knowledge={"sources": [string_source], "metadata": {"preference": "personal"}}, # Enable knowledge by adding the sources here. You can also add more sources to the sources list.
|
||||
)
|
||||
|
||||
result = crew.kickoff(inputs={"question": "What city does John live in and how old is he?"})
|
||||
```
|
||||
|
||||
|
||||
## Embedder Configuration
|
||||
|
||||
You can also configure the embedder for the knowledge store. This is useful if you want to use a different embedder for the knowledge store than the one used for the agents.
|
||||
|
||||
```python
|
||||
...
|
||||
string_source = StringKnowledgeSource(
|
||||
content="Users name is John. He is 30 years old and lives in San Francisco.",
|
||||
metadata={"preference": "personal"}
|
||||
)
|
||||
crew = Crew(
|
||||
...
|
||||
knowledge={
|
||||
"sources": [string_source],
|
||||
"metadata": {"preference": "personal"},
|
||||
"embedder_config": {"provider": "openai", "config": {"model": "text-embedding-3-small"}},
|
||||
},
|
||||
)
|
||||
```
|
||||
@@ -25,7 +25,102 @@ By default, CrewAI uses the `gpt-4o-mini` model. It uses environment variables i
|
||||
- `OPENAI_API_BASE`
|
||||
- `OPENAI_API_KEY`
|
||||
|
||||
### 2. Custom LLM Objects
|
||||
### 2. Updating YAML files
|
||||
|
||||
You can update the `agents.yml` file to refer to the LLM you want to use:
|
||||
|
||||
```yaml Code
|
||||
researcher:
|
||||
role: Research Specialist
|
||||
goal: Conduct comprehensive research and analysis to gather relevant information,
|
||||
synthesize findings, and produce well-documented insights.
|
||||
backstory: A dedicated research professional with years of experience in academic
|
||||
investigation, literature review, and data analysis, known for thorough and
|
||||
methodical approaches to complex research questions.
|
||||
verbose: true
|
||||
llm: openai/gpt-4o
|
||||
# llm: azure/gpt-4o-mini
|
||||
# llm: gemini/gemini-pro
|
||||
# llm: anthropic/claude-3-5-sonnet-20240620
|
||||
# llm: bedrock/anthropic.claude-3-sonnet-20240229-v1:0
|
||||
# llm: mistral/mistral-large-latest
|
||||
# llm: ollama/llama3:70b
|
||||
# llm: groq/llama-3.2-90b-vision-preview
|
||||
# llm: watsonx/meta-llama/llama-3-1-70b-instruct
|
||||
# llm: nvidia_nim/meta/llama3-70b-instruct
|
||||
# llm: sambanova/Meta-Llama-3.1-8B-Instruct
|
||||
# ...
|
||||
```
|
||||
|
||||
Keep in mind that you will need to set certain ENV vars depending on the model you are
|
||||
using to account for the credentials or set a custom LLM object like described below.
|
||||
Here are some of the required ENV vars for some of the LLM integrations:
|
||||
|
||||
<AccordionGroup>
|
||||
<Accordion title="OpenAI">
|
||||
```python Code
|
||||
OPENAI_API_KEY=<your-api-key>
|
||||
OPENAI_API_BASE=<optional-custom-base-url>
|
||||
OPENAI_MODEL_NAME=<openai-model-name>
|
||||
OPENAI_ORGANIZATION=<your-org-id> # OPTIONAL
|
||||
OPENAI_API_BASE=<openaiai-api-base> # OPTIONAL
|
||||
```
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="Anthropic">
|
||||
```python Code
|
||||
ANTHROPIC_API_KEY=<your-api-key>
|
||||
```
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="Google">
|
||||
```python Code
|
||||
GEMINI_API_KEY=<your-api-key>
|
||||
```
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="Azure">
|
||||
```python Code
|
||||
AZURE_API_KEY=<your-api-key> # "my-azure-api-key"
|
||||
AZURE_API_BASE=<your-resource-url> # "https://example-endpoint.openai.azure.com"
|
||||
AZURE_API_VERSION=<api-version> # "2023-05-15"
|
||||
AZURE_AD_TOKEN=<your-azure-ad-token> # Optional
|
||||
AZURE_API_TYPE=<your-azure-api-type> # Optional
|
||||
```
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="AWS Bedrock">
|
||||
```python Code
|
||||
AWS_ACCESS_KEY_ID=<your-access-key>
|
||||
AWS_SECRET_ACCESS_KEY=<your-secret-key>
|
||||
AWS_DEFAULT_REGION=<your-region>
|
||||
```
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="Mistral">
|
||||
```python Code
|
||||
MISTRAL_API_KEY=<your-api-key>
|
||||
```
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="Groq">
|
||||
```python Code
|
||||
GROQ_API_KEY=<your-api-key>
|
||||
```
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="IBM watsonx.ai">
|
||||
```python Code
|
||||
WATSONX_URL=<your-url> # (required) Base URL of your WatsonX instance
|
||||
WATSONX_APIKEY=<your-apikey> # (required) IBM cloud API key
|
||||
WATSONX_TOKEN=<your-token> # (required) IAM auth token (alternative to APIKEY)
|
||||
WATSONX_PROJECT_ID=<your-project-id> # (optional) Project ID of your WatsonX instance
|
||||
WATSONX_DEPLOYMENT_SPACE_ID=<your-space-id> # (optional) ID of deployment space for deployed models
|
||||
```
|
||||
</Accordion>
|
||||
</AccordionGroup>
|
||||
|
||||
### 3. Custom LLM Objects
|
||||
|
||||
Pass a custom LLM implementation or object from another library.
|
||||
|
||||
@@ -102,7 +197,7 @@ When configuring an LLM for your agent, you have access to a wide range of param
|
||||
|
||||
These are examples of how to configure LLMs for your agent.
|
||||
|
||||
<AccordionGroup>
|
||||
<AccordionGroup>
|
||||
<Accordion title="OpenAI">
|
||||
|
||||
```python Code
|
||||
@@ -133,10 +228,10 @@ These are examples of how to configure LLMs for your agent.
|
||||
model="cerebras/llama-3.1-70b",
|
||||
api_key="your-api-key-here"
|
||||
)
|
||||
agent = Agent(llm=llm, ...)
|
||||
agent = Agent(llm=llm, ...)
|
||||
```
|
||||
</Accordion>
|
||||
|
||||
|
||||
<Accordion title="Ollama (Local LLMs)">
|
||||
|
||||
CrewAI supports using Ollama for running open-source models locally:
|
||||
@@ -150,7 +245,7 @@ These are examples of how to configure LLMs for your agent.
|
||||
|
||||
agent = Agent(
|
||||
llm=LLM(
|
||||
model="ollama/llama3.1",
|
||||
model="ollama/llama3.1",
|
||||
base_url="http://localhost:11434"
|
||||
),
|
||||
...
|
||||
@@ -164,7 +259,7 @@ These are examples of how to configure LLMs for your agent.
|
||||
from crewai import LLM
|
||||
|
||||
llm = LLM(
|
||||
model="groq/llama3-8b-8192",
|
||||
model="groq/llama3-8b-8192",
|
||||
api_key="your-api-key-here"
|
||||
)
|
||||
agent = Agent(llm=llm, ...)
|
||||
@@ -189,7 +284,7 @@ These are examples of how to configure LLMs for your agent.
|
||||
from crewai import LLM
|
||||
|
||||
llm = LLM(
|
||||
model="fireworks_ai/accounts/fireworks/models/llama-v3-70b-instruct",
|
||||
model="fireworks_ai/accounts/fireworks/models/llama-v3-70b-instruct",
|
||||
api_key="your-api-key-here"
|
||||
)
|
||||
agent = Agent(llm=llm, ...)
|
||||
@@ -224,6 +319,29 @@ These are examples of how to configure LLMs for your agent.
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="IBM watsonx.ai">
|
||||
You can use IBM Watson by seeting the following ENV vars:
|
||||
|
||||
```python Code
|
||||
WATSONX_URL=<your-url>
|
||||
WATSONX_APIKEY=<your-apikey>
|
||||
WATSONX_PROJECT_ID=<your-project-id>
|
||||
```
|
||||
|
||||
You can then define your agents llms by updating the `agents.yml`
|
||||
|
||||
```yaml Code
|
||||
researcher:
|
||||
role: Research Specialist
|
||||
goal: Conduct comprehensive research and analysis to gather relevant information,
|
||||
synthesize findings, and produce well-documented insights.
|
||||
backstory: A dedicated research professional with years of experience in academic
|
||||
investigation, literature review, and data analysis, known for thorough and
|
||||
methodical approaches to complex research questions.
|
||||
verbose: true
|
||||
llm: watsonx/meta-llama/llama-3-1-70b-instruct
|
||||
```
|
||||
|
||||
You can also set up agents more dynamically as a base level LLM instance, like bellow:
|
||||
|
||||
```python Code
|
||||
from crewai import LLM
|
||||
@@ -247,7 +365,7 @@ These are examples of how to configure LLMs for your agent.
|
||||
api_key="your-api-key-here",
|
||||
base_url="your_api_endpoint"
|
||||
)
|
||||
agent = Agent(llm=llm, ...)
|
||||
agent = Agent(llm=llm, ...)
|
||||
```
|
||||
</Accordion>
|
||||
</AccordionGroup>
|
||||
|
||||
@@ -18,6 +18,7 @@ reason, and learn from past interactions.
|
||||
| **Long-Term Memory** | Preserves valuable insights and learnings from past executions, allowing agents to build and refine their knowledge over time. |
|
||||
| **Entity Memory** | Captures and organizes information about entities (people, places, concepts) encountered during tasks, facilitating deeper understanding and relationship mapping. Uses `RAG` for storing entity information. |
|
||||
| **Contextual Memory**| Maintains the context of interactions by combining `ShortTermMemory`, `LongTermMemory`, and `EntityMemory`, aiding in the coherence and relevance of agent responses over a sequence of tasks or a conversation. |
|
||||
| **User Memory** | Stores user-specific information and preferences, enhancing personalization and user experience. |
|
||||
|
||||
## How Memory Systems Empower Agents
|
||||
|
||||
@@ -92,6 +93,47 @@ my_crew = Crew(
|
||||
)
|
||||
```
|
||||
|
||||
## Integrating Mem0 for Enhanced User Memory
|
||||
|
||||
[Mem0](https://mem0.ai/) is a self-improving memory layer for LLM applications, enabling personalized AI experiences.
|
||||
|
||||
To include user-specific memory you can get your API key [here](https://app.mem0.ai/dashboard/api-keys) and refer the [docs](https://docs.mem0.ai/platform/quickstart#4-1-create-memories) for adding user preferences.
|
||||
|
||||
|
||||
```python Code
|
||||
import os
|
||||
from crewai import Crew, Process
|
||||
from mem0 import MemoryClient
|
||||
|
||||
# Set environment variables for Mem0
|
||||
os.environ["MEM0_API_KEY"] = "m0-xx"
|
||||
|
||||
# Step 1: Record preferences based on past conversation or user input
|
||||
client = MemoryClient()
|
||||
messages = [
|
||||
{"role": "user", "content": "Hi there! I'm planning a vacation and could use some advice."},
|
||||
{"role": "assistant", "content": "Hello! I'd be happy to help with your vacation planning. What kind of destination do you prefer?"},
|
||||
{"role": "user", "content": "I am more of a beach person than a mountain person."},
|
||||
{"role": "assistant", "content": "That's interesting. Do you like hotels or Airbnb?"},
|
||||
{"role": "user", "content": "I like Airbnb more."},
|
||||
]
|
||||
client.add(messages, user_id="john")
|
||||
|
||||
# Step 2: Create a Crew with User Memory
|
||||
|
||||
crew = Crew(
|
||||
agents=[...],
|
||||
tasks=[...],
|
||||
verbose=True,
|
||||
process=Process.sequential,
|
||||
memory=True,
|
||||
memory_config={
|
||||
"provider": "mem0",
|
||||
"config": {"user_id": "john"},
|
||||
},
|
||||
)
|
||||
```
|
||||
|
||||
|
||||
## Additional Embedding Providers
|
||||
|
||||
|
||||
@@ -330,4 +330,4 @@ This will clear the crew's memory, allowing for a fresh start.
|
||||
|
||||
## Deploying Your Project
|
||||
|
||||
The easiest way to deploy your crew is through [CrewAI Enterprise](https://www.crewai.com/crewaiplus), where you can deploy your crew in a few clicks.
|
||||
The easiest way to deploy your crew is through [CrewAI Enterprise](http://app.crewai.com/), where you can deploy your crew in a few clicks.
|
||||
|
||||
@@ -34,6 +34,7 @@ from crewai_tools import GithubSearchTool
|
||||
# Initialize the tool for semantic searches within a specific GitHub repository
|
||||
tool = GithubSearchTool(
|
||||
github_repo='https://github.com/example/repo',
|
||||
gh_token='your_github_personal_access_token',
|
||||
content_types=['code', 'issue'] # Options: code, repo, pr, issue
|
||||
)
|
||||
|
||||
@@ -41,6 +42,7 @@ tool = GithubSearchTool(
|
||||
|
||||
# Initialize the tool for semantic searches within a specific GitHub repository, so the agent can search any repository if it learns about during its execution
|
||||
tool = GithubSearchTool(
|
||||
gh_token='your_github_personal_access_token',
|
||||
content_types=['code', 'issue'] # Options: code, repo, pr, issue
|
||||
)
|
||||
```
|
||||
@@ -48,6 +50,7 @@ tool = GithubSearchTool(
|
||||
## Arguments
|
||||
|
||||
- `github_repo` : The URL of the GitHub repository where the search will be conducted. This is a mandatory field and specifies the target repository for your search.
|
||||
- `gh_token` : Your GitHub Personal Access Token (PAT) required for authentication. You can create one in your GitHub account settings under Developer Settings > Personal Access Tokens.
|
||||
- `content_types` : Specifies the types of content to include in your search. You must provide a list of content types from the following options: `code` for searching within the code,
|
||||
`repo` for searching within the repository's general information, `pr` for searching within pull requests, and `issue` for searching within issues.
|
||||
This field is mandatory and allows tailoring the search to specific content types within the GitHub repository.
|
||||
@@ -77,5 +80,4 @@ tool = GithubSearchTool(
|
||||
),
|
||||
),
|
||||
)
|
||||
)
|
||||
```
|
||||
)
|
||||
6
poetry.lock
generated
6
poetry.lock
generated
@@ -1597,12 +1597,12 @@ files = [
|
||||
google-auth = ">=2.14.1,<3.0.dev0"
|
||||
googleapis-common-protos = ">=1.56.2,<2.0.dev0"
|
||||
grpcio = [
|
||||
{version = ">=1.49.1,<2.0dev", optional = true, markers = "python_version >= \"3.11\" and extra == \"grpc\""},
|
||||
{version = ">=1.33.2,<2.0dev", optional = true, markers = "python_version < \"3.11\" and extra == \"grpc\""},
|
||||
{version = ">=1.49.1,<2.0dev", optional = true, markers = "python_version >= \"3.11\" and extra == \"grpc\""},
|
||||
]
|
||||
grpcio-status = [
|
||||
{version = ">=1.49.1,<2.0.dev0", optional = true, markers = "python_version >= \"3.11\" and extra == \"grpc\""},
|
||||
{version = ">=1.33.2,<2.0.dev0", optional = true, markers = "python_version < \"3.11\" and extra == \"grpc\""},
|
||||
{version = ">=1.49.1,<2.0.dev0", optional = true, markers = "python_version >= \"3.11\" and extra == \"grpc\""},
|
||||
]
|
||||
proto-plus = ">=1.22.3,<2.0.0dev"
|
||||
protobuf = ">=3.19.5,<3.20.0 || >3.20.0,<3.20.1 || >3.20.1,<4.21.0 || >4.21.0,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<6.0.0.dev0"
|
||||
@@ -4286,8 +4286,8 @@ files = [
|
||||
|
||||
[package.dependencies]
|
||||
numpy = [
|
||||
{version = ">=1.23.2", markers = "python_version == \"3.11\""},
|
||||
{version = ">=1.22.4", markers = "python_version < \"3.11\""},
|
||||
{version = ">=1.23.2", markers = "python_version == \"3.11\""},
|
||||
{version = ">=1.26.0", markers = "python_version >= \"3.12\""},
|
||||
]
|
||||
python-dateutil = ">=2.8.2"
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[project]
|
||||
name = "crewai"
|
||||
version = "0.76.9"
|
||||
version = "0.80.0"
|
||||
description = "Cutting-edge framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks."
|
||||
readme = "README.md"
|
||||
requires-python = ">=3.10,<=3.13"
|
||||
@@ -16,7 +16,7 @@ dependencies = [
|
||||
"opentelemetry-exporter-otlp-proto-http>=1.22.0",
|
||||
"instructor>=1.3.3",
|
||||
"regex>=2024.9.11",
|
||||
"crewai-tools>=0.13.4",
|
||||
"crewai-tools>=0.14.0",
|
||||
"click>=8.1.7",
|
||||
"python-dotenv>=1.0.0",
|
||||
"appdirs>=1.4.4",
|
||||
@@ -27,8 +27,8 @@ dependencies = [
|
||||
"pyvis>=0.3.2",
|
||||
"uv>=0.4.25",
|
||||
"tomli-w>=1.1.0",
|
||||
"chromadb>=0.4.24",
|
||||
"tomli>=2.0.2",
|
||||
"chromadb>=0.5.18",
|
||||
]
|
||||
|
||||
[project.urls]
|
||||
@@ -37,8 +37,19 @@ Documentation = "https://docs.crewai.com"
|
||||
Repository = "https://github.com/crewAIInc/crewAI"
|
||||
|
||||
[project.optional-dependencies]
|
||||
tools = ["crewai-tools>=0.13.4"]
|
||||
tools = ["crewai-tools>=0.14.0"]
|
||||
agentops = ["agentops>=0.3.0"]
|
||||
fastembed = ["fastembed>=0.4.1"]
|
||||
pdfplumber = [
|
||||
"pdfplumber>=0.11.4",
|
||||
]
|
||||
pandas = [
|
||||
"pandas>=2.2.3",
|
||||
]
|
||||
openpyxl = [
|
||||
"openpyxl>=3.1.5",
|
||||
]
|
||||
mem0 = ["mem0ai>=0.1.29"]
|
||||
|
||||
[tool.uv]
|
||||
dev-dependencies = [
|
||||
@@ -52,7 +63,7 @@ dev-dependencies = [
|
||||
"mkdocs-material-extensions>=1.3.1",
|
||||
"pillow>=10.2.0",
|
||||
"cairosvg>=2.7.1",
|
||||
"crewai-tools>=0.13.4",
|
||||
"crewai-tools>=0.14.0",
|
||||
"pytest>=8.0.0",
|
||||
"pytest-vcr>=1.0.2",
|
||||
"python-dotenv>=1.0.0",
|
||||
|
||||
@@ -1,7 +1,9 @@
|
||||
import warnings
|
||||
|
||||
from crewai.agent import Agent
|
||||
from crewai.crew import Crew
|
||||
from crewai.flow.flow import Flow
|
||||
from crewai.knowledge.knowledge import Knowledge
|
||||
from crewai.llm import LLM
|
||||
from crewai.pipeline import Pipeline
|
||||
from crewai.process import Process
|
||||
@@ -14,5 +16,15 @@ warnings.filterwarnings(
|
||||
category=UserWarning,
|
||||
module="pydantic.main",
|
||||
)
|
||||
__version__ = "0.76.9"
|
||||
__all__ = ["Agent", "Crew", "Process", "Task", "Pipeline", "Router", "LLM", "Flow"]
|
||||
__version__ = "0.80.0"
|
||||
__all__ = [
|
||||
"Agent",
|
||||
"Crew",
|
||||
"Process",
|
||||
"Task",
|
||||
"Pipeline",
|
||||
"Router",
|
||||
"LLM",
|
||||
"Flow",
|
||||
"Knowledge",
|
||||
]
|
||||
|
||||
@@ -8,10 +8,11 @@ from pydantic import Field, InstanceOf, PrivateAttr, model_validator
|
||||
from crewai.agents import CacheHandler
|
||||
from crewai.agents.agent_builder.base_agent import BaseAgent
|
||||
from crewai.agents.crew_agent_executor import CrewAgentExecutor
|
||||
from crewai.cli.constants import ENV_VARS
|
||||
from crewai.llm import LLM
|
||||
from crewai.memory.contextual.contextual_memory import ContextualMemory
|
||||
from crewai.tools.agent_tools.agent_tools import AgentTools
|
||||
from crewai.tools import BaseTool
|
||||
from crewai.tools.agent_tools.agent_tools import AgentTools
|
||||
from crewai.utilities import Converter, Prompts
|
||||
from crewai.utilities.constants import TRAINED_AGENTS_DATA_FILE, TRAINING_DATA_FILE
|
||||
from crewai.utilities.token_counter_callback import TokenCalcHandler
|
||||
@@ -51,6 +52,7 @@ class Agent(BaseAgent):
|
||||
role: The role of the agent.
|
||||
goal: The objective of the agent.
|
||||
backstory: The backstory of the agent.
|
||||
knowledge: The knowledge base of the agent.
|
||||
config: Dict representation of agent configuration.
|
||||
llm: The language model that will run the agent.
|
||||
function_calling_llm: The language model that will handle the tool calling for this agent, it overrides the crew function_calling_llm.
|
||||
@@ -122,6 +124,11 @@ class Agent(BaseAgent):
|
||||
@model_validator(mode="after")
|
||||
def post_init_setup(self):
|
||||
self.agent_ops_agent_name = self.role
|
||||
unnacepted_attributes = [
|
||||
"AWS_ACCESS_KEY_ID",
|
||||
"AWS_SECRET_ACCESS_KEY",
|
||||
"AWS_REGION_NAME",
|
||||
]
|
||||
|
||||
# Handle different cases for self.llm
|
||||
if isinstance(self.llm, str):
|
||||
@@ -131,8 +138,12 @@ class Agent(BaseAgent):
|
||||
# If it's already an LLM instance, keep it as is
|
||||
pass
|
||||
elif self.llm is None:
|
||||
# If it's None, use environment variables or default
|
||||
model_name = os.environ.get("OPENAI_MODEL_NAME", "gpt-4o-mini")
|
||||
# Determine the model name from environment variables or use default
|
||||
model_name = (
|
||||
os.environ.get("OPENAI_MODEL_NAME")
|
||||
or os.environ.get("MODEL")
|
||||
or "gpt-4o-mini"
|
||||
)
|
||||
llm_params = {"model": model_name}
|
||||
|
||||
api_base = os.environ.get("OPENAI_API_BASE") or os.environ.get(
|
||||
@@ -141,9 +152,44 @@ class Agent(BaseAgent):
|
||||
if api_base:
|
||||
llm_params["base_url"] = api_base
|
||||
|
||||
api_key = os.environ.get("OPENAI_API_KEY")
|
||||
if api_key:
|
||||
llm_params["api_key"] = api_key
|
||||
set_provider = model_name.split("/")[0] if "/" in model_name else "openai"
|
||||
|
||||
# Iterate over all environment variables to find matching API keys or use defaults
|
||||
for provider, env_vars in ENV_VARS.items():
|
||||
if provider == set_provider:
|
||||
for env_var in env_vars:
|
||||
if env_var["key_name"] in unnacepted_attributes:
|
||||
continue
|
||||
# Check if the environment variable is set
|
||||
if "key_name" in env_var:
|
||||
env_value = os.environ.get(env_var["key_name"])
|
||||
if env_value:
|
||||
# Map key names containing "API_KEY" to "api_key"
|
||||
key_name = (
|
||||
"api_key"
|
||||
if "API_KEY" in env_var["key_name"]
|
||||
else env_var["key_name"]
|
||||
)
|
||||
# Map key names containing "API_BASE" to "api_base"
|
||||
key_name = (
|
||||
"api_base"
|
||||
if "API_BASE" in env_var["key_name"]
|
||||
else key_name
|
||||
)
|
||||
# Map key names containing "API_VERSION" to "api_version"
|
||||
key_name = (
|
||||
"api_version"
|
||||
if "API_VERSION" in env_var["key_name"]
|
||||
else key_name
|
||||
)
|
||||
llm_params[key_name] = env_value
|
||||
# Check for default values if the environment variable is not set
|
||||
elif env_var.get("default", False):
|
||||
for key, value in env_var.items():
|
||||
if key not in ["prompt", "key_name", "default"]:
|
||||
# Only add default if the key is already set in os.environ
|
||||
if key in os.environ:
|
||||
llm_params[key] = value
|
||||
|
||||
self.llm = LLM(**llm_params)
|
||||
else:
|
||||
@@ -217,14 +263,28 @@ class Agent(BaseAgent):
|
||||
|
||||
if self.crew and self.crew.memory:
|
||||
contextual_memory = ContextualMemory(
|
||||
self.crew.memory_config,
|
||||
self.crew._short_term_memory,
|
||||
self.crew._long_term_memory,
|
||||
self.crew._entity_memory,
|
||||
self.crew._user_memory,
|
||||
)
|
||||
memory = contextual_memory.build_context_for_task(task, context)
|
||||
if memory.strip() != "":
|
||||
task_prompt += self.i18n.slice("memory").format(memory=memory)
|
||||
|
||||
# Integrate the knowledge base
|
||||
if self.crew and self.crew.knowledge:
|
||||
knowledge_snippets = self.crew.knowledge.query([task.prompt()])
|
||||
valid_snippets = [
|
||||
result["context"]
|
||||
for result in knowledge_snippets
|
||||
if result and result.get("context")
|
||||
]
|
||||
if valid_snippets:
|
||||
formatted_knowledge = "\n".join(valid_snippets)
|
||||
task_prompt += f"\n\nAdditional Information:\n{formatted_knowledge}"
|
||||
|
||||
tools = tools or self.tools or []
|
||||
self.create_agent_executor(tools=tools, task=task)
|
||||
|
||||
|
||||
@@ -4,6 +4,7 @@ from crewai.types.usage_metrics import UsageMetrics
|
||||
class TokenProcess:
|
||||
total_tokens: int = 0
|
||||
prompt_tokens: int = 0
|
||||
cached_prompt_tokens: int = 0
|
||||
completion_tokens: int = 0
|
||||
successful_requests: int = 0
|
||||
|
||||
@@ -15,6 +16,9 @@ class TokenProcess:
|
||||
self.completion_tokens = self.completion_tokens + tokens
|
||||
self.total_tokens = self.total_tokens + tokens
|
||||
|
||||
def sum_cached_prompt_tokens(self, tokens: int):
|
||||
self.cached_prompt_tokens = self.cached_prompt_tokens + tokens
|
||||
|
||||
def sum_successful_requests(self, requests: int):
|
||||
self.successful_requests = self.successful_requests + requests
|
||||
|
||||
@@ -22,6 +26,7 @@ class TokenProcess:
|
||||
return UsageMetrics(
|
||||
total_tokens=self.total_tokens,
|
||||
prompt_tokens=self.prompt_tokens,
|
||||
cached_prompt_tokens=self.cached_prompt_tokens,
|
||||
completion_tokens=self.completion_tokens,
|
||||
successful_requests=self.successful_requests,
|
||||
)
|
||||
|
||||
@@ -117,6 +117,15 @@ class CrewAgentExecutor(CrewAgentExecutorMixin):
|
||||
callbacks=self.callbacks,
|
||||
)
|
||||
|
||||
if answer is None or answer == "":
|
||||
self._printer.print(
|
||||
content="Received None or empty response from LLM call.",
|
||||
color="red",
|
||||
)
|
||||
raise ValueError(
|
||||
"Invalid response from LLM call - None or empty."
|
||||
)
|
||||
|
||||
if not self.use_stop_words:
|
||||
try:
|
||||
self._format_answer(answer)
|
||||
@@ -136,25 +145,26 @@ class CrewAgentExecutor(CrewAgentExecutorMixin):
|
||||
formatted_answer.result = action_result
|
||||
self._show_logs(formatted_answer)
|
||||
|
||||
if self.step_callback:
|
||||
self.step_callback(formatted_answer)
|
||||
if self.step_callback:
|
||||
self.step_callback(formatted_answer)
|
||||
|
||||
if self._should_force_answer():
|
||||
if self.have_forced_answer:
|
||||
return AgentFinish(
|
||||
output=self._i18n.errors(
|
||||
"force_final_answer_error"
|
||||
).format(formatted_answer.text),
|
||||
text=formatted_answer.text,
|
||||
)
|
||||
else:
|
||||
formatted_answer.text += (
|
||||
f'\n{self._i18n.errors("force_final_answer")}'
|
||||
)
|
||||
self.have_forced_answer = True
|
||||
self.messages.append(
|
||||
self._format_msg(formatted_answer.text, role="assistant")
|
||||
)
|
||||
if self._should_force_answer():
|
||||
if self.have_forced_answer:
|
||||
return AgentFinish(
|
||||
thought="",
|
||||
output=self._i18n.errors(
|
||||
"force_final_answer_error"
|
||||
).format(formatted_answer.text),
|
||||
text=formatted_answer.text,
|
||||
)
|
||||
else:
|
||||
formatted_answer.text += (
|
||||
f'\n{self._i18n.errors("force_final_answer")}'
|
||||
)
|
||||
self.have_forced_answer = True
|
||||
self.messages.append(
|
||||
self._format_msg(formatted_answer.text, role="assistant")
|
||||
)
|
||||
|
||||
except OutputParserException as e:
|
||||
self.messages.append({"role": "user", "content": e.error})
|
||||
@@ -323,9 +333,9 @@ class CrewAgentExecutor(CrewAgentExecutorMixin):
|
||||
if self.crew is not None and hasattr(self.crew, "_train_iteration"):
|
||||
train_iteration = self.crew._train_iteration
|
||||
if agent_id in training_data and isinstance(train_iteration, int):
|
||||
training_data[agent_id][train_iteration][
|
||||
"improved_output"
|
||||
] = result.output
|
||||
training_data[agent_id][train_iteration]["improved_output"] = (
|
||||
result.output
|
||||
)
|
||||
training_handler.save(training_data)
|
||||
else:
|
||||
self._logger.log(
|
||||
@@ -376,4 +386,5 @@ class CrewAgentExecutor(CrewAgentExecutorMixin):
|
||||
return CrewAgentParser(agent=self.agent).parse(answer)
|
||||
|
||||
def _format_msg(self, prompt: str, role: str = "user") -> Dict[str, str]:
|
||||
prompt = prompt.rstrip()
|
||||
return {"role": role, "content": prompt}
|
||||
|
||||
@@ -54,7 +54,7 @@ def create_embedded_crew(crew_name: str, parent_folder: Path) -> None:
|
||||
|
||||
templates_dir = Path(__file__).parent / "templates" / "crew"
|
||||
config_template_files = ["agents.yaml", "tasks.yaml"]
|
||||
crew_template_file = f"{folder_name}_crew.py" # Updated file name
|
||||
crew_template_file = f"{folder_name}.py" # Updated file name
|
||||
|
||||
for file_name in config_template_files:
|
||||
src_file = templates_dir / "config" / file_name
|
||||
|
||||
@@ -34,7 +34,9 @@ class AuthenticationCommand:
|
||||
"scope": "openid",
|
||||
"audience": AUTH0_AUDIENCE,
|
||||
}
|
||||
response = requests.post(url=self.DEVICE_CODE_URL, data=device_code_payload)
|
||||
response = requests.post(
|
||||
url=self.DEVICE_CODE_URL, data=device_code_payload, timeout=20
|
||||
)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
@@ -54,7 +56,7 @@ class AuthenticationCommand:
|
||||
|
||||
attempts = 0
|
||||
while True and attempts < 5:
|
||||
response = requests.post(self.TOKEN_URL, data=token_payload)
|
||||
response = requests.post(self.TOKEN_URL, data=token_payload, timeout=30)
|
||||
token_data = response.json()
|
||||
|
||||
if response.status_code == 200:
|
||||
|
||||
@@ -136,6 +136,7 @@ def log_tasks_outputs() -> None:
|
||||
@click.option("-l", "--long", is_flag=True, help="Reset LONG TERM memory")
|
||||
@click.option("-s", "--short", is_flag=True, help="Reset SHORT TERM memory")
|
||||
@click.option("-e", "--entities", is_flag=True, help="Reset ENTITIES memory")
|
||||
@click.option("-kn", "--knowledge", is_flag=True, help="Reset KNOWLEDGE storage")
|
||||
@click.option(
|
||||
"-k",
|
||||
"--kickoff-outputs",
|
||||
@@ -143,17 +144,24 @@ def log_tasks_outputs() -> None:
|
||||
help="Reset LATEST KICKOFF TASK OUTPUTS",
|
||||
)
|
||||
@click.option("-a", "--all", is_flag=True, help="Reset ALL memories")
|
||||
def reset_memories(long, short, entities, kickoff_outputs, all):
|
||||
def reset_memories(
|
||||
long: bool,
|
||||
short: bool,
|
||||
entities: bool,
|
||||
knowledge: bool,
|
||||
kickoff_outputs: bool,
|
||||
all: bool,
|
||||
) -> None:
|
||||
"""
|
||||
Reset the crew memories (long, short, entity, latest_crew_kickoff_ouputs). This will delete all the data saved.
|
||||
"""
|
||||
try:
|
||||
if not all and not (long or short or entities or kickoff_outputs):
|
||||
if not all and not (long or short or entities or knowledge or kickoff_outputs):
|
||||
click.echo(
|
||||
"Please specify at least one memory type to reset using the appropriate flags."
|
||||
)
|
||||
return
|
||||
reset_memories_command(long, short, entities, kickoff_outputs, all)
|
||||
reset_memories_command(long, short, entities, knowledge, kickoff_outputs, all)
|
||||
except Exception as e:
|
||||
click.echo(f"An error occurred while resetting memories: {e}", err=True)
|
||||
|
||||
|
||||
@@ -1,19 +1,161 @@
|
||||
ENV_VARS = {
|
||||
'openai': ['OPENAI_API_KEY'],
|
||||
'anthropic': ['ANTHROPIC_API_KEY'],
|
||||
'gemini': ['GEMINI_API_KEY'],
|
||||
'groq': ['GROQ_API_KEY'],
|
||||
'ollama': ['FAKE_KEY'],
|
||||
"openai": [
|
||||
{
|
||||
"prompt": "Enter your OPENAI API key (press Enter to skip)",
|
||||
"key_name": "OPENAI_API_KEY",
|
||||
}
|
||||
],
|
||||
"anthropic": [
|
||||
{
|
||||
"prompt": "Enter your ANTHROPIC API key (press Enter to skip)",
|
||||
"key_name": "ANTHROPIC_API_KEY",
|
||||
}
|
||||
],
|
||||
"gemini": [
|
||||
{
|
||||
"prompt": "Enter your GEMINI API key (press Enter to skip)",
|
||||
"key_name": "GEMINI_API_KEY",
|
||||
}
|
||||
],
|
||||
"groq": [
|
||||
{
|
||||
"prompt": "Enter your GROQ API key (press Enter to skip)",
|
||||
"key_name": "GROQ_API_KEY",
|
||||
}
|
||||
],
|
||||
"watson": [
|
||||
{
|
||||
"prompt": "Enter your WATSONX URL (press Enter to skip)",
|
||||
"key_name": "WATSONX_URL",
|
||||
},
|
||||
{
|
||||
"prompt": "Enter your WATSONX API Key (press Enter to skip)",
|
||||
"key_name": "WATSONX_APIKEY",
|
||||
},
|
||||
{
|
||||
"prompt": "Enter your WATSONX Project Id (press Enter to skip)",
|
||||
"key_name": "WATSONX_PROJECT_ID",
|
||||
},
|
||||
],
|
||||
"ollama": [
|
||||
{
|
||||
"default": True,
|
||||
"API_BASE": "http://localhost:11434",
|
||||
}
|
||||
],
|
||||
"bedrock": [
|
||||
{
|
||||
"prompt": "Enter your AWS Access Key ID (press Enter to skip)",
|
||||
"key_name": "AWS_ACCESS_KEY_ID",
|
||||
},
|
||||
{
|
||||
"prompt": "Enter your AWS Secret Access Key (press Enter to skip)",
|
||||
"key_name": "AWS_SECRET_ACCESS_KEY",
|
||||
},
|
||||
{
|
||||
"prompt": "Enter your AWS Region Name (press Enter to skip)",
|
||||
"key_name": "AWS_REGION_NAME",
|
||||
},
|
||||
],
|
||||
"azure": [
|
||||
{
|
||||
"prompt": "Enter your Azure deployment name (must start with 'azure/')",
|
||||
"key_name": "model",
|
||||
},
|
||||
{
|
||||
"prompt": "Enter your AZURE API key (press Enter to skip)",
|
||||
"key_name": "AZURE_API_KEY",
|
||||
},
|
||||
{
|
||||
"prompt": "Enter your AZURE API base URL (press Enter to skip)",
|
||||
"key_name": "AZURE_API_BASE",
|
||||
},
|
||||
{
|
||||
"prompt": "Enter your AZURE API version (press Enter to skip)",
|
||||
"key_name": "AZURE_API_VERSION",
|
||||
},
|
||||
],
|
||||
"cerebras": [
|
||||
{
|
||||
"prompt": "Enter your Cerebras model name (must start with 'cerebras/')",
|
||||
"key_name": "model",
|
||||
},
|
||||
{
|
||||
"prompt": "Enter your Cerebras API version (press Enter to skip)",
|
||||
"key_name": "CEREBRAS_API_KEY",
|
||||
},
|
||||
],
|
||||
}
|
||||
|
||||
PROVIDERS = ['openai', 'anthropic', 'gemini', 'groq', 'ollama']
|
||||
|
||||
PROVIDERS = [
|
||||
"openai",
|
||||
"anthropic",
|
||||
"gemini",
|
||||
"groq",
|
||||
"ollama",
|
||||
"watson",
|
||||
"bedrock",
|
||||
"azure",
|
||||
"cerebras",
|
||||
]
|
||||
|
||||
MODELS = {
|
||||
'openai': ['gpt-4', 'gpt-4o', 'gpt-4o-mini', 'o1-mini', 'o1-preview'],
|
||||
'anthropic': ['claude-3-5-sonnet-20240620', 'claude-3-sonnet-20240229', 'claude-3-opus-20240229', 'claude-3-haiku-20240307'],
|
||||
'gemini': ['gemini-1.5-flash', 'gemini-1.5-pro', 'gemini-gemma-2-9b-it', 'gemini-gemma-2-27b-it'],
|
||||
'groq': ['llama-3.1-8b-instant', 'llama-3.1-70b-versatile', 'llama-3.1-405b-reasoning', 'gemma2-9b-it', 'gemma-7b-it'],
|
||||
'ollama': ['llama3.1', 'mixtral'],
|
||||
"openai": ["gpt-4", "gpt-4o", "gpt-4o-mini", "o1-mini", "o1-preview"],
|
||||
"anthropic": [
|
||||
"claude-3-5-sonnet-20240620",
|
||||
"claude-3-sonnet-20240229",
|
||||
"claude-3-opus-20240229",
|
||||
"claude-3-haiku-20240307",
|
||||
],
|
||||
"gemini": [
|
||||
"gemini/gemini-1.5-flash",
|
||||
"gemini/gemini-1.5-pro",
|
||||
"gemini/gemini-gemma-2-9b-it",
|
||||
"gemini/gemini-gemma-2-27b-it",
|
||||
],
|
||||
"groq": [
|
||||
"groq/llama-3.1-8b-instant",
|
||||
"groq/llama-3.1-70b-versatile",
|
||||
"groq/llama-3.1-405b-reasoning",
|
||||
"groq/gemma2-9b-it",
|
||||
"groq/gemma-7b-it",
|
||||
],
|
||||
"ollama": ["ollama/llama3.1", "ollama/mixtral"],
|
||||
"watson": [
|
||||
"watsonx/meta-llama/llama-3-1-70b-instruct",
|
||||
"watsonx/meta-llama/llama-3-1-8b-instruct",
|
||||
"watsonx/meta-llama/llama-3-2-11b-vision-instruct",
|
||||
"watsonx/meta-llama/llama-3-2-1b-instruct",
|
||||
"watsonx/meta-llama/llama-3-2-90b-vision-instruct",
|
||||
"watsonx/meta-llama/llama-3-405b-instruct",
|
||||
"watsonx/mistral/mistral-large",
|
||||
"watsonx/ibm/granite-3-8b-instruct",
|
||||
],
|
||||
"bedrock": [
|
||||
"bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0",
|
||||
"bedrock/anthropic.claude-3-sonnet-20240229-v1:0",
|
||||
"bedrock/anthropic.claude-3-haiku-20240307-v1:0",
|
||||
"bedrock/anthropic.claude-3-opus-20240229-v1:0",
|
||||
"bedrock/anthropic.claude-v2:1",
|
||||
"bedrock/anthropic.claude-v2",
|
||||
"bedrock/anthropic.claude-instant-v1",
|
||||
"bedrock/meta.llama3-1-405b-instruct-v1:0",
|
||||
"bedrock/meta.llama3-1-70b-instruct-v1:0",
|
||||
"bedrock/meta.llama3-1-8b-instruct-v1:0",
|
||||
"bedrock/meta.llama3-70b-instruct-v1:0",
|
||||
"bedrock/meta.llama3-8b-instruct-v1:0",
|
||||
"bedrock/amazon.titan-text-lite-v1",
|
||||
"bedrock/amazon.titan-text-express-v1",
|
||||
"bedrock/cohere.command-text-v14",
|
||||
"bedrock/ai21.j2-mid-v1",
|
||||
"bedrock/ai21.j2-ultra-v1",
|
||||
"bedrock/ai21.jamba-instruct-v1:0",
|
||||
"bedrock/meta.llama2-13b-chat-v1",
|
||||
"bedrock/meta.llama2-70b-chat-v1",
|
||||
"bedrock/mistral.mistral-7b-instruct-v0:2",
|
||||
"bedrock/mistral.mixtral-8x7b-instruct-v0:1",
|
||||
],
|
||||
}
|
||||
|
||||
JSON_URL = "https://raw.githubusercontent.com/BerriAI/litellm/main/model_prices_and_context_window.json"
|
||||
JSON_URL = "https://raw.githubusercontent.com/BerriAI/litellm/main/model_prices_and_context_window.json"
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
import shutil
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
import click
|
||||
|
||||
from crewai.cli.constants import ENV_VARS
|
||||
from crewai.cli.constants import ENV_VARS, MODELS
|
||||
from crewai.cli.provider import (
|
||||
PROVIDERS,
|
||||
get_provider_data,
|
||||
select_model,
|
||||
select_provider,
|
||||
@@ -29,20 +29,20 @@ def create_folder_structure(name, parent_folder=None):
|
||||
click.secho("Operation cancelled.", fg="yellow")
|
||||
sys.exit(0)
|
||||
click.secho(f"Overriding folder {folder_name}...", fg="green", bold=True)
|
||||
else:
|
||||
click.secho(
|
||||
f"Creating {'crew' if parent_folder else 'folder'} {folder_name}...",
|
||||
fg="green",
|
||||
bold=True,
|
||||
)
|
||||
shutil.rmtree(folder_path) # Delete the existing folder and its contents
|
||||
|
||||
if not folder_path.exists():
|
||||
folder_path.mkdir(parents=True)
|
||||
(folder_path / "tests").mkdir(exist_ok=True)
|
||||
if not parent_folder:
|
||||
(folder_path / "src" / folder_name).mkdir(parents=True)
|
||||
(folder_path / "src" / folder_name / "tools").mkdir(parents=True)
|
||||
(folder_path / "src" / folder_name / "config").mkdir(parents=True)
|
||||
click.secho(
|
||||
f"Creating {'crew' if parent_folder else 'folder'} {folder_name}...",
|
||||
fg="green",
|
||||
bold=True,
|
||||
)
|
||||
|
||||
folder_path.mkdir(parents=True)
|
||||
(folder_path / "tests").mkdir(exist_ok=True)
|
||||
if not parent_folder:
|
||||
(folder_path / "src" / folder_name).mkdir(parents=True)
|
||||
(folder_path / "src" / folder_name / "tools").mkdir(parents=True)
|
||||
(folder_path / "src" / folder_name / "config").mkdir(parents=True)
|
||||
|
||||
return folder_path, folder_name, class_name
|
||||
|
||||
@@ -92,7 +92,10 @@ def create_crew(name, provider=None, skip_provider=False, parent_folder=None):
|
||||
|
||||
existing_provider = None
|
||||
for provider, env_keys in ENV_VARS.items():
|
||||
if any(key in env_vars for key in env_keys):
|
||||
if any(
|
||||
"key_name" in details and details["key_name"] in env_vars
|
||||
for details in env_keys
|
||||
):
|
||||
existing_provider = provider
|
||||
break
|
||||
|
||||
@@ -118,47 +121,48 @@ def create_crew(name, provider=None, skip_provider=False, parent_folder=None):
|
||||
"No provider selected. Please try again or press 'q' to exit.", fg="red"
|
||||
)
|
||||
|
||||
while True:
|
||||
selected_model = select_model(selected_provider, provider_models)
|
||||
if selected_model is None: # User typed 'q'
|
||||
click.secho("Exiting...", fg="yellow")
|
||||
sys.exit(0)
|
||||
if selected_model: # Valid selection
|
||||
break
|
||||
click.secho(
|
||||
"No model selected. Please try again or press 'q' to exit.", fg="red"
|
||||
)
|
||||
# Check if the selected provider has predefined models
|
||||
if selected_provider in MODELS and MODELS[selected_provider]:
|
||||
while True:
|
||||
selected_model = select_model(selected_provider, provider_models)
|
||||
if selected_model is None: # User typed 'q'
|
||||
click.secho("Exiting...", fg="yellow")
|
||||
sys.exit(0)
|
||||
if selected_model: # Valid selection
|
||||
break
|
||||
click.secho(
|
||||
"No model selected. Please try again or press 'q' to exit.",
|
||||
fg="red",
|
||||
)
|
||||
env_vars["MODEL"] = selected_model
|
||||
|
||||
if selected_provider in PROVIDERS:
|
||||
api_key_var = ENV_VARS[selected_provider][0]
|
||||
else:
|
||||
api_key_var = click.prompt(
|
||||
f"Enter the environment variable name for your {selected_provider.capitalize()} API key",
|
||||
type=str,
|
||||
default="",
|
||||
)
|
||||
# Check if the selected provider requires API keys
|
||||
if selected_provider in ENV_VARS:
|
||||
provider_env_vars = ENV_VARS[selected_provider]
|
||||
for details in provider_env_vars:
|
||||
if details.get("default", False):
|
||||
# Automatically add default key-value pairs
|
||||
for key, value in details.items():
|
||||
if key not in ["prompt", "key_name", "default"]:
|
||||
env_vars[key] = value
|
||||
elif "key_name" in details:
|
||||
# Prompt for non-default key-value pairs
|
||||
prompt = details["prompt"]
|
||||
key_name = details["key_name"]
|
||||
api_key_value = click.prompt(prompt, default="", show_default=False)
|
||||
|
||||
api_key_value = ""
|
||||
click.echo(
|
||||
f"Enter your {selected_provider.capitalize()} API key (press Enter to skip): ",
|
||||
nl=False,
|
||||
)
|
||||
try:
|
||||
api_key_value = input()
|
||||
except (KeyboardInterrupt, EOFError):
|
||||
api_key_value = ""
|
||||
if api_key_value.strip():
|
||||
env_vars[key_name] = api_key_value
|
||||
|
||||
if api_key_value.strip():
|
||||
env_vars = {api_key_var: api_key_value}
|
||||
if env_vars:
|
||||
write_env_file(folder_path, env_vars)
|
||||
click.secho("API key saved to .env file", fg="green")
|
||||
click.secho("API keys and model saved to .env file", fg="green")
|
||||
else:
|
||||
click.secho(
|
||||
"No API key provided. Skipping .env file creation.", fg="yellow"
|
||||
"No API keys provided. Skipping .env file creation.", fg="yellow"
|
||||
)
|
||||
|
||||
env_vars["MODEL"] = selected_model
|
||||
click.secho(f"Selected model: {selected_model}", fg="green")
|
||||
click.secho(f"Selected model: {env_vars.get('MODEL', 'N/A')}", fg="green")
|
||||
|
||||
package_dir = Path(__file__).parent
|
||||
templates_dir = package_dir / "templates" / "crew"
|
||||
|
||||
@@ -164,7 +164,7 @@ def fetch_provider_data(cache_file):
|
||||
- dict or None: The fetched provider data or None if the operation fails.
|
||||
"""
|
||||
try:
|
||||
response = requests.get(JSON_URL, stream=True, timeout=10)
|
||||
response = requests.get(JSON_URL, stream=True, timeout=60)
|
||||
response.raise_for_status()
|
||||
data = download_data(response)
|
||||
with open(cache_file, "w") as f:
|
||||
|
||||
@@ -5,9 +5,17 @@ from crewai.memory.entity.entity_memory import EntityMemory
|
||||
from crewai.memory.long_term.long_term_memory import LongTermMemory
|
||||
from crewai.memory.short_term.short_term_memory import ShortTermMemory
|
||||
from crewai.utilities.task_output_storage_handler import TaskOutputStorageHandler
|
||||
from crewai.knowledge.storage.knowledge_storage import KnowledgeStorage
|
||||
|
||||
|
||||
def reset_memories_command(long, short, entity, kickoff_outputs, all) -> None:
|
||||
def reset_memories_command(
|
||||
long,
|
||||
short,
|
||||
entity,
|
||||
knowledge,
|
||||
kickoff_outputs,
|
||||
all,
|
||||
) -> None:
|
||||
"""
|
||||
Reset the crew memories.
|
||||
|
||||
@@ -17,6 +25,7 @@ def reset_memories_command(long, short, entity, kickoff_outputs, all) -> None:
|
||||
entity (bool): Whether to reset the entity memory.
|
||||
kickoff_outputs (bool): Whether to reset the latest kickoff task outputs.
|
||||
all (bool): Whether to reset all memories.
|
||||
knowledge (bool): Whether to reset the knowledge.
|
||||
"""
|
||||
|
||||
try:
|
||||
@@ -25,6 +34,7 @@ def reset_memories_command(long, short, entity, kickoff_outputs, all) -> None:
|
||||
EntityMemory().reset()
|
||||
LongTermMemory().reset()
|
||||
TaskOutputStorageHandler().reset()
|
||||
KnowledgeStorage().reset()
|
||||
click.echo("All memories have been reset.")
|
||||
else:
|
||||
if long:
|
||||
@@ -40,6 +50,9 @@ def reset_memories_command(long, short, entity, kickoff_outputs, all) -> None:
|
||||
if kickoff_outputs:
|
||||
TaskOutputStorageHandler().reset()
|
||||
click.echo("Latest Kickoff outputs stored has been reset.")
|
||||
if knowledge:
|
||||
KnowledgeStorage().reset()
|
||||
click.echo("Knowledge has been reset.")
|
||||
|
||||
except subprocess.CalledProcessError as e:
|
||||
click.echo(f"An error occurred while resetting the memories: {e}", err=True)
|
||||
|
||||
@@ -24,7 +24,6 @@ def run_crew() -> None:
|
||||
f"Please run `crewai update` to update your pyproject.toml to use uv.",
|
||||
fg="red",
|
||||
)
|
||||
print()
|
||||
|
||||
try:
|
||||
subprocess.run(command, capture_output=False, text=True, check=True)
|
||||
|
||||
@@ -8,9 +8,12 @@ from crewai.project import CrewBase, agent, crew, task
|
||||
# from crewai_tools import SerperDevTool
|
||||
|
||||
@CrewBase
|
||||
class {{crew_name}}Crew():
|
||||
class {{crew_name}}():
|
||||
"""{{crew_name}} crew"""
|
||||
|
||||
agents_config = 'config/agents.yaml'
|
||||
tasks_config = 'config/tasks.yaml'
|
||||
|
||||
@agent
|
||||
def researcher(self) -> Agent:
|
||||
return Agent(
|
||||
@@ -48,4 +51,4 @@ class {{crew_name}}Crew():
|
||||
process=Process.sequential,
|
||||
verbose=True,
|
||||
# process=Process.hierarchical, # In case you wanna use that instead https://docs.crewai.com/how-to/Hierarchical/
|
||||
)
|
||||
)
|
||||
|
||||
@@ -1,6 +1,10 @@
|
||||
#!/usr/bin/env python
|
||||
import sys
|
||||
from {{folder_name}}.crew import {{crew_name}}Crew
|
||||
import warnings
|
||||
|
||||
from {{folder_name}}.crew import {{crew_name}}
|
||||
|
||||
warnings.filterwarnings("ignore", category=SyntaxWarning, module="pysbd")
|
||||
|
||||
# This main file is intended to be a way for you to run your
|
||||
# crew locally, so refrain from adding unnecessary logic into this file.
|
||||
@@ -14,7 +18,7 @@ def run():
|
||||
inputs = {
|
||||
'topic': 'AI LLMs'
|
||||
}
|
||||
{{crew_name}}Crew().crew().kickoff(inputs=inputs)
|
||||
{{crew_name}}().crew().kickoff(inputs=inputs)
|
||||
|
||||
|
||||
def train():
|
||||
@@ -25,7 +29,7 @@ def train():
|
||||
"topic": "AI LLMs"
|
||||
}
|
||||
try:
|
||||
{{crew_name}}Crew().crew().train(n_iterations=int(sys.argv[1]), filename=sys.argv[2], inputs=inputs)
|
||||
{{crew_name}}().crew().train(n_iterations=int(sys.argv[1]), filename=sys.argv[2], inputs=inputs)
|
||||
|
||||
except Exception as e:
|
||||
raise Exception(f"An error occurred while training the crew: {e}")
|
||||
@@ -35,7 +39,7 @@ def replay():
|
||||
Replay the crew execution from a specific task.
|
||||
"""
|
||||
try:
|
||||
{{crew_name}}Crew().crew().replay(task_id=sys.argv[1])
|
||||
{{crew_name}}().crew().replay(task_id=sys.argv[1])
|
||||
|
||||
except Exception as e:
|
||||
raise Exception(f"An error occurred while replaying the crew: {e}")
|
||||
@@ -48,7 +52,7 @@ def test():
|
||||
"topic": "AI LLMs"
|
||||
}
|
||||
try:
|
||||
{{crew_name}}Crew().crew().test(n_iterations=int(sys.argv[1]), openai_model_name=sys.argv[2], inputs=inputs)
|
||||
{{crew_name}}().crew().test(n_iterations=int(sys.argv[1]), openai_model_name=sys.argv[2], inputs=inputs)
|
||||
|
||||
except Exception as e:
|
||||
raise Exception(f"An error occurred while replaying the crew: {e}")
|
||||
|
||||
@@ -5,7 +5,7 @@ description = "{{name}} using crewAI"
|
||||
authors = [{ name = "Your Name", email = "you@example.com" }]
|
||||
requires-python = ">=3.10,<=3.13"
|
||||
dependencies = [
|
||||
"crewai[tools]>=0.76.9,<1.0.0"
|
||||
"crewai[tools]>=0.80.0,<1.0.0"
|
||||
]
|
||||
|
||||
[project.scripts]
|
||||
|
||||
@@ -5,7 +5,7 @@ description = "{{name}} using crewAI"
|
||||
authors = [{ name = "Your Name", email = "you@example.com" }]
|
||||
requires-python = ">=3.10,<=3.13"
|
||||
dependencies = [
|
||||
"crewai[tools]>=0.76.9,<1.0.0",
|
||||
"crewai[tools]>=0.80.0,<1.0.0",
|
||||
]
|
||||
|
||||
[project.scripts]
|
||||
|
||||
@@ -6,7 +6,7 @@ authors = ["Your Name <you@example.com>"]
|
||||
|
||||
[tool.poetry.dependencies]
|
||||
python = ">=3.10,<=3.13"
|
||||
crewai = { extras = ["tools"], version = ">=0.76.9,<1.0.0" }
|
||||
crewai = { extras = ["tools"], version = ">=0.80.0,<1.0.0" }
|
||||
asyncio = "*"
|
||||
|
||||
[tool.poetry.scripts]
|
||||
|
||||
@@ -5,7 +5,7 @@ description = "{{name}} using crewAI"
|
||||
authors = ["Your Name <you@example.com>"]
|
||||
requires-python = ">=3.10,<=3.13"
|
||||
dependencies = [
|
||||
"crewai[tools]>=0.76.9,<1.0.0"
|
||||
"crewai[tools]>=0.80.0,<1.0.0"
|
||||
]
|
||||
|
||||
[project.scripts]
|
||||
|
||||
@@ -5,6 +5,6 @@ description = "Power up your crews with {{folder_name}}"
|
||||
readme = "README.md"
|
||||
requires-python = ">=3.10,<=3.13"
|
||||
dependencies = [
|
||||
"crewai[tools]>=0.76.9"
|
||||
"crewai[tools]>=0.80.0"
|
||||
]
|
||||
|
||||
|
||||
@@ -5,7 +5,7 @@ import uuid
|
||||
import warnings
|
||||
from concurrent.futures import Future
|
||||
from hashlib import md5
|
||||
from typing import TYPE_CHECKING, Any, Dict, List, Optional, Tuple, Union
|
||||
from typing import TYPE_CHECKING, Any, Callable, Dict, List, Optional, Tuple, Union
|
||||
|
||||
from pydantic import (
|
||||
UUID4,
|
||||
@@ -27,6 +27,8 @@ from crewai.llm import LLM
|
||||
from crewai.memory.entity.entity_memory import EntityMemory
|
||||
from crewai.memory.long_term.long_term_memory import LongTermMemory
|
||||
from crewai.memory.short_term.short_term_memory import ShortTermMemory
|
||||
from crewai.knowledge.knowledge import Knowledge
|
||||
from crewai.memory.user.user_memory import UserMemory
|
||||
from crewai.process import Process
|
||||
from crewai.task import Task
|
||||
from crewai.tasks.conditional_task import ConditionalTask
|
||||
@@ -35,9 +37,7 @@ from crewai.telemetry import Telemetry
|
||||
from crewai.tools.agent_tools.agent_tools import AgentTools
|
||||
from crewai.types.usage_metrics import UsageMetrics
|
||||
from crewai.utilities import I18N, FileHandler, Logger, RPMController
|
||||
from crewai.utilities.constants import (
|
||||
TRAINING_DATA_FILE,
|
||||
)
|
||||
from crewai.utilities.constants import TRAINING_DATA_FILE
|
||||
from crewai.utilities.evaluators.crew_evaluator_handler import CrewEvaluator
|
||||
from crewai.utilities.evaluators.task_evaluator import TaskEvaluator
|
||||
from crewai.utilities.formatter import (
|
||||
@@ -71,6 +71,7 @@ class Crew(BaseModel):
|
||||
manager_llm: The language model that will run manager agent.
|
||||
manager_agent: Custom agent that will be used as manager.
|
||||
memory: Whether the crew should use memory to store memories of it's execution.
|
||||
memory_config: Configuration for the memory to be used for the crew.
|
||||
cache: Whether the crew should use a cache to store the results of the tools execution.
|
||||
function_calling_llm: The language model that will run the tool calling for all the agents.
|
||||
process: The process flow that the crew will follow (e.g., sequential, hierarchical).
|
||||
@@ -94,6 +95,7 @@ class Crew(BaseModel):
|
||||
_short_term_memory: Optional[InstanceOf[ShortTermMemory]] = PrivateAttr()
|
||||
_long_term_memory: Optional[InstanceOf[LongTermMemory]] = PrivateAttr()
|
||||
_entity_memory: Optional[InstanceOf[EntityMemory]] = PrivateAttr()
|
||||
_user_memory: Optional[InstanceOf[UserMemory]] = PrivateAttr()
|
||||
_train: Optional[bool] = PrivateAttr(default=False)
|
||||
_train_iteration: Optional[int] = PrivateAttr()
|
||||
_inputs: Optional[Dict[str, Any]] = PrivateAttr(default=None)
|
||||
@@ -114,6 +116,10 @@ class Crew(BaseModel):
|
||||
default=False,
|
||||
description="Whether the crew should use memory to store memories of it's execution",
|
||||
)
|
||||
memory_config: Optional[Dict[str, Any]] = Field(
|
||||
default=None,
|
||||
description="Configuration for the memory to be used for the crew.",
|
||||
)
|
||||
short_term_memory: Optional[InstanceOf[ShortTermMemory]] = Field(
|
||||
default=None,
|
||||
description="An Instance of the ShortTermMemory to be used by the Crew",
|
||||
@@ -126,7 +132,11 @@ class Crew(BaseModel):
|
||||
default=None,
|
||||
description="An Instance of the EntityMemory to be used by the Crew",
|
||||
)
|
||||
embedder: Optional[Any] = Field(
|
||||
user_memory: Optional[InstanceOf[UserMemory]] = Field(
|
||||
default=None,
|
||||
description="An instance of the UserMemory to be used by the Crew to store/fetch memories of a specific user.",
|
||||
)
|
||||
embedder: Optional[dict] = Field(
|
||||
default=None,
|
||||
description="Configuration for the embedder to be used for the crew.",
|
||||
)
|
||||
@@ -154,6 +164,16 @@ class Crew(BaseModel):
|
||||
default=None,
|
||||
description="Callback to be executed after each task for all agents execution.",
|
||||
)
|
||||
before_kickoff_callbacks: List[
|
||||
Callable[[Optional[Dict[str, Any]]], Optional[Dict[str, Any]]]
|
||||
] = Field(
|
||||
default_factory=list,
|
||||
description="List of callbacks to be executed before crew kickoff. It may be used to adjust inputs before the crew is executed.",
|
||||
)
|
||||
after_kickoff_callbacks: List[Callable[[CrewOutput], CrewOutput]] = Field(
|
||||
default_factory=list,
|
||||
description="List of callbacks to be executed after crew kickoff. It may be used to adjust the output of the crew.",
|
||||
)
|
||||
max_rpm: Optional[int] = Field(
|
||||
default=None,
|
||||
description="Maximum number of requests per minute for the crew execution to be respected.",
|
||||
@@ -182,6 +202,10 @@ class Crew(BaseModel):
|
||||
default=[],
|
||||
description="List of execution logs for tasks",
|
||||
)
|
||||
knowledge: Optional[Dict[str, Any]] = Field(
|
||||
default=None, description="Knowledge for the crew. Add knowledge sources to the knowledge object."
|
||||
)
|
||||
|
||||
|
||||
@field_validator("id", mode="before")
|
||||
@classmethod
|
||||
@@ -238,13 +262,31 @@ class Crew(BaseModel):
|
||||
self._short_term_memory = (
|
||||
self.short_term_memory
|
||||
if self.short_term_memory
|
||||
else ShortTermMemory(crew=self, embedder_config=self.embedder)
|
||||
else ShortTermMemory(
|
||||
crew=self,
|
||||
embedder_config=self.embedder,
|
||||
)
|
||||
)
|
||||
self._entity_memory = (
|
||||
self.entity_memory
|
||||
if self.entity_memory
|
||||
else EntityMemory(crew=self, embedder_config=self.embedder)
|
||||
)
|
||||
if hasattr(self, "memory_config") and self.memory_config is not None:
|
||||
self._user_memory = (
|
||||
self.user_memory if self.user_memory else UserMemory(crew=self)
|
||||
)
|
||||
else:
|
||||
self._user_memory = None
|
||||
return self
|
||||
|
||||
@model_validator(mode="after")
|
||||
def create_crew_knowledge(self) -> "Crew":
|
||||
if self.knowledge:
|
||||
try:
|
||||
self.knowledge = Knowledge(**self.knowledge) if isinstance(self.knowledge, dict) else self.knowledge
|
||||
except (TypeError, ValueError) as e:
|
||||
raise ValueError(f"Invalid knowledge configuration: {str(e)}")
|
||||
return self
|
||||
|
||||
@model_validator(mode="after")
|
||||
@@ -445,18 +487,22 @@ class Crew(BaseModel):
|
||||
training_data = CrewTrainingHandler(TRAINING_DATA_FILE).load()
|
||||
|
||||
for agent in train_crew.agents:
|
||||
result = TaskEvaluator(agent).evaluate_training_data(
|
||||
training_data=training_data, agent_id=str(agent.id)
|
||||
)
|
||||
if training_data.get(str(agent.id)):
|
||||
result = TaskEvaluator(agent).evaluate_training_data(
|
||||
training_data=training_data, agent_id=str(agent.id)
|
||||
)
|
||||
|
||||
CrewTrainingHandler(filename).save_trained_data(
|
||||
agent_id=str(agent.role), trained_data=result.model_dump()
|
||||
)
|
||||
CrewTrainingHandler(filename).save_trained_data(
|
||||
agent_id=str(agent.role), trained_data=result.model_dump()
|
||||
)
|
||||
|
||||
def kickoff(
|
||||
self,
|
||||
inputs: Optional[Dict[str, Any]] = None,
|
||||
) -> CrewOutput:
|
||||
for before_callback in self.before_kickoff_callbacks:
|
||||
inputs = before_callback(inputs)
|
||||
|
||||
"""Starts the crew to work on its assigned tasks."""
|
||||
self._execution_span = self._telemetry.crew_execution_span(self, inputs)
|
||||
self._task_output_handler.reset()
|
||||
@@ -499,6 +545,9 @@ class Crew(BaseModel):
|
||||
f"The process '{self.process}' is not implemented yet."
|
||||
)
|
||||
|
||||
for after_callback in self.after_kickoff_callbacks:
|
||||
result = after_callback(result)
|
||||
|
||||
metrics += [agent._token_process.get_summary() for agent in self.agents]
|
||||
|
||||
self.usage_metrics = UsageMetrics()
|
||||
|
||||
@@ -131,7 +131,6 @@ class FlowMeta(type):
|
||||
condition_type = getattr(attr_value, "__condition_type__", "OR")
|
||||
listeners[attr_name] = (condition_type, methods)
|
||||
|
||||
# TODO: should we add a check for __condition_type__ 'AND'?
|
||||
elif hasattr(attr_value, "__is_router__"):
|
||||
routers[attr_value.__router_for__] = attr_name
|
||||
possible_returns = get_possible_return_constants(attr_value)
|
||||
@@ -171,8 +170,7 @@ class Flow(Generic[T], metaclass=FlowMeta):
|
||||
def __init__(self) -> None:
|
||||
self._methods: Dict[str, Callable] = {}
|
||||
self._state: T = self._create_initial_state()
|
||||
self._executed_methods: Set[str] = set()
|
||||
self._scheduled_tasks: Set[str] = set()
|
||||
self._method_execution_counts: Dict[str, int] = {}
|
||||
self._pending_and_listeners: Dict[str, Set[str]] = {}
|
||||
self._method_outputs: List[Any] = [] # List to store all method outputs
|
||||
|
||||
@@ -309,7 +307,10 @@ class Flow(Generic[T], metaclass=FlowMeta):
|
||||
)
|
||||
self._method_outputs.append(result) # Store the output
|
||||
|
||||
self._executed_methods.add(method_name)
|
||||
# Track method execution counts
|
||||
self._method_execution_counts[method_name] = (
|
||||
self._method_execution_counts.get(method_name, 0) + 1
|
||||
)
|
||||
|
||||
return result
|
||||
|
||||
@@ -319,35 +320,34 @@ class Flow(Generic[T], metaclass=FlowMeta):
|
||||
if trigger_method in self._routers:
|
||||
router_method = self._methods[self._routers[trigger_method]]
|
||||
path = await self._execute_method(
|
||||
trigger_method, router_method
|
||||
) # TODO: Change or not?
|
||||
# Use the path as the new trigger method
|
||||
self._routers[trigger_method], router_method
|
||||
)
|
||||
trigger_method = path
|
||||
|
||||
for listener_name, (condition_type, methods) in self._listeners.items():
|
||||
if condition_type == "OR":
|
||||
if trigger_method in methods:
|
||||
if (
|
||||
listener_name not in self._executed_methods
|
||||
and listener_name not in self._scheduled_tasks
|
||||
):
|
||||
self._scheduled_tasks.add(listener_name)
|
||||
listener_tasks.append(
|
||||
self._execute_single_listener(listener_name, result)
|
||||
)
|
||||
# Schedule the listener without preventing re-execution
|
||||
listener_tasks.append(
|
||||
self._execute_single_listener(listener_name, result)
|
||||
)
|
||||
elif condition_type == "AND":
|
||||
if all(method in self._executed_methods for method in methods):
|
||||
if (
|
||||
listener_name not in self._executed_methods
|
||||
and listener_name not in self._scheduled_tasks
|
||||
):
|
||||
self._scheduled_tasks.add(listener_name)
|
||||
listener_tasks.append(
|
||||
self._execute_single_listener(listener_name, result)
|
||||
)
|
||||
# Initialize pending methods for this listener if not already done
|
||||
if listener_name not in self._pending_and_listeners:
|
||||
self._pending_and_listeners[listener_name] = set(methods)
|
||||
# Remove the trigger method from pending methods
|
||||
self._pending_and_listeners[listener_name].discard(trigger_method)
|
||||
if not self._pending_and_listeners[listener_name]:
|
||||
# All required methods have been executed
|
||||
listener_tasks.append(
|
||||
self._execute_single_listener(listener_name, result)
|
||||
)
|
||||
# Reset pending methods for this listener
|
||||
self._pending_and_listeners.pop(listener_name, None)
|
||||
|
||||
# Run all listener tasks concurrently and wait for them to complete
|
||||
await asyncio.gather(*listener_tasks)
|
||||
if listener_tasks:
|
||||
await asyncio.gather(*listener_tasks)
|
||||
|
||||
async def _execute_single_listener(self, listener_name: str, result: Any) -> None:
|
||||
try:
|
||||
@@ -367,9 +367,6 @@ class Flow(Generic[T], metaclass=FlowMeta):
|
||||
# If listener does not expect parameters, call without arguments
|
||||
listener_result = await self._execute_method(listener_name, method)
|
||||
|
||||
# Remove from scheduled tasks after execution
|
||||
self._scheduled_tasks.discard(listener_name)
|
||||
|
||||
# Execute listeners of this listener
|
||||
await self._execute_listeners(listener_name, listener_result)
|
||||
except Exception as e:
|
||||
|
||||
0
src/crewai/knowledge/__init__.py
Normal file
0
src/crewai/knowledge/__init__.py
Normal file
0
src/crewai/knowledge/embedder/__init__.py
Normal file
0
src/crewai/knowledge/embedder/__init__.py
Normal file
55
src/crewai/knowledge/embedder/base_embedder.py
Normal file
55
src/crewai/knowledge/embedder/base_embedder.py
Normal file
@@ -0,0 +1,55 @@
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import List
|
||||
|
||||
import numpy as np
|
||||
|
||||
|
||||
class BaseEmbedder(ABC):
|
||||
"""
|
||||
Abstract base class for text embedding models
|
||||
"""
|
||||
|
||||
@abstractmethod
|
||||
def embed_chunks(self, chunks: List[str]) -> np.ndarray:
|
||||
"""
|
||||
Generate embeddings for a list of text chunks
|
||||
|
||||
Args:
|
||||
chunks: List of text chunks to embed
|
||||
|
||||
Returns:
|
||||
Array of embeddings
|
||||
"""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def embed_texts(self, texts: List[str]) -> np.ndarray:
|
||||
"""
|
||||
Generate embeddings for a list of texts
|
||||
|
||||
Args:
|
||||
texts: List of texts to embed
|
||||
|
||||
Returns:
|
||||
Array of embeddings
|
||||
"""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def embed_text(self, text: str) -> np.ndarray:
|
||||
"""
|
||||
Generate embedding for a single text
|
||||
|
||||
Args:
|
||||
text: Text to embed
|
||||
|
||||
Returns:
|
||||
Embedding array
|
||||
"""
|
||||
pass
|
||||
|
||||
@property
|
||||
@abstractmethod
|
||||
def dimension(self) -> int:
|
||||
"""Get the dimension of the embeddings"""
|
||||
pass
|
||||
93
src/crewai/knowledge/embedder/fastembed.py
Normal file
93
src/crewai/knowledge/embedder/fastembed.py
Normal file
@@ -0,0 +1,93 @@
|
||||
from pathlib import Path
|
||||
from typing import List, Optional, Union
|
||||
|
||||
import numpy as np
|
||||
|
||||
from .base_embedder import BaseEmbedder
|
||||
|
||||
try:
|
||||
from fastembed_gpu import TextEmbedding # type: ignore
|
||||
|
||||
FASTEMBED_AVAILABLE = True
|
||||
except ImportError:
|
||||
try:
|
||||
from fastembed import TextEmbedding
|
||||
|
||||
FASTEMBED_AVAILABLE = True
|
||||
except ImportError:
|
||||
FASTEMBED_AVAILABLE = False
|
||||
|
||||
|
||||
class FastEmbed(BaseEmbedder):
|
||||
"""
|
||||
A wrapper class for text embedding models using FastEmbed
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
model_name: str = "BAAI/bge-small-en-v1.5",
|
||||
cache_dir: Optional[Union[str, Path]] = None,
|
||||
):
|
||||
"""
|
||||
Initialize the embedding model
|
||||
|
||||
Args:
|
||||
model_name: Name of the model to use
|
||||
cache_dir: Directory to cache the model
|
||||
gpu: Whether to use GPU acceleration
|
||||
"""
|
||||
if not FASTEMBED_AVAILABLE:
|
||||
raise ImportError(
|
||||
"FastEmbed is not installed. Please install it with: "
|
||||
"uv pip install fastembed or uv pip install fastembed-gpu for GPU support"
|
||||
)
|
||||
|
||||
self.model = TextEmbedding(
|
||||
model_name=model_name,
|
||||
cache_dir=str(cache_dir) if cache_dir else None,
|
||||
)
|
||||
|
||||
def embed_chunks(self, chunks: List[str]) -> List[np.ndarray]:
|
||||
"""
|
||||
Generate embeddings for a list of text chunks
|
||||
|
||||
Args:
|
||||
chunks: List of text chunks to embed
|
||||
|
||||
Returns:
|
||||
List of embeddings
|
||||
"""
|
||||
embeddings = list(self.model.embed(chunks))
|
||||
return embeddings
|
||||
|
||||
def embed_texts(self, texts: List[str]) -> List[np.ndarray]:
|
||||
"""
|
||||
Generate embeddings for a list of texts
|
||||
|
||||
Args:
|
||||
texts: List of texts to embed
|
||||
|
||||
Returns:
|
||||
List of embeddings
|
||||
"""
|
||||
embeddings = list(self.model.embed(texts))
|
||||
return embeddings
|
||||
|
||||
def embed_text(self, text: str) -> np.ndarray:
|
||||
"""
|
||||
Generate embedding for a single text
|
||||
|
||||
Args:
|
||||
text: Text to embed
|
||||
|
||||
Returns:
|
||||
Embedding array
|
||||
"""
|
||||
return self.embed_texts([text])[0]
|
||||
|
||||
@property
|
||||
def dimension(self) -> int:
|
||||
"""Get the dimension of the embeddings"""
|
||||
# Generate a test embedding to get dimensions
|
||||
test_embed = self.embed_text("test")
|
||||
return len(test_embed)
|
||||
54
src/crewai/knowledge/knowledge.py
Normal file
54
src/crewai/knowledge/knowledge.py
Normal file
@@ -0,0 +1,54 @@
|
||||
import os
|
||||
|
||||
from typing import List, Optional, Dict, Any
|
||||
from pydantic import BaseModel, ConfigDict, Field
|
||||
|
||||
from crewai.knowledge.source.base_knowledge_source import BaseKnowledgeSource
|
||||
from crewai.knowledge.storage.knowledge_storage import KnowledgeStorage
|
||||
from crewai.utilities.logger import Logger
|
||||
from crewai.utilities.constants import DEFAULT_SCORE_THRESHOLD
|
||||
os.environ["TOKENIZERS_PARALLELISM"] = "false" # removes logging from fastembed
|
||||
|
||||
|
||||
class Knowledge(BaseModel):
|
||||
"""
|
||||
Knowledge is a collection of sources and setup for the vector store to save and query relevant context.
|
||||
Args:
|
||||
sources: List[BaseKnowledgeSource] = Field(default_factory=list)
|
||||
storage: KnowledgeStorage = Field(default_factory=KnowledgeStorage)
|
||||
embedder_config: Optional[Dict[str, Any]] = None
|
||||
"""
|
||||
sources: List[BaseKnowledgeSource] = Field(default_factory=list)
|
||||
model_config = ConfigDict(arbitrary_types_allowed=True)
|
||||
storage: KnowledgeStorage = Field(default_factory=KnowledgeStorage)
|
||||
embedder_config: Optional[Dict[str, Any]] = None
|
||||
|
||||
def __init__(self, embedder_config: Optional[Dict[str, Any]] = None, **data):
|
||||
super().__init__(**data)
|
||||
self.storage = KnowledgeStorage(embedder_config=embedder_config or None)
|
||||
|
||||
try:
|
||||
for source in self.sources:
|
||||
source.add()
|
||||
except Exception as e:
|
||||
Logger(verbose=True).log(
|
||||
"warning",
|
||||
f"Failed to init knowledge: {e}",
|
||||
color="yellow",
|
||||
)
|
||||
|
||||
def query(
|
||||
self, query: List[str], limit: int = 3, preference: Optional[str] = None
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Query across all knowledge sources to find the most relevant information.
|
||||
Returns the top_k most relevant chunks.
|
||||
"""
|
||||
|
||||
results = self.storage.search(
|
||||
query,
|
||||
limit,
|
||||
filter={"preference": preference} if preference else None,
|
||||
score_threshold=DEFAULT_SCORE_THRESHOLD,
|
||||
)
|
||||
return results
|
||||
0
src/crewai/knowledge/source/__init__.py
Normal file
0
src/crewai/knowledge/source/__init__.py
Normal file
36
src/crewai/knowledge/source/base_file_knowledge_source.py
Normal file
36
src/crewai/knowledge/source/base_file_knowledge_source.py
Normal file
@@ -0,0 +1,36 @@
|
||||
from pathlib import Path
|
||||
from typing import Union, List
|
||||
|
||||
from pydantic import Field
|
||||
|
||||
from crewai.knowledge.source.base_knowledge_source import BaseKnowledgeSource
|
||||
from typing import Dict, Any
|
||||
from crewai.knowledge.storage.knowledge_storage import KnowledgeStorage
|
||||
|
||||
|
||||
class BaseFileKnowledgeSource(BaseKnowledgeSource):
|
||||
"""Base class for knowledge sources that load content from files."""
|
||||
|
||||
file_path: Union[Path, List[Path]] = Field(...)
|
||||
content: Dict[Path, str] = Field(init=False, default_factory=dict)
|
||||
storage: KnowledgeStorage = Field(default_factory=KnowledgeStorage)
|
||||
|
||||
def model_post_init(self, _):
|
||||
"""Post-initialization method to load content."""
|
||||
self.content = self.load_content()
|
||||
|
||||
def load_content(self) -> Dict[Path, str]:
|
||||
"""Load and preprocess file content. Should be overridden by subclasses."""
|
||||
paths = [self.file_path] if isinstance(self.file_path, Path) else self.file_path
|
||||
|
||||
for path in paths:
|
||||
if not path.exists():
|
||||
raise FileNotFoundError(f"File not found: {path}")
|
||||
if not path.is_file():
|
||||
raise ValueError(f"Path is not a file: {path}")
|
||||
return {}
|
||||
|
||||
def save_documents(self, metadata: Dict[str, Any]):
|
||||
"""Save the documents to the storage."""
|
||||
chunk_metadatas = [metadata.copy() for _ in self.chunks]
|
||||
self.storage.save(self.chunks, chunk_metadatas)
|
||||
48
src/crewai/knowledge/source/base_knowledge_source.py
Normal file
48
src/crewai/knowledge/source/base_knowledge_source.py
Normal file
@@ -0,0 +1,48 @@
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import List, Dict, Any
|
||||
|
||||
import numpy as np
|
||||
from pydantic import BaseModel, ConfigDict, Field
|
||||
|
||||
from crewai.knowledge.storage.knowledge_storage import KnowledgeStorage
|
||||
|
||||
|
||||
class BaseKnowledgeSource(BaseModel, ABC):
|
||||
"""Abstract base class for knowledge sources."""
|
||||
|
||||
chunk_size: int = 4000
|
||||
chunk_overlap: int = 200
|
||||
chunks: List[str] = Field(default_factory=list)
|
||||
chunk_embeddings: List[np.ndarray] = Field(default_factory=list)
|
||||
|
||||
model_config = ConfigDict(arbitrary_types_allowed=True)
|
||||
storage: KnowledgeStorage = Field(default_factory=KnowledgeStorage)
|
||||
metadata: Dict[str, Any] = Field(default_factory=dict)
|
||||
|
||||
@abstractmethod
|
||||
def load_content(self) -> Dict[Any, str]:
|
||||
"""Load and preprocess content from the source."""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def add(self) -> None:
|
||||
"""Process content, chunk it, compute embeddings, and save them."""
|
||||
pass
|
||||
|
||||
def get_embeddings(self) -> List[np.ndarray]:
|
||||
"""Return the list of embeddings for the chunks."""
|
||||
return self.chunk_embeddings
|
||||
|
||||
def _chunk_text(self, text: str) -> List[str]:
|
||||
"""Utility method to split text into chunks."""
|
||||
return [
|
||||
text[i : i + self.chunk_size]
|
||||
for i in range(0, len(text), self.chunk_size - self.chunk_overlap)
|
||||
]
|
||||
|
||||
def save_documents(self, metadata: Dict[str, Any]):
|
||||
"""
|
||||
Save the documents to the storage.
|
||||
This method should be called after the chunks and embeddings are generated.
|
||||
"""
|
||||
self.storage.save(self.chunks, metadata)
|
||||
44
src/crewai/knowledge/source/csv_knowledge_source.py
Normal file
44
src/crewai/knowledge/source/csv_knowledge_source.py
Normal file
@@ -0,0 +1,44 @@
|
||||
import csv
|
||||
from typing import Dict, List
|
||||
from pathlib import Path
|
||||
|
||||
from crewai.knowledge.source.base_file_knowledge_source import BaseFileKnowledgeSource
|
||||
|
||||
|
||||
class CSVKnowledgeSource(BaseFileKnowledgeSource):
|
||||
"""A knowledge source that stores and queries CSV file content using embeddings."""
|
||||
|
||||
def load_content(self) -> Dict[Path, str]:
|
||||
"""Load and preprocess CSV file content."""
|
||||
super().load_content() # Validate the file path
|
||||
|
||||
file_path = (
|
||||
self.file_path[0] if isinstance(self.file_path, list) else self.file_path
|
||||
)
|
||||
file_path = Path(file_path) if isinstance(file_path, str) else file_path
|
||||
|
||||
with open(file_path, "r", encoding="utf-8") as csvfile:
|
||||
reader = csv.reader(csvfile)
|
||||
content = ""
|
||||
for row in reader:
|
||||
content += " ".join(row) + "\n"
|
||||
return {file_path: content}
|
||||
|
||||
def add(self) -> None:
|
||||
"""
|
||||
Add CSV file content to the knowledge source, chunk it, compute embeddings,
|
||||
and save the embeddings.
|
||||
"""
|
||||
content_str = (
|
||||
str(self.content) if isinstance(self.content, dict) else self.content
|
||||
)
|
||||
new_chunks = self._chunk_text(content_str)
|
||||
self.chunks.extend(new_chunks)
|
||||
self.save_documents(metadata=self.metadata)
|
||||
|
||||
def _chunk_text(self, text: str) -> List[str]:
|
||||
"""Utility method to split text into chunks."""
|
||||
return [
|
||||
text[i : i + self.chunk_size]
|
||||
for i in range(0, len(text), self.chunk_size - self.chunk_overlap)
|
||||
]
|
||||
56
src/crewai/knowledge/source/excel_knowledge_source.py
Normal file
56
src/crewai/knowledge/source/excel_knowledge_source.py
Normal file
@@ -0,0 +1,56 @@
|
||||
from typing import Dict, List
|
||||
from pathlib import Path
|
||||
from crewai.knowledge.source.base_file_knowledge_source import BaseFileKnowledgeSource
|
||||
|
||||
|
||||
class ExcelKnowledgeSource(BaseFileKnowledgeSource):
|
||||
"""A knowledge source that stores and queries Excel file content using embeddings."""
|
||||
|
||||
def load_content(self) -> Dict[Path, str]:
|
||||
"""Load and preprocess Excel file content."""
|
||||
super().load_content() # Validate the file path
|
||||
pd = self._import_dependencies()
|
||||
|
||||
if isinstance(self.file_path, list):
|
||||
file_path = self.file_path[0]
|
||||
else:
|
||||
file_path = self.file_path
|
||||
|
||||
df = pd.read_excel(file_path)
|
||||
content = df.to_csv(index=False)
|
||||
return {file_path: content}
|
||||
|
||||
def _import_dependencies(self):
|
||||
"""Dynamically import dependencies."""
|
||||
try:
|
||||
import openpyxl # noqa
|
||||
import pandas as pd
|
||||
|
||||
return pd
|
||||
except ImportError as e:
|
||||
missing_package = str(e).split()[-1]
|
||||
raise ImportError(
|
||||
f"{missing_package} is not installed. Please install it with: pip install {missing_package}"
|
||||
)
|
||||
|
||||
def add(self) -> None:
|
||||
"""
|
||||
Add Excel file content to the knowledge source, chunk it, compute embeddings,
|
||||
and save the embeddings.
|
||||
"""
|
||||
# Convert dictionary values to a single string if content is a dictionary
|
||||
if isinstance(self.content, dict):
|
||||
content_str = "\n".join(str(value) for value in self.content.values())
|
||||
else:
|
||||
content_str = str(self.content)
|
||||
|
||||
new_chunks = self._chunk_text(content_str)
|
||||
self.chunks.extend(new_chunks)
|
||||
self.save_documents(metadata=self.metadata)
|
||||
|
||||
def _chunk_text(self, text: str) -> List[str]:
|
||||
"""Utility method to split text into chunks."""
|
||||
return [
|
||||
text[i : i + self.chunk_size]
|
||||
for i in range(0, len(text), self.chunk_size - self.chunk_overlap)
|
||||
]
|
||||
54
src/crewai/knowledge/source/json_knowledge_source.py
Normal file
54
src/crewai/knowledge/source/json_knowledge_source.py
Normal file
@@ -0,0 +1,54 @@
|
||||
import json
|
||||
from typing import Any, Dict, List
|
||||
from pathlib import Path
|
||||
|
||||
from crewai.knowledge.source.base_file_knowledge_source import BaseFileKnowledgeSource
|
||||
|
||||
|
||||
class JSONKnowledgeSource(BaseFileKnowledgeSource):
|
||||
"""A knowledge source that stores and queries JSON file content using embeddings."""
|
||||
|
||||
def load_content(self) -> Dict[Path, str]:
|
||||
"""Load and preprocess JSON file content."""
|
||||
super().load_content() # Validate the file path
|
||||
paths = [self.file_path] if isinstance(self.file_path, Path) else self.file_path
|
||||
|
||||
content: Dict[Path, str] = {}
|
||||
for path in paths:
|
||||
with open(path, "r", encoding="utf-8") as json_file:
|
||||
data = json.load(json_file)
|
||||
content[path] = self._json_to_text(data)
|
||||
return content
|
||||
|
||||
def _json_to_text(self, data: Any, level: int = 0) -> str:
|
||||
"""Recursively convert JSON data to a text representation."""
|
||||
text = ""
|
||||
indent = " " * level
|
||||
if isinstance(data, dict):
|
||||
for key, value in data.items():
|
||||
text += f"{indent}{key}: {self._json_to_text(value, level + 1)}\n"
|
||||
elif isinstance(data, list):
|
||||
for item in data:
|
||||
text += f"{indent}- {self._json_to_text(item, level + 1)}\n"
|
||||
else:
|
||||
text += f"{str(data)}"
|
||||
return text
|
||||
|
||||
def add(self) -> None:
|
||||
"""
|
||||
Add JSON file content to the knowledge source, chunk it, compute embeddings,
|
||||
and save the embeddings.
|
||||
"""
|
||||
content_str = (
|
||||
str(self.content) if isinstance(self.content, dict) else self.content
|
||||
)
|
||||
new_chunks = self._chunk_text(content_str)
|
||||
self.chunks.extend(new_chunks)
|
||||
self.save_documents(metadata=self.metadata)
|
||||
|
||||
def _chunk_text(self, text: str) -> List[str]:
|
||||
"""Utility method to split text into chunks."""
|
||||
return [
|
||||
text[i : i + self.chunk_size]
|
||||
for i in range(0, len(text), self.chunk_size - self.chunk_overlap)
|
||||
]
|
||||
54
src/crewai/knowledge/source/pdf_knowledge_source.py
Normal file
54
src/crewai/knowledge/source/pdf_knowledge_source.py
Normal file
@@ -0,0 +1,54 @@
|
||||
from typing import List, Dict
|
||||
from pathlib import Path
|
||||
|
||||
from crewai.knowledge.source.base_file_knowledge_source import BaseFileKnowledgeSource
|
||||
|
||||
|
||||
class PDFKnowledgeSource(BaseFileKnowledgeSource):
|
||||
"""A knowledge source that stores and queries PDF file content using embeddings."""
|
||||
|
||||
def load_content(self) -> Dict[Path, str]:
|
||||
"""Load and preprocess PDF file content."""
|
||||
super().load_content() # Validate the file paths
|
||||
pdfplumber = self._import_pdfplumber()
|
||||
|
||||
paths = [self.file_path] if isinstance(self.file_path, Path) else self.file_path
|
||||
content = {}
|
||||
|
||||
for path in paths:
|
||||
text = ""
|
||||
with pdfplumber.open(path) as pdf:
|
||||
for page in pdf.pages:
|
||||
page_text = page.extract_text()
|
||||
if page_text:
|
||||
text += page_text + "\n"
|
||||
content[path] = text
|
||||
return content
|
||||
|
||||
def _import_pdfplumber(self):
|
||||
"""Dynamically import pdfplumber."""
|
||||
try:
|
||||
import pdfplumber
|
||||
|
||||
return pdfplumber
|
||||
except ImportError:
|
||||
raise ImportError(
|
||||
"pdfplumber is not installed. Please install it with: pip install pdfplumber"
|
||||
)
|
||||
|
||||
def add(self) -> None:
|
||||
"""
|
||||
Add PDF file content to the knowledge source, chunk it, compute embeddings,
|
||||
and save the embeddings.
|
||||
"""
|
||||
for _, text in self.content.items():
|
||||
new_chunks = self._chunk_text(text)
|
||||
self.chunks.extend(new_chunks)
|
||||
self.save_documents(metadata=self.metadata)
|
||||
|
||||
def _chunk_text(self, text: str) -> List[str]:
|
||||
"""Utility method to split text into chunks."""
|
||||
return [
|
||||
text[i : i + self.chunk_size]
|
||||
for i in range(0, len(text), self.chunk_size - self.chunk_overlap)
|
||||
]
|
||||
33
src/crewai/knowledge/source/string_knowledge_source.py
Normal file
33
src/crewai/knowledge/source/string_knowledge_source.py
Normal file
@@ -0,0 +1,33 @@
|
||||
from typing import List
|
||||
|
||||
from pydantic import Field
|
||||
|
||||
from crewai.knowledge.source.base_knowledge_source import BaseKnowledgeSource
|
||||
|
||||
|
||||
class StringKnowledgeSource(BaseKnowledgeSource):
|
||||
"""A knowledge source that stores and queries plain text content using embeddings."""
|
||||
|
||||
content: str = Field(...)
|
||||
|
||||
def model_post_init(self, _):
|
||||
"""Post-initialization method to validate content."""
|
||||
self.load_content()
|
||||
|
||||
def load_content(self):
|
||||
"""Validate string content."""
|
||||
if not isinstance(self.content, str):
|
||||
raise ValueError("StringKnowledgeSource only accepts string content")
|
||||
|
||||
def add(self) -> None:
|
||||
"""Add string content to the knowledge source, chunk it, compute embeddings, and save them."""
|
||||
new_chunks = self._chunk_text(self.content)
|
||||
self.chunks.extend(new_chunks)
|
||||
self.save_documents(metadata=self.metadata)
|
||||
|
||||
def _chunk_text(self, text: str) -> List[str]:
|
||||
"""Utility method to split text into chunks."""
|
||||
return [
|
||||
text[i : i + self.chunk_size]
|
||||
for i in range(0, len(text), self.chunk_size - self.chunk_overlap)
|
||||
]
|
||||
35
src/crewai/knowledge/source/text_file_knowledge_source.py
Normal file
35
src/crewai/knowledge/source/text_file_knowledge_source.py
Normal file
@@ -0,0 +1,35 @@
|
||||
from typing import Dict, List
|
||||
from pathlib import Path
|
||||
|
||||
from crewai.knowledge.source.base_file_knowledge_source import BaseFileKnowledgeSource
|
||||
|
||||
|
||||
class TextFileKnowledgeSource(BaseFileKnowledgeSource):
|
||||
"""A knowledge source that stores and queries text file content using embeddings."""
|
||||
|
||||
def load_content(self) -> Dict[Path, str]:
|
||||
"""Load and preprocess text file content."""
|
||||
super().load_content()
|
||||
paths = [self.file_path] if isinstance(self.file_path, Path) else self.file_path
|
||||
content = {}
|
||||
for path in paths:
|
||||
with path.open("r", encoding="utf-8") as f:
|
||||
content[path] = f.read() # type: ignore
|
||||
return content
|
||||
|
||||
def add(self) -> None:
|
||||
"""
|
||||
Add text file content to the knowledge source, chunk it, compute embeddings,
|
||||
and save the embeddings.
|
||||
"""
|
||||
for _, text in self.content.items():
|
||||
new_chunks = self._chunk_text(text)
|
||||
self.chunks.extend(new_chunks)
|
||||
self.save_documents(metadata=self.metadata)
|
||||
|
||||
def _chunk_text(self, text: str) -> List[str]:
|
||||
"""Utility method to split text into chunks."""
|
||||
return [
|
||||
text[i : i + self.chunk_size]
|
||||
for i in range(0, len(text), self.chunk_size - self.chunk_overlap)
|
||||
]
|
||||
0
src/crewai/knowledge/storage/__init__.py
Normal file
0
src/crewai/knowledge/storage/__init__.py
Normal file
29
src/crewai/knowledge/storage/base_knowledge_storage.py
Normal file
29
src/crewai/knowledge/storage/base_knowledge_storage.py
Normal file
@@ -0,0 +1,29 @@
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import Dict, Any, List, Optional
|
||||
|
||||
|
||||
class BaseKnowledgeStorage(ABC):
|
||||
"""Abstract base class for knowledge storage implementations."""
|
||||
|
||||
@abstractmethod
|
||||
def search(
|
||||
self,
|
||||
query: List[str],
|
||||
limit: int = 3,
|
||||
filter: Optional[dict] = None,
|
||||
score_threshold: float = 0.35,
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""Search for documents in the knowledge base."""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def save(
|
||||
self, documents: List[str], metadata: Dict[str, Any] | List[Dict[str, Any]]
|
||||
) -> None:
|
||||
"""Save documents to the knowledge base."""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def reset(self) -> None:
|
||||
"""Reset the knowledge base."""
|
||||
pass
|
||||
132
src/crewai/knowledge/storage/knowledge_storage.py
Normal file
132
src/crewai/knowledge/storage/knowledge_storage.py
Normal file
@@ -0,0 +1,132 @@
|
||||
import contextlib
|
||||
import io
|
||||
import logging
|
||||
import chromadb
|
||||
import os
|
||||
from crewai.utilities.paths import db_storage_path
|
||||
from typing import Optional, List
|
||||
from typing import Dict, Any
|
||||
from crewai.utilities import EmbeddingConfigurator
|
||||
from crewai.knowledge.storage.base_knowledge_storage import BaseKnowledgeStorage
|
||||
import hashlib
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def suppress_logging(
|
||||
logger_name="chromadb.segment.impl.vector.local_persistent_hnsw",
|
||||
level=logging.ERROR,
|
||||
):
|
||||
logger = logging.getLogger(logger_name)
|
||||
original_level = logger.getEffectiveLevel()
|
||||
logger.setLevel(level)
|
||||
with (
|
||||
contextlib.redirect_stdout(io.StringIO()),
|
||||
contextlib.redirect_stderr(io.StringIO()),
|
||||
contextlib.suppress(UserWarning),
|
||||
):
|
||||
yield
|
||||
logger.setLevel(original_level)
|
||||
|
||||
|
||||
class KnowledgeStorage(BaseKnowledgeStorage):
|
||||
"""
|
||||
Extends Storage to handle embeddings for memory entries, improving
|
||||
search efficiency.
|
||||
"""
|
||||
|
||||
collection: Optional[chromadb.Collection] = None
|
||||
|
||||
def __init__(self, embedder_config: Optional[Dict[str, Any]] = None):
|
||||
self._initialize_app(embedder_config or {})
|
||||
|
||||
def search(
|
||||
self,
|
||||
query: List[str],
|
||||
limit: int = 3,
|
||||
filter: Optional[dict] = None,
|
||||
score_threshold: float = 0.35,
|
||||
) -> List[Dict[str, Any]]:
|
||||
with suppress_logging():
|
||||
if self.collection:
|
||||
fetched = self.collection.query(
|
||||
query_texts=query,
|
||||
n_results=limit,
|
||||
where=filter,
|
||||
)
|
||||
results = []
|
||||
for i in range(len(fetched["ids"][0])): # type: ignore
|
||||
result = {
|
||||
"id": fetched["ids"][0][i], # type: ignore
|
||||
"metadata": fetched["metadatas"][0][i], # type: ignore
|
||||
"context": fetched["documents"][0][i], # type: ignore
|
||||
"score": fetched["distances"][0][i], # type: ignore
|
||||
}
|
||||
if result["score"] >= score_threshold: # type: ignore
|
||||
results.append(result)
|
||||
return results
|
||||
else:
|
||||
raise Exception("Collection not initialized")
|
||||
|
||||
def _initialize_app(self, embedder_config: Optional[Dict[str, Any]] = None):
|
||||
import chromadb
|
||||
from chromadb.config import Settings
|
||||
|
||||
self._set_embedder_config(embedder_config)
|
||||
|
||||
chroma_client = chromadb.PersistentClient(
|
||||
path=f"{db_storage_path()}/knowledge",
|
||||
settings=Settings(allow_reset=True),
|
||||
)
|
||||
|
||||
self.app = chroma_client
|
||||
|
||||
try:
|
||||
self.collection = self.app.get_or_create_collection(name="knowledge")
|
||||
except Exception:
|
||||
raise Exception("Failed to create or get collection")
|
||||
|
||||
def reset(self):
|
||||
if self.app:
|
||||
self.app.reset()
|
||||
|
||||
def save(
|
||||
self, documents: List[str], metadata: Dict[str, Any] | List[Dict[str, Any]]
|
||||
):
|
||||
if self.collection:
|
||||
metadatas = [metadata] if isinstance(metadata, dict) else metadata
|
||||
|
||||
ids = [
|
||||
hashlib.sha256(doc.encode("utf-8")).hexdigest() for doc in documents
|
||||
]
|
||||
|
||||
self.collection.upsert(
|
||||
documents=documents,
|
||||
metadatas=metadatas,
|
||||
ids=ids,
|
||||
)
|
||||
else:
|
||||
raise Exception("Collection not initialized")
|
||||
|
||||
def _create_default_embedding_function(self):
|
||||
from chromadb.utils.embedding_functions.openai_embedding_function import (
|
||||
OpenAIEmbeddingFunction,
|
||||
)
|
||||
|
||||
return OpenAIEmbeddingFunction(
|
||||
api_key=os.getenv("OPENAI_API_KEY"), model_name="text-embedding-3-small"
|
||||
)
|
||||
|
||||
def _set_embedder_config(
|
||||
self, embedder_config: Optional[Dict[str, Any]] = None
|
||||
) -> None:
|
||||
"""Set the embedding configuration for the knowledge storage.
|
||||
|
||||
Args:
|
||||
embedder_config (Optional[Dict[str, Any]]): Configuration dictionary for the embedder.
|
||||
If None or empty, defaults to the default embedding function.
|
||||
"""
|
||||
self.embedder_config = (
|
||||
EmbeddingConfigurator().configure_embedder(embedder_config)
|
||||
if embedder_config
|
||||
else self._create_default_embedding_function()
|
||||
)
|
||||
@@ -1,7 +1,10 @@
|
||||
import io
|
||||
import logging
|
||||
import sys
|
||||
import warnings
|
||||
from contextlib import contextmanager
|
||||
from typing import Any, Dict, List, Optional, Union
|
||||
import logging
|
||||
import warnings
|
||||
|
||||
import litellm
|
||||
from litellm import get_supported_openai_params
|
||||
|
||||
@@ -9,9 +12,6 @@ from crewai.utilities.exceptions.context_window_exceeding_exception import (
|
||||
LLMContextLengthExceededException,
|
||||
)
|
||||
|
||||
import sys
|
||||
import io
|
||||
|
||||
|
||||
class FilteredStream(io.StringIO):
|
||||
def write(self, s):
|
||||
@@ -118,12 +118,12 @@ class LLM:
|
||||
|
||||
litellm.drop_params = True
|
||||
litellm.set_verbose = False
|
||||
litellm.callbacks = callbacks
|
||||
self.set_callbacks(callbacks)
|
||||
|
||||
def call(self, messages: List[Dict[str, str]], callbacks: List[Any] = []) -> str:
|
||||
with suppress_warnings():
|
||||
if callbacks and len(callbacks) > 0:
|
||||
litellm.callbacks = callbacks
|
||||
self.set_callbacks(callbacks)
|
||||
|
||||
try:
|
||||
params = {
|
||||
@@ -181,3 +181,15 @@ class LLM:
|
||||
def get_context_window_size(self) -> int:
|
||||
# Only using 75% of the context window size to avoid cutting the message in the middle
|
||||
return int(LLM_CONTEXT_WINDOW_SIZES.get(self.model, 8192) * 0.75)
|
||||
|
||||
def set_callbacks(self, callbacks: List[Any]):
|
||||
callback_types = [type(callback) for callback in callbacks]
|
||||
for callback in litellm.success_callback[:]:
|
||||
if type(callback) in callback_types:
|
||||
litellm.success_callback.remove(callback)
|
||||
|
||||
for callback in litellm._async_success_callback[:]:
|
||||
if type(callback) in callback_types:
|
||||
litellm._async_success_callback.remove(callback)
|
||||
|
||||
litellm.callbacks = callbacks
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
from .entity.entity_memory import EntityMemory
|
||||
from .long_term.long_term_memory import LongTermMemory
|
||||
from .short_term.short_term_memory import ShortTermMemory
|
||||
from .user.user_memory import UserMemory
|
||||
|
||||
__all__ = ["EntityMemory", "LongTermMemory", "ShortTermMemory"]
|
||||
__all__ = ["UserMemory", "EntityMemory", "LongTermMemory", "ShortTermMemory"]
|
||||
|
||||
@@ -1,13 +1,25 @@
|
||||
from typing import Optional
|
||||
from typing import Optional, Dict, Any
|
||||
|
||||
from crewai.memory import EntityMemory, LongTermMemory, ShortTermMemory
|
||||
from crewai.memory import EntityMemory, LongTermMemory, ShortTermMemory, UserMemory
|
||||
|
||||
|
||||
class ContextualMemory:
|
||||
def __init__(self, stm: ShortTermMemory, ltm: LongTermMemory, em: EntityMemory):
|
||||
def __init__(
|
||||
self,
|
||||
memory_config: Optional[Dict[str, Any]],
|
||||
stm: ShortTermMemory,
|
||||
ltm: LongTermMemory,
|
||||
em: EntityMemory,
|
||||
um: UserMemory,
|
||||
):
|
||||
if memory_config is not None:
|
||||
self.memory_provider = memory_config.get("provider")
|
||||
else:
|
||||
self.memory_provider = None
|
||||
self.stm = stm
|
||||
self.ltm = ltm
|
||||
self.em = em
|
||||
self.um = um
|
||||
|
||||
def build_context_for_task(self, task, context) -> str:
|
||||
"""
|
||||
@@ -23,6 +35,8 @@ class ContextualMemory:
|
||||
context.append(self._fetch_ltm_context(task.description))
|
||||
context.append(self._fetch_stm_context(query))
|
||||
context.append(self._fetch_entity_context(query))
|
||||
if self.memory_provider == "mem0":
|
||||
context.append(self._fetch_user_context(query))
|
||||
return "\n".join(filter(None, context))
|
||||
|
||||
def _fetch_stm_context(self, query) -> str:
|
||||
@@ -32,9 +46,11 @@ class ContextualMemory:
|
||||
"""
|
||||
stm_results = self.stm.search(query)
|
||||
formatted_results = "\n".join(
|
||||
[f"- {result['context']}" for result in stm_results]
|
||||
[
|
||||
f"- {result['memory'] if self.memory_provider == 'mem0' else result['context']}"
|
||||
for result in stm_results
|
||||
]
|
||||
)
|
||||
print("formatted_results stm", formatted_results)
|
||||
return f"Recent Insights:\n{formatted_results}" if stm_results else ""
|
||||
|
||||
def _fetch_ltm_context(self, task) -> Optional[str]:
|
||||
@@ -54,8 +70,6 @@ class ContextualMemory:
|
||||
formatted_results = list(dict.fromkeys(formatted_results))
|
||||
formatted_results = "\n".join([f"- {result}" for result in formatted_results]) # type: ignore # Incompatible types in assignment (expression has type "str", variable has type "list[str]")
|
||||
|
||||
print("formatted_results ltm", formatted_results)
|
||||
|
||||
return f"Historical Data:\n{formatted_results}" if ltm_results else ""
|
||||
|
||||
def _fetch_entity_context(self, query) -> str:
|
||||
@@ -65,7 +79,26 @@ class ContextualMemory:
|
||||
"""
|
||||
em_results = self.em.search(query)
|
||||
formatted_results = "\n".join(
|
||||
[f"- {result['context']}" for result in em_results] # type: ignore # Invalid index type "str" for "str"; expected type "SupportsIndex | slice"
|
||||
[
|
||||
f"- {result['memory'] if self.memory_provider == 'mem0' else result['context']}"
|
||||
for result in em_results
|
||||
] # type: ignore # Invalid index type "str" for "str"; expected type "SupportsIndex | slice"
|
||||
)
|
||||
print("formatted_results em", formatted_results)
|
||||
return f"Entities:\n{formatted_results}" if em_results else ""
|
||||
|
||||
def _fetch_user_context(self, query: str) -> str:
|
||||
"""
|
||||
Fetches and formats relevant user information from User Memory.
|
||||
Args:
|
||||
query (str): The search query to find relevant user memories.
|
||||
Returns:
|
||||
str: Formatted user memories as bullet points, or an empty string if none found.
|
||||
"""
|
||||
user_memories = self.um.search(query)
|
||||
if not user_memories:
|
||||
return ""
|
||||
|
||||
formatted_memories = "\n".join(
|
||||
f"- {result['memory']}" for result in user_memories
|
||||
)
|
||||
return f"User memories/preferences:\n{formatted_memories}"
|
||||
|
||||
@@ -11,21 +11,43 @@ class EntityMemory(Memory):
|
||||
"""
|
||||
|
||||
def __init__(self, crew=None, embedder_config=None, storage=None):
|
||||
storage = (
|
||||
storage
|
||||
if storage
|
||||
else RAGStorage(
|
||||
type="entities",
|
||||
allow_reset=True,
|
||||
embedder_config=embedder_config,
|
||||
crew=crew,
|
||||
if hasattr(crew, "memory_config") and crew.memory_config is not None:
|
||||
self.memory_provider = crew.memory_config.get("provider")
|
||||
else:
|
||||
self.memory_provider = None
|
||||
|
||||
if self.memory_provider == "mem0":
|
||||
try:
|
||||
from crewai.memory.storage.mem0_storage import Mem0Storage
|
||||
except ImportError:
|
||||
raise ImportError(
|
||||
"Mem0 is not installed. Please install it with `pip install mem0ai`."
|
||||
)
|
||||
storage = Mem0Storage(type="entities", crew=crew)
|
||||
else:
|
||||
storage = (
|
||||
storage
|
||||
if storage
|
||||
else RAGStorage(
|
||||
type="entities",
|
||||
allow_reset=True,
|
||||
embedder_config=embedder_config,
|
||||
crew=crew,
|
||||
)
|
||||
)
|
||||
)
|
||||
super().__init__(storage)
|
||||
|
||||
def save(self, item: EntityMemoryItem) -> None: # type: ignore # BUG?: Signature of "save" incompatible with supertype "Memory"
|
||||
"""Saves an entity item into the SQLite storage."""
|
||||
data = f"{item.name}({item.type}): {item.description}"
|
||||
if self.memory_provider == "mem0":
|
||||
data = f"""
|
||||
Remember details about the following entity:
|
||||
Name: {item.name}
|
||||
Type: {item.type}
|
||||
Entity Description: {item.description}
|
||||
"""
|
||||
else:
|
||||
data = f"{item.name}({item.type}): {item.description}"
|
||||
super().save(data, item.metadata)
|
||||
|
||||
def reset(self) -> None:
|
||||
|
||||
@@ -23,5 +23,12 @@ class Memory:
|
||||
|
||||
self.storage.save(value, metadata)
|
||||
|
||||
def search(self, query: str) -> List[Dict[str, Any]]:
|
||||
return self.storage.search(query)
|
||||
def search(
|
||||
self,
|
||||
query: str,
|
||||
limit: int = 3,
|
||||
score_threshold: float = 0.35,
|
||||
) -> List[Any]:
|
||||
return self.storage.search(
|
||||
query=query, limit=limit, score_threshold=score_threshold
|
||||
)
|
||||
|
||||
@@ -14,13 +14,27 @@ class ShortTermMemory(Memory):
|
||||
"""
|
||||
|
||||
def __init__(self, crew=None, embedder_config=None, storage=None):
|
||||
storage = (
|
||||
storage
|
||||
if storage
|
||||
else RAGStorage(
|
||||
type="short_term", embedder_config=embedder_config, crew=crew
|
||||
if hasattr(crew, "memory_config") and crew.memory_config is not None:
|
||||
self.memory_provider = crew.memory_config.get("provider")
|
||||
else:
|
||||
self.memory_provider = None
|
||||
|
||||
if self.memory_provider == "mem0":
|
||||
try:
|
||||
from crewai.memory.storage.mem0_storage import Mem0Storage
|
||||
except ImportError:
|
||||
raise ImportError(
|
||||
"Mem0 is not installed. Please install it with `pip install mem0ai`."
|
||||
)
|
||||
storage = Mem0Storage(type="short_term", crew=crew)
|
||||
else:
|
||||
storage = (
|
||||
storage
|
||||
if storage
|
||||
else RAGStorage(
|
||||
type="short_term", embedder_config=embedder_config, crew=crew
|
||||
)
|
||||
)
|
||||
)
|
||||
super().__init__(storage)
|
||||
|
||||
def save(
|
||||
@@ -30,11 +44,20 @@ class ShortTermMemory(Memory):
|
||||
agent: Optional[str] = None,
|
||||
) -> None:
|
||||
item = ShortTermMemoryItem(data=value, metadata=metadata, agent=agent)
|
||||
if self.memory_provider == "mem0":
|
||||
item.data = f"Remember the following insights from Agent run: {item.data}"
|
||||
|
||||
super().save(value=item.data, metadata=item.metadata, agent=item.agent)
|
||||
|
||||
def search(self, query: str, score_threshold: float = 0.35):
|
||||
return self.storage.search(query=query, score_threshold=score_threshold) # type: ignore # BUG? The reference is to the parent class, but the parent class does not have this parameters
|
||||
def search(
|
||||
self,
|
||||
query: str,
|
||||
limit: int = 3,
|
||||
score_threshold: float = 0.35,
|
||||
):
|
||||
return self.storage.search(
|
||||
query=query, limit=limit, score_threshold=score_threshold
|
||||
) # type: ignore # BUG? The reference is to the parent class, but the parent class does not have this parameters
|
||||
|
||||
def reset(self) -> None:
|
||||
try:
|
||||
|
||||
@@ -7,8 +7,10 @@ class Storage:
|
||||
def save(self, value: Any, metadata: Dict[str, Any]) -> None:
|
||||
pass
|
||||
|
||||
def search(self, key: str) -> List[Dict[str, Any]]: # type: ignore
|
||||
pass
|
||||
def search(
|
||||
self, query: str, limit: int, score_threshold: float
|
||||
) -> Dict[str, Any] | List[Any]:
|
||||
return {}
|
||||
|
||||
def reset(self) -> None:
|
||||
pass
|
||||
|
||||
@@ -103,7 +103,7 @@ class KickoffTaskOutputsSQLiteStorage:
|
||||
else value
|
||||
)
|
||||
|
||||
query = f"UPDATE latest_kickoff_task_outputs SET {', '.join(fields)} WHERE task_index = ?"
|
||||
query = f"UPDATE latest_kickoff_task_outputs SET {', '.join(fields)} WHERE task_index = ?" # nosec
|
||||
values.append(task_index)
|
||||
|
||||
cursor.execute(query, tuple(values))
|
||||
|
||||
@@ -83,7 +83,7 @@ class LTMSQLiteStorage:
|
||||
WHERE task_description = ?
|
||||
ORDER BY datetime DESC, score ASC
|
||||
LIMIT {latest_n}
|
||||
""",
|
||||
""", # nosec
|
||||
(task_description,),
|
||||
)
|
||||
rows = cursor.fetchall()
|
||||
|
||||
104
src/crewai/memory/storage/mem0_storage.py
Normal file
104
src/crewai/memory/storage/mem0_storage.py
Normal file
@@ -0,0 +1,104 @@
|
||||
import os
|
||||
from typing import Any, Dict, List
|
||||
|
||||
from mem0 import MemoryClient
|
||||
from crewai.memory.storage.interface import Storage
|
||||
|
||||
|
||||
class Mem0Storage(Storage):
|
||||
"""
|
||||
Extends Storage to handle embedding and searching across entities using Mem0.
|
||||
"""
|
||||
|
||||
def __init__(self, type, crew=None):
|
||||
super().__init__()
|
||||
|
||||
if type not in ["user", "short_term", "long_term", "entities"]:
|
||||
raise ValueError("Invalid type for Mem0Storage. Must be 'user' or 'agent'.")
|
||||
|
||||
self.memory_type = type
|
||||
self.crew = crew
|
||||
self.memory_config = crew.memory_config
|
||||
|
||||
# User ID is required for user memory type "user" since it's used as a unique identifier for the user.
|
||||
user_id = self._get_user_id()
|
||||
if type == "user" and not user_id:
|
||||
raise ValueError("User ID is required for user memory type")
|
||||
|
||||
# API key in memory config overrides the environment variable
|
||||
mem0_api_key = self.memory_config.get("config", {}).get("api_key") or os.getenv(
|
||||
"MEM0_API_KEY"
|
||||
)
|
||||
self.memory = MemoryClient(api_key=mem0_api_key)
|
||||
|
||||
def _sanitize_role(self, role: str) -> str:
|
||||
"""
|
||||
Sanitizes agent roles to ensure valid directory names.
|
||||
"""
|
||||
return role.replace("\n", "").replace(" ", "_").replace("/", "_")
|
||||
|
||||
def save(self, value: Any, metadata: Dict[str, Any]) -> None:
|
||||
user_id = self._get_user_id()
|
||||
agent_name = self._get_agent_name()
|
||||
if self.memory_type == "user":
|
||||
self.memory.add(value, user_id=user_id, metadata={**metadata})
|
||||
elif self.memory_type == "short_term":
|
||||
agent_name = self._get_agent_name()
|
||||
self.memory.add(
|
||||
value, agent_id=agent_name, metadata={"type": "short_term", **metadata}
|
||||
)
|
||||
elif self.memory_type == "long_term":
|
||||
agent_name = self._get_agent_name()
|
||||
self.memory.add(
|
||||
value,
|
||||
agent_id=agent_name,
|
||||
infer=False,
|
||||
metadata={"type": "long_term", **metadata},
|
||||
)
|
||||
elif self.memory_type == "entities":
|
||||
entity_name = None
|
||||
self.memory.add(
|
||||
value, user_id=entity_name, metadata={"type": "entity", **metadata}
|
||||
)
|
||||
|
||||
def search(
|
||||
self,
|
||||
query: str,
|
||||
limit: int = 3,
|
||||
score_threshold: float = 0.35,
|
||||
) -> List[Any]:
|
||||
params = {"query": query, "limit": limit}
|
||||
if self.memory_type == "user":
|
||||
user_id = self._get_user_id()
|
||||
params["user_id"] = user_id
|
||||
elif self.memory_type == "short_term":
|
||||
agent_name = self._get_agent_name()
|
||||
params["agent_id"] = agent_name
|
||||
params["metadata"] = {"type": "short_term"}
|
||||
elif self.memory_type == "long_term":
|
||||
agent_name = self._get_agent_name()
|
||||
params["agent_id"] = agent_name
|
||||
params["metadata"] = {"type": "long_term"}
|
||||
elif self.memory_type == "entities":
|
||||
agent_name = self._get_agent_name()
|
||||
params["agent_id"] = agent_name
|
||||
params["metadata"] = {"type": "entity"}
|
||||
|
||||
# Discard the filters for now since we create the filters
|
||||
# automatically when the crew is created.
|
||||
results = self.memory.search(**params)
|
||||
return [r for r in results if r["score"] >= score_threshold]
|
||||
|
||||
def _get_user_id(self):
|
||||
if self.memory_type == "user":
|
||||
if hasattr(self, "memory_config") and self.memory_config is not None:
|
||||
return self.memory_config.get("config", {}).get("user_id")
|
||||
else:
|
||||
return None
|
||||
return None
|
||||
|
||||
def _get_agent_name(self):
|
||||
agents = self.crew.agents if self.crew else []
|
||||
agents = [self._sanitize_role(agent.role) for agent in agents]
|
||||
agents = "_".join(agents)
|
||||
return agents
|
||||
@@ -4,13 +4,12 @@ import logging
|
||||
import os
|
||||
import shutil
|
||||
import uuid
|
||||
from typing import Any, Dict, List, Optional, cast
|
||||
|
||||
from chromadb import Documents, EmbeddingFunction, Embeddings
|
||||
from typing import Any, Dict, List, Optional
|
||||
from chromadb.api import ClientAPI
|
||||
from chromadb.api.types import validate_embedding_function
|
||||
from crewai.memory.storage.base_rag_storage import BaseRAGStorage
|
||||
from crewai.utilities.paths import db_storage_path
|
||||
from crewai.utilities import EmbeddingConfigurator
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
@@ -51,117 +50,8 @@ class RAGStorage(BaseRAGStorage):
|
||||
self._initialize_app()
|
||||
|
||||
def _set_embedder_config(self):
|
||||
import chromadb.utils.embedding_functions as embedding_functions
|
||||
|
||||
if self.embedder_config is None:
|
||||
self.embedder_config = self._create_default_embedding_function()
|
||||
|
||||
if isinstance(self.embedder_config, dict):
|
||||
provider = self.embedder_config.get("provider")
|
||||
config = self.embedder_config.get("config", {})
|
||||
model_name = config.get("model")
|
||||
if provider == "openai":
|
||||
self.embedder_config = embedding_functions.OpenAIEmbeddingFunction(
|
||||
api_key=config.get("api_key") or os.getenv("OPENAI_API_KEY"),
|
||||
model_name=model_name,
|
||||
)
|
||||
elif provider == "azure":
|
||||
self.embedder_config = embedding_functions.OpenAIEmbeddingFunction(
|
||||
api_key=config.get("api_key"),
|
||||
api_base=config.get("api_base"),
|
||||
api_type=config.get("api_type", "azure"),
|
||||
api_version=config.get("api_version"),
|
||||
model_name=model_name,
|
||||
)
|
||||
elif provider == "ollama":
|
||||
from openai import OpenAI
|
||||
|
||||
class OllamaEmbeddingFunction(EmbeddingFunction):
|
||||
def __call__(self, input: Documents) -> Embeddings:
|
||||
client = OpenAI(
|
||||
base_url="http://localhost:11434/v1",
|
||||
api_key=config.get("api_key", "ollama"),
|
||||
)
|
||||
try:
|
||||
response = client.embeddings.create(
|
||||
input=input, model=model_name
|
||||
)
|
||||
embeddings = [item.embedding for item in response.data]
|
||||
return cast(Embeddings, embeddings)
|
||||
except Exception as e:
|
||||
raise e
|
||||
|
||||
self.embedder_config = OllamaEmbeddingFunction()
|
||||
elif provider == "vertexai":
|
||||
self.embedder_config = (
|
||||
embedding_functions.GoogleVertexEmbeddingFunction(
|
||||
model_name=model_name,
|
||||
api_key=config.get("api_key"),
|
||||
)
|
||||
)
|
||||
elif provider == "google":
|
||||
self.embedder_config = (
|
||||
embedding_functions.GoogleGenerativeAiEmbeddingFunction(
|
||||
model_name=model_name,
|
||||
api_key=config.get("api_key"),
|
||||
)
|
||||
)
|
||||
elif provider == "cohere":
|
||||
self.embedder_config = embedding_functions.CohereEmbeddingFunction(
|
||||
model_name=model_name,
|
||||
api_key=config.get("api_key"),
|
||||
)
|
||||
elif provider == "huggingface":
|
||||
self.embedder_config = embedding_functions.HuggingFaceEmbeddingServer(
|
||||
url=config.get("api_url"),
|
||||
)
|
||||
elif provider == "watson":
|
||||
try:
|
||||
import ibm_watsonx_ai.foundation_models as watson_models
|
||||
from ibm_watsonx_ai import Credentials
|
||||
from ibm_watsonx_ai.metanames import (
|
||||
EmbedTextParamsMetaNames as EmbedParams,
|
||||
)
|
||||
except ImportError as e:
|
||||
raise ImportError(
|
||||
"IBM Watson dependencies are not installed. Please install them to use Watson embedding."
|
||||
) from e
|
||||
|
||||
class WatsonEmbeddingFunction(EmbeddingFunction):
|
||||
def __call__(self, input: Documents) -> Embeddings:
|
||||
if isinstance(input, str):
|
||||
input = [input]
|
||||
|
||||
embed_params = {
|
||||
EmbedParams.TRUNCATE_INPUT_TOKENS: 3,
|
||||
EmbedParams.RETURN_OPTIONS: {"input_text": True},
|
||||
}
|
||||
|
||||
embedding = watson_models.Embeddings(
|
||||
model_id=config.get("model"),
|
||||
params=embed_params,
|
||||
credentials=Credentials(
|
||||
api_key=config.get("api_key"), url=config.get("api_url")
|
||||
),
|
||||
project_id=config.get("project_id"),
|
||||
)
|
||||
|
||||
try:
|
||||
embeddings = embedding.embed_documents(input)
|
||||
return cast(Embeddings, embeddings)
|
||||
|
||||
except Exception as e:
|
||||
print("Error during Watson embedding:", e)
|
||||
raise e
|
||||
|
||||
self.embedder_config = WatsonEmbeddingFunction()
|
||||
else:
|
||||
raise Exception(
|
||||
f"Unsupported embedding provider: {provider}, supported providers: [openai, azure, ollama, vertexai, google, cohere, huggingface, watson]"
|
||||
)
|
||||
else:
|
||||
validate_embedding_function(self.embedder_config)
|
||||
self.embedder_config = self.embedder_config
|
||||
configurator = EmbeddingConfigurator()
|
||||
self.embedder_config = configurator.configure_embedder(self.embedder_config)
|
||||
|
||||
def _initialize_app(self):
|
||||
import chromadb
|
||||
@@ -253,8 +143,10 @@ class RAGStorage(BaseRAGStorage):
|
||||
)
|
||||
|
||||
def _create_default_embedding_function(self):
|
||||
import chromadb.utils.embedding_functions as embedding_functions
|
||||
from chromadb.utils.embedding_functions.openai_embedding_function import (
|
||||
OpenAIEmbeddingFunction,
|
||||
)
|
||||
|
||||
return embedding_functions.OpenAIEmbeddingFunction(
|
||||
return OpenAIEmbeddingFunction(
|
||||
api_key=os.getenv("OPENAI_API_KEY"), model_name="text-embedding-3-small"
|
||||
)
|
||||
|
||||
0
src/crewai/memory/user/__init__.py
Normal file
0
src/crewai/memory/user/__init__.py
Normal file
45
src/crewai/memory/user/user_memory.py
Normal file
45
src/crewai/memory/user/user_memory.py
Normal file
@@ -0,0 +1,45 @@
|
||||
from typing import Any, Dict, Optional
|
||||
|
||||
from crewai.memory.memory import Memory
|
||||
|
||||
|
||||
class UserMemory(Memory):
|
||||
"""
|
||||
UserMemory class for handling user memory storage and retrieval.
|
||||
Inherits from the Memory class and utilizes an instance of a class that
|
||||
adheres to the Storage for data storage, specifically working with
|
||||
MemoryItem instances.
|
||||
"""
|
||||
|
||||
def __init__(self, crew=None):
|
||||
try:
|
||||
from crewai.memory.storage.mem0_storage import Mem0Storage
|
||||
except ImportError:
|
||||
raise ImportError(
|
||||
"Mem0 is not installed. Please install it with `pip install mem0ai`."
|
||||
)
|
||||
storage = Mem0Storage(type="user", crew=crew)
|
||||
super().__init__(storage)
|
||||
|
||||
def save(
|
||||
self,
|
||||
value,
|
||||
metadata: Optional[Dict[str, Any]] = None,
|
||||
agent: Optional[str] = None,
|
||||
) -> None:
|
||||
# TODO: Change this function since we want to take care of the case where we save memories for the usr
|
||||
data = f"Remember the details about the user: {value}"
|
||||
super().save(data, metadata)
|
||||
|
||||
def search(
|
||||
self,
|
||||
query: str,
|
||||
limit: int = 3,
|
||||
score_threshold: float = 0.35,
|
||||
):
|
||||
results = super().search(
|
||||
query=query,
|
||||
limit=limit,
|
||||
score_threshold=score_threshold,
|
||||
)
|
||||
return results
|
||||
8
src/crewai/memory/user/user_memory_item.py
Normal file
8
src/crewai/memory/user/user_memory_item.py
Normal file
@@ -0,0 +1,8 @@
|
||||
from typing import Any, Dict, Optional
|
||||
|
||||
|
||||
class UserMemoryItem:
|
||||
def __init__(self, data: Any, user: str, metadata: Optional[Dict[str, Any]] = None):
|
||||
self.data = data
|
||||
self.user = user
|
||||
self.metadata = metadata if metadata is not None else {}
|
||||
@@ -1,5 +1,7 @@
|
||||
from .annotations import (
|
||||
after_kickoff,
|
||||
agent,
|
||||
before_kickoff,
|
||||
cache_handler,
|
||||
callback,
|
||||
crew,
|
||||
@@ -26,4 +28,6 @@ __all__ = [
|
||||
"llm",
|
||||
"cache_handler",
|
||||
"pipeline",
|
||||
"before_kickoff",
|
||||
"after_kickoff",
|
||||
]
|
||||
|
||||
@@ -5,6 +5,16 @@ from crewai import Crew
|
||||
from crewai.project.utils import memoize
|
||||
|
||||
|
||||
def before_kickoff(func):
|
||||
func.is_before_kickoff = True
|
||||
return func
|
||||
|
||||
|
||||
def after_kickoff(func):
|
||||
func.is_after_kickoff = True
|
||||
return func
|
||||
|
||||
|
||||
def task(func):
|
||||
func.is_task = True
|
||||
|
||||
@@ -99,6 +109,19 @@ def crew(func) -> Callable[..., Crew]:
|
||||
self.agents = instantiated_agents
|
||||
self.tasks = instantiated_tasks
|
||||
|
||||
return func(self, *args, **kwargs)
|
||||
crew = func(self, *args, **kwargs)
|
||||
|
||||
return wrapper
|
||||
def callback_wrapper(callback, instance):
|
||||
def wrapper(*args, **kwargs):
|
||||
return callback(instance, *args, **kwargs)
|
||||
|
||||
return wrapper
|
||||
|
||||
for _, callback in self._before_kickoff.items():
|
||||
crew.before_kickoff_callbacks.append(callback_wrapper(callback, self))
|
||||
for _, callback in self._after_kickoff.items():
|
||||
crew.after_kickoff_callbacks.append(callback_wrapper(callback, self))
|
||||
|
||||
return crew
|
||||
|
||||
return memoize(wrapper)
|
||||
|
||||
@@ -34,18 +34,39 @@ def CrewBase(cls: T) -> T:
|
||||
self.map_all_agent_variables()
|
||||
self.map_all_task_variables()
|
||||
|
||||
# Preserve task and agent information
|
||||
self._original_tasks = {
|
||||
# Preserve all decorated functions
|
||||
self._original_functions = {
|
||||
name: method
|
||||
for name, method in cls.__dict__.items()
|
||||
if hasattr(method, "is_task") and method.is_task
|
||||
}
|
||||
self._original_agents = {
|
||||
name: method
|
||||
for name, method in cls.__dict__.items()
|
||||
if hasattr(method, "is_agent") and method.is_agent
|
||||
if any(
|
||||
hasattr(method, attr)
|
||||
for attr in [
|
||||
"is_task",
|
||||
"is_agent",
|
||||
"is_before_kickoff",
|
||||
"is_after_kickoff",
|
||||
"is_kickoff",
|
||||
]
|
||||
)
|
||||
}
|
||||
|
||||
# Store specific function types
|
||||
self._original_tasks = self._filter_functions(
|
||||
self._original_functions, "is_task"
|
||||
)
|
||||
self._original_agents = self._filter_functions(
|
||||
self._original_functions, "is_agent"
|
||||
)
|
||||
self._before_kickoff = self._filter_functions(
|
||||
self._original_functions, "is_before_kickoff"
|
||||
)
|
||||
self._after_kickoff = self._filter_functions(
|
||||
self._original_functions, "is_after_kickoff"
|
||||
)
|
||||
self._kickoff = self._filter_functions(
|
||||
self._original_functions, "is_kickoff"
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def load_yaml(config_path: Path):
|
||||
try:
|
||||
|
||||
0
src/crewai/tools/cache_tools/__init__.py
Normal file
0
src/crewai/tools/cache_tools/__init__.py
Normal file
@@ -8,6 +8,7 @@ class UsageMetrics(BaseModel):
|
||||
Attributes:
|
||||
total_tokens: Total number of tokens used.
|
||||
prompt_tokens: Number of tokens used in prompts.
|
||||
cached_prompt_tokens: Number of cached prompt tokens used.
|
||||
completion_tokens: Number of tokens used in completions.
|
||||
successful_requests: Number of successful requests made.
|
||||
"""
|
||||
@@ -16,6 +17,9 @@ class UsageMetrics(BaseModel):
|
||||
prompt_tokens: int = Field(
|
||||
default=0, description="Number of tokens used in prompts."
|
||||
)
|
||||
cached_prompt_tokens: int = Field(
|
||||
default=0, description="Number of cached prompt tokens used."
|
||||
)
|
||||
completion_tokens: int = Field(
|
||||
default=0, description="Number of tokens used in completions."
|
||||
)
|
||||
@@ -32,5 +36,6 @@ class UsageMetrics(BaseModel):
|
||||
"""
|
||||
self.total_tokens += usage_metrics.total_tokens
|
||||
self.prompt_tokens += usage_metrics.prompt_tokens
|
||||
self.cached_prompt_tokens += usage_metrics.cached_prompt_tokens
|
||||
self.completion_tokens += usage_metrics.completion_tokens
|
||||
self.successful_requests += usage_metrics.successful_requests
|
||||
|
||||
@@ -10,6 +10,7 @@ from .rpm_controller import RPMController
|
||||
from .exceptions.context_window_exceeding_exception import (
|
||||
LLMContextLengthExceededException,
|
||||
)
|
||||
from .embedding_configurator import EmbeddingConfigurator
|
||||
|
||||
__all__ = [
|
||||
"Converter",
|
||||
@@ -23,4 +24,5 @@ __all__ = [
|
||||
"RPMController",
|
||||
"YamlParser",
|
||||
"LLMContextLengthExceededException",
|
||||
"EmbeddingConfigurator",
|
||||
]
|
||||
|
||||
@@ -1,2 +1,3 @@
|
||||
TRAINING_DATA_FILE = "training_data.pkl"
|
||||
TRAINED_AGENTS_DATA_FILE = "trained_agents_data.pkl"
|
||||
DEFAULT_SCORE_THRESHOLD = 0.35
|
||||
|
||||
183
src/crewai/utilities/embedding_configurator.py
Normal file
183
src/crewai/utilities/embedding_configurator.py
Normal file
@@ -0,0 +1,183 @@
|
||||
import os
|
||||
from typing import Any, Dict, cast
|
||||
from chromadb import EmbeddingFunction, Documents, Embeddings
|
||||
from chromadb.api.types import validate_embedding_function
|
||||
|
||||
|
||||
class EmbeddingConfigurator:
|
||||
def __init__(self):
|
||||
self.embedding_functions = {
|
||||
"openai": self._configure_openai,
|
||||
"azure": self._configure_azure,
|
||||
"ollama": self._configure_ollama,
|
||||
"vertexai": self._configure_vertexai,
|
||||
"google": self._configure_google,
|
||||
"cohere": self._configure_cohere,
|
||||
"bedrock": self._configure_bedrock,
|
||||
"huggingface": self._configure_huggingface,
|
||||
"watson": self._configure_watson,
|
||||
}
|
||||
|
||||
def configure_embedder(
|
||||
self,
|
||||
embedder_config: Dict[str, Any] | None = None,
|
||||
) -> EmbeddingFunction:
|
||||
"""Configures and returns an embedding function based on the provided config."""
|
||||
if embedder_config is None:
|
||||
return self._create_default_embedding_function()
|
||||
|
||||
provider = embedder_config.get("provider")
|
||||
config = embedder_config.get("config", {})
|
||||
model_name = config.get("model")
|
||||
|
||||
if isinstance(provider, EmbeddingFunction):
|
||||
try:
|
||||
validate_embedding_function(provider)
|
||||
return provider
|
||||
except Exception as e:
|
||||
raise ValueError(f"Invalid custom embedding function: {str(e)}")
|
||||
|
||||
if provider not in self.embedding_functions:
|
||||
raise Exception(
|
||||
f"Unsupported embedding provider: {provider}, supported providers: {list(self.embedding_functions.keys())}"
|
||||
)
|
||||
|
||||
return self.embedding_functions[provider](config, model_name)
|
||||
|
||||
@staticmethod
|
||||
def _create_default_embedding_function():
|
||||
from chromadb.utils.embedding_functions.openai_embedding_function import (
|
||||
OpenAIEmbeddingFunction,
|
||||
)
|
||||
|
||||
return OpenAIEmbeddingFunction(
|
||||
api_key=os.getenv("OPENAI_API_KEY"), model_name="text-embedding-3-small"
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def _configure_openai(config, model_name):
|
||||
from chromadb.utils.embedding_functions.openai_embedding_function import (
|
||||
OpenAIEmbeddingFunction,
|
||||
)
|
||||
|
||||
return OpenAIEmbeddingFunction(
|
||||
api_key=config.get("api_key") or os.getenv("OPENAI_API_KEY"),
|
||||
model_name=model_name,
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def _configure_azure(config, model_name):
|
||||
from chromadb.utils.embedding_functions.openai_embedding_function import (
|
||||
OpenAIEmbeddingFunction,
|
||||
)
|
||||
|
||||
return OpenAIEmbeddingFunction(
|
||||
api_key=config.get("api_key"),
|
||||
api_base=config.get("api_base"),
|
||||
api_type=config.get("api_type", "azure"),
|
||||
api_version=config.get("api_version"),
|
||||
model_name=model_name,
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def _configure_ollama(config, model_name):
|
||||
from chromadb.utils.embedding_functions.ollama_embedding_function import (
|
||||
OllamaEmbeddingFunction,
|
||||
)
|
||||
|
||||
return OllamaEmbeddingFunction(
|
||||
url=config.get("url", "http://localhost:11434/api/embeddings"),
|
||||
model_name=model_name,
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def _configure_vertexai(config, model_name):
|
||||
from chromadb.utils.embedding_functions.google_embedding_function import (
|
||||
GoogleVertexEmbeddingFunction,
|
||||
)
|
||||
|
||||
return GoogleVertexEmbeddingFunction(
|
||||
model_name=model_name,
|
||||
api_key=config.get("api_key"),
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def _configure_google(config, model_name):
|
||||
from chromadb.utils.embedding_functions.google_embedding_function import (
|
||||
GoogleGenerativeAiEmbeddingFunction,
|
||||
)
|
||||
|
||||
return GoogleGenerativeAiEmbeddingFunction(
|
||||
model_name=model_name,
|
||||
api_key=config.get("api_key"),
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def _configure_cohere(config, model_name):
|
||||
from chromadb.utils.embedding_functions.cohere_embedding_function import (
|
||||
CohereEmbeddingFunction,
|
||||
)
|
||||
|
||||
return CohereEmbeddingFunction(
|
||||
model_name=model_name,
|
||||
api_key=config.get("api_key"),
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def _configure_bedrock(config, model_name):
|
||||
from chromadb.utils.embedding_functions.amazon_bedrock_embedding_function import (
|
||||
AmazonBedrockEmbeddingFunction,
|
||||
)
|
||||
|
||||
return AmazonBedrockEmbeddingFunction(
|
||||
session=config.get("session"),
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def _configure_huggingface(config, model_name):
|
||||
from chromadb.utils.embedding_functions.huggingface_embedding_function import (
|
||||
HuggingFaceEmbeddingServer,
|
||||
)
|
||||
|
||||
return HuggingFaceEmbeddingServer(
|
||||
url=config.get("api_url"),
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def _configure_watson(config, model_name):
|
||||
try:
|
||||
import ibm_watsonx_ai.foundation_models as watson_models
|
||||
from ibm_watsonx_ai import Credentials
|
||||
from ibm_watsonx_ai.metanames import EmbedTextParamsMetaNames as EmbedParams
|
||||
except ImportError as e:
|
||||
raise ImportError(
|
||||
"IBM Watson dependencies are not installed. Please install them to use Watson embedding."
|
||||
) from e
|
||||
|
||||
class WatsonEmbeddingFunction(EmbeddingFunction):
|
||||
def __call__(self, input: Documents) -> Embeddings:
|
||||
if isinstance(input, str):
|
||||
input = [input]
|
||||
|
||||
embed_params = {
|
||||
EmbedParams.TRUNCATE_INPUT_TOKENS: 3,
|
||||
EmbedParams.RETURN_OPTIONS: {"input_text": True},
|
||||
}
|
||||
|
||||
embedding = watson_models.Embeddings(
|
||||
model_id=config.get("model"),
|
||||
params=embed_params,
|
||||
credentials=Credentials(
|
||||
api_key=config.get("api_key"), url=config.get("api_url")
|
||||
),
|
||||
project_id=config.get("project_id"),
|
||||
)
|
||||
|
||||
try:
|
||||
embeddings = embedding.embed_documents(input)
|
||||
return cast(Embeddings, embeddings)
|
||||
except Exception as e:
|
||||
print("Error during Watson embedding:", e)
|
||||
raise e
|
||||
|
||||
return WatsonEmbeddingFunction()
|
||||
@@ -16,7 +16,11 @@ class FileHandler:
|
||||
|
||||
def log(self, **kwargs):
|
||||
now = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
|
||||
message = f"{now}: " + ", ".join([f"{key}=\"{value}\"" for key, value in kwargs.items()]) + "\n"
|
||||
message = (
|
||||
f"{now}: "
|
||||
+ ", ".join([f'{key}="{value}"' for key, value in kwargs.items()])
|
||||
+ "\n"
|
||||
)
|
||||
with open(self._path, "a", encoding="utf-8") as file:
|
||||
file.write(message + "\n")
|
||||
|
||||
@@ -63,7 +67,7 @@ class PickleHandler:
|
||||
|
||||
with open(self.file_path, "rb") as file:
|
||||
try:
|
||||
return pickle.load(file)
|
||||
return pickle.load(file) # nosec
|
||||
except EOFError:
|
||||
return {} # Return an empty dictionary if the file is empty or corrupted
|
||||
except Exception:
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
from litellm.integrations.custom_logger import CustomLogger
|
||||
|
||||
from litellm.types.utils import Usage
|
||||
from crewai.agents.agent_builder.utilities.base_token_process import TokenProcess
|
||||
|
||||
|
||||
@@ -11,8 +11,11 @@ class TokenCalcHandler(CustomLogger):
|
||||
if self.token_cost_process is None:
|
||||
return
|
||||
|
||||
usage : Usage = response_obj["usage"]
|
||||
self.token_cost_process.sum_successful_requests(1)
|
||||
self.token_cost_process.sum_prompt_tokens(response_obj["usage"].prompt_tokens)
|
||||
self.token_cost_process.sum_completion_tokens(
|
||||
response_obj["usage"].completion_tokens
|
||||
)
|
||||
self.token_cost_process.sum_prompt_tokens(usage.prompt_tokens)
|
||||
self.token_cost_process.sum_completion_tokens(usage.completion_tokens)
|
||||
if usage.prompt_tokens_details:
|
||||
self.token_cost_process.sum_cached_prompt_tokens(
|
||||
usage.prompt_tokens_details.cached_tokens
|
||||
)
|
||||
|
||||
@@ -10,10 +10,11 @@ from crewai import Agent, Crew, Task
|
||||
from crewai.agents.cache import CacheHandler
|
||||
from crewai.agents.crew_agent_executor import CrewAgentExecutor
|
||||
from crewai.agents.parser import AgentAction, CrewAgentParser, OutputParserException
|
||||
from crewai.knowledge.source.string_knowledge_source import StringKnowledgeSource
|
||||
from crewai.llm import LLM
|
||||
from crewai.tools import tool
|
||||
from crewai.tools.tool_calling import InstructorToolCalling
|
||||
from crewai.tools.tool_usage import ToolUsage
|
||||
from crewai.tools import tool
|
||||
from crewai.tools.tool_usage_events import ToolUsageFinished
|
||||
from crewai.utilities import RPMController
|
||||
from crewai.utilities.events import Emitter
|
||||
@@ -1574,3 +1575,42 @@ def test_agent_execute_task_with_ollama():
|
||||
result = agent.execute_task(task)
|
||||
assert len(result.split(".")) == 2
|
||||
assert "AI" in result or "artificial intelligence" in result.lower()
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_agent_with_knowledge_sources():
|
||||
# Create a knowledge source with some content
|
||||
content = "Brandon's favorite color is blue and he likes Mexican food."
|
||||
string_source = StringKnowledgeSource(
|
||||
content=content, metadata={"preference": "personal"}
|
||||
)
|
||||
|
||||
|
||||
with patch('crewai.knowledge.storage.knowledge_storage.KnowledgeStorage') as MockKnowledge:
|
||||
mock_knowledge_instance = MockKnowledge.return_value
|
||||
mock_knowledge_instance.sources = [string_source]
|
||||
mock_knowledge_instance.query.return_value = [{
|
||||
"content": content,
|
||||
"metadata": {"preference": "personal"}
|
||||
}]
|
||||
|
||||
agent = Agent(
|
||||
role="Information Agent",
|
||||
goal="Provide information based on knowledge sources",
|
||||
backstory="You have access to specific knowledge sources.",
|
||||
llm=LLM(model="gpt-4o-mini"),
|
||||
)
|
||||
|
||||
# Create a task that requires the agent to use the knowledge
|
||||
task = Task(
|
||||
description="What is Brandon's favorite color?",
|
||||
expected_output="Brandon's favorite color.",
|
||||
agent=agent,
|
||||
)
|
||||
|
||||
crew = Crew(agents=[agent], tasks=[task])
|
||||
result = crew.kickoff()
|
||||
|
||||
# Assert that the agent provides the correct information
|
||||
assert "blue" in result.raw.lower()
|
||||
|
||||
|
||||
449
tests/cassettes/test_after_crew_modification.yaml
Normal file
449
tests/cassettes/test_after_crew_modification.yaml
Normal file
@@ -0,0 +1,449 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: !!binary |
|
||||
CuMOCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSug4KEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRKSDAoQK+dPhrB8w3HKFlxX60XzYRIIk5aB+A8oCWQqDENyZXcgQ3JlYXRlZDABObix
|
||||
K+HWrwgYQcBiMeHWrwgYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODAuMEoaCg5weXRob25fdmVy
|
||||
c2lvbhIICgYzLjExLjdKLgoIY3Jld19rZXkSIgogZjM0NmE5YWQ2ZDczMDYzZTA2NzdiMTdjZTlj
|
||||
NTAxNzdKMQoHY3Jld19pZBImCiQ3NjRjZWM1YS04NzkxLTRmN2MtOWY0MC1hNTMzMzJmOTk3YzBK
|
||||
HAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdf
|
||||
bnVtYmVyX29mX3Rhc2tzEgIYAkobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgCSqwFCgtjcmV3
|
||||
X2FnZW50cxKcBQqZBVt7ImtleSI6ICI3M2MzNDljOTNjMTYzYjVkNGRmOThhNjRmYWMxYzQzMCIs
|
||||
ICJpZCI6ICJjZDgwYjlhNy1hN2QzLTQzNTQtYjUyOC1jMzAyODA0MjA3YzgiLCAicm9sZSI6ICJ7
|
||||
dG9waWN9IFNlbmlvciBEYXRhIFJlc2VhcmNoZXJcbiIsICJ2ZXJib3NlPyI6IGZhbHNlLCAibWF4
|
||||
X2l0ZXIiOiAyMCwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwg
|
||||
ImxsbSI6ICJncHQtNG8iLCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6IGZhbHNlLCAiYWxsb3dfY29k
|
||||
ZV9leGVjdXRpb24/IjogZmFsc2UsICJtYXhfcmV0cnlfbGltaXQiOiAyLCAidG9vbHNfbmFtZXMi
|
||||
OiBbXX0sIHsia2V5IjogImJiMDY4Mzc3YzE2NDFiZTZkN2Q5N2E1MTY1OWRiNjEzIiwgImlkIjog
|
||||
ImJmZjc3YmUyLWU4MjQtNGEyOS1hZTFlLTQyMWFjMzc2MjY2YyIsICJyb2xlIjogInt0b3BpY30g
|
||||
UmVwb3J0aW5nIEFuYWx5c3RcbiIsICJ2ZXJib3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAyMCwg
|
||||
Im1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJncHQt
|
||||
NG8iLCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6IGZhbHNlLCAiYWxsb3dfY29kZV9leGVjdXRpb24/
|
||||
IjogZmFsc2UsICJtYXhfcmV0cnlfbGltaXQiOiAyLCAidG9vbHNfbmFtZXMiOiBbXX1dSpMECgpj
|
||||
cmV3X3Rhc2tzEoQECoEEW3sia2V5IjogIjZhZmM0YjM5NjI1OWZiYjc2ODFmNTZjNzc1NWNjOTM3
|
||||
IiwgImlkIjogIjRmNTFlYzM2LTVlMDctNGU4Ni1iYzIxLWU1MTQ0Mzg2YmIyYSIsICJhc3luY19l
|
||||
eGVjdXRpb24/IjogZmFsc2UsICJodW1hbl9pbnB1dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAi
|
||||
e3RvcGljfSBTZW5pb3IgRGF0YSBSZXNlYXJjaGVyXG4iLCAiYWdlbnRfa2V5IjogIjczYzM0OWM5
|
||||
M2MxNjNiNWQ0ZGY5OGE2NGZhYzFjNDMwIiwgInRvb2xzX25hbWVzIjogW119LCB7ImtleSI6ICJi
|
||||
MTdiMTg4ZGJmMTRmOTNhOThlNWI5NWFhZDM2NzU3NyIsICJpZCI6ICIwMGJmZDY5ZC03OWZiLTRj
|
||||
MjctYTM0Yi02NzBkZWJlMzU0NWYiLCAiYXN5bmNfZXhlY3V0aW9uPyI6IGZhbHNlLCAiaHVtYW5f
|
||||
aW5wdXQ/IjogZmFsc2UsICJhZ2VudF9yb2xlIjogInt0b3BpY30gUmVwb3J0aW5nIEFuYWx5c3Rc
|
||||
biIsICJhZ2VudF9rZXkiOiAiYmIwNjgzNzdjMTY0MWJlNmQ3ZDk3YTUxNjU5ZGI2MTMiLCAidG9v
|
||||
bHNfbmFtZXMiOiBbXX1degIYAYUBAAEAABKOAgoQsN5cQC9ZzBr2B0OKBR2WCxII3ULL7Wk965Yq
|
||||
DFRhc2sgQ3JlYXRlZDABOWB9ROHWrwgYQfg0ReHWrwgYSi4KCGNyZXdfa2V5EiIKIGYzNDZhOWFk
|
||||
NmQ3MzA2M2UwNjc3YjE3Y2U5YzUwMTc3SjEKB2NyZXdfaWQSJgokNzY0Y2VjNWEtODc5MS00Zjdj
|
||||
LTlmNDAtYTUzMzMyZjk5N2MwSi4KCHRhc2tfa2V5EiIKIDZhZmM0YjM5NjI1OWZiYjc2ODFmNTZj
|
||||
Nzc1NWNjOTM3SjEKB3Rhc2tfaWQSJgokNGY1MWVjMzYtNWUwNy00ZTg2LWJjMjEtZTUxNDQzODZi
|
||||
YjJhegIYAYUBAAEAAA==
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '1894'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.27.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Sun, 17 Nov 2024 07:09:57 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are LLMs Senior Data Researcher\n.
|
||||
You''re a seasoned researcher with a knack for uncovering the latest developments
|
||||
in LLMs. Known for your ability to find the most relevant information and present
|
||||
it in a clear and concise manner.\n\nYour personal goal is: Uncover cutting-edge
|
||||
developments in LLMs\n\nTo give my best complete final answer to the task use
|
||||
the exact following format:\n\nThought: I now can give a great answer\nFinal
|
||||
Answer: Your final answer must be the great and the most complete as possible,
|
||||
it must be outcome described.\n\nI MUST use these formats, my job depends on
|
||||
it!"}, {"role": "user", "content": "\nCurrent Task: Conduct a thorough research
|
||||
about LLMs Make sure you find any interesting and relevant information given
|
||||
the current year is 2024.\n\n\nThis is the expect criteria for your final answer:
|
||||
A list with 10 bullet points of the most relevant information about LLMs\n\nyou
|
||||
MUST return the actual complete content as the final answer, not a summary.\n\nBegin!
|
||||
This is VERY important to you, use the tools available and give your best Final
|
||||
Answer, your job depends on it!\n\nThought:"}], "model": "gpt-4o", "stop": ["\nObservation:"],
|
||||
"stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1235'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=08pKRcLhS1PDw0mYfL2jz19ac6M.T31GoiMuI5DlX6w-1731827382-1.0.1.1-UfOLu3AaIUuXP1sGzdV6oggJ1q7iMTC46t08FDhYVrKcW5YmD4CbifudOJiSgx8h0JLTwZdgk.aG05S0eAO_PQ;
|
||||
_cfuvid=74kaPOoAcp8YRSA0XocQ1FFNksu9V0_KiWdQfo7wQuQ-1731827382509-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA2RXwW4cOQ69z1cQffFMUG04iWeS8a2xmcn2wkYMr4MFdnNhS6wqTlRSjSh1pz0/
|
||||
vyBVbfdmL4atkijq8b1H+q8fAFbsVzewciMWN81hvfn8OP75+fDu3+Ufh08fNuHvzPt//sHxbvzX
|
||||
08dVpyfS7g9y5XTq0qVpDlQ4xfbZZcJCGvX1u7ev37959/bXa/swJU9Bjw1zWV+n9ZurN9frq/fr
|
||||
q1+Wg2NiR7K6gf/8AADwl/3UFKOnb6sbuOpOKxOJ4ECrm+dNAKucgq6sUISlYCyr7uWjS7FQtKwf
|
||||
x1SHsdzAFmI6gMMIA+8JEAZNHTDKgTLAl/g7Rwywsb9v4Ev8El9fwqtXn2aKm+2FwMf7x/U1PFAg
|
||||
FPKvXt1A+wR5WWo7OjiM7Eboay4jZeBpzmlPApbUt1IxQI2esmbtOQ6A0cNAkTIqruBwxh0HLkxy
|
||||
CdsCqe8pC3BUsPUedK5mdMcODH7eczl2FsalkTJFR8ARMsmcotjV04yZPJQEXATmTJ4ciaQsHUz4
|
||||
VdNgBQNIhGJhDFBSCtCnDLsqHHVdOiBfHRY7pvd52lNIM2W5VMDeKGAfUxoCKWA0cWS4UyYoXNsI
|
||||
yoIO2g4IWKMbNauRTpuNNh30yemlA6QIE+VBfw0Yh4oDfYfegcsIUw2Fp+QxfAff48jSglrp55z0
|
||||
2aCF6ACr59QewhMOJCCskTBSqhKOHVAcMbqGjgDOc2BnVdJyQM8UvEDgrwR7pTM8s1FeompGeoTj
|
||||
YCC9VZA2sYw5zewuBP4WsHpShNpvHeyO8LyhAxbwJDzEVsA5c8pc+IlAsKdytKuojOz0+SkK+4VL
|
||||
luXt7R1UFZCS6UJgVzkUDaSATxpmxDz1NUCqZa5L6jtGZc6caY+BYtFIhDkw5ZdKGLDSgacpRSl6
|
||||
qfIZZORerzhg9nLiIe8CwWYLnuaQjhPFYnBcKxx3VPBCTFBwn2ldMrI+9zFjlD7liTL8+On+8Sd4
|
||||
e3mlSOkBGFEfWHLy1ZGHT/eP+rmDXUKxTKjv2bFmbwEtuXnOCd1IAmXEAiGp/FUgtRhmhqEUURkH
|
||||
ggk5luXsyMMIM2VNCKMjpdcCAtA3R8HwLppzaFouKF8bnprrjlya1HrmNNeAGZoJmsjQoaeJnaJF
|
||||
mN0IvtJJr2mmuEZn1J1TYHc06H5W6G5PxTCdwQeWwmG5fuP3mqcoYg/kFAhcljTTlrk/P1HIjZH/
|
||||
rCQwotpkUIDOha45tRKCTBgC5Q6mlOkM7O/4oSImP5AGUc83yaZaIKaCu3A09HOa2BR/hm8H9G1e
|
||||
ZK4e0RBgU3dj/ZkiIfVK9eVhO62d7timx9O9htkvitnvHGn9WOOzf9zyxIU8fMCCC73G5MUq0+vm
|
||||
0jbbBXYiLCc8GhH3dDJ6r66Bu6DbXZWSJn5qCWqwyE4fcm4kJzT0hZHI2749SrHYQuVkY0sNVDlG
|
||||
J2WFZpBSq0pzzBMGMpNjDPxEHjj6KiWzOri5VaABg+EzkTffEHJq64bRO8Xot8VQdJPJUDtILKrg
|
||||
z2JupWZuiQhRVNXzELlnh7EATfOIwq32S6TNtmvYpTxgXFARIFEWsIyavdk6Zm9p9xknOqT8tRVC
|
||||
jaw8Z+IWBjiXaizN8o+XcKdUtH4XmQQwE6BPs9nBnNEVY6CBsNk+dwa1KeU1iwtJ6Hn9pZV0reIt
|
||||
56X5aekl1Xzi1nvFTS3hgYWUj/c5TXOB3+LAkShzHBS3jZyEsRiCyWcRpu+0SekpejllKJM2Qu3+
|
||||
AgguV2dN+qRXQ0jfOfGTnmgvWBx9IRDHfQo6irR2YngrS3p2wHGuZbl6scahsqfGp5LgqM3OTuog
|
||||
kUlqKNJBGauctckqSyWUgssYtkwpKbYRxTRBGYSyytKQ+1WR28ZCwzIEGU82dVCyk4cHQg2q6N1+
|
||||
Zy7cpgqr9DIicRzCEXZkfXsJSn6J+dAwSyENShBrqGmvL82EYV14ohcDO/PyDvacbXo7a/IcgaeJ
|
||||
suhISXHPOUVNeWGIXm6c0682ObUGo6OVDlMxsxtP7laF8oWo5VFmm+Es4Zex0TSx2T7fv2D3+krB
|
||||
e6ChBp3MjvDhxSfM+T+mPeWWFhxSDv6gr1W85pyGrJaqKSzd2sw3NBLlFtRkijnV6F+mif8d0kwL
|
||||
c+a9jaVCrubnofRsNHGUo7GRhM79TAB50lroRDNUzN7gODVhii7VjEMraEz7RhKOCsdzOVtTbLHn
|
||||
xBo1Ux/INW91tagLrK0VLWpb7o4eSqboW/8eqU12quCAeaD/a2g/qih+UiWmfplpdTYIPIxlKSdn
|
||||
GHI6tIz7UK2gdtF4ztMG4rlL6p7FqDCcD0vn/+Dk/wIAAP//jJdBDsIgEEX3nIKwdqO20csYgjBU
|
||||
IgKB6cJF726AprSxJm7nD5/5LMg80GMSma/caO1cnxZisn4I0d/TrC91bZxJD57v9i7TUUIfWFEn
|
||||
QumtkNm4gS1WvwSO/gkuG57Ol+rHGgs2ta/4RylDj8I24Xo+HnYMuQIUxqYV3DGZ1zPVjjYSLAv7
|
||||
SiCr2N/j7HnX6MYN/9g3QUoICIpncDJyG7m1Rcis/KtteeYyMEvvhPDi2rgBYoim4qoOvOul7jsF
|
||||
AhiZyAcAAP//AwCidyXxtw8AAA==
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e3de5de7b6c6217-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Sun, 17 Nov 2024 07:10:00 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '6026'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '30000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '29999713'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_553f04a622d026a28dd3c5da55568fcd
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: !!binary |
|
||||
Cs4CCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSpQIKEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRKOAgoQIn+FuHJydyMnR3y/Qfb8GBII2zXFs4gynEgqDFRhc2sgQ3JlYXRlZDABOShs
|
||||
9ljYrwgYQQiP+FjYrwgYSi4KCGNyZXdfa2V5EiIKIGYzNDZhOWFkNmQ3MzA2M2UwNjc3YjE3Y2U5
|
||||
YzUwMTc3SjEKB2NyZXdfaWQSJgokNzY0Y2VjNWEtODc5MS00ZjdjLTlmNDAtYTUzMzMyZjk5N2Mw
|
||||
Si4KCHRhc2tfa2V5EiIKIGIxN2IxODhkYmYxNGY5M2E5OGU1Yjk1YWFkMzY3NTc3SjEKB3Rhc2tf
|
||||
aWQSJgokMDBiZmQ2OWQtNzlmYi00YzI3LWEzNGItNjcwZGViZTM1NDVmegIYAYUBAAEAAA==
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '337'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.27.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Sun, 17 Nov 2024 07:10:01 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are LLMs Reporting Analyst\n.
|
||||
You''re a meticulous analyst with a keen eye for detail. You''re known for your
|
||||
ability to turn complex data into clear and concise reports, making it easy
|
||||
for others to understand and act on the information you provide.\nYour personal
|
||||
goal is: Create detailed reports based on LLMs data analysis and research findings\n\nTo
|
||||
give my best complete final answer to the task use the exact following format:\n\nThought:
|
||||
I now can give a great answer\nFinal Answer: Your final answer must be the great
|
||||
and the most complete as possible, it must be outcome described.\n\nI MUST use
|
||||
these formats, my job depends on it!"}, {"role": "user", "content": "\nCurrent
|
||||
Task: Review the context you got and expand each topic into a full section for
|
||||
a report. Make sure the report is detailed and contains any and all relevant
|
||||
information.\n\n\nThis is the expect criteria for your final answer: A fully
|
||||
fledge reports with the mains topics, each with a full section of information.
|
||||
Formatted as markdown without ''```''\n\nyou MUST return the actual complete
|
||||
content as the final answer, not a summary.\n\nThis is the context you''re working
|
||||
with:\n1. **OpenAI''s GPT-4 Released**: OpenAI released GPT-4, which further
|
||||
improves contextual understanding and generation capabilities. It offers increased
|
||||
accuracy, creativity, and coherence in responses compared to its predecessors,
|
||||
making it an essential tool for businesses, educators, and developers.\n\n2.
|
||||
**Google''s Gemini Model**: In 2024, Google launched the Gemini model, focusing
|
||||
on merging language understanding with multimodal capabilities. This model can
|
||||
process text, audio, and images simultaneously, enhancing its applications in
|
||||
fields like voice assistants and image captioning.\n\n3. **Anthropic''s Claude**:
|
||||
Claude, by Anthropic, is designed to prioritize safety and ethical considerations
|
||||
in LLM usage. It''s built to minimize harmful outputs and biases prevalent in
|
||||
earlier language models, demonstrating a shift towards responsible AI deployment.\n\n4.
|
||||
**Meta''s Open Pre-trained Transformer (OPT) 3.0**: Meta has introduced OPT
|
||||
3.0, boasting efficient training approaches that lower computational costs while
|
||||
maintaining high performance. The model excels in translation tasks and has
|
||||
become a popular choice for academic research due to its open-access policy.\n\n5.
|
||||
**Language Model Distillation Advances**: Recent advances in model distillation
|
||||
techniques have allowed developers to deploy smaller, more efficient language
|
||||
models on edge devices without notably compromising performance, expanding the
|
||||
accessibility and application of LLMs in mobile and IoT devices.\n\n6. **Fine-Tuning
|
||||
with Limited Data**: Methods for fine-tuning LLMs with limited data have improved,
|
||||
enabling customization for niche applications without the need for vast datasets.
|
||||
This development has opened doors to using LLMs in specialized industries, like
|
||||
legal and medical sectors.\n\n7. **Ethical and Transparent AI Use**: 2024 has
|
||||
seen a significant emphasis on ethical AI, with organizations establishing standardized
|
||||
frameworks for LLM transparency and accountability. More companies are adopting
|
||||
practices like AI model cards to disclose model capabilities, limitations, and
|
||||
data sources.\n\n8. **The Rise of Prompt Engineering**: As models become more
|
||||
advanced, prompt engineering has emerged as a crucial technique for optimizing
|
||||
model outputs. This involves designing specific input prompts that guide LLMs
|
||||
to yield desired results, thus enhancing usability in content creation and customer
|
||||
service.\n\n9. **Integration with Augmented Reality**: Language models in 2024
|
||||
are increasingly being integrated with AR technologies to provide real-time
|
||||
language translation, virtual assistants in immersive environments, and interactive
|
||||
educational tools, enriching the user''s experience with contextualized AI assistance.\n\n10.
|
||||
**Regulatory Developments**: Governments worldwide are progressing towards formalizing
|
||||
regulations around LLM usage, focusing on data privacy, security, and ethical
|
||||
concerns. These developments aim to safeguard users while encouraging innovation
|
||||
in AI technology.\n\nThese points reflect the cutting-edge advancements and
|
||||
trends in the field of large language models (LLMs) as of 2024, highlighting
|
||||
their growing influence and the increasing focus on ethical and practical deployment.\n\nBegin!
|
||||
This is VERY important to you, use the tools available and give your best Final
|
||||
Answer, your job depends on it!\n\nThought:"}], "model": "gpt-4o", "stop": ["\nObservation:"],
|
||||
"stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '4624'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=08pKRcLhS1PDw0mYfL2jz19ac6M.T31GoiMuI5DlX6w-1731827382-1.0.1.1-UfOLu3AaIUuXP1sGzdV6oggJ1q7iMTC46t08FDhYVrKcW5YmD4CbifudOJiSgx8h0JLTwZdgk.aG05S0eAO_PQ;
|
||||
_cfuvid=74kaPOoAcp8YRSA0XocQ1FFNksu9V0_KiWdQfo7wQuQ-1731827382509-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA3RXS28bRxK+51cUlIMTYEjIlvOAbso6zhIrI4KjXSyyuRS7izO16umeVHWTovPn
|
||||
F9U9Q5GG92JY0696fI/iX18BXLG/uoUrN2B24xRWd/98HA75/u//0u3z4V3+DftPP+82P/3+799/
|
||||
vrm56uxE2v6XXF5OrV0ap0CZU2zLTggz2a2vf7h5/eObH95eX9eFMXkKdqyf8uptWr25fvN2df3j
|
||||
6vr7+eCQ2JFe3cJ/vgIA+Kv+ayFGT89Xt1CvqV9GUsWerm5PmwCuJAX7coWqrBljvupeFl2KmWKN
|
||||
+nFIpR/yLWwgpgM4jNDzngCht9ABox5I/oh/xPccMcBd/fvWPnwNH2lKkiFFuPN7jI5GilmBI9yj
|
||||
9AT3GPuCPcEHy1bhm/v7D/otrMCyrVd8Da/X8OtE8W7zSuGXh8fVW/hIgVDJ24ZNrHu7eQ9wzJJ8
|
||||
ceTb5g5GlCeOPSAo95F37DBmoDgs8Vg4YQmklh0yuSGmkPrjGn4qHLxdkCJwVpiEPDlSTaLdHFHa
|
||||
7UgUhHYcyYPDCbccODPVZGs5n3PBACV6Eqt3vRKjByGdUlSCniIJGjTW8A86Ao+TpP2pZC4UTzAQ
|
||||
90MmewWdK4Lu2Nmi1IpAhRPvOR+7evmWciYBlwYSio4smjzQ6VEFzsvDpGt4HEgJ8LxZA+4JRvQ0
|
||||
54oROHrWiaLiNhDklAKgk6QKexRORUHJ5SS6hvdJYFuUI6nSqV5z9RVc0ZxGEusbCTrLHvIghrml
|
||||
AB6MOduUtaa0Z6mVPAFX17CJQL64WrzOUlKSPSmgHVnS8S3SXRKYSDRFDPyJPARCiRz7DpBrV9pV
|
||||
SWrvPO0ppMm+j0kIKPbY218zR1qK8y4ySFh9T6HfPWwAQ0gHrQ/XKyRti2bAaQrcYq4vRcxFMLxg
|
||||
cZJkOLPHMuqTrmdGvFnDLyn1gYwRNHLkxh9b/nyh4rmDgCW6gby90/giNAlp7TDCxCkSib2kmSbb
|
||||
NZaQeUzeKL25JMSxdqs3rMb+JdxLaB84D5dEyGnJCIwNHWDxnBpOecSeFJTtWYyUioZjtySxFcIn
|
||||
hUgH6CWVWNO421xU0LDLOvP3jOnhCFj6BmXrzK7EijIMnI+QdrA3HT1DU2dR7hsS8kBjy8SOtlSO
|
||||
lojJuNBA0c8cRpeX74GeQR1Fo4I1dp/CvsLHKjoFAo8ZDQ2jruHOe27hWL5NX9LBasSxVcWKaDuq
|
||||
BDWdsUJMSZXPRQZdBcsS49IwW7UQseQ0mtUswD3Tmw48jSlqnltKz5mimsqX3K4zQJBn7ICMqBk5
|
||||
Wklb804kNtZZMXdMwZ/gerOGu2icnti9UvhbwOLJ1tr/uhN5PGyPL1u7GVAuiaUATjizwwA68M6K
|
||||
fUDxVi5OtvKpghd3lI81KMpD3e1SVPZzojoDx9MU0nGs9K2wCdWOPnMB9F6qbNkljiSq4WVAGXcl
|
||||
QCp5KrMmbRmruh0GdkNTzC1RNOEKgWLfOlRDne9+Kb5WRu2SKzq7TBPxalmzTrPp7N2mQdEoMVbl
|
||||
EtYn7eaKglLjspAjbw02wTHcLqV4ydryOGc1m/Zv7Pi5JqHJnU7k2NAJQoH22PxSDR+5zgHmPKDF
|
||||
Daa2A2HIg0OhDgL1GGaEJJmSWYwRZCxxecIqRmL2wtEwVcE3R1uVpLaYFSYUHFOJeQHV2zV8oIyv
|
||||
tBo/PAitsmB130fBqEYvEvjm14fHb+FmfW3HlgMPj/bFcBRJW02NeLTbsWOrTr2pUpbykPxCozxg
|
||||
BjF3qHlMJWOjrtEHo1frf7jMZiJ8MruxeMyELmSKnp1NPRNKZlcCSqhMy5ZAwOaFWFs8WxQc2JNO
|
||||
QugBfaq60LiPnkZ2hhdCcQP4YsZc80oTxVVTh+YxSxC2YL4MFF0qUhXYpRBwa72y5l6MIDtJY8VT
|
||||
H9LWDHh5dO6pzRzN2GfpbFm+0tnw6lNVirwpWuzNqHMCz3sSJQgc+8Ka6411YKqgvHSq8xmuWtWi
|
||||
A6ZgzsKuQnC3scGvzX2L+//fGe8knXWYAYStJPQkIBh7Mq5MdTSxapv8zlNbLTTJSee+W382z8I7
|
||||
yybMnZwnYLXdH8kZziZJvcxa34Ly5ydqhPxnIRvCrEs2bvla2Usq62gyIx0M3A/heIbky5zV1IV8
|
||||
b+dNqL888PkEMeXPPLQaXhpZ6RzNNuwUnesVl7abGw4Wl03zLbet8cLqtkmPF48bxF4UxxTr0sdM
|
||||
gyZJO3P9c2dOk/2yMNQZM+2cEIZV5pG+OECZpcnssrSnyhohTUUcrVzzvqoeFPcsqbqbruE307WL
|
||||
KUZLffkLXbjQzqZrghN78ORYOcXVgsg2MczRclQb6ZvcxvrLAuXYnSTVjDum0abqPQ3swmzoFgU2
|
||||
X1h5YcvJLLoJfSvwDMzv1/CeI60eSzxNZvc8sg0D7zCjbXscLthu5bHfMqtczoSwFbp2tV4S5kvq
|
||||
QGMQLTEk90QWnI0YjG3cbgMCf8JFryI7m6deCjbbiN2ailH+z8JyOYrYI2Zxs3idmnIEGqd0MGLO
|
||||
XlW9kaMvmqW1rFWyGlKt3VKo+WeKsX5AqQJlfa3XWUsbkv4HAAD//4xZO4/jNhDu91cQWyWAYuSQ
|
||||
PdxtaSRXbBcsEKQKDFoaycRJokBSdlzsfw++mSEl2T4gpS2Jj+HMfA8u5ZOs631gJYEXXZApW+68
|
||||
WLKWEOZIFAa3RleFDvTYmEoX4Qw0RxuCwxbSdcLKwFpj9LVjxiYMNIOScJX9my6qMq2PSci7G0d/
|
||||
1oCi0a4CLIgP6F4FifMo0Nkx3xbC9Qtrp3UAcx592ZlvCs74kHF2sgHpsn8zf0XKiXQlG1hmsPjG
|
||||
PNu+vSJwBe73b9plaxApTejjNWtb7C6jLPOaMnktfM/WNfhBzolbnhd35nc/THZkPoxK8xeFUBY9
|
||||
wjU4KG2wA118+L5kTg62qRlrGLRi3ftI5f+lRVRSF5nhYHFcIHLUpec69C6G2WisGzDoEVaDSQHq
|
||||
kM98jsiJ43XV+GrI1Qc2QqTEW0FjERClfyeqdRl6/Otl3moPZorDdLLRCU7gYJh89WRFDXnVr67z
|
||||
Af0oUUx5/rPtXSOppm0XkaAxzkG0MxcQZBJOKuQBkeOUNnRPR5uXxPsqUPEO9PGt+TP4YUrm29g5
|
||||
Va3qBBUrRcrTRD+dOBalBNC7GEYwAC0DCMAOFDpk60ZuFBTmvEPCDMIx5ORVBjCJb4jTHAmV24Ib
|
||||
pznphLHSA2UTbXbN0leUxGRVoECK8dBtAsW5L50vlwmv+Ui1FxBJYUai216UI2fM2Jk55pLgTqza
|
||||
TySGtok7/bZGskrsCwxVDAxxnpIg+sZLWdk4ihH7Nzk88cYyVmpAFiOIsw+0xdbXNXXI5HDpRpUQ
|
||||
Ue4sSbNIzog7usBzpa1auODWXfJ8CmYkWjTqKzwk9TT8qCsX24Aa807sFhScXL35ACT4Y1s+DvKx
|
||||
+Wn//vNWluP8wMGpYUXfBj8mBoE7d2PHS7hhNyuZUD1wxSQNBgqMnmtao5ZLjgmeZvMM+e59H1eG
|
||||
AEkpXUcKvOojpQu07f6dh8FBLS2tBAZHGyCGOdZoRcERn/S2mxVnlNMHu9YNgFwqX9okFSegG9Os
|
||||
8pN3wnwUv3fm7xOlExuKm11pe4zGh62FYabeJjZjKmyp9sOReaAQnFv8z9IympO/aDWzI0jyQeM6
|
||||
l8R14DpjJ2LNKLOp/SuOtJt7+IxX84c4IPwK3tjHB/ySS0LqXcOwqMHKdP5MYVT6ISYZhrj40Deq
|
||||
5EEXHVQvUKLMvYI7iB/xPBbHI1DPHASYBxSbgjuz6RypnkMxmgUtciLo8MVFqMMMzsH0w7bUzVYA
|
||||
gJODj5dAiUQ+ozd4xbLF/VhRGymPrXkhLaZsaiNoGIRI5Dto4M2hdsFfJJ4ThbNFuVTK7jYMtQGl
|
||||
5yAtRPLerOBZpuAT1clM87FnEJAdVmay59zcL1ZacpwjcpGpPPNGwP/Fh3S65taVmw3Cx3h53Sni
|
||||
xXkYWC2kex0ngxEYE+tCqAyeuWRxEfmAZe5k9y5YzJbxzuyj5GBn2czygxtJRSCJgVV4w8Ztqh5Y
|
||||
K5o10v5vbkxQPG6cJdbMBPU4mqVM7vjt4wuIysSTnXLI2xnuBzbYxNpO9MgGW1+DBWrnaHELN859
|
||||
r/9/lHu13ndT8Meoz8v/rRtdPB1AW/2IO7SY/PTMTz+ejPmH7+/mzZXcswDiIfnvNGLAry8vMt7z
|
||||
cmO4PP306fNv+jj5ZPvVk9fXL9WDIQ8NARHj6hLwuba4Eli+XW4M2ZlfPXhabfx+QY/Gls27sfs/
|
||||
wy8P6pqmRM0Bl2yu3m56eS0Q7lR/9FoJNC/4OV5jouHQurGjMAUn15rtdHj5XLefXxqy9Pz08fQf
|
||||
AAAA//8DAMIhqlTfHQAA
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e3de605dca06217-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Sun, 17 Nov 2024 07:10:10 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '9951'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '30000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '29998873'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 2ms
|
||||
x-request-id:
|
||||
- req_52a4f98f9c08fbaa1724634d237f3245
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
487
tests/cassettes/test_after_kickoff_modification.yaml
Normal file
487
tests/cassettes/test_after_kickoff_modification.yaml
Normal file
@@ -0,0 +1,487 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: !!binary |
|
||||
CusOCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSwg4KEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRKaDAoQJ2RtlOW3xhPcNjmbKwSJaxIIMUF8zJjQkvQqDENyZXcgQ3JlYXRlZDABOThF
|
||||
x7PrrgkYQWiczLPrrgkYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODAuMEoaCg5weXRob25fdmVy
|
||||
c2lvbhIICgYzLjEyLjdKLgoIY3Jld19rZXkSIgogMWYxMjhiZGI3YmFhNGI2NzcxNGYxZGFlZGMy
|
||||
ZjNhYjZKMQoHY3Jld19pZBImCiQzNGJiYzZjYS03MmRiLTQwMzktODQzMy01NTFmOWNmNDM0YTdK
|
||||
HAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdf
|
||||
bnVtYmVyX29mX3Rhc2tzEgIYAkobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgCSrQFCgtjcmV3
|
||||
X2FnZW50cxKkBQqhBVt7ImtleSI6ICI3M2MzNDljOTNjMTYzYjVkNGRmOThhNjRmYWMxYzQzMCIs
|
||||
ICJpZCI6ICI4MjJkOGM2OC01NzlkLTQ4ZWUtOTBhMi1hNjJiNDgzY2JhNGUiLCAicm9sZSI6ICJ7
|
||||
dG9waWN9IFNlbmlvciBEYXRhIFJlc2VhcmNoZXJcbiIsICJ2ZXJib3NlPyI6IHRydWUsICJtYXhf
|
||||
aXRlciI6IDIwLCAibWF4X3JwbSI6IG51bGwsICJmdW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAi
|
||||
bGxtIjogImdwdC00by1taW5pIiwgImRlbGVnYXRpb25fZW5hYmxlZD8iOiBmYWxzZSwgImFsbG93
|
||||
X2NvZGVfZXhlY3V0aW9uPyI6IGZhbHNlLCAibWF4X3JldHJ5X2xpbWl0IjogMiwgInRvb2xzX25h
|
||||
bWVzIjogW119LCB7ImtleSI6ICIxMDRmZTA2NTllMTBiNDI2Y2Y4OGYwMjRmYjU3MTU1MyIsICJp
|
||||
ZCI6ICI0YTY4NDQwZi0xMjRkLTQ3YmEtYWEzNy1hZTZmMTI2NzlkMmIiLCAicm9sZSI6ICJ7dG9w
|
||||
aWN9IFJlcG9ydGluZyBBbmFseXN0XG4iLCAidmVyYm9zZT8iOiB0cnVlLCAibWF4X2l0ZXIiOiAy
|
||||
MCwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJn
|
||||
cHQtNG8tbWluaSIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJhbGxvd19jb2RlX2V4
|
||||
ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6IFtd
|
||||
fV1KkwQKCmNyZXdfdGFza3MShAQKgQRbeyJrZXkiOiAiNmFmYzRiMzk2MjU5ZmJiNzY4MWY1NmM3
|
||||
NzU1Y2M5MzciLCAiaWQiOiAiODE2YzI1ZDgtNDg3NC00MmMxLWJmNzEtODc2OTcxZDNmYmExIiwg
|
||||
ImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdlbnRf
|
||||
cm9sZSI6ICJ7dG9waWN9IFNlbmlvciBEYXRhIFJlc2VhcmNoZXJcbiIsICJhZ2VudF9rZXkiOiAi
|
||||
NzNjMzQ5YzkzYzE2M2I1ZDRkZjk4YTY0ZmFjMWM0MzAiLCAidG9vbHNfbmFtZXMiOiBbXX0sIHsi
|
||||
a2V5IjogImIxN2IxODhkYmYxNGY5M2E5OGU1Yjk1YWFkMzY3NTc3IiwgImlkIjogIjM4YzU1NTI5
|
||||
LTc2ODAtNDc5OS1iODdiLTFmMDY2NjE5MGU2NyIsICJhc3luY19leGVjdXRpb24/IjogZmFsc2Us
|
||||
ICJodW1hbl9pbnB1dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAie3RvcGljfSBSZXBvcnRpbmcg
|
||||
QW5hbHlzdFxuIiwgImFnZW50X2tleSI6ICIxMDRmZTA2NTllMTBiNDI2Y2Y4OGYwMjRmYjU3MTU1
|
||||
MyIsICJ0b29sc19uYW1lcyI6IFtdfV16AhgBhQEAAQAAEo4CChCo3E4xT/U6O20NrD4/Zkt6EggD
|
||||
/w74tbrrOCoMVGFzayBDcmVhdGVkMAE5SPTas+uuCRhB6IDbs+uuCRhKLgoIY3Jld19rZXkSIgog
|
||||
MWYxMjhiZGI3YmFhNGI2NzcxNGYxZGFlZGMyZjNhYjZKMQoHY3Jld19pZBImCiQzNGJiYzZjYS03
|
||||
MmRiLTQwMzktODQzMy01NTFmOWNmNDM0YTdKLgoIdGFza19rZXkSIgogNmFmYzRiMzk2MjU5ZmJi
|
||||
NzY4MWY1NmM3NzU1Y2M5MzdKMQoHdGFza19pZBImCiQ4MTZjMjVkOC00ODc0LTQyYzEtYmY3MS04
|
||||
NzY5NzFkM2ZiYTF6AhgBhQEAAQAA
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '1902'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.27.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Wed, 20 Nov 2024 13:04:24 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are Bicycles Senior Data
|
||||
Researcher\n. You''re a seasoned researcher with a knack for uncovering the
|
||||
latest developments in Bicycles. Known for your ability to find the most relevant
|
||||
information and present it in a clear and concise manner.\n\nYour personal goal
|
||||
is: Uncover cutting-edge developments in Bicycles\n\nTo give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: Your final answer must be the great and the
|
||||
most complete as possible, it must be outcome described.\n\nI MUST use these
|
||||
formats, my job depends on it!"}, {"role": "user", "content": "\nCurrent Task:
|
||||
Conduct a thorough research about Bicycles Make sure you find any interesting
|
||||
and relevant information given the current year is 2024.\n\n\nThis is the expect
|
||||
criteria for your final answer: A list with 10 bullet points of the most relevant
|
||||
information about Bicycles\n\nyou MUST return the actual complete content as
|
||||
the final answer, not a summary.\n\nBegin! This is VERY important to you, use
|
||||
the tools available and give your best Final Answer, your job depends on it!\n\nThought:"}],
|
||||
"model": "gpt-4o-mini", "stop": ["\nObservation:"], "stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1260'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=CkK4UvBd9ukXvn50uJwGambJcz5zERAJfeXJ9xge6H4-1732107842-1.0.1.1-IOK2yVL3RlD75MgmnKzIEyE38HNknwn6I8BBJ1wjGz4jCTd0YWIBPnvUm9gB8D_zLlUA9G7p_wbrfyc4mO_Bmg;
|
||||
_cfuvid=MmeN9oHWrBLThkEJdaSFHBfWe95JvA8iFnnt7CC92tk-1732107842102-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- x64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
content: "{\n \"id\": \"chatcmpl-AVefTnyhy126z54bX4Wq0TjWFUGJI\",\n \"object\":
|
||||
\"chat.completion\",\n \"created\": 1732107859,\n \"model\": \"gpt-4o-mini-2024-07-18\",\n
|
||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
||||
\"assistant\",\n \"content\": \"I now can give a great answer. \\nFinal
|
||||
Answer: \\n\\n1. **E-Bike Boom**: Electric bikes (e-bikes) have seen a significant
|
||||
rise in popularity, with industry reports indicating a projected growth of 60%
|
||||
in sales compared to previous years. Many cities are paving bike lanes specifically
|
||||
designed for e-bikes to accommodate this surge.\\n\\n2. **Sustainability in
|
||||
Manufacturing**: Bicycle manufacturers are increasingly adopting sustainable
|
||||
practices, such as using recycled materials for frames and parts, and implementing
|
||||
environmentally friendly production processes. This shift is driven by consumer
|
||||
demand for greener products.\\n\\n3. **Smart Bicycles**: The integration of
|
||||
technology in bicycles has progressed with smart bikes featuring built-in GPS,
|
||||
automated gear shifting, and performance analytics. These innovations enhance
|
||||
the cycling experience and cater to data-driven enthusiasts.\\n\\n4. **Bike
|
||||
Sharing Programs**: Urban areas are continuing to expand bike-sharing programs,
|
||||
with some cities introducing electric bike options and introducing smartphone
|
||||
apps to streamline the renting process, increasing accessibility and convenience
|
||||
for riders.\\n\\n5. **Safety Innovations**: Advances in safety technology such
|
||||
as smart helmets that incorporate lights and indicators, anti-collision systems
|
||||
using sensor technology, and built-in communication systems to connect with
|
||||
smartphones are on the rise, aimed at reducing accidents.\\n\\n6. **Adventure
|
||||
Cycling Trends**: There is a growing popularity in adventure and gravel cycling,
|
||||
with more cyclists seeking off-road experiences. This has prompted manufacturers
|
||||
to develop dedicated bikes that cater to rugged terrains, with features such
|
||||
as wider tires and durable frames.\\n\\n7. **Customization and Personalization**:
|
||||
The market for customizable bicycles is expanding. Consumers are now able to
|
||||
choose colors, styles, and features that suit their personal preferences, leading
|
||||
to a more personalized cycling experience.\\n\\n8. **Communities and Events**:
|
||||
Cycling communities are thriving globally, with an increase in events such as
|
||||
group rides, competitive races, and festivals celebrating biking culture. This
|
||||
fosters social engagement and promotes cycling as a lifestyle.\\n\\n9. **Cargo
|
||||
Bikes for Urban Living**: The rise of cargo bikes, particularly in urban environments,
|
||||
allows for efficient transportation of goods, making them an appealing choice
|
||||
for small businesses and families. This trend is encouraged by city planners
|
||||
promoting cycling as an alternative to car deliveries.\\n\\n10. **Regulatory
|
||||
Changes**: Governments around the world are increasingly implementing policies
|
||||
to support cycling infrastructure, such as funding for bike lanes, subsidies
|
||||
for bicycle purchases, and stricter emissions standards for motor vehicles,
|
||||
making cycling a more attractive option for commuting.\\n\\nEach of these points
|
||||
represents the latest developments in the bicycle industry as we move through
|
||||
2024, highlighting advancements in technology, trends in user preferences, and
|
||||
a broader societal shift towards sustainability and health.\",\n \"refusal\":
|
||||
null\n },\n \"logprobs\": null,\n \"finish_reason\": \"stop\"\n
|
||||
\ }\n ],\n \"usage\": {\n \"prompt_tokens\": 237,\n \"completion_tokens\":
|
||||
539,\n \"total_tokens\": 776,\n \"prompt_tokens_details\": {\n \"cached_tokens\":
|
||||
0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\": {\n
|
||||
\ \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
||||
\"fp_0705bf87c0\"\n}\n"
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e58a5276a096225-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Wed, 20 Nov 2024 13:04:26 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '7355'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '30000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '150000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '29999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '149999708'
|
||||
x-ratelimit-reset-requests:
|
||||
- 2ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_5536f2a242886d3949f0cdc1628b2996
|
||||
http_version: HTTP/1.1
|
||||
status_code: 200
|
||||
- request:
|
||||
body: !!binary |
|
||||
Cs4CCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSpQIKEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRKOAgoQpBIRwGH/fJtGJT1cIWsC5BIIM3YyJZEYUUgqDFRhc2sgQ3JlYXRlZDABOYgb
|
||||
lILtrgkYQZBnlYLtrgkYSi4KCGNyZXdfa2V5EiIKIDFmMTI4YmRiN2JhYTRiNjc3MTRmMWRhZWRj
|
||||
MmYzYWI2SjEKB2NyZXdfaWQSJgokMzRiYmM2Y2EtNzJkYi00MDM5LTg0MzMtNTUxZjljZjQzNGE3
|
||||
Si4KCHRhc2tfa2V5EiIKIGIxN2IxODhkYmYxNGY5M2E5OGU1Yjk1YWFkMzY3NTc3SjEKB3Rhc2tf
|
||||
aWQSJgokMzhjNTU1MjktNzY4MC00Nzk5LWI4N2ItMWYwNjY2MTkwZTY3egIYAYUBAAEAAA==
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '337'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.27.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Wed, 20 Nov 2024 13:04:29 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are Bicycles Reporting
|
||||
Analyst\n. You''re a meticulous analyst with a keen eye for detail. You''re
|
||||
known for your ability to turn complex data into clear and concise reports,
|
||||
making it easy for others to understand and act on the information you provide.\n\nYour
|
||||
personal goal is: Create detailed reports based on Bicycles data analysis and
|
||||
research findings\n\nTo give my best complete final answer to the task use the
|
||||
exact following format:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||
Your final answer must be the great and the most complete as possible, it must
|
||||
be outcome described.\n\nI MUST use these formats, my job depends on it!"},
|
||||
{"role": "user", "content": "\nCurrent Task: Review the context you got and
|
||||
expand each topic into a full section for a report. Make sure the report is
|
||||
detailed and contains any and all relevant information.\n\n\nThis is the expect
|
||||
criteria for your final answer: A fully fledge reports with the mains topics,
|
||||
each with a full section of information. Formatted as markdown without ''```''\n\nyou
|
||||
MUST return the actual complete content as the final answer, not a summary.\n\nThis
|
||||
is the context you''re working with:\n1. **E-Bike Boom**: Electric bikes (e-bikes)
|
||||
have seen a significant rise in popularity, with industry reports indicating
|
||||
a projected growth of 60% in sales compared to previous years. Many cities are
|
||||
paving bike lanes specifically designed for e-bikes to accommodate this surge.\n\n2.
|
||||
**Sustainability in Manufacturing**: Bicycle manufacturers are increasingly
|
||||
adopting sustainable practices, such as using recycled materials for frames
|
||||
and parts, and implementing environmentally friendly production processes. This
|
||||
shift is driven by consumer demand for greener products.\n\n3. **Smart Bicycles**:
|
||||
The integration of technology in bicycles has progressed with smart bikes featuring
|
||||
built-in GPS, automated gear shifting, and performance analytics. These innovations
|
||||
enhance the cycling experience and cater to data-driven enthusiasts.\n\n4. **Bike
|
||||
Sharing Programs**: Urban areas are continuing to expand bike-sharing programs,
|
||||
with some cities introducing electric bike options and introducing smartphone
|
||||
apps to streamline the renting process, increasing accessibility and convenience
|
||||
for riders.\n\n5. **Safety Innovations**: Advances in safety technology such
|
||||
as smart helmets that incorporate lights and indicators, anti-collision systems
|
||||
using sensor technology, and built-in communication systems to connect with
|
||||
smartphones are on the rise, aimed at reducing accidents.\n\n6. **Adventure
|
||||
Cycling Trends**: There is a growing popularity in adventure and gravel cycling,
|
||||
with more cyclists seeking off-road experiences. This has prompted manufacturers
|
||||
to develop dedicated bikes that cater to rugged terrains, with features such
|
||||
as wider tires and durable frames.\n\n7. **Customization and Personalization**:
|
||||
The market for customizable bicycles is expanding. Consumers are now able to
|
||||
choose colors, styles, and features that suit their personal preferences, leading
|
||||
to a more personalized cycling experience.\n\n8. **Communities and Events**:
|
||||
Cycling communities are thriving globally, with an increase in events such as
|
||||
group rides, competitive races, and festivals celebrating biking culture. This
|
||||
fosters social engagement and promotes cycling as a lifestyle.\n\n9. **Cargo
|
||||
Bikes for Urban Living**: The rise of cargo bikes, particularly in urban environments,
|
||||
allows for efficient transportation of goods, making them an appealing choice
|
||||
for small businesses and families. This trend is encouraged by city planners
|
||||
promoting cycling as an alternative to car deliveries.\n\n10. **Regulatory Changes**:
|
||||
Governments around the world are increasingly implementing policies to support
|
||||
cycling infrastructure, such as funding for bike lanes, subsidies for bicycle
|
||||
purchases, and stricter emissions standards for motor vehicles, making cycling
|
||||
a more attractive option for commuting.\n\nEach of these points represents the
|
||||
latest developments in the bicycle industry as we move through 2024, highlighting
|
||||
advancements in technology, trends in user preferences, and a broader societal
|
||||
shift towards sustainability and health.\n\nBegin! This is VERY important to
|
||||
you, use the tools available and give your best Final Answer, your job depends
|
||||
on it!\n\nThought:"}], "model": "gpt-4o-mini", "stop": ["\nObservation:"], "stream":
|
||||
false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '4457'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=CkK4UvBd9ukXvn50uJwGambJcz5zERAJfeXJ9xge6H4-1732107842-1.0.1.1-IOK2yVL3RlD75MgmnKzIEyE38HNknwn6I8BBJ1wjGz4jCTd0YWIBPnvUm9gB8D_zLlUA9G7p_wbrfyc4mO_Bmg;
|
||||
_cfuvid=MmeN9oHWrBLThkEJdaSFHBfWe95JvA8iFnnt7CC92tk-1732107842102-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- x64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
content: "{\n \"id\": \"chatcmpl-AVefbpMLcvelEguI3pyXOpfbaXLGG\",\n \"object\":
|
||||
\"chat.completion\",\n \"created\": 1732107867,\n \"model\": \"gpt-4o-mini-2024-07-18\",\n
|
||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
||||
\"assistant\",\n \"content\": \"I now can give a great answer \\nFinal
|
||||
Answer: \\n\\n# Comprehensive Report on the Latest Developments in the Bicycle
|
||||
Industry (2024)\\n\\n## 1. E-Bike Boom\\nThe popularity of electric bikes (e-bikes)
|
||||
has surged dramatically in recent years, with industry reports indicating a
|
||||
projected growth of 60% in sales compared to previous years. This growth can
|
||||
be attributed to increasing urbanization, the rising need for more sustainable
|
||||
modes of transport, and technological advancements that have made e-bikes more
|
||||
accessible and desirable. Cities worldwide are responding to this boom by developing
|
||||
dedicated bike lanes specifically designed for e-bikes, which not only promotes
|
||||
safety but also encourages more individuals to consider cycling as a primary
|
||||
mode of transportation.\\n\\n## 2. Sustainability in Manufacturing\\nIn line
|
||||
with global trends towards sustainability, bicycle manufacturers are increasingly
|
||||
adopting eco-friendlier practices. They are utilizing recycled materials for
|
||||
frames and components and implementing environmentally friendly production processes.
|
||||
This shift is not just a response to regulatory pressures but also driven by
|
||||
consumer demand for greener products. Companies that prioritize sustainability
|
||||
are seeing a competitive edge in an increasingly eco-conscious market, as consumers
|
||||
are more likely to align their purchases with their values regarding environmental
|
||||
responsibility.\\n\\n## 3. Smart Bicycles\\nThe integration of technology in
|
||||
bicycles has advanced significantly, resulting in the emergence of smart bikes.
|
||||
These bicycles often feature built-in GPS for navigation, automated gear shifting
|
||||
for smoother rides, and performance analytics that allow users to track their
|
||||
cycling metrics. Such innovations enhance the overall cycling experience and
|
||||
cater to performance-focused cyclists who seek data to optimize their rides.
|
||||
By merging cycling with technology, manufacturers are not only attracting tech
|
||||
enthusiasts but also making cycling more mainstream.\\n\\n## 4. Bike Sharing
|
||||
Programs\\nBike-sharing programs are rapidly expanding, particularly in urban
|
||||
areas. Many cities have started introducing electric bike options within these
|
||||
programs to meet the growing demand. The introduction of smartphone apps has
|
||||
streamlined the renting process, increasing accessibility and convenience for
|
||||
users. This trend not only promotes a healthier lifestyle but also reduces traffic
|
||||
congestion and environmental impact in densely populated areas, making cycling
|
||||
a more viable option for commuting.\\n\\n## 5. Safety Innovations\\nRecent advancements
|
||||
in safety technology are working towards making cycling safer. Innovations such
|
||||
as smart helmets equipped with lights and turn indicators, anti-collision systems
|
||||
utilizing sensor technology, and integrated communication systems linking bicycles
|
||||
with smartphones are increasingly gaining traction. These developments aim to
|
||||
minimize accidents and enhance the overall sense of security for cyclists, thereby
|
||||
encouraging more people to take up cycling as a daily activity.\\n\\n## 6. Adventure
|
||||
Cycling Trends\\nAdventure and gravel cycling are witnessing a renaissance,
|
||||
with many cyclists seeking off-road experiences that enable a connection with
|
||||
nature. This trend has led manufacturers to innovate by developing dedicated
|
||||
bikes suited for rugged terrains, characterized by features like wider tires
|
||||
and durable frames. As consumers become more adventurous in their hobbies, manufacturers
|
||||
are recognizing the need to cater to this niche market, fostering the growth
|
||||
of adventure cycling as a distinct segment in the industry.\\n\\n## 7. Customization
|
||||
and Personalization\\nThe demand for customizable bicycles is on the rise, allowing
|
||||
consumers to choose various aspects of their bikes, including colors, styles,
|
||||
and features. This trend towards personalization is enhancing the cycling experience,
|
||||
as riders can tailor their bicycles to their preferences. The flourishing market
|
||||
for custom bikes reflects a broader societal shift towards individuality and
|
||||
self-expression, as consumers are no longer content with one-size-fits-all solutions.\\n\\n##
|
||||
8. Communities and Events\\nCycling communities are thriving worldwide, reflected
|
||||
in an increase in events such as group rides, competitive races, and festivals
|
||||
celebrating biking culture. These gatherings not only foster a sense of camaraderie
|
||||
among cyclists but also promote cycling as a lifestyle choice to the wider community.
|
||||
The growth of these events is instrumental in building a culture around cycling,
|
||||
driving advocacy for cycling infrastructure and safety, and ultimately increasing
|
||||
the number of people who cycle.\\n\\n## 9. Cargo Bikes for Urban Living\\nThe
|
||||
rise of cargo bikes, especially in urban settings, represents an innovative
|
||||
solution for transporting goods efficiently while reducing reliance on motor
|
||||
vehicles. Such bikes serve as an appealing alternative for small businesses
|
||||
and families alike, allowing for easy deliveries and shopping. City planners
|
||||
are increasingly promoting cargo bikes within urban transport strategies, recognizing
|
||||
them as a sustainable option that aligns with broader goals for reducing carbon
|
||||
footprints and enhancing urban mobility.\\n\\n## 10. Regulatory Changes\\nGovernments
|
||||
around the globe are progressively enacting regulations to support and grow
|
||||
cycling infrastructure. Initiatives include funding for bike lanes, subsidies
|
||||
for bicycle purchases, and stricter emissions standards for cars. These regulatory
|
||||
changes are making cycling a more attractive option for commuting and are an
|
||||
acknowledgment of the role that cycling plays in reducing pollution and traffic
|
||||
congestion. Such policies are instrumental in fostering a cycling-friendly environment
|
||||
and encouraging more people to adopt biking as a daily mode of transportation.\\n\\nThis
|
||||
report highlights the most significant developments in the bicycle industry
|
||||
as we advance through 2024, showcasing the technological breakthroughs, shifts
|
||||
in user preferences, and an overarching movement toward sustainability and health.
|
||||
These trends are indicative of a vibrant cycling culture that continues to evolve
|
||||
to meet the needs of modern society.\",\n \"refusal\": null\n },\n
|
||||
\ \"logprobs\": null,\n \"finish_reason\": \"stop\"\n }\n ],\n
|
||||
\ \"usage\": {\n \"prompt_tokens\": 790,\n \"completion_tokens\": 1022,\n
|
||||
\ \"total_tokens\": 1812,\n \"prompt_tokens_details\": {\n \"cached_tokens\":
|
||||
0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\": {\n
|
||||
\ \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
||||
\"fp_0705bf87c0\"\n}\n"
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e58a5580add6225-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Wed, 20 Nov 2024 13:04:46 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '18921'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '30000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '150000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '29999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '149998916'
|
||||
x-ratelimit-reset-requests:
|
||||
- 2ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_32b801874a2fed46b91251052364ec47
|
||||
http_version: HTTP/1.1
|
||||
status_code: 200
|
||||
version: 1
|
||||
115
tests/cassettes/test_agent_with_knowledge_sources.yaml
Normal file
115
tests/cassettes/test_agent_with_knowledge_sources.yaml
Normal file
@@ -0,0 +1,115 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are Information Agent.
|
||||
You have access to specific knowledge sources.\nYour personal goal is: Provide
|
||||
information based on knowledge sources\nTo give my best complete final answer
|
||||
to the task use the exact following format:\n\nThought: I now can give a great
|
||||
answer\nFinal Answer: Your final answer must be the great and the most complete
|
||||
as possible, it must be outcome described.\n\nI MUST use these formats, my job
|
||||
depends on it!"}, {"role": "user", "content": "\nCurrent Task: What is Brandon''s
|
||||
favorite color?\n\nThis is the expect criteria for your final answer: Brandon''s
|
||||
favorite color.\nyou MUST return the actual complete content as the final answer,
|
||||
not a summary.\n\nBegin! This is VERY important to you, use the tools available
|
||||
and give your best Final Answer, your job depends on it!\n\nThought:"}], "model":
|
||||
"gpt-4o-mini", "stop": ["\nObservation:"], "stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '931'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.9
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA4xSQW7bMBC86xULXnqxAtmxI1e3FEWBtJekCXJpC4GmVhIdapcgqbhN4L8HlB1L
|
||||
QVOgFwGa2RnOLPmcAAhdiQKEamVQnTXp5f3d9lsdbndh++C+757Or6/bm6uvn59WH/FGzKKCN1tU
|
||||
4VV1prizBoNmOtDKoQwYXef5+SK7WK3ny4HouEITZY0N6ZLTTpNOF9limWZ5Ol8f1S1rhV4U8CMB
|
||||
AHgevjEnVfhbFJDNXpEOvZcNiuI0BCAcm4gI6b32QVIQs5FUTAFpiH4FxDtQkqDRjwgSmhgbJPkd
|
||||
OoCf9EWTNHA5/BfwyUmqmD54qOUjOx0QFBt2oD1sTI9n02Mc1r2XsSr1xhzx/Sm34cY63vgjf8Jr
|
||||
Tdq3pUPpmWJGH9iKgd0nAL+G/fRvKgvruLOhDPyAFA3nF/nBT4zXMmHXRzJwkGaKr2bv+JUVBqmN
|
||||
n2xYKKlarEbpeB2yrzRPiGTS+u8073kfmmtq/sd+JJRCG7AqrcNKq7eNxzGH8dX+a+y05SGw8H98
|
||||
wK6sNTXorNOHN1PbMsuz1aZe5yoTyT55AQAA//8DAPaYLdRBAwAA
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e54a2a7d81467f7-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Wed, 20 Nov 2024 01:23:34 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=DoHo1Z11nN9bxkwZmJGnaxRhyrWE0UfyimYuUVRU6A4-1732065814-1.0.1.1-JVRvFrIJLHEq9OaFQS0qcgYcawE7t2XQ4Tpqd58n2Yfx3mvEqD34MJmooi1LtvdvjB2J8x1Rs.rCdXD.msLlKw;
|
||||
path=/; expires=Wed, 20-Nov-24 01:53:34 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=n3RrNhFMqC3HtJ7n3e3agyxnM1YOQ6eKESz_eeXLtZA-1732065814630-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '344'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '30000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '150000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '29999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '149999790'
|
||||
x-ratelimit-reset-requests:
|
||||
- 2ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_8f1622677c64913753a595f679596614
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
445
tests/cassettes/test_before_crew_modification.yaml
Normal file
445
tests/cassettes/test_before_crew_modification.yaml
Normal file
@@ -0,0 +1,445 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are Bicycles Senior Data
|
||||
Researcher\n. You''re a seasoned researcher with a knack for uncovering the
|
||||
latest developments in Bicycles. Known for your ability to find the most relevant
|
||||
information and present it in a clear and concise manner.\n\nYour personal goal
|
||||
is: Uncover cutting-edge developments in Bicycles\n\nTo give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: Your final answer must be the great and the
|
||||
most complete as possible, it must be outcome described.\n\nI MUST use these
|
||||
formats, my job depends on it!"}, {"role": "user", "content": "\nCurrent Task:
|
||||
Conduct a thorough research about Bicycles Make sure you find any interesting
|
||||
and relevant information given the current year is 2024.\n\n\nThis is the expect
|
||||
criteria for your final answer: A list with 10 bullet points of the most relevant
|
||||
information about Bicycles\n\nyou MUST return the actual complete content as
|
||||
the final answer, not a summary.\n\nBegin! This is VERY important to you, use
|
||||
the tools available and give your best Final Answer, your job depends on it!\n\nThought:"}],
|
||||
"model": "gpt-4o", "stop": ["\nObservation:"], "stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1255'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA4xX247cyA19n68g+ikxugc9vuzMzpvttTdGMtjAHsPAxoHBrqKkypSKSpHqnvZi
|
||||
/z0gpb4MsgHy0mipqljk4eEh9dsFwCLFxS0sQoca+iGvXn/u8fuX9cPDX7tffu1vtlkffv5l/fnX
|
||||
j9uPN68WSzvBm39R0MOpy8D9kEkTl2k5VEIls3p1/eLqx/V6vX7lCz1HynasHXT1klfP189frtY3
|
||||
q/UP88GOUyBZ3MI/LgAAfvNfc7FEelzcwnp5eNOTCLa0uD1uAlhUzvZmgSJJFIsulqfFwEWpuNf3
|
||||
HY9tp7fwAQrvIGCBNm0JEFpzHbDIjirA1/I+Fczw2p9v4Wv5Wq4u4dmzd5mC1hTgTQr7kEngT+9W
|
||||
b9IDyZ/hJ+5TQSXQjuAO6wPp7bNn8KGAhbsEWm1sI5g/qYwEypAJI6QCgmarrbzTDtrMG8x5fwn3
|
||||
HaUKAw9jxpp0D0mgGSlThM3e78G4xRKop6JmZ4OqVPegFLrCmdv9ErhpqKbSQubSUoWKpSUBLBGk
|
||||
46pUIXRYW9uiqSdZQo8P9hS470e1fz1XAmqaFJLd5GdHUUwFN5ns5rFusACVbapczB25NNSeG2of
|
||||
ilJb0YgCu6QdfOqxKtwfnTScZkShxzI2GHSsVAWwmnUjlqTS5r09cB3YrJUWPvD9WayGKJXOAIFR
|
||||
qAI9DlQTlUCX8J7QbArk9EDw898/gVYMFucSmqSFRCCiImRuW39rUWLRtNKOGgXZi1IvEFOloHkP
|
||||
OZUHinapWDxDx4UmjzcUjAwtGBkjVs9xIeOWl8KEzQvD5tMZjHfH0O2sw5P+PZIYPO9OyGI2DgWq
|
||||
RaDDLcEwSkfR0jVgSSTmEkYe1GhNxW59YnmoHEjEMi1j6AAFRk05fbfFSp6HCD0q1YRZllApjsEZ
|
||||
gXXDBRpmHWoqKhbYUDmOwbI7YZZMFcxRP5FqMPYCBS7cm3dGgVScvps56anEUbTuHZeXE2cKb50y
|
||||
fsffUtvpjuwX7g6OGS73HUGkLWUevAi4MaAdC5akdB5GKiGP8SwO87atOHRUyKKUMU8h0aNW6smT
|
||||
fLrYtsexeq6aij2JlWiSp1UoMGDV5FHnPRTWFOhQJhUdR7PU81gs9eCysJyZ6/mh2nDtncdeagNR
|
||||
dGheGTRvR1Hu03e3aet3HB1iEyL4iSS1RZ5Ij3ZkdSSA5o6f00rFuLszdk7C5KXZz7biZAa0M13M
|
||||
mXdQU7SSVIYwe0Ae6qQlGIxTXC3DVq2TfiWZbwqWBj8c05aqkHFYxp4qFKI42Rgq96xWo5NUmV+Q
|
||||
U0Mwq+1mP/liKDVc5+srDRmPIihqosoN7DrONMXm4P3gCv44YBFTIm7gs4vW233ILialqShaRxcf
|
||||
A/DOZC8kddZyzXGXIs2atCVxgqfizqXSrhrTmugidW7pVGWRYjIg4hwZllmI7XElHXp5SujIZdjl
|
||||
LPBYsSWrn9MFJ2W2016ellI0hTZgW3OOi4d9bWH/hTBr57u/UM6udm+oUJPUqfLFcm/9x23usJJv
|
||||
4Qa608lZJpdTOxiIhzyhETpm8bqaoUSjWsSU90CPVEMSgsrmMXlXOxY8CBkEIGNt6dQJuTlctjJG
|
||||
lQmymQQTNSfl7fExORED1ph4izKpjZeNMa/VzoBJxdzazBE7LjeGy0fzjBt4i7VlP/XZlFD3xw4/
|
||||
i8xcP+XQjdzZSL3jwtXub/no4xJ2XbKcV+9E0xatWGTg6nlrmaMAb6lOXRiiTy6BXG4bmzMcSJAu
|
||||
NToXqjzpuptR0tS2KFtN7YEHF8zlUwU6dmchNaNT+D+6lHA/kCa1IehQBhbRO/fTYz/fckhvhwLU
|
||||
byoGihBTm6wnDRnVZMuCNzZtU9URM1RCB9RFYmz7KZuHt2ZDoJ0TZC3ZxwTs+YxOVLQbJaHoUSca
|
||||
EqtizIDaZTLRmDGeLNm2cHCdCwxjHVhmJbhaW/DvOYwCXCbd/IQN6R7O+o5F/3pSdm8Kh2Yl087z
|
||||
GevUWXwUgI5yTzpL6mZMWVemE1a1xRTAPDrME47LdE2c2s1B3Yp3nW2StHFOLqdi4yKmxLix3PZD
|
||||
5a33bXsHQmGsvvUwwx2y6m5P84cR/kjGy/NZuVIzCtqoXsac5/e/H4fvzO1QeSPz+vF9k0qS7pvV
|
||||
BRcbtEV5WPjq7xcA//Qhf3wyty9M6wf9pvxAxQw+f3E92VucPitOqy+vb+ZVZcV8Wri+erX8A4Pf
|
||||
IimmLGffCYuAoaN4Onr6qMAxJj5buDgL+7/d+SPbU+iptP+P+dNCCDQoxW9Dtc7wNOTTtkr22fW/
|
||||
tl3AfwAAAP//ggYz2MFKkGQVn5YJqkHBjTRQhKQVxJunGlkYG6eaplkqcdVyAQAAAP//AwB8fdED
|
||||
Ag4AAA==
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e44d29989a61ab0-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Mon, 18 Nov 2024 03:20:10 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=elWqsM.3Jt5.vyzDrpCmVftKrlxb0_fRVMZxBGUYfcE-1731900010-1.0.1.1-AxUZI4aRPPnqgUcewvytSN0TcEpcfBqYEZ.h2A96g3wUsy6Ui_pr4y81nyHf2Pcn1S3lz4zSmufsGDmnNKtHDQ;
|
||||
path=/; expires=Mon, 18-Nov-24 03:50:10 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=lzrs54cKet3l28qlaoF9_vtIs55.7H9Sbr6IhTssBmk-1731900010790-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '5249'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '30000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '29999708'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_a9781a68655042f161d8089cc3819728
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: !!binary |
|
||||
CuEOCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSuA4KEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRKQDAoQ2G6ncUKutPIsOmTplHC6bBIIclCMqGiNvUoqDENyZXcgQ3JlYXRlZDABOcBv
|
||||
KPXg8QgYQXAFLvXg8QgYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODAuMEoaCg5weXRob25fdmVy
|
||||
c2lvbhIICgYzLjExLjdKLgoIY3Jld19rZXkSIgogMWYxMjhiZGI3YmFhNGI2NzcxNGYxZGFlZGMy
|
||||
ZjNhYjZKMQoHY3Jld19pZBImCiQzNDEwYmI2Mi01NzYxLTRhMGQtOGY1Zi1hOTliZWY5NDYxM2VK
|
||||
HAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdf
|
||||
bnVtYmVyX29mX3Rhc2tzEgIYAkobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgCSqoFCgtjcmV3
|
||||
X2FnZW50cxKaBQqXBVt7ImtleSI6ICI3M2MzNDljOTNjMTYzYjVkNGRmOThhNjRmYWMxYzQzMCIs
|
||||
ICJpZCI6ICI1YzgyZGRkOS1kMTM3LTQ3MDMtODY0My1iNTFmZDBlMTUxMjkiLCAicm9sZSI6ICJ7
|
||||
dG9waWN9IFNlbmlvciBEYXRhIFJlc2VhcmNoZXJcbiIsICJ2ZXJib3NlPyI6IHRydWUsICJtYXhf
|
||||
aXRlciI6IDIwLCAibWF4X3JwbSI6IG51bGwsICJmdW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAi
|
||||
bGxtIjogImdwdC00byIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJhbGxvd19jb2Rl
|
||||
X2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6
|
||||
IFtdfSwgeyJrZXkiOiAiMTA0ZmUwNjU5ZTEwYjQyNmNmODhmMDI0ZmI1NzE1NTMiLCAiaWQiOiAi
|
||||
ODdlYmRiYTMtNDRmZS00ODBmLWI2MWQtMWYzZjIyMWE5MDE2IiwgInJvbGUiOiAie3RvcGljfSBS
|
||||
ZXBvcnRpbmcgQW5hbHlzdFxuIiwgInZlcmJvc2U/IjogdHJ1ZSwgIm1heF9pdGVyIjogMjAsICJt
|
||||
YXhfcnBtIjogbnVsbCwgImZ1bmN0aW9uX2NhbGxpbmdfbGxtIjogIiIsICJsbG0iOiAiZ3B0LTRv
|
||||
IiwgImRlbGVnYXRpb25fZW5hYmxlZD8iOiBmYWxzZSwgImFsbG93X2NvZGVfZXhlY3V0aW9uPyI6
|
||||
IGZhbHNlLCAibWF4X3JldHJ5X2xpbWl0IjogMiwgInRvb2xzX25hbWVzIjogW119XUqTBAoKY3Jl
|
||||
d190YXNrcxKEBAqBBFt7ImtleSI6ICI2YWZjNGIzOTYyNTlmYmI3NjgxZjU2Yzc3NTVjYzkzNyIs
|
||||
ICJpZCI6ICI2ZTIzZmMzMS02OGI2LTRjZTMtODZjNC0zMDcxZGUwZDdjMWIiLCAiYXN5bmNfZXhl
|
||||
Y3V0aW9uPyI6IGZhbHNlLCAiaHVtYW5faW5wdXQ/IjogZmFsc2UsICJhZ2VudF9yb2xlIjogInt0
|
||||
b3BpY30gU2VuaW9yIERhdGEgUmVzZWFyY2hlclxuIiwgImFnZW50X2tleSI6ICI3M2MzNDljOTNj
|
||||
MTYzYjVkNGRmOThhNjRmYWMxYzQzMCIsICJ0b29sc19uYW1lcyI6IFtdfSwgeyJrZXkiOiAiYjE3
|
||||
YjE4OGRiZjE0ZjkzYTk4ZTViOTVhYWQzNjc1NzciLCAiaWQiOiAiNzRhOWVhMjMtNzVmYy00NWFi
|
||||
LWIyMDAtMTllZTk0ZjU0Y2JkIiwgImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lu
|
||||
cHV0PyI6IGZhbHNlLCAiYWdlbnRfcm9sZSI6ICJ7dG9waWN9IFJlcG9ydGluZyBBbmFseXN0XG4i
|
||||
LCAiYWdlbnRfa2V5IjogIjEwNGZlMDY1OWUxMGI0MjZjZjg4ZjAyNGZiNTcxNTUzIiwgInRvb2xz
|
||||
X25hbWVzIjogW119XXoCGAGFAQABAAASjgIKEDkgSkh9vBYObKyMriyidxwSCG3RsAoOYBU/KgxU
|
||||
YXNrIENyZWF0ZWQwATkgjkr14PEIGEHYFkv14PEIGEouCghjcmV3X2tleRIiCiAxZjEyOGJkYjdi
|
||||
YWE0YjY3NzE0ZjFkYWVkYzJmM2FiNkoxCgdjcmV3X2lkEiYKJDM0MTBiYjYyLTU3NjEtNGEwZC04
|
||||
ZjVmLWE5OWJlZjk0NjEzZUouCgh0YXNrX2tleRIiCiA2YWZjNGIzOTYyNTlmYmI3NjgxZjU2Yzc3
|
||||
NTVjYzkzN0oxCgd0YXNrX2lkEiYKJDZlMjNmYzMxLTY4YjYtNGNlMy04NmM0LTMwNzFkZTBkN2Mx
|
||||
YnoCGAGFAQABAAA=
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '1892'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.27.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Mon, 18 Nov 2024 03:20:10 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: !!binary |
|
||||
Cs4CCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSpQIKEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRKOAgoQwB8k3adY9mK031pcBVZJKhII3fxizKFNiGkqDFRhc2sgQ3JlYXRlZDABOaj7
|
||||
K0Xi8QgYQUiCLUXi8QgYSi4KCGNyZXdfa2V5EiIKIDFmMTI4YmRiN2JhYTRiNjc3MTRmMWRhZWRj
|
||||
MmYzYWI2SjEKB2NyZXdfaWQSJgokMzQxMGJiNjItNTc2MS00YTBkLThmNWYtYTk5YmVmOTQ2MTNl
|
||||
Si4KCHRhc2tfa2V5EiIKIGIxN2IxODhkYmYxNGY5M2E5OGU1Yjk1YWFkMzY3NTc3SjEKB3Rhc2tf
|
||||
aWQSJgokNzRhOWVhMjMtNzVmYy00NWFiLWIyMDAtMTllZTk0ZjU0Y2JkegIYAYUBAAEAAA==
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '337'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.27.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Mon, 18 Nov 2024 03:20:15 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are Bicycles Reporting
|
||||
Analyst\n. You''re a meticulous analyst with a keen eye for detail. You''re
|
||||
known for your ability to turn complex data into clear and concise reports,
|
||||
making it easy for others to understand and act on the information you provide.\n\nYour
|
||||
personal goal is: Create detailed reports based on Bicycles data analysis and
|
||||
research findings\n\nTo give my best complete final answer to the task use the
|
||||
exact following format:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||
Your final answer must be the great and the most complete as possible, it must
|
||||
be outcome described.\n\nI MUST use these formats, my job depends on it!"},
|
||||
{"role": "user", "content": "\nCurrent Task: Review the context you got and
|
||||
expand each topic into a full section for a report. Make sure the report is
|
||||
detailed and contains any and all relevant information.\n\n\nThis is the expect
|
||||
criteria for your final answer: A fully fledge reports with the mains topics,
|
||||
each with a full section of information. Formatted as markdown without ''```''\n\nyou
|
||||
MUST return the actual complete content as the final answer, not a summary.\n\nThis
|
||||
is the context you''re working with:\n1. **Electric Bicycles (E-Bikes) Dominate
|
||||
the Market:** In 2024, e-bikes continue to lead in sales growth globally. Their
|
||||
popularity is fueled by the advancement in battery technology, offering longer
|
||||
ranges and shorter charging times, making commuting more efficient and sustainable
|
||||
in urban environments.\n\n2. **Integration with Smart Technology:** Bicycle
|
||||
manufacturers are increasingly incorporating IoT technology to enhance user
|
||||
experience. Features like GPS tracking, fitness data logging, and anti-theft
|
||||
systems directly linked to smartphones are becoming standard in newer models.\n\n3.
|
||||
**Sustainable Manufacturing Techniques:** Environmental concerns have pushed
|
||||
companies to adopt greener manufacturing processes, such as utilizing recycled
|
||||
materials, reducing carbon footprints in production, and implementing circular
|
||||
economies within the bicycle industry.\n\n4. **Innovations in Lightweight Materials:**
|
||||
The development of new composite materials, including carbon and graphene, results
|
||||
in extremely lightweight and durable frames. This advancement is particularly
|
||||
noticeable in racing and mountain bikes, enhancing performance and speed.\n\n5.
|
||||
**Customizable and Modular Bike Designs:** In 2024, there is a notable trend
|
||||
toward bikes with modular designs that allow riders to customize parts and accessories
|
||||
easily. This trend caters to diverse consumer needs and promotes longer bike
|
||||
life cycles by allowing for parts replacement instead of whole bikes.\n\n6.
|
||||
**Expansion of Urban Cycling Infrastructure:** More cities worldwide are investing
|
||||
in cycling-friendly infrastructure, such as dedicated bike lanes and bike-sharing
|
||||
schemes, to encourage eco-friendly commuting and reduce traffic congestion.\n\n7.
|
||||
**Health and Wellness Benefits:** With growing awareness of health and fitness,
|
||||
more people are choosing cycling as a daily exercise routine. The industry sees
|
||||
a surge in sales of fitness-oriented bicycles designed to maximize cardiovascular
|
||||
and strength training benefits.\n\n8. **Rise of Cargo and Utility Bicycles:**
|
||||
There is an increase in demand for cargo bicycles, which are used for transporting
|
||||
goods over short distances, reflecting a shift towards sustainable business
|
||||
delivery options, particularly in urban settings.\n\n9. **Competitive Cycling
|
||||
and Esports:** Competitive cycling has embraced digital platforms, with virtual
|
||||
reality and augmented reality races gaining traction among cycling enthusiasts
|
||||
and professional athletes for training and competition purposes.\n\n10. **Focus
|
||||
on Bike Safety Innovations:** Advances in bicycle safety technology, including
|
||||
smart helmets with built-in communication systems and advanced lighting for
|
||||
night visibility, are considerably improving rider security, making cycling
|
||||
a safer mode of transport.\n\nBegin! This is VERY important to you, use the
|
||||
tools available and give your best Final Answer, your job depends on it!\n\nThought:"}],
|
||||
"model": "gpt-4o", "stop": ["\nObservation:"], "stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '4197'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=elWqsM.3Jt5.vyzDrpCmVftKrlxb0_fRVMZxBGUYfcE-1731900010-1.0.1.1-AxUZI4aRPPnqgUcewvytSN0TcEpcfBqYEZ.h2A96g3wUsy6Ui_pr4y81nyHf2Pcn1S3lz4zSmufsGDmnNKtHDQ;
|
||||
_cfuvid=lzrs54cKet3l28qlaoF9_vtIs55.7H9Sbr6IhTssBmk-1731900010790-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA4RXTY/kxg29+1cQ44sNdDdmdm0nmdvueh1s4gGM3UkCJ75QVZTETH3IrFL3avzn
|
||||
A7Kk7mknQC4NtFRkkY+Pj9RvXwDcsL+5hxs3YnVxCvs3f4vdtz++G37++88fX//lp6f4/vM/n6eH
|
||||
+Ne75+8+3uzUInf/Jlc3q4PLcQpUOaf22glhJfV694fXd3+6vb29u7MXMXsKajZMdf9N3r+6ffXN
|
||||
/vaP+9vvVsMxs6Nycw//+gIA4Df71RCTp88393C7255EKgUHurk/HwK4kRz0yQ2WwqViqje7y0uX
|
||||
U6VkUT+OeR7Geg8fIOUTOEww8JEAYdDQAVM5kfySfkk/cMIAb+z/vT74Et7lOAmNlIqafKQpS4Wc
|
||||
4C27xQWCD8nPpcoCj0LJF+AEmqcZfwl3B3gfyFVht1kU+Or9/i0/Ufkavs+RE1aCOhI8oDxRVcMP
|
||||
zccOaLPtVtsduBxjTmGBp5RPCbAA7Tv1pq9S5TSbNxaYpxOKhyqo5cuyaGxDyB0GKGjOSg7suV84
|
||||
DauRt4iSIz2sUUWL6gCPI0GZZbAX7UqY8jQHFK6LgdoRYK3C3VzJQ81QeEjcs8NUAf1R3UZK1UDq
|
||||
sFaSBSq5MeWQh+UAD9mTbN6LFavLWCqEnAYSEEwDgcMJOw5cWVPAEPJJ4++zAH2ulDxZ0kcKWieE
|
||||
wmkIBG5EGegAb7xnJS+GsOwsRSE/O32kcdkxw4MjFRixGK4vsmpA4TQRhh1EfFrRi4BwZOwCASYP
|
||||
1PfsmFKFPJlzDXCWDpPVcK6chgO8KeAsE0DJc/IW0ClL8DAITlMgOHEdNYaBivlR51MOYdZ/LQP0
|
||||
6x25P6M3CRUDG6HMpSInC63kZnhQZse5khToKFHPFXrJsaFxgbDBgGtkDfhFYUh45MG4K6iptjgJ
|
||||
C1lR4DSyor5hZyBl8KRyUcivUFDkUjinclg75tUBPqRKg6AlZE4/RZQKj2em6NGt/yKmuUdXZ9FM
|
||||
RjwqQe0OLfsCFDtBy2fUNxfPuYdifi8M1AApjUZ/IwZ7UlJNJEzJUeuCuZAa6wGNVBJV/f84choK
|
||||
fPUhP34NXODEnsokhH7XsugJNcoCZXajNu6ff/qk4DkjUFlKpai1cNqCg5atYvLawpwg0YlkkwEw
|
||||
ZS0aDperpKwbSgu8aDohD9BzTVQKeKy4azeC5Lla+yQPMSeuWWAi6bPErfuFMOy1/OBZyNWwQB1F
|
||||
pXRtAYNvGnOicoAfZqkjScyi9W/N7gFT5X0dqa/nBK1EHVECT0cKeSK/g9z3JJr1ROgM3sjJQ7fA
|
||||
JPnIXl9xMpHHRHkuELJrKc+Tx0oFuAc8A8QFYj6ShyxQMU4k5K0KVsJCl6Kzw3AtTijG6VQUDL14
|
||||
E19FOiujEznVAk9HnV+argV4xDC33u/n5JrGaLNo45uUNnlTZ1zqRvjXB/j0oj8fznxWl8Z5/nWm
|
||||
oqffpyNLThqmavhm1VpSKCInbfdJOMvLi8+obLNqHVKHF7cpXzTzq+ZpupIGIJf3vTaBD1YSR6Vo
|
||||
5q27jL8vcohXOdRzDlBHrFpZjvxMQFfpcJzQ1QO8gZQrOzJPVYeqVrN171k7Kgd+PvexkOXnIWIl
|
||||
YQxtwqxZT5JXfd/9lxzhtfgLns4+9GyZ43QWywlT02kCDCVDn92sweS0qZo6dSidiX2uk7DyqVts
|
||||
AkR+NoKfo7ngaJxh3aoUC8M7kQzL/jJDJkGnqJT/NcBcTo4mUyEEx+J0KmvNUo6L0X4d5hcCFBiQ
|
||||
k6FgnhWc00hCreU0xJaqJ53i5I1MNoePSi2NuMG+8m8HfVMANY95zeJlgYXKlFPhdn6j/zeq9ykf
|
||||
rZhWtx95GOuJ9BcetoLq6Rfiy6vJkSC8OH6p/7YLmQgKHdepp7TbeNEyO8A/VJx/v57okpsLm0Zu
|
||||
er2VljsSA0An9EiJdheJWNXN5bhGth5VXQQ/S2P1qJPvqIJOfRbaROkSviI/oVS2UqryboLUkl7r
|
||||
ufZeu2BO2oDQcZsnNFgya1mvxH2dcRe5c8JVhfDQwCeBXlDnPkfVXwIkyX5JGNlpfXsOG2vXnoQy
|
||||
kQr5Ra4VQaps4Rq6FQcqRiP1QDb2tc2aHDZ3qBDMKkWBn9Z5W1QEBq1R8g3CJnhtBhfS2fgSOu1N
|
||||
SmUWupRFe8DG6TrXhyw6RDxF1KU993BEYX1USUR1dLdueGeh6CSjPy+huW+BG8vrOBfGUs9LzLcH
|
||||
eDeXmiM/n/fBh+ytLXX1h++Ne2dWL4RiigyFFFgYpC21Tf9q1k2+gPu9z7j6vKZ0+b+6btKl/nMC
|
||||
+3y7GnOm0rZJtGqRlFap7fomF5GorqsveD6SWCnIt0pOQj2Jbk0GbxNTknI4A7EGCz06LaiukoRl
|
||||
ub5GbbURVno4Fcws3GbPlE9ta7jsOxU5tKnH8iKjrPx0+i0CQr/OLBv5ryNdN6qGesr6oadT0IjQ
|
||||
vNQl0F6TccaWc1rQzbVRr8mfrXkv5/MB3i5AOh+35shJhX2eBkG/dpPQFHBtzCbtgXsqExoOlz2k
|
||||
nL90dm2AbXVQ/K1o60c6rS7/AwAA//+MWU1vGzcQvftXELq0AWShbuPGPTaBi+bqIOmpEKjd2V3W
|
||||
XHJBctdVAf/3Yj74IdsFehNEiuIMh2/ee2zClfaLv3jSMUGu2Z8P6v7vRbsoWf9K/PyTFPlnNwQd
|
||||
U1ippkSkCqIDZqY2Z75UtIuWFBCCoXqWc6XGiqqV5Q9pHuTMUq0bqh3iFSeifQbbHeKeSCjZl7nY
|
||||
l5wflDiM6+zaQ8xtkmdKhD30pkProtwfqx1WFgqo6zhpqq3YTTBnqlw4hQ6EslK7BkvnC7YJ4bSS
|
||||
bjOzpOj8GvQIlzSqqEAubWthM3gJOMCsqqrwO6jPjMb9i6hV1ANIT+YbIj2WUpn/Hf+IuhAm/B9w
|
||||
VM9+SVwukk4d5eoswcw6nImzEtKSGtwrC7oXSJxA2zQZCHte96Lb23NzSzgidgtSK/Y+HNTvtArt
|
||||
/Q+wloTKR9ajBI/cnQsg6icdgCb5QXbAjJs1jmzFuN5spl9zK600tQlUuyydLEGMNBSDig9CZyJe
|
||||
ntHM4CpfLh5OhhOqyguOlNXWsobFx0wJqojsSZCLkUI+jGT9zCrowjXpdOiN33RkSoeR5nZ4fVqN
|
||||
pbMQ+R65XSIQ61Ra5GXAObK9OgF+a/3TNRNvXns1iS4rBoIiHls2eRyIOgm1BVs+ir07pWfvRjmH
|
||||
5sCb/P8Hv6qie18ZAJpIaPpkQcqnIac8AzphAtkNmemsj4BMOIBxgw/tMX8XFaEfRa82k4jWZ+gl
|
||||
ujx5/G+0L3L16WUJXndTLtK7g3owTDs/6TB62sFXyXAuiNzJG/DryuRVJjcsMTaAWfQOBTGg5ycG
|
||||
wGSGxGyU24w6waQ3g6kRStAi7Al7Ogl9sNiPzxlkM78s/493IoKeLcRIqMoOAiFL8nJfURxjZmIj
|
||||
z7GEWgzTFg0Q5qSkNjPI4+TRe+RWyHPjhLZpTzat9Fk4v2a5pKD7ivINpEThsSnoLH/UBpOheGZ9
|
||||
RvfRuMZ0C0ReSTUhsf0oyZHoLdJvBkU+p5YrWOQVyi/ArgoVTWaobI+9lnl7jPGpYxGoiQUY+jys
|
||||
hNEv2yEHOHtB6mzLFWj8hTVnZtC5D+MW7inDVHLtlNoSmUqK8xVVb0aq/FZDeLaOeSVadTMhrSjT
|
||||
6DeoDht/96C+5WFgU+P7bw/vuOOsJDSgr0O/PrxrDBbJeFUGyNf1BqTOpGhYiuJyRTZgw7uv1Fqo
|
||||
WiMcdJqQ4QjCu5FP0zhlZmRkZoO9imbGnoObY/yvOCw8lzG3m7D/uvGFamXh4NfU8AdNot4PagRP
|
||||
ApAcJGtmk/TFbctpX6xOmHXZKHITSoNfMPmrY/5DiLv2tDWOhsCSw6Y7YhY+N7bwESyK952PnrSu
|
||||
7iZK8USaOBatCNrOTMVrzfD555q7+eGgfkNpwA8cj6C+MLNoFDpOlW9fWfpCjhpDtZXDGmUH5sMb
|
||||
l/ZIjmTL2F5qnyI2E1ri0dDGA5e2P0UIG2RbKFVPpbi6E9gZUlTI9pdF3D+FPTNdG7HgnREPUbzJ
|
||||
F68J1QBFgBSbJHu5JI8utCvMEEZw3Vnh5l56NcWyYJeDaGn1SckqaG3gp8l0U0sFEKZFjDvyOjZT
|
||||
aR7iEnnVzAQL3gfQj+LZcvUVTcMWQJYN+OPmarCYyT6XIdacXTrXKMlXbw/5EE9YHH3Aq4oxcbmQ
|
||||
SxVcLG8mRCiIr1WpLbyCHnOwhfW6IcqH9oEvwLBGje+LbrVWvn8uL4bWj0vwpyjj5fvBOBOnI0bg
|
||||
Hb4OxuSXHY0+Xyn1J71MrhePjTsE8yUdk38Ehwt++PGO19vVt9A6enNz+5MMJ5+0bUbu7m72byx5
|
||||
7AF1a2yeN3ed7ibo62/rWyiihG8GrprAX2/orbU5eOPG/7N8HejQZIT+uAQUTpdB12kB/iJ7/O1p
|
||||
JdG04R2X+3Ew+L5HjRSPZFiO72+74fZ9Dxp2V89X/wIAAP//AwC/JrUuuR4AAA==
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e44d2bc2bec1ab0-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Mon, 18 Nov 2024 03:20:25 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '13936'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '30000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '29998979'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 2ms
|
||||
x-request-id:
|
||||
- req_602e9ec1c4bc0da2fdb284f809c50872
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
438
tests/cassettes/test_before_crew_with_none_input.yaml
Normal file
438
tests/cassettes/test_before_crew_with_none_input.yaml
Normal file
@@ -0,0 +1,438 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: !!binary |
|
||||
CuMOCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSug4KEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRKSDAoQf/zeqxfqyNP5BgW6rZrC0BIIiXyYjb3bUBcqDENyZXcgQ3JlYXRlZDABOXha
|
||||
vrnarwgYQcCbxrnarwgYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODAuMEoaCg5weXRob25fdmVy
|
||||
c2lvbhIICgYzLjExLjdKLgoIY3Jld19rZXkSIgogZjM0NmE5YWQ2ZDczMDYzZTA2NzdiMTdjZTlj
|
||||
NTAxNzdKMQoHY3Jld19pZBImCiQ2Yzg5NDczNy0zNWJjLTRhZDEtYjE2Ni1hZTY3ODhhMTA4YWZK
|
||||
HAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdf
|
||||
bnVtYmVyX29mX3Rhc2tzEgIYAkobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgCSqwFCgtjcmV3
|
||||
X2FnZW50cxKcBQqZBVt7ImtleSI6ICI3M2MzNDljOTNjMTYzYjVkNGRmOThhNjRmYWMxYzQzMCIs
|
||||
ICJpZCI6ICIzNDQ2YWRlOS05YWM0LTQ1NTUtOTlkNS0zYWM0MzdhMmMxNmUiLCAicm9sZSI6ICJ7
|
||||
dG9waWN9IFNlbmlvciBEYXRhIFJlc2VhcmNoZXJcbiIsICJ2ZXJib3NlPyI6IGZhbHNlLCAibWF4
|
||||
X2l0ZXIiOiAyMCwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwg
|
||||
ImxsbSI6ICJncHQtNG8iLCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6IGZhbHNlLCAiYWxsb3dfY29k
|
||||
ZV9leGVjdXRpb24/IjogZmFsc2UsICJtYXhfcmV0cnlfbGltaXQiOiAyLCAidG9vbHNfbmFtZXMi
|
||||
OiBbXX0sIHsia2V5IjogImJiMDY4Mzc3YzE2NDFiZTZkN2Q5N2E1MTY1OWRiNjEzIiwgImlkIjog
|
||||
IjExMzVjODkzLTRlZGUtNDRiNC1hMjZmLTIxYWUxNzA0ZDRlZCIsICJyb2xlIjogInt0b3BpY30g
|
||||
UmVwb3J0aW5nIEFuYWx5c3RcbiIsICJ2ZXJib3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAyMCwg
|
||||
Im1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJncHQt
|
||||
NG8iLCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6IGZhbHNlLCAiYWxsb3dfY29kZV9leGVjdXRpb24/
|
||||
IjogZmFsc2UsICJtYXhfcmV0cnlfbGltaXQiOiAyLCAidG9vbHNfbmFtZXMiOiBbXX1dSpMECgpj
|
||||
cmV3X3Rhc2tzEoQECoEEW3sia2V5IjogIjZhZmM0YjM5NjI1OWZiYjc2ODFmNTZjNzc1NWNjOTM3
|
||||
IiwgImlkIjogImIxZjQ5ODJiLTRjZGItNDk1MC04ZmNjLWMwZDcxNzRhYzY0NiIsICJhc3luY19l
|
||||
eGVjdXRpb24/IjogZmFsc2UsICJodW1hbl9pbnB1dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAi
|
||||
e3RvcGljfSBTZW5pb3IgRGF0YSBSZXNlYXJjaGVyXG4iLCAiYWdlbnRfa2V5IjogIjczYzM0OWM5
|
||||
M2MxNjNiNWQ0ZGY5OGE2NGZhYzFjNDMwIiwgInRvb2xzX25hbWVzIjogW119LCB7ImtleSI6ICJi
|
||||
MTdiMTg4ZGJmMTRmOTNhOThlNWI5NWFhZDM2NzU3NyIsICJpZCI6ICIyY2VkNGVhNC01YjcwLTRh
|
||||
MDctOTEyOS00MzQ2ZDQ1OWM4NjIiLCAiYXN5bmNfZXhlY3V0aW9uPyI6IGZhbHNlLCAiaHVtYW5f
|
||||
aW5wdXQ/IjogZmFsc2UsICJhZ2VudF9yb2xlIjogInt0b3BpY30gUmVwb3J0aW5nIEFuYWx5c3Rc
|
||||
biIsICJhZ2VudF9rZXkiOiAiYmIwNjgzNzdjMTY0MWJlNmQ3ZDk3YTUxNjU5ZGI2MTMiLCAidG9v
|
||||
bHNfbmFtZXMiOiBbXX1degIYAYUBAAEAABKOAgoQOaRyuH2UERJ3sHC1ImhOgxIIq8DZc4P2KYMq
|
||||
DFRhc2sgQ3JlYXRlZDABOTA127narwgYQVjV27narwgYSi4KCGNyZXdfa2V5EiIKIGYzNDZhOWFk
|
||||
NmQ3MzA2M2UwNjc3YjE3Y2U5YzUwMTc3SjEKB2NyZXdfaWQSJgokNmM4OTQ3MzctMzViYy00YWQx
|
||||
LWIxNjYtYWU2Nzg4YTEwOGFmSi4KCHRhc2tfa2V5EiIKIDZhZmM0YjM5NjI1OWZiYjc2ODFmNTZj
|
||||
Nzc1NWNjOTM3SjEKB3Rhc2tfaWQSJgokYjFmNDk4MmItNGNkYi00OTUwLThmY2MtYzBkNzE3NGFj
|
||||
NjQ2egIYAYUBAAEAAA==
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '1894'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.27.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Sun, 17 Nov 2024 07:10:11 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are {topic} Senior Data
|
||||
Researcher\n. You''re a seasoned researcher with a knack for uncovering the
|
||||
latest developments in {topic}. Known for your ability to find the most relevant
|
||||
information and present it in a clear and concise manner.\n\nYour personal goal
|
||||
is: Uncover cutting-edge developments in {topic}\n\nTo give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: Your final answer must be the great and the
|
||||
most complete as possible, it must be outcome described.\n\nI MUST use these
|
||||
formats, my job depends on it!"}, {"role": "user", "content": "\nCurrent Task:
|
||||
Conduct a thorough research about {topic} Make sure you find any interesting
|
||||
and relevant information given the current year is 2024.\n\n\nThis is the expect
|
||||
criteria for your final answer: A list with 10 bullet points of the most relevant
|
||||
information about {topic}\n\nyou MUST return the actual complete content as
|
||||
the final answer, not a summary.\n\nBegin! This is VERY important to you, use
|
||||
the tools available and give your best Final Answer, your job depends on it!\n\nThought:"}],
|
||||
"model": "gpt-4o", "stop": ["\nObservation:"], "stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1250'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=08pKRcLhS1PDw0mYfL2jz19ac6M.T31GoiMuI5DlX6w-1731827382-1.0.1.1-UfOLu3AaIUuXP1sGzdV6oggJ1q7iMTC46t08FDhYVrKcW5YmD4CbifudOJiSgx8h0JLTwZdgk.aG05S0eAO_PQ;
|
||||
_cfuvid=74kaPOoAcp8YRSA0XocQ1FFNksu9V0_KiWdQfo7wQuQ-1731827382509-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA4RXTW8cRw69+1cQcwlgzAiS5a/oJnu1Wi3irON4N4fNwmBXc7q5rmZ1iuwejYL8
|
||||
9wWrejSjJMBeNFAXi83H9/jRvz4DWHG7uoJV6NHCMMbN9T8/8+vvb374+hAe9MPlntvbf5x/N739
|
||||
9GAPf1+t/UZq/kvBDrfOQhrGSMZJ6nHIhEbu9eLN5cXbF29eXpyXgyG1FP1aN9rmZdq8OH/xcnP+
|
||||
dnP+ernYJw6kqyv49zMAgF/LXw9RWrpfXUFxU54MpIodra4ejQBWOUV/skJVVkOx1fp4GJIYSYn6
|
||||
7psBDPUrtbBj6yGTEubQs3SAoCMF3nIASyOHaoGwTWFSSAKThDRTdtswmbF0G2o7gpZmimkcSEwB
|
||||
pYVMkWYUA5ZtygN6gmCbMjjsM7jlmQSsJ6D7kYLV87QFhJYMOVILkdX80cU5NFOMZDAmFtM13H0T
|
||||
I5DolAkswZjTzC0BgpORqSdRngk80plp5078VRGN1CNS7nrTM/iODAby+4EO6ejQ+grQrwgFz3Xe
|
||||
P8GhRuOm2W/89+xn+Vk+92nqeruCO5C0g4ACnUeA0LkcAEV3lN3yrywY4br8fwX+5OIMnj9/lwm/
|
||||
Wp/dDbDADxOKTQO8T8M4eZqfP7+Cd/uSvTVgO6MEqtlmgV8W63Cwhh5nAk+iJRhSJueDsYkEv0wN
|
||||
W4WsytKtYcT5AHeH+0LSmDEYB4yA4xg5FNjlVSHvR0tdxrHfr6GK/x50r0YDKA9TrLbrooI0Gg/8
|
||||
ULM25tREGrRk7IWjvr6DOzHqcjVggb8RRusDZnLA19lci4wRWIxi5I4kELiaWSZSh2cZRZ0b6B8v
|
||||
ryuX13eAsUuZrR+0MFNygSFMGY3iHlrGTpLnAVpWQiVdw5ip5VDyOKIxiUGaLKSBFlgjZU2CkR9K
|
||||
4pxi5wLGiKJgPQrQTBka2qZMBe+l4/1EQrvCwo1Q7vZwJ5LmmjDH+7kn2BPWIoEeFZRIQLkTz0Mp
|
||||
p8HVfqQ+P7qk6tIo9JJi6pj0DL6ne9t0frQoN0XMMKJQrHW6Y2nBptywkAJmqimireedxNYQCdsC
|
||||
M8GOW9IxE7aArXObZCn2dgrUVpE5QaXWVTnCdqJYGX/pGXh1Czf3I4oerr6+hb8ce8chCV1MDUbI
|
||||
KcY0lSbw6vaIbA+ZMPSkIJ6sYwdeL9HUhuaaSf4CT2RIw+Daac/gc8962rBcmANrVROL5eRoYJIx
|
||||
U6CWxKiFFg0XqVH2LkltTWBIIhSMZ7Z9wfmqKLtWaKHoHadj6I7wloQ21HJR2Clfa9Ap9IAK7z/d
|
||||
/fjx07rW8VLupZgR1LAj2PWUCZxZ4+DaTbmlrKX1NIW/EpRr3OpEWoOOmL/6O8n6UtwtNd4Sl6x1
|
||||
Xrsp7x2ScrtIppL32kH9OGIg5y+mRU4fOJJaEnqUrxYblnZSy3vA0DPNhakdDI/WtTxL+1fDJrL2
|
||||
hYm09doaUPyfOAlmaLwmS4RuvuWsBgOKUAsDq5a+ZAk+YNY1tLnMlWYPTfJe7hNAiudStplnNAKl
|
||||
YMl1EyM2C5SC8k1BOakhS6mp6y5zmKL5oPlYe2KFeueVp2MSLRMoRB7cc+hRuvITI0lXGT26wxN3
|
||||
48HdGlhCnEqNzZRr291iHspALmFT4Fowx/vrUqsdstQW5M6SLIUT92toUtIisG1K7sNF/bRgixJk
|
||||
5pxKirzJDiMGK6l466l4v28oK4Ups+3hRvrHqeM5+Mk5ZPGFZ2mgHbsX611wuq5dTIkUcmomtT/M
|
||||
rfDE/9NKOCbl+m6z8Fo9+4ZA4VjvVONaapSkjChHOpD1qT30Qkf1raP67FU8prwsHcce7KA+Tk30
|
||||
7eepTXlNpGCZA8zUc4h02o8OA8m1QHOKMx0VfugojxvOZEnSkCaF8eRlbMsUrWL/fZ+/+Rc0aEZ5
|
||||
DzFJR95uav/pMXeeJpZtRrU8BddHxXtx7oA/VHbrbIXrHfrQ0MLh9aHVLjx6uLCooY5TwIN9mQRL
|
||||
k6zGLWwnaavI8rHvHoqVT2Z72p7ogzKOvFQ1jmMZmfa4xh1mtO8nXjVPw9FpdF6WtYuUni6HoNMw
|
||||
YOYHejI20eMtQZz0/ZJnMcp1Kywxd1VoZfllWfatnrs++s542JPGnLZpqjRhqAzNmNlJ3TLFVs9O
|
||||
t+9M20nRl3+ZYlye//a4zsfU+W6ky/nj8y0La//FQ0/iq7taGlfl9LdnAP8pnw3Tky+Blc+y0b5Y
|
||||
+kriDl9cvqz+VscxeTx9df5qObVkGI8Hby6/Xf+Jwy91OdeTL49V8FHcHq8eP1NwajmdHDw7gf3H
|
||||
cP7Md4XO0v1/9/8DAAD//0KWSE5OLShJTYmHNeSQvYxQVpQK6sjhUgYPZrCDlSB5Mz4tMy89taig
|
||||
KBPSl0oriDdPNbIwNk41TbNU4qrlAgAAAP//AwCpko/aVA4AAA==
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e3de645a8666217-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Sun, 17 Nov 2024 07:10:16 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '5537'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '30000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '29999711'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_220e7945d04e84ab7b58c252c98630b5
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: !!binary |
|
||||
Cs4CCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSpQIKEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRKOAgoQrqG+rs9H9Iyyqr2ZU1qS4RIIWspPh5zdoVMqDFRhc2sgQ3JlYXRlZDABOeiB
|
||||
tw7crwgYQZgvuQ7crwgYSi4KCGNyZXdfa2V5EiIKIGYzNDZhOWFkNmQ3MzA2M2UwNjc3YjE3Y2U5
|
||||
YzUwMTc3SjEKB2NyZXdfaWQSJgokNmM4OTQ3MzctMzViYy00YWQxLWIxNjYtYWU2Nzg4YTEwOGFm
|
||||
Si4KCHRhc2tfa2V5EiIKIGIxN2IxODhkYmYxNGY5M2E5OGU1Yjk1YWFkMzY3NTc3SjEKB3Rhc2tf
|
||||
aWQSJgokMmNlZDRlYTQtNWI3MC00YTA3LTkxMjktNDM0NmQ0NTljODYyegIYAYUBAAEAAA==
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '337'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.27.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Sun, 17 Nov 2024 07:10:22 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are {topic} Reporting
|
||||
Analyst\n. You''re a meticulous analyst with a keen eye for detail. You''re
|
||||
known for your ability to turn complex data into clear and concise reports,
|
||||
making it easy for others to understand and act on the information you provide.\nYour
|
||||
personal goal is: Create detailed reports based on {topic} data analysis and
|
||||
research findings\n\nTo give my best complete final answer to the task use the
|
||||
exact following format:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||
Your final answer must be the great and the most complete as possible, it must
|
||||
be outcome described.\n\nI MUST use these formats, my job depends on it!"},
|
||||
{"role": "user", "content": "\nCurrent Task: Review the context you got and
|
||||
expand each topic into a full section for a report. Make sure the report is
|
||||
detailed and contains any and all relevant information.\n\n\nThis is the expect
|
||||
criteria for your final answer: A fully fledge reports with the mains topics,
|
||||
each with a full section of information. Formatted as markdown without ''```''\n\nyou
|
||||
MUST return the actual complete content as the final answer, not a summary.\n\nThis
|
||||
is the context you''re working with:\n1. **Breakthrough in Quantum Computing**:
|
||||
By 2024, advancements in quantum computing have led to more reliable qubit processing,
|
||||
paving the way for practical applications in cryptography, complex system simulations,
|
||||
and optimization problems.\n\n2. **AI Integration in Healthcare**: Artificial
|
||||
intelligence continues to transform healthcare, with AI algorithms now more
|
||||
accurately diagnosing diseases, predicting patient outcomes, and personalizing
|
||||
treatment plans than ever before.\n\n3. **Renewable Energy Innovations**: The
|
||||
year 2024 has seen significant improvements in renewable energy technologies.
|
||||
Next-generation solar panels and wind turbines are more efficient, leading to
|
||||
widespread adoption and reduced reliance on fossil fuels.\n\n4. **5G Expansion
|
||||
and 6G Development**: The global rollout of 5G technology reaches near completion,
|
||||
and research into 6G has commenced. This development promises to introduce unprecedented
|
||||
data transfer speeds and connectivity.\n\n5. **Advances in Biotechnology**:
|
||||
Gene-editing technologies, such as CRISPR, have advanced to a stage where genetic
|
||||
disorders can be effectively treated, sparking ethical debates and regulatory
|
||||
considerations.\n\n6. **Space Exploration Milestones**: The space industry achieves
|
||||
new milestones with the establishment of permanent lunar bases and the first
|
||||
manned missions to Mars, driven by both government and private sector collaboration.\n\n7.
|
||||
**Sustainable Agriculture Practices**: In response to climate change challenges,
|
||||
sustainable agriculture practices, including vertical farming and precision
|
||||
agriculture, are gaining traction globally, boosting food production and reducing
|
||||
environmental impact.\n\n8. **Cybersecurity Enhancements**: With increasing
|
||||
digital threats, 2024 sees robust advancements in cybersecurity technologies,
|
||||
including AI-driven threat detection, and enhanced data encryption methodologies.\n\n9.
|
||||
**Transportation Innovation**: Public transportation and electric vehicle technology
|
||||
continue to evolve with the introduction of autonomous public transit systems
|
||||
and improvements in EV battery longevity and charging infrastructures.\n\n10.
|
||||
**Mental Health Awareness**: A global increase in mental health awareness leads
|
||||
to increased funding for research and the integration of digital therapies and
|
||||
apps that provide more accessible mental health support.\n\nThese bullet points
|
||||
summarize significant areas of development and interest in the given topic in
|
||||
2024, highlighting the profound impacts in various fields.\n\nBegin! This is
|
||||
VERY important to you, use the tools available and give your best Final Answer,
|
||||
your job depends on it!\n\nThought:"}], "model": "gpt-4o", "stop": ["\nObservation:"],
|
||||
"stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '3935'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=08pKRcLhS1PDw0mYfL2jz19ac6M.T31GoiMuI5DlX6w-1731827382-1.0.1.1-UfOLu3AaIUuXP1sGzdV6oggJ1q7iMTC46t08FDhYVrKcW5YmD4CbifudOJiSgx8h0JLTwZdgk.aG05S0eAO_PQ;
|
||||
_cfuvid=74kaPOoAcp8YRSA0XocQ1FFNksu9V0_KiWdQfo7wQuQ-1731827382509-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA3RXS28cxxG++1cU6OuSkBRJVnijFFumBQQKRcOH6FLTXTtTUU93q6pmlyNf8jfy
|
||||
9/JLguqZfQnIhVhuv+r1PfbPHwCuOF7dwlUY0MJY0/Xd74/87tPHr/KwHV59CMF+e1nf/P7ht+Hx
|
||||
17/a1cZPlO5fFOxw6iaUsSYyLnlZDkJo5Lc+/+kvz9+8+Onl89dtYSyRkh/rq12/LNcvnr14ef3s
|
||||
zfWz1+vBoXAgvbqFf/4AAPBn++sh5khPV7fwbHP4ZiRV7Onq9rgJ4EpK8m+uUJXVMC/hrouhZKPc
|
||||
on4cytQPdgv3kMseAmboeUeA0HvogFn3JACf8y+cMcFd+//2c/6cf4R3ZaxCA2X1Iw9UixiUDB9o
|
||||
hkcKQy6p9BwwAeYInwJTNt5ygL/RjlKpI2VT4AyefbvyR3grhF9sEA/Ll/4xYbZpbG9NxrkH3/h2
|
||||
bmc2oNxnvxKzAcYd5kDHW7+uR8Px6IA7go4oA4aBaUdxAxXFOEwJJc1+ygYCFEIoW/g6dWxQpQRS
|
||||
9QuEEmPHiW2+gceBFbrzeAdUqLij2G7Z4wzbIu1zFQy2lKLWxAF9RpYnzoMkUcAgRRWUdiSYIJYR
|
||||
OesN3GcIMlcrvWAd5o3fq3SZNWWdhIDy4N9FoNyO+Fsj2VCigg1oEMqUItTiY8CY0gxCu5Im38nf
|
||||
CCIaglKYhG2GboYRv3j+bIBpLGrAYyXZcZkUrMCAoa2jGY3V9ObYt1PxMWkBegqUWneUxylhW1lQ
|
||||
8wQ6q9GoG9gPHIY1SBPMui0yAuc4qQmTgk5hAC/2gDJioKnVVtucjWgk7P+pT1wgj/9sTlqycQr+
|
||||
srfGeKR2MLS88q4k7yBnoKdKwktlfUM8je0N/DKJDSRjEdpAqcYjf1u6WqV0icZjJx0Eahx0A1vO
|
||||
3phNu66jueTYcNdRpi0bbKWM67wcZ64WR+DZqBzLpCaEY+Ls20olae8vsfJYpex8gbZbboXwwqGB
|
||||
BkykUIVa+9IMU0Yz5IxdopsVh3f3cJ+N+uVOr8avhMmGgEINgnfSsMyYgLNRSty3YnN7xHMIRcjj
|
||||
xlMPj5cNp8tQfdMC5z3b4E9j6ouwDaO28ixo9WSmXIUCRcpGETCESTAccRsZ+1yU242RlVBJHagO
|
||||
lLVsoJNUVIVhGjFDwLoA2qvTzYAZ0/zNn9qhWgOCkil8nTh8SfM6B0ahTW71gZfsA71j5S6Ro8FD
|
||||
WW6nmTaQCGMbtgLUWKZNqRfHc2E9xk26Tj4rBOGFL5xBQsmRl94m/kJO04FkGVqUyGWH2ijsLOmP
|
||||
QpGDOTPf3UPTG3Vig5pwbniEyrtimMDFwku4LUIB9ZCZ0zWUyUIZSTcQZGrd9oCqkOPcLz/rpJqg
|
||||
Ud/mLEcQ0jJJIMCUysJ4N3AXl0ycdDZwd//ff/9HYaVUr1AlUV91FvLxNkebx5z1QGt6DK692s1g
|
||||
yKlIY/izaJyffHeHShHa3EXecZwwQU+ZjMMGEm9JbU4rJinvWEr2RzFBE8onc6qdvMcu5w2Tp8ga
|
||||
unwE/fQhLkVj3Trjl3wA1ANl2jvC4OdM0s9wn3PZrYj1PY4AGFG+KOAZHaPMMBNKq7scL6HlEjvK
|
||||
rPf87/Rk157ailotPhMVs/e+Kz7QJ2k40MIM3jVvcMk7ktb9Np065cT94Jzoo5somHDwPnmu35dk
|
||||
eWsNC2NZVGdfJMU9R2r6VVEwJUqbg2o1JdizXzdJx5lOCc2LWgvplGzh43XPqmHe4xbnIRNbAVFb
|
||||
gcoWDpU4kigapPahPamVKB7pgc/aEUrybHlHaW5DINxN1rCNF5ajCcmB0/pUOkeTewRnwuKIUuUE
|
||||
24mS3sAfHEmrEMZTgcp21fHzTjoBnEqbZpeDSilR9GnPxV1WOm++E8Xa2UWuuskWwe2l7JsKXAz2
|
||||
iDl604+6d+0aTMaecybVw9S+eg8/P1XM6rH65tfvz91bG9zHgZYJ3bP52dYggkwo16shpmNxSkpl
|
||||
Mk/71fuzZm+AXH480i2qkTRRkUy2tulEmosHS64vObcu+Uj6u83LNJw43UmGyD3bpefSjVc2TY2R
|
||||
78tju1dHFIM22py3gmoyBZvEMfVASihOyo6C15dBY/JuuoJGkj3OGxi4uQX7TqmaoVpkkOQ8pYsk
|
||||
zsXoBv4YKMN2WtxZY8S48QBYmzMJfq8V0Kk24007ykuFVnDF7/L+f3ZvtWDjlA++9GIWPcrtYnZO
|
||||
TfIGd5gd2jZce6+WHwGRfLSa/m4nr+AFrpp48TiStM07FpsaYrDR/+K3qPGCvzolE7weuB+uI205
|
||||
N+GA9dfL6n4490fLcsYpb7mcMYlvuM+rx+gulkb84n7yDNNuMaPzIWmlsJTL0U2ZrsnFy9M/L1DL
|
||||
6t3D/aePDwcyuTDlLn/7o9aPJbaXloqsLGUcjr5144R2ZJ+mNP7kYVtkLRL9h8KZg4tEI0WYctve
|
||||
fBx8co98YW9KpQxThUx7b6BgbcYZqrPUyQQdqEONBciGxi6RukYYobj1aI5zsThrXEvHOi6Ue85E
|
||||
rsbfq/1CdOf1b1dfxJip+V5DoxanUO8/E4rMsBUcaV9cIJ2IYxRSPU21W950Kmxe41scwQa0BCYn
|
||||
g1zEzfMabuT/AQAA//+MWbFu5DYQ7f0VhKsEWC9ygH1BSuPgIoWBQxJcFywoaSQxpkiFpORscf8e
|
||||
vBlS4q7tIOUuJZKaGb5573FFNaZzqaLfZ90SQM/63EWfjaWYPBoPnpEHqHogkG5H3vurGgktc1O1
|
||||
B+7ogts4MxSRHRNHRk/fg+9M2uEHaArvG8/ZxekAqtpryAXmMuBsoK1NZIwrZ1Iar7DbelulFrta
|
||||
bDPhzk2VufnUIEGL9Dzd+KIhGkqvEMqDXylw21CRP1wPRUqA8ASzIlOR2uQBDylj17MPhDcPQoZN
|
||||
TD5wjTtHnZpMjLxK8upZh6gCzYEit3A1ebfkNmVJz6wWOJPxOvAHxRqHody4VduF+0LcjQbjoqQD
|
||||
m828dwh+cR3qKIeRYSovQSlZktxIIn6jTn212lE6qlIhSyxqST0OAeYBT/FVNH4uk8fJdDFlMZel
|
||||
XDuC/riB2UlrDY48/nQDHbhcVCREtnRLmuZRs5xxKlar6mrVuay6Exm0N1LM5VhB6DBtoGX+XgqB
|
||||
mvQ/ZmJ4W5Jhwp0zXHF3HJSoXslaltuMYUwFqh3wZAuKF8rN5WbHQoqBIfmij0m1wc/qbMh2rHUs
|
||||
qck4DL3lKL33aQ4GWpuhbPvQnQBlu6P3vrtwKxrvs4oJvhC0DdZGsjMwmVaD8OsOAEBXi5tp1i3q
|
||||
JgI34OGJNk1BF0Tb4pqdlQIgX84NJsy7eapZGddF3GhJGoHWjKnJuIX5JbH98B/GFtNe3bFgay+W
|
||||
umhK3cJ6CEV1vJAaxqnHX++6AKKXd1AErXfS4Ik78h0bI4Hi7B1zOl9hLa98B9ugfYmHj+yVdbFg
|
||||
4MUwe/rQliq73jAN35RtKD3A/0qKJgqDcA8OG5/+IsQj+AfrUeN2o2Gz0oBOMQu8zGBE4uR1SkJA
|
||||
0ES2FXZW0SKOnS6MDRUBcVPS/gfeBRGTpfeQC0e+HK0ISEl+bmqcaak1BoSGRKtb7Tq0kl0LNj6N
|
||||
al4aa1rZuEmbx8FfmgWbWmk0LSyfH56+xR8ZJkBm96OBPrAk7/yERvDBlOSC4S63hEaDaGbN3pyV
|
||||
73vu9zs15/U3WZaCXskqUTybvcdVApGv8SDiMFBMRWeITqPSKY7qi3ftEgKhwg7q6VsdQzG7YJ8g
|
||||
buay2hv2aM7KYoG1yNd21FJNV4T/UIiFwIcwLGXNZFLlrmU79G2ItdTVuoehxu2s+YopPGmIpVJA
|
||||
z4I8YrOpx1cdWIkJaADzwH3SeK4beqEZ0ASYsagsw31AZTATCqR0mfLCkso6kxjzRen0i+NRkfmo
|
||||
xkq36jlt5vUOZCCThRRcrTrPW2tK3tuYm3YW77plvleiVdk/1xNlnXORoDovUNIVOtdOkJTwUT3G
|
||||
qzkbHYJh052FJQqSrwTExiE36IFynCVK8HA99Kmcyrwn4E7VP0SCykJonXf5EMOWvfBUAanBTLB4
|
||||
8oZ4vzvRPWZ9HUm90PmKygm4rTpwifbSVEtBsCKNbMFe+q8rwTWX+qilbDbLhRgLWYacxR0EFwPI
|
||||
7aVQyMdIGE3mFK/GWhVHPRNXjpCrY335FKhfosbdl1uszf9/326zrB/gn8c8vv0P+RfHE5LgHW6u
|
||||
YvLzLY9+v1HqT741Wy4uwm7n4Kc5nZJ/IYcJP/9yL/Pd7vd0++innz5/ysMJtmg18vPDw+GdKU8d
|
||||
wXmM1dXbbQsp0O3v7vd0eumMrwZuqg9/u6H35paPN274P9PvA21Lc6LuNGdHuP7o/bFAf3FXe/+x
|
||||
LdC84Vs5UafeuIECkzSkpJ9P9w9t/3Dfkabbm+83/wIAAP//AwCS6QzxVR0AAA==
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e3de6697f976217-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Sun, 17 Nov 2024 07:10:27 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '10658'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '30000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '29999045'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 1ms
|
||||
x-request-id:
|
||||
- req_f0af67637da5bc0e6b11fc3e5db59f62
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
500
tests/cassettes/test_before_kickoff_modification.yaml
Normal file
500
tests/cassettes/test_before_kickoff_modification.yaml
Normal file
@@ -0,0 +1,500 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: !!binary |
|
||||
CusOCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSwg4KEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRKaDAoQFHOMv8VK3fCTALziX07PIRIIN6Cmi+pyjGkqDENyZXcgQ3JlYXRlZDABORgw
|
||||
kr/lrgkYQWDinL/lrgkYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODAuMEoaCg5weXRob25fdmVy
|
||||
c2lvbhIICgYzLjEyLjdKLgoIY3Jld19rZXkSIgogMWYxMjhiZGI3YmFhNGI2NzcxNGYxZGFlZGMy
|
||||
ZjNhYjZKMQoHY3Jld19pZBImCiQ5MWYxYTY2OC05Y2MwLTQxODctYWZmOS03NzJkNzZlMzg3NDlK
|
||||
HAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdf
|
||||
bnVtYmVyX29mX3Rhc2tzEgIYAkobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgCSrQFCgtjcmV3
|
||||
X2FnZW50cxKkBQqhBVt7ImtleSI6ICI3M2MzNDljOTNjMTYzYjVkNGRmOThhNjRmYWMxYzQzMCIs
|
||||
ICJpZCI6ICIxNDFhOGY2NS0zODRjLTQxMDMtODgwZS02ODMzNTQ0NmVkN2YiLCAicm9sZSI6ICJ7
|
||||
dG9waWN9IFNlbmlvciBEYXRhIFJlc2VhcmNoZXJcbiIsICJ2ZXJib3NlPyI6IHRydWUsICJtYXhf
|
||||
aXRlciI6IDIwLCAibWF4X3JwbSI6IG51bGwsICJmdW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAi
|
||||
bGxtIjogImdwdC00by1taW5pIiwgImRlbGVnYXRpb25fZW5hYmxlZD8iOiBmYWxzZSwgImFsbG93
|
||||
X2NvZGVfZXhlY3V0aW9uPyI6IGZhbHNlLCAibWF4X3JldHJ5X2xpbWl0IjogMiwgInRvb2xzX25h
|
||||
bWVzIjogW119LCB7ImtleSI6ICIxMDRmZTA2NTllMTBiNDI2Y2Y4OGYwMjRmYjU3MTU1MyIsICJp
|
||||
ZCI6ICI5YWFkMWUxMi00MTgxLTQ5NTctYmNlNS01ZWNhODg2YjMxYWYiLCAicm9sZSI6ICJ7dG9w
|
||||
aWN9IFJlcG9ydGluZyBBbmFseXN0XG4iLCAidmVyYm9zZT8iOiB0cnVlLCAibWF4X2l0ZXIiOiAy
|
||||
MCwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJn
|
||||
cHQtNG8tbWluaSIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJhbGxvd19jb2RlX2V4
|
||||
ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6IFtd
|
||||
fV1KkwQKCmNyZXdfdGFza3MShAQKgQRbeyJrZXkiOiAiNmFmYzRiMzk2MjU5ZmJiNzY4MWY1NmM3
|
||||
NzU1Y2M5MzciLCAiaWQiOiAiNTI5YmU1NTMtM2Y3Mi00YTU2LWFhNWItYWE0ZTZmMzhlOWJhIiwg
|
||||
ImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdlbnRf
|
||||
cm9sZSI6ICJ7dG9waWN9IFNlbmlvciBEYXRhIFJlc2VhcmNoZXJcbiIsICJhZ2VudF9rZXkiOiAi
|
||||
NzNjMzQ5YzkzYzE2M2I1ZDRkZjk4YTY0ZmFjMWM0MzAiLCAidG9vbHNfbmFtZXMiOiBbXX0sIHsi
|
||||
a2V5IjogImIxN2IxODhkYmYxNGY5M2E5OGU1Yjk1YWFkMzY3NTc3IiwgImlkIjogImI2NzQyNmI0
|
||||
LTM2NTAtNDY5MS1iYTU4LWYwZTRmOWM0NTk3YyIsICJhc3luY19leGVjdXRpb24/IjogZmFsc2Us
|
||||
ICJodW1hbl9pbnB1dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAie3RvcGljfSBSZXBvcnRpbmcg
|
||||
QW5hbHlzdFxuIiwgImFnZW50X2tleSI6ICIxMDRmZTA2NTllMTBiNDI2Y2Y4OGYwMjRmYjU3MTU1
|
||||
MyIsICJ0b29sc19uYW1lcyI6IFtdfV16AhgBhQEAAQAAEo4CChBM7T06NWnnx9b1Sl8dbVH+Eghz
|
||||
9rR/8DUNEioMVGFzayBDcmVhdGVkMAE5yJqzv+WuCRhBqEa0v+WuCRhKLgoIY3Jld19rZXkSIgog
|
||||
MWYxMjhiZGI3YmFhNGI2NzcxNGYxZGFlZGMyZjNhYjZKMQoHY3Jld19pZBImCiQ5MWYxYTY2OC05
|
||||
Y2MwLTQxODctYWZmOS03NzJkNzZlMzg3NDlKLgoIdGFza19rZXkSIgogNmFmYzRiMzk2MjU5ZmJi
|
||||
NzY4MWY1NmM3NzU1Y2M5MzdKMQoHdGFza19pZBImCiQ1MjliZTU1My0zZjcyLTRhNTYtYWE1Yi1h
|
||||
YTRlNmYzOGU5YmF6AhgBhQEAAQAA
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '1902'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.27.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Wed, 20 Nov 2024 13:03:59 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are Bicycles Senior Data
|
||||
Researcher\n. You''re a seasoned researcher with a knack for uncovering the
|
||||
latest developments in Bicycles. Known for your ability to find the most relevant
|
||||
information and present it in a clear and concise manner.\n\nYour personal goal
|
||||
is: Uncover cutting-edge developments in Bicycles\n\nTo give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: Your final answer must be the great and the
|
||||
most complete as possible, it must be outcome described.\n\nI MUST use these
|
||||
formats, my job depends on it!"}, {"role": "user", "content": "\nCurrent Task:
|
||||
Conduct a thorough research about Bicycles Make sure you find any interesting
|
||||
and relevant information given the current year is 2024.\n\n\nThis is the expect
|
||||
criteria for your final answer: A list with 10 bullet points of the most relevant
|
||||
information about Bicycles\n\nyou MUST return the actual complete content as
|
||||
the final answer, not a summary.\n\nBegin! This is VERY important to you, use
|
||||
the tools available and give your best Final Answer, your job depends on it!\n\nThought:"}],
|
||||
"model": "gpt-4o-mini", "stop": ["\nObservation:"], "stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1260'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- x64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
content: "{\n \"id\": \"chatcmpl-AVef48hbtmEEfHJzc9KI6SOG72L6j\",\n \"object\":
|
||||
\"chat.completion\",\n \"created\": 1732107834,\n \"model\": \"gpt-4o-mini-2024-07-18\",\n
|
||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
||||
\"assistant\",\n \"content\": \"I now can give a great answer. \\nFinal
|
||||
Answer: \\n\\n1. **E-Bike Market Growth**: The e-bike market has experienced
|
||||
unprecedented growth, with sales increasing by over 45% in 2023 compared to
|
||||
the previous year, driven by rising fuel prices and increased urbanization.
|
||||
For 2024, predictions suggest this trend will continue as more consumers seek
|
||||
sustainable transportation options.\\n\\n2. **Smart Technology Integration**:
|
||||
Bicycle manufacturers are increasingly integrating smart technology into their
|
||||
models. Features like GPS navigation, smartphone connectivity, anti-theft alarms,
|
||||
and fitness tracking are becoming standard, enhancing the cycling experience
|
||||
while providing riders with valuable data.\\n\\n3. **Sustainable Materials**:
|
||||
Many companies are now focusing on using sustainable and eco-friendly materials
|
||||
for bicycle production, with significant advancements in recycled aluminum and
|
||||
carbon fiber technologies. This approach not only reduces environmental impact
|
||||
but also appeals to eco-conscious consumers.\\n\\n4. **Urban Infrastructure
|
||||
Improvements**: Cities worldwide are investing heavily in improving cycling
|
||||
infrastructure, including the addition of dedicated bike lanes, bike-sharing
|
||||
programs, and parking facilities, aiming to promote cycling as a primary mode
|
||||
of transport and improve safety for cyclists.\\n\\n5. **Global Cycling Tourism
|
||||
Increase**: Cycling tourism has seen a surge in popularity, with destinations
|
||||
specifically catering to cyclists emerging across Europe, North America, and
|
||||
Asia. This trend encourages eco-friendly travel options and boosts local economies,
|
||||
offering curated cycling paths and accommodations.\\n\\n6. **Bike Repair & Maintenance
|
||||
Innovations**: Innovative solutions like mobile bike repair services and self-service
|
||||
bike repair stations are becoming more common, addressing the maintenance needs
|
||||
of cyclists and reducing barriers to cycling.\\n\\n7. **Safety Innovations**:
|
||||
The development of safety features such as automatic lights that respond to
|
||||
ambient light, integrated turn signals in helmets, and advanced brake systems
|
||||
have become essential selling points for new bikes, increasing rider visibility
|
||||
and safety.\\n\\n8. **Performance Enhancements**: Advances in bike design and
|
||||
materials, such as lightweight titanium and carbon fiber frames, have enhanced
|
||||
performance for competitive cyclists. Additionally, innovations in gear shifting
|
||||
and suspension systems are improving efficiency and comfort.\\n\\n9. **Inclusivity
|
||||
in Cycling**: An increasing number of brands are focusing on inclusivity, producing
|
||||
step-through frames and bikes tailored for various body types and abilities,
|
||||
thus promoting cycling for people of all ages and physical conditions.\\n\\n10.
|
||||
**Data Analytics for Cycling Trends**: The use of data analytics to study cycling
|
||||
patterns has increased, helping cities and businesses understand cycling behaviors
|
||||
and improve services. Insights gathered are being used to optimize bike-sharing
|
||||
programs and enhance cycling infrastructure strategically.\\n\\nThis comprehensive
|
||||
understanding highlights the diverse and exciting developments in the bicycle
|
||||
industry, reflective of the shifting trends and technological advancements as
|
||||
we move through 2024.\",\n \"refusal\": null\n },\n \"logprobs\":
|
||||
null,\n \"finish_reason\": \"stop\"\n }\n ],\n \"usage\": {\n \"prompt_tokens\":
|
||||
237,\n \"completion_tokens\": 540,\n \"total_tokens\": 777,\n \"prompt_tokens_details\":
|
||||
{\n \"cached_tokens\": 0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\":
|
||||
{\n \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
||||
\"fp_0705bf87c0\"\n}\n"
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e58a48a783d6225-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Wed, 20 Nov 2024 13:04:02 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=CkK4UvBd9ukXvn50uJwGambJcz5zERAJfeXJ9xge6H4-1732107842-1.0.1.1-IOK2yVL3RlD75MgmnKzIEyE38HNknwn6I8BBJ1wjGz4jCTd0YWIBPnvUm9gB8D_zLlUA9G7p_wbrfyc4mO_Bmg;
|
||||
path=/; expires=Wed, 20-Nov-24 13:34:02 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=MmeN9oHWrBLThkEJdaSFHBfWe95JvA8iFnnt7CC92tk-1732107842102-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '7649'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '30000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '150000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '29999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '149999708'
|
||||
x-ratelimit-reset-requests:
|
||||
- 2ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_60a333db2dbe3378c077ae0b2af16f8e
|
||||
http_version: HTTP/1.1
|
||||
status_code: 200
|
||||
- request:
|
||||
body: !!binary |
|
||||
Cs4CCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSpQIKEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRKOAgoQU4pBe1pQxsUBVChkPK41ghII8dnGjmMshHkqDFRhc2sgQ3JlYXRlZDABOQiW
|
||||
uMjnrgkYQYjOucjnrgkYSi4KCGNyZXdfa2V5EiIKIDFmMTI4YmRiN2JhYTRiNjc3MTRmMWRhZWRj
|
||||
MmYzYWI2SjEKB2NyZXdfaWQSJgokOTFmMWE2NjgtOWNjMC00MTg3LWFmZjktNzcyZDc2ZTM4NzQ5
|
||||
Si4KCHRhc2tfa2V5EiIKIGIxN2IxODhkYmYxNGY5M2E5OGU1Yjk1YWFkMzY3NTc3SjEKB3Rhc2tf
|
||||
aWQSJgokYjY3NDI2YjQtMzY1MC00NjkxLWJhNTgtZjBlNGY5YzQ1OTdjegIYAYUBAAEAAA==
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '337'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.27.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Wed, 20 Nov 2024 13:04:04 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are Bicycles Reporting
|
||||
Analyst\n. You''re a meticulous analyst with a keen eye for detail. You''re
|
||||
known for your ability to turn complex data into clear and concise reports,
|
||||
making it easy for others to understand and act on the information you provide.\n\nYour
|
||||
personal goal is: Create detailed reports based on Bicycles data analysis and
|
||||
research findings\n\nTo give my best complete final answer to the task use the
|
||||
exact following format:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||
Your final answer must be the great and the most complete as possible, it must
|
||||
be outcome described.\n\nI MUST use these formats, my job depends on it!"},
|
||||
{"role": "user", "content": "\nCurrent Task: Review the context you got and
|
||||
expand each topic into a full section for a report. Make sure the report is
|
||||
detailed and contains any and all relevant information.\n\n\nThis is the expect
|
||||
criteria for your final answer: A fully fledge reports with the mains topics,
|
||||
each with a full section of information. Formatted as markdown without ''```''\n\nyou
|
||||
MUST return the actual complete content as the final answer, not a summary.\n\nThis
|
||||
is the context you''re working with:\n1. **E-Bike Market Growth**: The e-bike
|
||||
market has experienced unprecedented growth, with sales increasing by over 45%
|
||||
in 2023 compared to the previous year, driven by rising fuel prices and increased
|
||||
urbanization. For 2024, predictions suggest this trend will continue as more
|
||||
consumers seek sustainable transportation options.\n\n2. **Smart Technology
|
||||
Integration**: Bicycle manufacturers are increasingly integrating smart technology
|
||||
into their models. Features like GPS navigation, smartphone connectivity, anti-theft
|
||||
alarms, and fitness tracking are becoming standard, enhancing the cycling experience
|
||||
while providing riders with valuable data.\n\n3. **Sustainable Materials**:
|
||||
Many companies are now focusing on using sustainable and eco-friendly materials
|
||||
for bicycle production, with significant advancements in recycled aluminum and
|
||||
carbon fiber technologies. This approach not only reduces environmental impact
|
||||
but also appeals to eco-conscious consumers.\n\n4. **Urban Infrastructure Improvements**:
|
||||
Cities worldwide are investing heavily in improving cycling infrastructure,
|
||||
including the addition of dedicated bike lanes, bike-sharing programs, and parking
|
||||
facilities, aiming to promote cycling as a primary mode of transport and improve
|
||||
safety for cyclists.\n\n5. **Global Cycling Tourism Increase**: Cycling tourism
|
||||
has seen a surge in popularity, with destinations specifically catering to cyclists
|
||||
emerging across Europe, North America, and Asia. This trend encourages eco-friendly
|
||||
travel options and boosts local economies, offering curated cycling paths and
|
||||
accommodations.\n\n6. **Bike Repair & Maintenance Innovations**: Innovative
|
||||
solutions like mobile bike repair services and self-service bike repair stations
|
||||
are becoming more common, addressing the maintenance needs of cyclists and reducing
|
||||
barriers to cycling.\n\n7. **Safety Innovations**: The development of safety
|
||||
features such as automatic lights that respond to ambient light, integrated
|
||||
turn signals in helmets, and advanced brake systems have become essential selling
|
||||
points for new bikes, increasing rider visibility and safety.\n\n8. **Performance
|
||||
Enhancements**: Advances in bike design and materials, such as lightweight titanium
|
||||
and carbon fiber frames, have enhanced performance for competitive cyclists.
|
||||
Additionally, innovations in gear shifting and suspension systems are improving
|
||||
efficiency and comfort.\n\n9. **Inclusivity in Cycling**: An increasing number
|
||||
of brands are focusing on inclusivity, producing step-through frames and bikes
|
||||
tailored for various body types and abilities, thus promoting cycling for people
|
||||
of all ages and physical conditions.\n\n10. **Data Analytics for Cycling Trends**:
|
||||
The use of data analytics to study cycling patterns has increased, helping cities
|
||||
and businesses understand cycling behaviors and improve services. Insights gathered
|
||||
are being used to optimize bike-sharing programs and enhance cycling infrastructure
|
||||
strategically.\n\nThis comprehensive understanding highlights the diverse and
|
||||
exciting developments in the bicycle industry, reflective of the shifting trends
|
||||
and technological advancements as we move through 2024.\n\nBegin! This is VERY
|
||||
important to you, use the tools available and give your best Final Answer, your
|
||||
job depends on it!\n\nThought:"}], "model": "gpt-4o-mini", "stop": ["\nObservation:"],
|
||||
"stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '4587'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=CkK4UvBd9ukXvn50uJwGambJcz5zERAJfeXJ9xge6H4-1732107842-1.0.1.1-IOK2yVL3RlD75MgmnKzIEyE38HNknwn6I8BBJ1wjGz4jCTd0YWIBPnvUm9gB8D_zLlUA9G7p_wbrfyc4mO_Bmg;
|
||||
_cfuvid=MmeN9oHWrBLThkEJdaSFHBfWe95JvA8iFnnt7CC92tk-1732107842102-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- x64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
content: "{\n \"id\": \"chatcmpl-AVefC4hfHvHYaSnPpfpnDBIn5IOgg\",\n \"object\":
|
||||
\"chat.completion\",\n \"created\": 1732107842,\n \"model\": \"gpt-4o-mini-2024-07-18\",\n
|
||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
||||
\"assistant\",\n \"content\": \"I now can give a great answer \\nFinal
|
||||
Answer: \\n\\n# Detailed Report on Current Bicycle Industry Trends\\n\\n## 1.
|
||||
E-Bike Market Growth\\nThe e-bike market has experienced unprecedented growth
|
||||
in 2023, with sales skyrocketing by over 45% compared to the previous year.
|
||||
The surge in popularity is largely attributed to rising fuel prices that compel
|
||||
consumers to explore alternative modes of transportation, as well as increased
|
||||
urbanization that pushes individuals towards more sustainable commuting options.
|
||||
In 2024, forecasts indicate that this robust growth is likely to persist as
|
||||
more consumers prioritize eco-friendly transportation solutions. Factors driving
|
||||
this momentum include government incentives for electric vehicle purchases,
|
||||
improved battery technology providing longer ranges, and the appeal of e-bikes
|
||||
as a viable solution for first-and-last-mile connectivity in urban environments.\\n\\n##
|
||||
2. Smart Technology Integration\\nThe bicycle manufacturing industry is witnessing
|
||||
an increasing trend towards integrating smart technology into their products.
|
||||
Modern bicycles now come equipped with features such as GPS navigation systems,
|
||||
Bluetooth connectivity, anti-theft alarms, and fitness tracking capabilities.
|
||||
These enhancements not only enrich the cycling experience by providing cyclists
|
||||
with valuable data\u2014such as speed, distance traveled, and route optimization\u2014but
|
||||
also position cycling as a technologically advanced means of transport. Such
|
||||
innovations cater particularly to tech-savvy consumers looking for a comprehensive
|
||||
solution that addresses both utility and convenience.\\n\\n## 3. Sustainable
|
||||
Materials\\nIn response to growing environmental concerns, many bicycle manufacturers
|
||||
are now focusing on the use of sustainable and eco-friendly materials in their
|
||||
production processes. Innovations in recycled aluminum production and advancements
|
||||
in carbon fiber manufacturing are leading the way to minimize the ecological
|
||||
footprint of bicycles. The shift to sustainable materials not only attracts
|
||||
eco-conscious consumers but also aligns with the broader movement towards sustainability
|
||||
within various industries. This commitment to responsible sourcing and production
|
||||
practices is intended to resonate with consumers increasingly prioritizing sustainability
|
||||
in their purchasing decisions.\\n\\n## 4. Urban Infrastructure Improvements\\nCities
|
||||
across the globe are investing significantly to enhance cycling infrastructure,
|
||||
which includes creating dedicated bike lanes, establishing bike-sharing programs,
|
||||
and increasing the availability of secure bike parking facilities. The aim of
|
||||
these investment strategies is to promote cycling as a primary mode of transportation,
|
||||
thereby alleviating traffic congestion and reducing urban air pollution. These
|
||||
improvements not only make cycling safer and more appealing but also encourage
|
||||
a cultural shift towards embracing cycling as a sustainable form of transport,
|
||||
contributing to healthier urban populations.\\n\\n## 5. Global Cycling Tourism
|
||||
Increase\\nCycling tourism has emerged as a rapidly growing sector, with numerous
|
||||
destinations catering specifically to the needs of cyclists. Regions in Europe,
|
||||
North America, and Asia have begun to promote curated cycling paths and accommodations
|
||||
that enhance the travel experience for biking enthusiasts. This trend encourages
|
||||
eco-friendly travel options and provides a substantial boost to local economies
|
||||
reliant on tourism. With more travelers seeking unique and sustainable adventure
|
||||
experiences, cycling tourism is cementing its place as a desirable and responsible
|
||||
leisure activity.\\n\\n## 6. Bike Repair & Maintenance Innovations\\nAs cycling
|
||||
becomes more popular, addressing the maintenance needs of bicycles is critical.
|
||||
The advent of innovative solutions such as mobile bike repair services and self-service
|
||||
repair stations is helping cyclists maintain their bikes more conveniently.
|
||||
These services remove barriers to cycling by providing quick access to repair
|
||||
assistance, thus ensuring that cyclists can get back on the road with minimal
|
||||
downtime. Additionally, the proliferation of these services reflects an increasingly
|
||||
proactive approach to bicycle maintenance within the industry.\\n\\n## 7. Safety
|
||||
Innovations\\nSafety remains a paramount concern for cyclists, prompting the
|
||||
development of several innovative features that enhance visibility and rider
|
||||
protection. New safety technologies include automatic lights that adjust to
|
||||
ambient lighting, integrated turn signals built into helmets, and advanced braking
|
||||
systems that improve stopping power. These innovations not only elevate the
|
||||
overall safety of new bicycles but also serve as essential selling points for
|
||||
manufacturers, helping to reassure potential buyers about the security of their
|
||||
cycling experiences.\\n\\n## 8. Performance Enhancements\\nContinual advancements
|
||||
in bike design and materials have significantly improved performance for competitive
|
||||
cyclists. The adoption of lightweight materials like titanium and carbon fiber
|
||||
frames enhances speed and maneuverability. Moreover, state-of-the-art gear shifting
|
||||
mechanisms and suspension systems are optimizing cycling efficiency and rider
|
||||
comfort. These innovations cater to both amateur and professional cyclists alike,
|
||||
emphasizing the drive for enhanced performance in the marketplace.\\n\\n## 9.
|
||||
Inclusivity in Cycling\\nThe bicycle industry is progressively recognizing the
|
||||
importance of inclusivity by producing more diverse models catering to a wide
|
||||
range of body types and abilities. This includes step-through frames designed
|
||||
for easier mounting and dismounting as well as specialized bikes accommodating
|
||||
unique ergonomic needs. By promoting cycling as an approachable and accessible
|
||||
activity for individuals of various ages and physical conditions, brands are
|
||||
broadening their market reach and fostering a more inclusive cycling community.\\n\\n##
|
||||
10. Data Analytics for Cycling Trends\\nThe utilization of data analytics in
|
||||
cycling is on the rise, as cities and businesses increasingly turn to data-driven
|
||||
insights to understand cyclist behaviors and optimize offerings. Analytics are
|
||||
being harnessed to fine-tune bike-sharing programs, enhancing user experience
|
||||
through informed decision-making. This strategic approach not only aids in the
|
||||
identification of high-demand cycling routes but also informs infrastructure
|
||||
investments, ensuring that cycling continues to become a more viable and attractive
|
||||
option for urban transport.\\n\\nIn summary, these trends reflect a dynamic
|
||||
and rapidly evolving bicycle industry characterized by technological advancements,
|
||||
sustainability efforts, and a commitment to inclusivity. As we advance through
|
||||
2024, these developments will shape the future of cycling, making it not just
|
||||
a mode of transport but a lifestyle choice that emphasizes health, environment,
|
||||
and community engagement.\",\n \"refusal\": null\n },\n \"logprobs\":
|
||||
null,\n \"finish_reason\": \"stop\"\n }\n ],\n \"usage\": {\n \"prompt_tokens\":
|
||||
791,\n \"completion_tokens\": 1102,\n \"total_tokens\": 1893,\n \"prompt_tokens_details\":
|
||||
{\n \"cached_tokens\": 0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\":
|
||||
{\n \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
||||
\"fp_0705bf87c0\"\n}\n"
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e58a4be3c906225-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Wed, 20 Nov 2024 13:04:18 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '16287'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '30000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '150000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '29999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '149998883'
|
||||
x-ratelimit-reset-requests:
|
||||
- 2ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_bb43402829dc4dc60bf6f4b76a72e6c9
|
||||
http_version: HTTP/1.1
|
||||
status_code: 200
|
||||
version: 1
|
||||
492
tests/cassettes/test_before_kickoff_with_none_input.yaml
Normal file
492
tests/cassettes/test_before_kickoff_with_none_input.yaml
Normal file
@@ -0,0 +1,492 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: !!binary |
|
||||
CusOCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSwg4KEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRKaDAoQ4G43ZjKxBKDC/tbsjP4YXxIINS4tBd9tcREqDENyZXcgQ3JlYXRlZDABOQB0
|
||||
FQ7yrgkYQdg+GA7yrgkYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODAuMEoaCg5weXRob25fdmVy
|
||||
c2lvbhIICgYzLjEyLjdKLgoIY3Jld19rZXkSIgogMWYxMjhiZGI3YmFhNGI2NzcxNGYxZGFlZGMy
|
||||
ZjNhYjZKMQoHY3Jld19pZBImCiQzNTE4YjRjNS0xYTM5LTRkYjEtODEwMy03MzllNjQ5YzAwZDhK
|
||||
HAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdf
|
||||
bnVtYmVyX29mX3Rhc2tzEgIYAkobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgCSrQFCgtjcmV3
|
||||
X2FnZW50cxKkBQqhBVt7ImtleSI6ICI3M2MzNDljOTNjMTYzYjVkNGRmOThhNjRmYWMxYzQzMCIs
|
||||
ICJpZCI6ICIyZmFkNjUwMC0wYTk1LTRmMTMtYjk5YS0zMTE1YzRkOTM3ODgiLCAicm9sZSI6ICJ7
|
||||
dG9waWN9IFNlbmlvciBEYXRhIFJlc2VhcmNoZXJcbiIsICJ2ZXJib3NlPyI6IHRydWUsICJtYXhf
|
||||
aXRlciI6IDIwLCAibWF4X3JwbSI6IG51bGwsICJmdW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAi
|
||||
bGxtIjogImdwdC00by1taW5pIiwgImRlbGVnYXRpb25fZW5hYmxlZD8iOiBmYWxzZSwgImFsbG93
|
||||
X2NvZGVfZXhlY3V0aW9uPyI6IGZhbHNlLCAibWF4X3JldHJ5X2xpbWl0IjogMiwgInRvb2xzX25h
|
||||
bWVzIjogW119LCB7ImtleSI6ICIxMDRmZTA2NTllMTBiNDI2Y2Y4OGYwMjRmYjU3MTU1MyIsICJp
|
||||
ZCI6ICIxYTQ0MjFiOC1lZWMzLTQ1ZjItODY1NS01NDcyMWIyOTk5NDciLCAicm9sZSI6ICJ7dG9w
|
||||
aWN9IFJlcG9ydGluZyBBbmFseXN0XG4iLCAidmVyYm9zZT8iOiB0cnVlLCAibWF4X2l0ZXIiOiAy
|
||||
MCwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJn
|
||||
cHQtNG8tbWluaSIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJhbGxvd19jb2RlX2V4
|
||||
ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6IFtd
|
||||
fV1KkwQKCmNyZXdfdGFza3MShAQKgQRbeyJrZXkiOiAiNmFmYzRiMzk2MjU5ZmJiNzY4MWY1NmM3
|
||||
NzU1Y2M5MzciLCAiaWQiOiAiMmY2ODFlY2YtNmY0Yy00NzlhLWE0ZWEtY2Y0ZTVmNGM2ZWFlIiwg
|
||||
ImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdlbnRf
|
||||
cm9sZSI6ICJ7dG9waWN9IFNlbmlvciBEYXRhIFJlc2VhcmNoZXJcbiIsICJhZ2VudF9rZXkiOiAi
|
||||
NzNjMzQ5YzkzYzE2M2I1ZDRkZjk4YTY0ZmFjMWM0MzAiLCAidG9vbHNfbmFtZXMiOiBbXX0sIHsi
|
||||
a2V5IjogImIxN2IxODhkYmYxNGY5M2E5OGU1Yjk1YWFkMzY3NTc3IiwgImlkIjogIjgwM2Q5YWYy
|
||||
LTdhYjAtNDYzNy1iMWJjLTkxNDJmMWJkMDM0YSIsICJhc3luY19leGVjdXRpb24/IjogZmFsc2Us
|
||||
ICJodW1hbl9pbnB1dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAie3RvcGljfSBSZXBvcnRpbmcg
|
||||
QW5hbHlzdFxuIiwgImFnZW50X2tleSI6ICIxMDRmZTA2NTllMTBiNDI2Y2Y4OGYwMjRmYjU3MTU1
|
||||
MyIsICJ0b29sc19uYW1lcyI6IFtdfV16AhgBhQEAAQAAEo4CChCkKf4+mBo3buykKHqmcwYdEgit
|
||||
HkuXVEC4UCoMVGFzayBDcmVhdGVkMAE5uJAnDvKuCRhBcBkoDvKuCRhKLgoIY3Jld19rZXkSIgog
|
||||
MWYxMjhiZGI3YmFhNGI2NzcxNGYxZGFlZGMyZjNhYjZKMQoHY3Jld19pZBImCiQzNTE4YjRjNS0x
|
||||
YTM5LTRkYjEtODEwMy03MzllNjQ5YzAwZDhKLgoIdGFza19rZXkSIgogNmFmYzRiMzk2MjU5ZmJi
|
||||
NzY4MWY1NmM3NzU1Y2M5MzdKMQoHdGFza19pZBImCiQyZjY4MWVjZi02ZjRjLTQ3OWEtYTRlYS1j
|
||||
ZjRlNWY0YzZlYWV6AhgBhQEAAQAA
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '1902'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.27.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Wed, 20 Nov 2024 13:04:49 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are {topic} Senior Data
|
||||
Researcher\n. You''re a seasoned researcher with a knack for uncovering the
|
||||
latest developments in {topic}. Known for your ability to find the most relevant
|
||||
information and present it in a clear and concise manner.\n\nYour personal goal
|
||||
is: Uncover cutting-edge developments in {topic}\n\nTo give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: Your final answer must be the great and the
|
||||
most complete as possible, it must be outcome described.\n\nI MUST use these
|
||||
formats, my job depends on it!"}, {"role": "user", "content": "\nCurrent Task:
|
||||
Conduct a thorough research about {topic} Make sure you find any interesting
|
||||
and relevant information given the current year is 2024.\n\n\nThis is the expect
|
||||
criteria for your final answer: A list with 10 bullet points of the most relevant
|
||||
information about {topic}\n\nyou MUST return the actual complete content as
|
||||
the final answer, not a summary.\n\nBegin! This is VERY important to you, use
|
||||
the tools available and give your best Final Answer, your job depends on it!\n\nThought:"}],
|
||||
"model": "gpt-4o-mini", "stop": ["\nObservation:"], "stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1255'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=CkK4UvBd9ukXvn50uJwGambJcz5zERAJfeXJ9xge6H4-1732107842-1.0.1.1-IOK2yVL3RlD75MgmnKzIEyE38HNknwn6I8BBJ1wjGz4jCTd0YWIBPnvUm9gB8D_zLlUA9G7p_wbrfyc4mO_Bmg;
|
||||
_cfuvid=MmeN9oHWrBLThkEJdaSFHBfWe95JvA8iFnnt7CC92tk-1732107842102-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- x64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
content: "{\n \"id\": \"chatcmpl-AVefuCEPMJPCqhgvBPhOk55hlNQ0m\",\n \"object\":
|
||||
\"chat.completion\",\n \"created\": 1732107886,\n \"model\": \"gpt-4o-mini-2024-07-18\",\n
|
||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
||||
\"assistant\",\n \"content\": \"I now can give a great answer. \\nFinal
|
||||
Answer: \\n\\n1. **Artificial Intelligence Advancements**: In 2024, AI has made
|
||||
significant strides in natural language processing and computer vision, with
|
||||
models achieving near-human-level understanding and interpretation capabilities.
|
||||
This has led to more sophisticated AI applications across industries.\\n\\n2.
|
||||
**Quantum Computing Progress**: Quantum computers are now capable of surpassing
|
||||
traditional computing power for specific tasks, with breakthroughs in error
|
||||
correction and qubit coherence. This achievement is paving the way for real-world
|
||||
applications in cryptography and complex problem-solving.\\n\\n3. **Sustainable
|
||||
Energy Technologies**: The shift toward renewable energy sources has accelerated,
|
||||
with innovations in solar panel efficiency and the rise of hydrogen fuel cells
|
||||
gaining traction as viable alternatives for energy storage and transportation.\\n\\n4.
|
||||
**Augmented Reality Enhancements**: In 2024, augmented reality (AR) technologies
|
||||
are being integrated into everyday applications, from retail to education, providing
|
||||
immersive experiences that enhance learning and consumer engagement.\\n\\n5.
|
||||
**5G Expansion and 6G Development**: The rollout of 5G continues to expand globally,
|
||||
while foundational work on 6G is underway, promising enhanced connectivity speeds,
|
||||
low latency, and the potential for new applications like smart cities and automated
|
||||
industries.\\n\\n6. **Data Privacy Regulations**: As data breaches become increasingly
|
||||
sophisticated, worldwide regulations around data privacy have tightened, with
|
||||
new laws being implemented to protect consumer information and corporate accountability
|
||||
in data handling.\\n\\n7. **Biotechnology Breakthroughs**: Advances in gene
|
||||
editing technologies, particularly CRISPR, have progressed rapidly, making personalized
|
||||
medicine and agricultural improvements more feasible, aiming to address genetic
|
||||
diseases and food security.\\n\\n8. **Blockchain Applications**: Beyond cryptocurrencies,
|
||||
blockchain technology is being applied in supply chain management and digital
|
||||
identity verification, offering transparent and secure methods for transactions
|
||||
and record-keeping.\\n\\n9. **Mental Health Technology**: The integration of
|
||||
technology in mental health care is expanding, with virtual reality and AI-driven
|
||||
apps providing new therapeutic options for patients, drastically improving accessibility
|
||||
and treatment personalization.\\n\\n10. **Transportation Innovations**: Electric
|
||||
vehicles (EVs) have seen increased adoption due to advancements in battery technology,
|
||||
while autonomous vehicles are becoming more prevalent, with pilot programs indicating
|
||||
potential for widespread urban deployment by 2025.\",\n \"refusal\":
|
||||
null\n },\n \"logprobs\": null,\n \"finish_reason\": \"stop\"\n
|
||||
\ }\n ],\n \"usage\": {\n \"prompt_tokens\": 234,\n \"completion_tokens\":
|
||||
457,\n \"total_tokens\": 691,\n \"prompt_tokens_details\": {\n \"cached_tokens\":
|
||||
0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\": {\n
|
||||
\ \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
||||
\"fp_0705bf87c0\"\n}\n"
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e58a5d1ef736225-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Wed, 20 Nov 2024 13:04:50 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '3814'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '30000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '150000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '29999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '149999710'
|
||||
x-ratelimit-reset-requests:
|
||||
- 2ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_0e0bf8c81c9997414688b5188337104b
|
||||
http_version: HTTP/1.1
|
||||
status_code: 200
|
||||
- request:
|
||||
body: !!binary |
|
||||
Cs4CCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSpQIKEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRKOAgoQT0LRe4bJ4FgqPQObXTZKYRIIpR3A/gdzzPQqDFRhc2sgQ3JlYXRlZDABOfCJ
|
||||
CAXzrgkYQcAICgXzrgkYSi4KCGNyZXdfa2V5EiIKIDFmMTI4YmRiN2JhYTRiNjc3MTRmMWRhZWRj
|
||||
MmYzYWI2SjEKB2NyZXdfaWQSJgokMzUxOGI0YzUtMWEzOS00ZGIxLTgxMDMtNzM5ZTY0OWMwMGQ4
|
||||
Si4KCHRhc2tfa2V5EiIKIGIxN2IxODhkYmYxNGY5M2E5OGU1Yjk1YWFkMzY3NTc3SjEKB3Rhc2tf
|
||||
aWQSJgokODAzZDlhZjItN2FiMC00NjM3LWIxYmMtOTE0MmYxYmQwMzRhegIYAYUBAAEAAA==
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '337'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.27.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Wed, 20 Nov 2024 13:04:54 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are {topic} Reporting
|
||||
Analyst\n. You''re a meticulous analyst with a keen eye for detail. You''re
|
||||
known for your ability to turn complex data into clear and concise reports,
|
||||
making it easy for others to understand and act on the information you provide.\n\nYour
|
||||
personal goal is: Create detailed reports based on {topic} data analysis and
|
||||
research findings\n\nTo give my best complete final answer to the task use the
|
||||
exact following format:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||
Your final answer must be the great and the most complete as possible, it must
|
||||
be outcome described.\n\nI MUST use these formats, my job depends on it!"},
|
||||
{"role": "user", "content": "\nCurrent Task: Review the context you got and
|
||||
expand each topic into a full section for a report. Make sure the report is
|
||||
detailed and contains any and all relevant information.\n\n\nThis is the expect
|
||||
criteria for your final answer: A fully fledge reports with the mains topics,
|
||||
each with a full section of information. Formatted as markdown without ''```''\n\nyou
|
||||
MUST return the actual complete content as the final answer, not a summary.\n\nThis
|
||||
is the context you''re working with:\n1. **Artificial Intelligence Advancements**:
|
||||
In 2024, AI has made significant strides in natural language processing and
|
||||
computer vision, with models achieving near-human-level understanding and interpretation
|
||||
capabilities. This has led to more sophisticated AI applications across industries.\n\n2.
|
||||
**Quantum Computing Progress**: Quantum computers are now capable of surpassing
|
||||
traditional computing power for specific tasks, with breakthroughs in error
|
||||
correction and qubit coherence. This achievement is paving the way for real-world
|
||||
applications in cryptography and complex problem-solving.\n\n3. **Sustainable
|
||||
Energy Technologies**: The shift toward renewable energy sources has accelerated,
|
||||
with innovations in solar panel efficiency and the rise of hydrogen fuel cells
|
||||
gaining traction as viable alternatives for energy storage and transportation.\n\n4.
|
||||
**Augmented Reality Enhancements**: In 2024, augmented reality (AR) technologies
|
||||
are being integrated into everyday applications, from retail to education, providing
|
||||
immersive experiences that enhance learning and consumer engagement.\n\n5. **5G
|
||||
Expansion and 6G Development**: The rollout of 5G continues to expand globally,
|
||||
while foundational work on 6G is underway, promising enhanced connectivity speeds,
|
||||
low latency, and the potential for new applications like smart cities and automated
|
||||
industries.\n\n6. **Data Privacy Regulations**: As data breaches become increasingly
|
||||
sophisticated, worldwide regulations around data privacy have tightened, with
|
||||
new laws being implemented to protect consumer information and corporate accountability
|
||||
in data handling.\n\n7. **Biotechnology Breakthroughs**: Advances in gene editing
|
||||
technologies, particularly CRISPR, have progressed rapidly, making personalized
|
||||
medicine and agricultural improvements more feasible, aiming to address genetic
|
||||
diseases and food security.\n\n8. **Blockchain Applications**: Beyond cryptocurrencies,
|
||||
blockchain technology is being applied in supply chain management and digital
|
||||
identity verification, offering transparent and secure methods for transactions
|
||||
and record-keeping.\n\n9. **Mental Health Technology**: The integration of technology
|
||||
in mental health care is expanding, with virtual reality and AI-driven apps
|
||||
providing new therapeutic options for patients, drastically improving accessibility
|
||||
and treatment personalization.\n\n10. **Transportation Innovations**: Electric
|
||||
vehicles (EVs) have seen increased adoption due to advancements in battery technology,
|
||||
while autonomous vehicles are becoming more prevalent, with pilot programs indicating
|
||||
potential for widespread urban deployment by 2025.\n\nBegin! This is VERY important
|
||||
to you, use the tools available and give your best Final Answer, your job depends
|
||||
on it!\n\nThought:"}], "model": "gpt-4o-mini", "stop": ["\nObservation:"], "stream":
|
||||
false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '4065'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=CkK4UvBd9ukXvn50uJwGambJcz5zERAJfeXJ9xge6H4-1732107842-1.0.1.1-IOK2yVL3RlD75MgmnKzIEyE38HNknwn6I8BBJ1wjGz4jCTd0YWIBPnvUm9gB8D_zLlUA9G7p_wbrfyc4mO_Bmg;
|
||||
_cfuvid=MmeN9oHWrBLThkEJdaSFHBfWe95JvA8iFnnt7CC92tk-1732107842102-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- x64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
content: "{\n \"id\": \"chatcmpl-AVefym1A3aTi6N7szB8ei85GCHkyG\",\n \"object\":
|
||||
\"chat.completion\",\n \"created\": 1732107890,\n \"model\": \"gpt-4o-mini-2024-07-18\",\n
|
||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
||||
\"assistant\",\n \"content\": \"I now can give a great answer \\nFinal
|
||||
Answer: \\n\\n# Comprehensive Report on Key Technology Trends in 2024\\n\\n##
|
||||
1. Artificial Intelligence Advancements\\nIn 2024, artificial intelligence (AI)
|
||||
has undergone remarkable advancements, particularly in the fields of natural
|
||||
language processing (NLP) and computer vision. AI models are now achieving near-human-level
|
||||
understanding and interpretation capabilities, enabling more nuanced interactions
|
||||
between humans and machines. This progression has spurred the development of
|
||||
sophisticated AI applications across various sectors, from healthcare, where
|
||||
AI can analyze medical images and assist in diagnostic processes, to finance,
|
||||
where predictive analytics enhances decision-making and risk management. With
|
||||
AI being employed in customer service chatbots and personal assistants, the
|
||||
technology's integration into daily operations significantly improves productivity
|
||||
and user experience.\\n\\n## 2. Quantum Computing Progress\\nThe capabilities
|
||||
of quantum computers have expanded significantly, showcasing their potential
|
||||
to exceed traditional computing power for specific tasks. In 2024, key breakthroughs
|
||||
have been made in areas such as error correction and qubit coherence, addressing
|
||||
longstanding challenges in the field. These advancements are not only enhancing
|
||||
the performance of quantum systems but are also paving the way for practical
|
||||
applications in cryptography, where quantum encryption could revolutionize data
|
||||
security protocols, and in complex problem-solving scenarios across scientific
|
||||
research and logistics. As quantum technology matures, it holds the promise
|
||||
to solve problems that are currently intractable for classical computers.\\n\\n##
|
||||
3. Sustainable Energy Technologies\\nThe global shift towards sustainable energy
|
||||
sources has gained remarkable momentum in 2024, driven by innovations in solar
|
||||
panel efficiency and the adoption of hydrogen fuel cells. Advances in photovoltaic
|
||||
technology have led to the development of more efficient solar panels capable
|
||||
of capturing a higher percentage of sunlight, reducing reliance on fossil fuels.
|
||||
Simultaneously, hydrogen fuel cells are emerging as a viable alternative for
|
||||
energy storage and transportation, particularly in heavy-duty vehicles and public
|
||||
transport systems. This transformation towards greener energy solutions is critical
|
||||
in combating climate change while fostering economic growth through new job
|
||||
creation in the clean technology sector.\\n\\n## 4. Augmented Reality Enhancements\\nAugmented
|
||||
reality (AR) technologies are becoming increasingly integrated into everyday
|
||||
applications as of 2024, providing immersive experiences across various industries,
|
||||
including retail and education. In retail, AR is enhancing consumer engagement
|
||||
by allowing customers to visualize products in a real-world context before making
|
||||
a purchase. In education, AR is facilitating interactive learning experiences,
|
||||
enabling students to engage with complex subjects through visual simulations
|
||||
and augmented textbooks. These advancements not only improve user engagement
|
||||
but also foster greater understanding and retention of information.\\n\\n##
|
||||
5. 5G Expansion and 6G Development\\nThe global rollout of 5G technology continues
|
||||
at a rapid pace, significantly enhancing connectivity speeds and reducing latency.
|
||||
As 2024 progresses, foundational work on the next-generation 6G networks is
|
||||
also underway, promising even greater improvements in connectivity and the potential
|
||||
for groundbreaking applications. These advancements are facilitating the emergence
|
||||
of smart cities, automated industries, and enhanced telecommunications services.
|
||||
The increased bandwidth provided by 5G and the anticipation surrounding 6G enable
|
||||
new possibilities in mobile communications, Internet of Things (IoT) implementations,
|
||||
and real-time data processing.\\n\\n## 6. Data Privacy Regulations\\nAs data
|
||||
breaches become increasingly sophisticated and pervasive, regulations surrounding
|
||||
data privacy have intensified globally in 2024. Many countries have enacted
|
||||
new laws aimed at protecting consumer information and holding businesses accountable
|
||||
for their data handling practices. This regulatory environment requires organizations
|
||||
to implement robust data protection measures, maintain transparency, and establish
|
||||
trust with consumers. The emphasis on data privacy not only safeguards individuals'
|
||||
personal information but also promotes ethical practices within the tech industry,
|
||||
thereby fostering greater public confidence in emerging digital services.\\n\\n##
|
||||
7. Biotechnology Breakthroughs\\nThe biotechnology sector has experienced significant
|
||||
developments in 2024, particularly regarding gene editing technologies such
|
||||
as CRISPR. These advances enable researchers to make precise modifications to
|
||||
genetic material, paving the way for personalized medicine that targets genetic
|
||||
diseases at the source. Agricultural improvements are also on the horizon as
|
||||
genetically modified crops become more resilient to climate change and pests,
|
||||
addressing global food security challenges. With ongoing research and clinical
|
||||
trials, the potential applications of biotechnology in healthcare and agriculture
|
||||
present transformative opportunities to improve quality of life and sustainability.\\n\\n##
|
||||
8. Blockchain Applications\\nBeyond its initial use in cryptocurrencies, blockchain
|
||||
technology is finding diverse applications in areas such as supply chain management
|
||||
and digital identity verification. In 2024, businesses leverage blockchain's
|
||||
inherent transparency and security to streamline operations, enhance traceability,
|
||||
and foster trust among stakeholders. For example, in supply chain management,
|
||||
blockchain allows for real-time tracking of products, enabling greater accountability
|
||||
and efficient inventory management. Similarly, digital identity verification
|
||||
is becoming more secure through decentralized systems, reducing the risk of
|
||||
identity theft and fraud, thereby enhancing overall trust in digital transactions.\\n\\n##
|
||||
9. Mental Health Technology\\nIn recent years, technology's role in mental health
|
||||
care has expanded dramatically, with innovative solutions such as virtual reality
|
||||
(VR) therapy and AI-driven mental health applications emerging in 2024. These
|
||||
technologies offer patients new therapeutic options that enhance accessibility
|
||||
and treatment personalization. VR environments can simulate therapeutic situations
|
||||
for exposure therapy while AI algorithms tailor mental health interventions
|
||||
according to individual needs, improving overall efficacy. This integration
|
||||
of technology into mental health care has the potential to bridge gaps in traditional
|
||||
therapy access, providing vital support to those in need.\\n\\n## 10. Transportation
|
||||
Innovations\\nTransportation has seen significant innovations as of 2024, driven
|
||||
primarily by advancements in electric vehicles (EVs) and autonomous technology.
|
||||
Increasing adoption of EVs is correlated with enhanced battery technologies
|
||||
that extend range and reduce charging time, making them more appealing to consumers.
|
||||
Simultaneously, pilot programs for autonomous vehicles are indicating promising
|
||||
results for safe integration into urban environments. These developments present
|
||||
the opportunity for reduced traffic congestion, lower emissions, and enhanced
|
||||
mobility solutions, fundamentally reshaping the landscape of transportation
|
||||
in cities worldwide by 2025. \\n\\nThrough these comprehensive explorations
|
||||
of emerging technologies, it is clear that 2024 marks a pivotal year for innovation,
|
||||
shaping future directions in various industries and enhancing societal progress.\",\n
|
||||
\ \"refusal\": null\n },\n \"logprobs\": null,\n \"finish_reason\":
|
||||
\"stop\"\n }\n ],\n \"usage\": {\n \"prompt_tokens\": 708,\n \"completion_tokens\":
|
||||
1223,\n \"total_tokens\": 1931,\n \"prompt_tokens_details\": {\n \"cached_tokens\":
|
||||
0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\": {\n
|
||||
\ \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
||||
\"fp_0705bf87c0\"\n}\n"
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e58a5ec0f936225-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Wed, 20 Nov 2024 13:05:05 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '15043'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '30000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '150000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '29999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '149999013'
|
||||
x-ratelimit-reset-requests:
|
||||
- 2ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_4bd436f5144121694f8df654ed8514ea
|
||||
http_version: HTTP/1.1
|
||||
status_code: 200
|
||||
version: 1
|
||||
@@ -10,7 +10,8 @@ interactions:
|
||||
criteria for your final answer: 1 bullet point about dog that''s under 15 words.\nyou
|
||||
MUST return the actual complete content as the final answer, not a summary.\n\nBegin!
|
||||
This is VERY important to you, use the tools available and give your best Final
|
||||
Answer, your job depends on it!\n\nThought:"}], "model": "gpt-4o"}'
|
||||
Answer, your job depends on it!\n\nThought:"}], "model": "gpt-4o-mini", "stop":
|
||||
["\nObservation:"], "stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
@@ -19,49 +20,50 @@ interactions:
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '869'
|
||||
- '919'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=9.8sBYBkvBR8R1K_bVF7xgU..80XKlEIg3N2OBbTSCU-1727214102-1.0.1.1-.qiTLXbPamYUMSuyNsOEB9jhGu.jOifujOrx9E2JZvStbIZ9RTIiE44xKKNfLPxQkOi6qAT3h6htK8lPDGV_5g;
|
||||
_cfuvid=lbRdAddVWV6W3f5Dm9SaOPWDUOxqtZBSPr_fTW26nEA-1727213194587-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.47.0
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
- x64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
- Linux
|
||||
x-stainless-package-version:
|
||||
- 1.47.0
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.7
|
||||
- 3.11.9
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
content: "{\n \"id\": \"chatcmpl-AB7auGDrAVE0iXSBBhySZp3xE8gvP\",\n \"object\":
|
||||
\"chat.completion\",\n \"created\": 1727214164,\n \"model\": \"gpt-4o-2024-05-13\",\n
|
||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
||||
\"assistant\",\n \"content\": \"I now can give a great answer\\nFinal
|
||||
Answer: Dogs are unparalleled in loyalty and companionship to humans.\",\n \"refusal\":
|
||||
null\n },\n \"logprobs\": null,\n \"finish_reason\": \"stop\"\n
|
||||
\ }\n ],\n \"usage\": {\n \"prompt_tokens\": 175,\n \"completion_tokens\":
|
||||
21,\n \"total_tokens\": 196,\n \"completion_tokens_details\": {\n \"reasoning_tokens\":
|
||||
0\n }\n },\n \"system_fingerprint\": \"fp_e375328146\"\n}\n"
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA4xSy27bMBC86ysWPEuB7ciV7VuAIkUObQ+59QFhTa0kttQuS9Jx08D/XkhyLAVJ
|
||||
gV4EaGdnMLPDpwRAmUrtQOkWo+6czW7u41q2t3+cvCvuPvxafSG+58XHTzXlxWeV9gzZ/yAdn1lX
|
||||
WjpnKRrhEdaeMFKvuiyul3m+uV7lA9BJRbanNS5muWSdYZOtFqs8WxTZcnNmt2I0BbWDrwkAwNPw
|
||||
7X1yRb/VDhbp86SjELAhtbssASgvtp8oDMGEiBxVOoFaOBIP1u+A5QgaGRrzQIDQ9LYBORzJA3zj
|
||||
W8No4Wb438F7aQKgJ7DyiBb6zMhGOKRA3CJrww10xBEttIQ2toBcgTyQR2vhSNZmezLcXM39eKoP
|
||||
Afub8MHa8/x0CWilcV724Yxf5rVhE9rSEwbhPkyI4tSAnhKA78MhDy9uo5yXzsUyyk/iMHSzHvXU
|
||||
1N+Ejo0BqCgR7Yy13aZv6JUVRTQ2zKpQGnVL1USdesNDZWQGJLPUr928pT0mN9z8j/wEaE0uUlU6
|
||||
T5XRLxNPa5765/2vtcuVB8MqPIZIXVkbbsg7b8bHVbuyXm9xs8xXRa2SU/IXAAD//wMAq2ZCBWoD
|
||||
AAA=
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8c85f22ddda01cf3-GRU
|
||||
- 8e19bf36db158761-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
@@ -69,19 +71,27 @@ interactions:
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Tue, 24 Sep 2024 21:42:44 GMT
|
||||
- Tue, 12 Nov 2024 21:52:04 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=MkvcnvacGpTyn.y0OkFRoFXuAwg4oxjMhViZJTt9mw0-1731448324-1.0.1.1-oekkH_B0xOoPnIFw15LpqFCkZ2cu7VBTJVLDGylan4I67NjX.tlPvOiX9kvtP5Acewi28IE2IwlwtrZWzCH3vw;
|
||||
path=/; expires=Tue, 12-Nov-24 22:22:04 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=4.17346mfw5npZfYNbCx3Vj1VAVPy.tH0Jm2gkTteJ8-1731448324998-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
- user-tqfegqsiobpvvjmn0giaipdq
|
||||
openai-processing-ms:
|
||||
- '349'
|
||||
- '601'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
@@ -89,19 +99,20 @@ interactions:
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '30000000'
|
||||
- '200000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '29999792'
|
||||
- '199793'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
- 8.64s
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
- 62ms
|
||||
x-request-id:
|
||||
- req_4c8cd76fdfba7b65e5ce85397b33c22b
|
||||
http_version: HTTP/1.1
|
||||
status_code: 200
|
||||
- req_77fb166b4e272bfd45c37c08d2b93b0c
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are cat Researcher. You
|
||||
have a lot of experience with cat.\nYour personal goal is: Express hot takes
|
||||
@@ -113,7 +124,8 @@ interactions:
|
||||
criteria for your final answer: 1 bullet point about cat that''s under 15 words.\nyou
|
||||
MUST return the actual complete content as the final answer, not a summary.\n\nBegin!
|
||||
This is VERY important to you, use the tools available and give your best Final
|
||||
Answer, your job depends on it!\n\nThought:"}], "model": "gpt-4o"}'
|
||||
Answer, your job depends on it!\n\nThought:"}], "model": "gpt-4o-mini", "stop":
|
||||
["\nObservation:"], "stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
@@ -122,49 +134,53 @@ interactions:
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '869'
|
||||
- '919'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=9.8sBYBkvBR8R1K_bVF7xgU..80XKlEIg3N2OBbTSCU-1727214102-1.0.1.1-.qiTLXbPamYUMSuyNsOEB9jhGu.jOifujOrx9E2JZvStbIZ9RTIiE44xKKNfLPxQkOi6qAT3h6htK8lPDGV_5g;
|
||||
_cfuvid=lbRdAddVWV6W3f5Dm9SaOPWDUOxqtZBSPr_fTW26nEA-1727213194587-0.0.1.1-604800000
|
||||
- __cf_bm=MkvcnvacGpTyn.y0OkFRoFXuAwg4oxjMhViZJTt9mw0-1731448324-1.0.1.1-oekkH_B0xOoPnIFw15LpqFCkZ2cu7VBTJVLDGylan4I67NjX.tlPvOiX9kvtP5Acewi28IE2IwlwtrZWzCH3vw;
|
||||
_cfuvid=4.17346mfw5npZfYNbCx3Vj1VAVPy.tH0Jm2gkTteJ8-1731448324998-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.47.0
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
- x64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
- Linux
|
||||
x-stainless-package-version:
|
||||
- 1.47.0
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.7
|
||||
- 3.11.9
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
content: "{\n \"id\": \"chatcmpl-AB7auNbAqjT3rgBX92rhxBLuhaLBj\",\n \"object\":
|
||||
\"chat.completion\",\n \"created\": 1727214164,\n \"model\": \"gpt-4o-2024-05-13\",\n
|
||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
||||
\"assistant\",\n \"content\": \"Thought: I now can give a great answer\\nFinal
|
||||
Answer: Cats are highly independent, agile, and intuitive creatures beloved
|
||||
by millions worldwide.\",\n \"refusal\": null\n },\n \"logprobs\":
|
||||
null,\n \"finish_reason\": \"stop\"\n }\n ],\n \"usage\": {\n \"prompt_tokens\":
|
||||
175,\n \"completion_tokens\": 28,\n \"total_tokens\": 203,\n \"completion_tokens_details\":
|
||||
{\n \"reasoning_tokens\": 0\n }\n },\n \"system_fingerprint\": \"fp_e375328146\"\n}\n"
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA4xSy27bMBC86ysWPFuB7MhN6ltQIGmBnlL00BcEmlxJ21JLhlzFLQL/eyH5IRlt
|
||||
gV4EaGZnMLPLlwxAkVUbUKbVYrrg8rsPsg4P+Orxs9XvPz0U8eP966dS6sdo3wa1GBR++x2NnFRX
|
||||
xnfBoZDnA20iasHBdXlzvSzL2+vVeiQ6b9ENsiZIXvq8I6Z8VazKvLjJl7dHdevJYFIb+JIBALyM
|
||||
3yEnW/ypNlAsTkiHKekG1eY8BKCidwOidEqURLOoxUQaz4I8Rn8H7HdgNENDzwgamiE2aE47jABf
|
||||
+Z5YO7gb/zfwRksCHRGGGAHZIg/D1GmXFiBtpGfiBjyDtEgR/I5BMHYJNFvomZ56hIAxedaOhDBd
|
||||
zYNFrPukh+Vw79wR35+bOt+E6LfpyJ/xmphSW0XUyfPQKokPamT3GcC3caP9xZJUiL4LUon/gZzG
|
||||
I60Pfmo65MSuTqR40W6GF8c7XPpVFkWTS7ObKKNNi3aSTgfUvSU/I7JZ6z/T/M370Jy4+R/7iTAG
|
||||
g6CtQkRL5rLxNBZxeOf/GjtveQys0q8k2FU1cYMxRDq8sjpUxVYXdrkq66XK9tlvAAAA//8DAIjK
|
||||
KzJzAwAA
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8c85f2321c1c1cf3-GRU
|
||||
- 8e19bf3fae118761-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
@@ -172,7 +188,7 @@ interactions:
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Tue, 24 Sep 2024 21:42:45 GMT
|
||||
- Tue, 12 Nov 2024 21:52:05 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
@@ -181,10 +197,12 @@ interactions:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
- user-tqfegqsiobpvvjmn0giaipdq
|
||||
openai-processing-ms:
|
||||
- '430'
|
||||
- '464'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
@@ -192,19 +210,20 @@ interactions:
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '30000000'
|
||||
- '200000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
- '9998'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '29999792'
|
||||
- '199792'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
- 16.369s
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
- 62ms
|
||||
x-request-id:
|
||||
- req_ace859b7d9e83d9fa7753ce23bb03716
|
||||
http_version: HTTP/1.1
|
||||
status_code: 200
|
||||
- req_91706b23d0ef23458ba63ec18304cd28
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are apple Researcher.
|
||||
You have a lot of experience with apple.\nYour personal goal is: Express hot
|
||||
@@ -217,7 +236,7 @@ interactions:
|
||||
under 15 words.\nyou MUST return the actual complete content as the final answer,
|
||||
not a summary.\n\nBegin! This is VERY important to you, use the tools available
|
||||
and give your best Final Answer, your job depends on it!\n\nThought:"}], "model":
|
||||
"gpt-4o"}'
|
||||
"gpt-4o-mini", "stop": ["\nObservation:"], "stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
@@ -226,49 +245,53 @@ interactions:
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '879'
|
||||
- '929'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=9.8sBYBkvBR8R1K_bVF7xgU..80XKlEIg3N2OBbTSCU-1727214102-1.0.1.1-.qiTLXbPamYUMSuyNsOEB9jhGu.jOifujOrx9E2JZvStbIZ9RTIiE44xKKNfLPxQkOi6qAT3h6htK8lPDGV_5g;
|
||||
_cfuvid=lbRdAddVWV6W3f5Dm9SaOPWDUOxqtZBSPr_fTW26nEA-1727213194587-0.0.1.1-604800000
|
||||
- __cf_bm=MkvcnvacGpTyn.y0OkFRoFXuAwg4oxjMhViZJTt9mw0-1731448324-1.0.1.1-oekkH_B0xOoPnIFw15LpqFCkZ2cu7VBTJVLDGylan4I67NjX.tlPvOiX9kvtP5Acewi28IE2IwlwtrZWzCH3vw;
|
||||
_cfuvid=4.17346mfw5npZfYNbCx3Vj1VAVPy.tH0Jm2gkTteJ8-1731448324998-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.47.0
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
- x64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
- Linux
|
||||
x-stainless-package-version:
|
||||
- 1.47.0
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.7
|
||||
- 3.11.9
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
content: "{\n \"id\": \"chatcmpl-AB7avZ0yqY18ukQS7SnLkZydsx72b\",\n \"object\":
|
||||
\"chat.completion\",\n \"created\": 1727214165,\n \"model\": \"gpt-4o-2024-05-13\",\n
|
||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
||||
\"assistant\",\n \"content\": \"I now can give a great answer.\\n\\nFinal
|
||||
Answer: Apples are incredibly versatile, nutritious, and a staple in diets globally.\",\n
|
||||
\ \"refusal\": null\n },\n \"logprobs\": null,\n \"finish_reason\":
|
||||
\"stop\"\n }\n ],\n \"usage\": {\n \"prompt_tokens\": 175,\n \"completion_tokens\":
|
||||
25,\n \"total_tokens\": 200,\n \"completion_tokens_details\": {\n \"reasoning_tokens\":
|
||||
0\n }\n },\n \"system_fingerprint\": \"fp_a5d11b2ef2\"\n}\n"
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA4xSPW/bMBDd9SsOXLpIgeTITarNS4t26JJubSHQ5IliSh1ZHv0RBP7vhSTHctAU
|
||||
6CJQ7909vHd3zxmAsFo0IFQvkxqCKzYPaf1b2/hhW+8PR9N9Kh9W5Zdhjebr4zeRjx1++4gqvXTd
|
||||
KD8Eh8l6mmkVUSYcVau726qu729X7ydi8Brd2GZCKmpfDJZssSpXdVHeFdX9ubv3ViGLBr5nAADP
|
||||
03f0SRqPooEyf0EGZJYGRXMpAhDRuxERktlykpREvpDKU0KarH8G8gdQksDYPYIEM9oGSXzACPCD
|
||||
PlqSDjbTfwObEBy+Y0Dl+YkTDmApoYkyIUMvoz7IiDmw79L8kqSBMe7HMMAoB4fM7ikHpF6SsmRg
|
||||
xxgBjwGjRVJ4c+00YrdjOU6Lds6d8dMluvMmRL/lM3/BO0uW+zaiZE9jTE4+iIk9ZQA/pxHvXk1N
|
||||
hOiHkNrkfyHxtLX1rCeWzS7svEsAkXyS7govq/wNvVZjktbx1ZKEkqpHvbQuG5U7bf0VkV2l/tvN
|
||||
W9pzckvmf+QXQikMCXUbImqrXideyiKOh/+vssuUJ8NiPpO2s2Qwhmjns+tCW25lqatV3VUiO2V/
|
||||
AAAA//8DAPtpFJCEAwAA
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8c85f2369a761cf3-GRU
|
||||
- 8e19bf447ba48761-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
@@ -276,7 +299,7 @@ interactions:
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Tue, 24 Sep 2024 21:42:46 GMT
|
||||
- Tue, 12 Nov 2024 21:52:06 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
@@ -285,10 +308,12 @@ interactions:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
- user-tqfegqsiobpvvjmn0giaipdq
|
||||
openai-processing-ms:
|
||||
- '389'
|
||||
- '655'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
@@ -296,17 +321,18 @@ interactions:
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '30000000'
|
||||
- '200000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
- '9997'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '29999791'
|
||||
- '199791'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
- 24.239s
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
- 62ms
|
||||
x-request-id:
|
||||
- req_0167388f0a7a7f1a1026409834ceb914
|
||||
http_version: HTTP/1.1
|
||||
status_code: 200
|
||||
- req_a228208b0e965ecee334a6947d6c9e7c
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
|
||||
232
tests/cassettes/test_kickoff_for_each_error_handling.yaml
Normal file
232
tests/cassettes/test_kickoff_for_each_error_handling.yaml
Normal file
@@ -0,0 +1,232 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: !!binary |
|
||||
Cv1YCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkS1FgKEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRLADQoQ5TzgW9QzcBbzMl1hJozLcxIIl3adf7U81wwqDENyZXcgQ3JlYXRlZDABOaAJ
|
||||
txhffgkYQfiVuRhffgkYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODAuMEoaCg5weXRob25fdmVy
|
||||
c2lvbhIICgYzLjEyLjVKLgoIY3Jld19rZXkSIgogM2Y4ZDVjM2FiODgyZDY4NjlkOTNjYjgxZjBl
|
||||
MmVkNGFKMQoHY3Jld19pZBImCiRjYjRiY2Q1Zi0xYWJkLTQyYmYtOGQ1OC02ZmEzMDU3ZDFjOTZK
|
||||
HAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdf
|
||||
bnVtYmVyX29mX3Rhc2tzEgIYA0obChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgCSpIFCgtjcmV3
|
||||
X2FnZW50cxKCBQr/BFt7ImtleSI6ICI4YmQyMTM5YjU5NzUxODE1MDZlNDFmZDljNDU2M2Q3NSIs
|
||||
ICJpZCI6ICI1ZThjNTM1MS1jNWVlLTRhZGUtODY5MC1kM2RhOWI1NzI5YzciLCAicm9sZSI6ICJS
|
||||
ZXNlYXJjaGVyIiwgInZlcmJvc2U/IjogZmFsc2UsICJtYXhfaXRlciI6IDIwLCAibWF4X3JwbSI6
|
||||
IG51bGwsICJmdW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAibGxtIjogImdwdC00by1taW5pIiwg
|
||||
ImRlbGVnYXRpb25fZW5hYmxlZD8iOiBmYWxzZSwgImFsbG93X2NvZGVfZXhlY3V0aW9uPyI6IGZh
|
||||
bHNlLCAibWF4X3JldHJ5X2xpbWl0IjogMiwgInRvb2xzX25hbWVzIjogW119LCB7ImtleSI6ICI5
|
||||
YTUwMTVlZjQ4OTVkYzYyNzhkNTQ4MThiYTQ0NmFmNyIsICJpZCI6ICJhMTcwODczOC0yYWE2LTRk
|
||||
ZmYtODFlNy00OGFkMDNjNWFjY2QiLCAicm9sZSI6ICJTZW5pb3IgV3JpdGVyIiwgInZlcmJvc2U/
|
||||
IjogZmFsc2UsICJtYXhfaXRlciI6IDIwLCAibWF4X3JwbSI6IG51bGwsICJmdW5jdGlvbl9jYWxs
|
||||
aW5nX2xsbSI6ICIiLCAibGxtIjogImdwdC00by1taW5pIiwgImRlbGVnYXRpb25fZW5hYmxlZD8i
|
||||
OiBmYWxzZSwgImFsbG93X2NvZGVfZXhlY3V0aW9uPyI6IGZhbHNlLCAibWF4X3JldHJ5X2xpbWl0
|
||||
IjogMiwgInRvb2xzX25hbWVzIjogW119XUrbBQoKY3Jld190YXNrcxLMBQrJBVt7ImtleSI6ICI2
|
||||
Nzg0OWZmNzE3ZGJhZGFiYTFiOTVkNWYyZGZjZWVhMSIsICJpZCI6ICIyNzkxNTMxMy0wNDBhLTRk
|
||||
ZWItOTVkMy1mNWVmMzg2Mjk3NTEiLCAiYXN5bmNfZXhlY3V0aW9uPyI6IHRydWUsICJodW1hbl9p
|
||||
bnB1dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAiUmVzZWFyY2hlciIsICJhZ2VudF9rZXkiOiAi
|
||||
OGJkMjEzOWI1OTc1MTgxNTA2ZTQxZmQ5YzQ1NjNkNzUiLCAidG9vbHNfbmFtZXMiOiBbXX0sIHsi
|
||||
a2V5IjogImZjNTZkZWEzOGM5OTc0YjZmNTVhMmUyOGMxNDk5ODg2IiwgImlkIjogIjc3NzQ3MmVl
|
||||
LWYzNzAtNDQyZS05NWMyLWVlMGVkYzZiMTgyZiIsICJhc3luY19leGVjdXRpb24/IjogZmFsc2Us
|
||||
ICJodW1hbl9pbnB1dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAiUmVzZWFyY2hlciIsICJhZ2Vu
|
||||
dF9rZXkiOiAiOGJkMjEzOWI1OTc1MTgxNTA2ZTQxZmQ5YzQ1NjNkNzUiLCAidG9vbHNfbmFtZXMi
|
||||
OiBbXX0sIHsia2V5IjogIjk0YTgyNmMxOTMwNTU5Njg2YmFmYjQwOWVlODM4NzZmIiwgImlkIjog
|
||||
ImM4OWEzODA2LTg5MDItNGQ2My1iYzA0LTdjMzRhZTJmM2UxNyIsICJhc3luY19leGVjdXRpb24/
|
||||
IjogZmFsc2UsICJodW1hbl9pbnB1dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAiU2VuaW9yIFdy
|
||||
aXRlciIsICJhZ2VudF9rZXkiOiAiOWE1MDE1ZWY0ODk1ZGM2Mjc4ZDU0ODE4YmE0NDZhZjciLCAi
|
||||
dG9vbHNfbmFtZXMiOiBbXX1degIYAYUBAAEAABKOAgoQSqupTllrk2mxgu2AqenZUhIIspWxig2+
|
||||
1M0qDFRhc2sgQ3JlYXRlZDABOcCj1BhffgkYQfhq1RhffgkYSi4KCGNyZXdfa2V5EiIKIDNmOGQ1
|
||||
YzNhYjg4MmQ2ODY5ZDkzY2I4MWYwZTJlZDRhSjEKB2NyZXdfaWQSJgokY2I0YmNkNWYtMWFiZC00
|
||||
MmJmLThkNTgtNmZhMzA1N2QxYzk2Si4KCHRhc2tfa2V5EiIKIDY3ODQ5ZmY3MTdkYmFkYWJhMWI5
|
||||
NWQ1ZjJkZmNlZWExSjEKB3Rhc2tfaWQSJgokMjc5MTUzMTMtMDQwYS00ZGViLTk1ZDMtZjVlZjM4
|
||||
NjI5NzUxegIYAYUBAAEAABKOAgoQ3dJesXQA5ISCqVgmwvBMgRIIdrWBiVQuihcqDFRhc2sgQ3Jl
|
||||
YXRlZDABOdjAch1ffgkYQVh8cx1ffgkYSi4KCGNyZXdfa2V5EiIKIDNmOGQ1YzNhYjg4MmQ2ODY5
|
||||
ZDkzY2I4MWYwZTJlZDRhSjEKB2NyZXdfaWQSJgokY2I0YmNkNWYtMWFiZC00MmJmLThkNTgtNmZh
|
||||
MzA1N2QxYzk2Si4KCHRhc2tfa2V5EiIKIGZjNTZkZWEzOGM5OTc0YjZmNTVhMmUyOGMxNDk5ODg2
|
||||
SjEKB3Rhc2tfaWQSJgokNzc3NDcyZWUtZjM3MC00NDJlLTk1YzItZWUwZWRjNmIxODJmegIYAYUB
|
||||
AAEAABKOAgoQCBmV+4VbArZNiL5MaefbahII1fRxaC46KKgqDFRhc2sgQ3JlYXRlZDABOaDs4SNf
|
||||
fgkYQai74iNffgkYSi4KCGNyZXdfa2V5EiIKIDNmOGQ1YzNhYjg4MmQ2ODY5ZDkzY2I4MWYwZTJl
|
||||
ZDRhSjEKB2NyZXdfaWQSJgokY2I0YmNkNWYtMWFiZC00MmJmLThkNTgtNmZhMzA1N2QxYzk2Si4K
|
||||
CHRhc2tfa2V5EiIKIDk0YTgyNmMxOTMwNTU5Njg2YmFmYjQwOWVlODM4NzZmSjEKB3Rhc2tfaWQS
|
||||
JgokYzg5YTM4MDYtODkwMi00ZDYzLWJjMDQtN2MzNGFlMmYzZTE3egIYAYUBAAEAABKiBwoQhITI
|
||||
U8q3JLgneRv1MZQY8RIIF2CpEmiZsP4qDENyZXcgQ3JlYXRlZDABOZDBCytffgkYQTDFDStffgkY
|
||||
ShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODAuMEoaCg5weXRob25fdmVyc2lvbhIICgYzLjEyLjVK
|
||||
LgoIY3Jld19rZXkSIgogYTljYzVkNDMzOTViMjFiMTgxYzgwYmQ0MzUxY2NlYzhKMQoHY3Jld19p
|
||||
ZBImCiQ2MTMwMWVmYS0yOGQ4LTQyNTItOWVjNi1iM2JmZDcyMWM0MzVKHAoMY3Jld19wcm9jZXNz
|
||||
EgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdfbnVtYmVyX29mX3Rhc2tz
|
||||
EgIYAUobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgBStECCgtjcmV3X2FnZW50cxLBAgq+Alt7
|
||||
ImtleSI6ICI4YmQyMTM5YjU5NzUxODE1MDZlNDFmZDljNDU2M2Q3NSIsICJpZCI6ICI3NWRjOTUw
|
||||
OS02MjQ4LTQ0YWQtYTExZC1iZjdlZWVhOWI0NTQiLCAicm9sZSI6ICJSZXNlYXJjaGVyIiwgInZl
|
||||
cmJvc2U/IjogZmFsc2UsICJtYXhfaXRlciI6IDIwLCAibWF4X3JwbSI6IG51bGwsICJmdW5jdGlv
|
||||
bl9jYWxsaW5nX2xsbSI6ICIiLCAibGxtIjogImdwdC00by1taW5pIiwgImRlbGVnYXRpb25fZW5h
|
||||
YmxlZD8iOiBmYWxzZSwgImFsbG93X2NvZGVfZXhlY3V0aW9uPyI6IGZhbHNlLCAibWF4X3JldHJ5
|
||||
X2xpbWl0IjogMiwgInRvb2xzX25hbWVzIjogW119XUr+AQoKY3Jld190YXNrcxLvAQrsAVt7Imtl
|
||||
eSI6ICJlOWU2YjcyYWFjMzI2NDU5ZGQ3MDY4ZjBiMTcxN2MxYyIsICJpZCI6ICIxOTBlMGQ1Zi0y
|
||||
NDg1LTQ3N2ItYWIxNC1kMTlmNDE5YTFlYjQiLCAiYXN5bmNfZXhlY3V0aW9uPyI6IHRydWUsICJo
|
||||
dW1hbl9pbnB1dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAiUmVzZWFyY2hlciIsICJhZ2VudF9r
|
||||
ZXkiOiAiOGJkMjEzOWI1OTc1MTgxNTA2ZTQxZmQ5YzQ1NjNkNzUiLCAidG9vbHNfbmFtZXMiOiBb
|
||||
XX1degIYAYUBAAEAABKOAgoQxgDNe1lQGKnixKPk3O1TDBIISyqKkjcA7OYqDFRhc2sgQ3JlYXRl
|
||||
ZDABOfCYJCtffgkYQZAlJStffgkYSi4KCGNyZXdfa2V5EiIKIGE5Y2M1ZDQzMzk1YjIxYjE4MWM4
|
||||
MGJkNDM1MWNjZWM4SjEKB2NyZXdfaWQSJgokNjEzMDFlZmEtMjhkOC00MjUyLTllYzYtYjNiZmQ3
|
||||
MjFjNDM1Si4KCHRhc2tfa2V5EiIKIGU5ZTZiNzJhYWMzMjY0NTlkZDcwNjhmMGIxNzE3YzFjSjEK
|
||||
B3Rhc2tfaWQSJgokMTkwZTBkNWYtMjQ4NS00NzdiLWFiMTQtZDE5ZjQxOWExZWI0egIYAYUBAAEA
|
||||
ABK/DQoQwwR54Z8nOGgj2VSb63WRwhIIonLT+7Mwj00qDENyZXcgQ3JlYXRlZDABObDfvzhffgkY
|
||||
QfBvwjhffgkYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODAuMEoaCg5weXRob25fdmVyc2lvbhII
|
||||
CgYzLjEyLjVKLgoIY3Jld19rZXkSIgogNjZhOTYwZGM2OWZmZjU3OGIyNmM2MWQ0ZjdjNWE5ZmVK
|
||||
MQoHY3Jld19pZBImCiQxNThhMTkzOS01OWUzLTRlODgtYTRkYi04M2IzN2U5MjgxZWVKHAoMY3Jl
|
||||
d19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdfbnVtYmVy
|
||||
X29mX3Rhc2tzEgIYA0obChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgCSpIFCgtjcmV3X2FnZW50
|
||||
cxKCBQr/BFt7ImtleSI6ICI4YmQyMTM5YjU5NzUxODE1MDZlNDFmZDljNDU2M2Q3NSIsICJpZCI6
|
||||
ICI1ZThjNTM1MS1jNWVlLTRhZGUtODY5MC1kM2RhOWI1NzI5YzciLCAicm9sZSI6ICJSZXNlYXJj
|
||||
aGVyIiwgInZlcmJvc2U/IjogZmFsc2UsICJtYXhfaXRlciI6IDIwLCAibWF4X3JwbSI6IG51bGws
|
||||
ICJmdW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAibGxtIjogImdwdC00by1taW5pIiwgImRlbGVn
|
||||
YXRpb25fZW5hYmxlZD8iOiBmYWxzZSwgImFsbG93X2NvZGVfZXhlY3V0aW9uPyI6IGZhbHNlLCAi
|
||||
bWF4X3JldHJ5X2xpbWl0IjogMiwgInRvb2xzX25hbWVzIjogW119LCB7ImtleSI6ICI5YTUwMTVl
|
||||
ZjQ4OTVkYzYyNzhkNTQ4MThiYTQ0NmFmNyIsICJpZCI6ICJhMTcwODczOC0yYWE2LTRkZmYtODFl
|
||||
Ny00OGFkMDNjNWFjY2QiLCAicm9sZSI6ICJTZW5pb3IgV3JpdGVyIiwgInZlcmJvc2U/IjogZmFs
|
||||
c2UsICJtYXhfaXRlciI6IDIwLCAibWF4X3JwbSI6IG51bGwsICJmdW5jdGlvbl9jYWxsaW5nX2xs
|
||||
bSI6ICIiLCAibGxtIjogImdwdC00by1taW5pIiwgImRlbGVnYXRpb25fZW5hYmxlZD8iOiBmYWxz
|
||||
ZSwgImFsbG93X2NvZGVfZXhlY3V0aW9uPyI6IGZhbHNlLCAibWF4X3JldHJ5X2xpbWl0IjogMiwg
|
||||
InRvb2xzX25hbWVzIjogW119XUraBQoKY3Jld190YXNrcxLLBQrIBVt7ImtleSI6ICI5NDRhZWYw
|
||||
YmFjODQwZjFjMjdiZDgzYTkzN2JjMzYxYiIsICJpZCI6ICIzN2FkNzI5MC04Yjg5LTRjNWEtYmNl
|
||||
Zi03YzY0ZWJhMWM5NjciLCAiYXN5bmNfZXhlY3V0aW9uPyI6IHRydWUsICJodW1hbl9pbnB1dD8i
|
||||
OiBmYWxzZSwgImFnZW50X3JvbGUiOiAiUmVzZWFyY2hlciIsICJhZ2VudF9rZXkiOiAiOGJkMjEz
|
||||
OWI1OTc1MTgxNTA2ZTQxZmQ5YzQ1NjNkNzUiLCAidG9vbHNfbmFtZXMiOiBbXX0sIHsia2V5Ijog
|
||||
ImZjNTZkZWEzOGM5OTc0YjZmNTVhMmUyOGMxNDk5ODg2IiwgImlkIjogIjZhMmViMGY2LTgwZTIt
|
||||
NDkxOC05Zjk3LWVhNDY3OTNkMjI2YyIsICJhc3luY19leGVjdXRpb24/IjogdHJ1ZSwgImh1bWFu
|
||||
X2lucHV0PyI6IGZhbHNlLCAiYWdlbnRfcm9sZSI6ICJSZXNlYXJjaGVyIiwgImFnZW50X2tleSI6
|
||||
ICI4YmQyMTM5YjU5NzUxODE1MDZlNDFmZDljNDU2M2Q3NSIsICJ0b29sc19uYW1lcyI6IFtdfSwg
|
||||
eyJrZXkiOiAiOTRhODI2YzE5MzA1NTk2ODZiYWZiNDA5ZWU4Mzg3NmYiLCAiaWQiOiAiZGQ2Yzkz
|
||||
NzAtOGYwNC00ZDFmLThjODMtMmFiM2IyYzIwYWI3IiwgImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxz
|
||||
ZSwgImh1bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdlbnRfcm9sZSI6ICJTZW5pb3IgV3JpdGVyIiwg
|
||||
ImFnZW50X2tleSI6ICI5YTUwMTVlZjQ4OTVkYzYyNzhkNTQ4MThiYTQ0NmFmNyIsICJ0b29sc19u
|
||||
YW1lcyI6IFtdfV16AhgBhQEAAQAAErMHChBV+1WNQzpVlY6l4C/mUgHzEgi3vWQXjOQJ5CoMQ3Jl
|
||||
dyBDcmVhdGVkMAE5sH1kPF9+CRhBaH1mPF9+CRhKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC44MC4w
|
||||
ShoKDnB5dGhvbl92ZXJzaW9uEggKBjMuMTIuNUouCghjcmV3X2tleRIiCiBlZTY3NDVkN2M4YWU4
|
||||
MmUwMGRmOTRkZTBmN2Y4NzExOEoxCgdjcmV3X2lkEiYKJDAwOThmODNmLTdkNTAtNGI2Mi1hYmIy
|
||||
LTJlNTc0N2ZlMWE4OUocCgxjcmV3X3Byb2Nlc3MSDAoKc2VxdWVudGlhbEoRCgtjcmV3X21lbW9y
|
||||
eRICEABKGgoUY3Jld19udW1iZXJfb2ZfdGFza3MSAhgBShsKFWNyZXdfbnVtYmVyX29mX2FnZW50
|
||||
cxICGAFK2QIKC2NyZXdfYWdlbnRzEskCCsYCW3sia2V5IjogImYzMzg2ZjZkOGRhNzVhYTQxNmE2
|
||||
ZTMxMDA1M2Y3Njk4IiwgImlkIjogIjEzODI4ZDViLWIyOWMtNDllMy05NWVhLTkyOGQ2ZmZhY2I0
|
||||
NSIsICJyb2xlIjogInt0b3BpY30gUmVzZWFyY2hlciIsICJ2ZXJib3NlPyI6IGZhbHNlLCAibWF4
|
||||
X2l0ZXIiOiAyMCwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwg
|
||||
ImxsbSI6ICJncHQtNG8tbWluaSIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJhbGxv
|
||||
d19jb2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6IDIsICJ0b29sc19u
|
||||
YW1lcyI6IFtdfV1KhwIKCmNyZXdfdGFza3MS+AEK9QFbeyJrZXkiOiAiMDZhNzMyMjBmNDE0OGE0
|
||||
YmJkNWJhY2IwZDBiNDRmY2UiLCAiaWQiOiAiNWM4MjM1ZmYtNWVjNy00NzFhLWI4NWEtNWFkZjk3
|
||||
YzJkYzI3IiwgImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lucHV0PyI6IGZhbHNl
|
||||
LCAiYWdlbnRfcm9sZSI6ICJ7dG9waWN9IFJlc2VhcmNoZXIiLCAiYWdlbnRfa2V5IjogImYzMzg2
|
||||
ZjZkOGRhNzVhYTQxNmE2ZTMxMDA1M2Y3Njk4IiwgInRvb2xzX25hbWVzIjogW119XXoCGAGFAQAB
|
||||
AAASjgIKEHOZ/a+LQwTLSkMO0sPwg9gSCL1SwQw1+0iKKgxUYXNrIENyZWF0ZWQwATl4kX48X34J
|
||||
GEFIFn88X34JGEouCghjcmV3X2tleRIiCiBlZTY3NDVkN2M4YWU4MmUwMGRmOTRkZTBmN2Y4NzEx
|
||||
OEoxCgdjcmV3X2lkEiYKJDAwOThmODNmLTdkNTAtNGI2Mi1hYmIyLTJlNTc0N2ZlMWE4OUouCgh0
|
||||
YXNrX2tleRIiCiAwNmE3MzIyMGY0MTQ4YTRiYmQ1YmFjYjBkMGI0NGZjZUoxCgd0YXNrX2lkEiYK
|
||||
JDVjODIzNWZmLTVlYzctNDcxYS1iODVhLTVhZGY5N2MyZGMyN3oCGAGFAQABAAASswcKEHZQCRd7
|
||||
z4ZBCh4+06qs1r4SCNzrNsw+dn2zKgxDcmV3IENyZWF0ZWQwATlgWdtDX34JGEGIcN1DX34JGEoa
|
||||
Cg5jcmV3YWlfdmVyc2lvbhIICgYwLjgwLjBKGgoOcHl0aG9uX3ZlcnNpb24SCAoGMy4xMi41Si4K
|
||||
CGNyZXdfa2V5EiIKIGVlNjc0NWQ3YzhhZTgyZTAwZGY5NGRlMGY3Zjg3MTE4SjEKB2NyZXdfaWQS
|
||||
JgokZTgzMTdjNzEtNmZiZS00MjI5LWE3MzctYTkxM2I0ZmU0ZTU0ShwKDGNyZXdfcHJvY2VzcxIM
|
||||
CgpzZXF1ZW50aWFsShEKC2NyZXdfbWVtb3J5EgIQAEoaChRjcmV3X251bWJlcl9vZl90YXNrcxIC
|
||||
GAFKGwoVY3Jld19udW1iZXJfb2ZfYWdlbnRzEgIYAUrZAgoLY3Jld19hZ2VudHMSyQIKxgJbeyJr
|
||||
ZXkiOiAiZjMzODZmNmQ4ZGE3NWFhNDE2YTZlMzEwMDUzZjc2OTgiLCAiaWQiOiAiNjAwMzU5OTYt
|
||||
ZWU1ZS00YmZhLThmODctMGM1ZTY0OTBlMmE4IiwgInJvbGUiOiAie3RvcGljfSBSZXNlYXJjaGVy
|
||||
IiwgInZlcmJvc2U/IjogZmFsc2UsICJtYXhfaXRlciI6IDIwLCAibWF4X3JwbSI6IG51bGwsICJm
|
||||
dW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAibGxtIjogImdwdC00by1taW5pIiwgImRlbGVnYXRp
|
||||
b25fZW5hYmxlZD8iOiBmYWxzZSwgImFsbG93X2NvZGVfZXhlY3V0aW9uPyI6IGZhbHNlLCAibWF4
|
||||
X3JldHJ5X2xpbWl0IjogMiwgInRvb2xzX25hbWVzIjogW119XUqHAgoKY3Jld190YXNrcxL4AQr1
|
||||
AVt7ImtleSI6ICIwNmE3MzIyMGY0MTQ4YTRiYmQ1YmFjYjBkMGI0NGZjZSIsICJpZCI6ICI4MDc3
|
||||
MDhjNS0yN2RkLTQ4ZDEtYTU0ZC1lZTRkNTZmMzBiZTQiLCAiYXN5bmNfZXhlY3V0aW9uPyI6IGZh
|
||||
bHNlLCAiaHVtYW5faW5wdXQ/IjogZmFsc2UsICJhZ2VudF9yb2xlIjogInt0b3BpY30gUmVzZWFy
|
||||
Y2hlciIsICJhZ2VudF9rZXkiOiAiZjMzODZmNmQ4ZGE3NWFhNDE2YTZlMzEwMDUzZjc2OTgiLCAi
|
||||
dG9vbHNfbmFtZXMiOiBbXX1degIYAYUBAAEAABKOAgoQtqHg5uy2kZsnJlJTYgmZoxIIlgUHkQ7m
|
||||
LugqDFRhc2sgQ3JlYXRlZDABOTi470NffgkYQQg98ENffgkYSi4KCGNyZXdfa2V5EiIKIGVlNjc0
|
||||
NWQ3YzhhZTgyZTAwZGY5NGRlMGY3Zjg3MTE4SjEKB2NyZXdfaWQSJgokZTgzMTdjNzEtNmZiZS00
|
||||
MjI5LWE3MzctYTkxM2I0ZmU0ZTU0Si4KCHRhc2tfa2V5EiIKIDA2YTczMjIwZjQxNDhhNGJiZDVi
|
||||
YWNiMGQwYjQ0ZmNlSjEKB3Rhc2tfaWQSJgokODA3NzA4YzUtMjdkZC00OGQxLWE1NGQtZWU0ZDU2
|
||||
ZjMwYmU0egIYAYUBAAEAABKzBwoQpfKhpM9cCoiT5Mun1aoNQhII4HhX0QHHc/0qDENyZXcgQ3Jl
|
||||
YXRlZDABORiH20lffgkYQcix3UlffgkYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODAuMEoaCg5w
|
||||
eXRob25fdmVyc2lvbhIICgYzLjEyLjVKLgoIY3Jld19rZXkSIgogZWU2NzQ1ZDdjOGFlODJlMDBk
|
||||
Zjk0ZGUwZjdmODcxMThKMQoHY3Jld19pZBImCiRmZDk2MmQwMi0wNGY0LTQ3NDUtODc5YS02NTFm
|
||||
MzFmMmZhOTZKHAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAA
|
||||
ShoKFGNyZXdfbnVtYmVyX29mX3Rhc2tzEgIYAUobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgB
|
||||
StkCCgtjcmV3X2FnZW50cxLJAgrGAlt7ImtleSI6ICJmMzM4NmY2ZDhkYTc1YWE0MTZhNmUzMTAw
|
||||
NTNmNzY5OCIsICJpZCI6ICIzNmZhMTEyZS02ZDVlLTRhMzgtODk0Yy01M2M5YjAzNTI5ODUiLCAi
|
||||
cm9sZSI6ICJ7dG9waWN9IFJlc2VhcmNoZXIiLCAidmVyYm9zZT8iOiBmYWxzZSwgIm1heF9pdGVy
|
||||
IjogMjAsICJtYXhfcnBtIjogbnVsbCwgImZ1bmN0aW9uX2NhbGxpbmdfbGxtIjogIiIsICJsbG0i
|
||||
OiAiZ3B0LTRvLW1pbmkiLCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6IGZhbHNlLCAiYWxsb3dfY29k
|
||||
ZV9leGVjdXRpb24/IjogZmFsc2UsICJtYXhfcmV0cnlfbGltaXQiOiAyLCAidG9vbHNfbmFtZXMi
|
||||
OiBbXX1dSocCCgpjcmV3X3Rhc2tzEvgBCvUBW3sia2V5IjogIjA2YTczMjIwZjQxNDhhNGJiZDVi
|
||||
YWNiMGQwYjQ0ZmNlIiwgImlkIjogIjY3NTE3ZjY1LThhYzMtNDIyZi1hMmJhLTM4NDcyZDRkYmZl
|
||||
NSIsICJhc3luY19leGVjdXRpb24/IjogZmFsc2UsICJodW1hbl9pbnB1dD8iOiBmYWxzZSwgImFn
|
||||
ZW50X3JvbGUiOiAie3RvcGljfSBSZXNlYXJjaGVyIiwgImFnZW50X2tleSI6ICJmMzM4NmY2ZDhk
|
||||
YTc1YWE0MTZhNmUzMTAwNTNmNzY5OCIsICJ0b29sc19uYW1lcyI6IFtdfV16AhgBhQEAAQAAEo4C
|
||||
ChAzGUzQMDZOgJ090im3887lEgik7+/nVnqntioMVGFzayBDcmVhdGVkMAE5UO7rSV9+CRhBWEDs
|
||||
SV9+CRhKLgoIY3Jld19rZXkSIgogZWU2NzQ1ZDdjOGFlODJlMDBkZjk0ZGUwZjdmODcxMThKMQoH
|
||||
Y3Jld19pZBImCiRmZDk2MmQwMi0wNGY0LTQ3NDUtODc5YS02NTFmMzFmMmZhOTZKLgoIdGFza19r
|
||||
ZXkSIgogMDZhNzMyMjBmNDE0OGE0YmJkNWJhY2IwZDBiNDRmY2VKMQoHdGFza19pZBImCiQ2NzUx
|
||||
N2Y2NS04YWMzLTQyMmYtYTJiYS0zODQ3MmQ0ZGJmZTV6AhgBhQEAAQAAErMHChCB1TPvVbWX62DF
|
||||
102NfOHLEghdZ/LjI40W8SoMQ3JldyBDcmVhdGVkMAE5sJDUT19+CRhBEHXWT19+CRhKGgoOY3Jl
|
||||
d2FpX3ZlcnNpb24SCAoGMC44MC4wShoKDnB5dGhvbl92ZXJzaW9uEggKBjMuMTIuNUouCghjcmV3
|
||||
X2tleRIiCiBlZTY3NDVkN2M4YWU4MmUwMGRmOTRkZTBmN2Y4NzExOEoxCgdjcmV3X2lkEiYKJDUx
|
||||
YmI1NGQ0LWM4MTAtNDA0Yy04MTQzLWVmNTgwMTlhN2Q2OEocCgxjcmV3X3Byb2Nlc3MSDAoKc2Vx
|
||||
dWVudGlhbEoRCgtjcmV3X21lbW9yeRICEABKGgoUY3Jld19udW1iZXJfb2ZfdGFza3MSAhgBShsK
|
||||
FWNyZXdfbnVtYmVyX29mX2FnZW50cxICGAFK2QIKC2NyZXdfYWdlbnRzEskCCsYCW3sia2V5Ijog
|
||||
ImYzMzg2ZjZkOGRhNzVhYTQxNmE2ZTMxMDA1M2Y3Njk4IiwgImlkIjogIjRlNTExYTRhLTM1Yzkt
|
||||
NDA0NC1iMzBlLWM4OGZjZTJiMzc5YiIsICJyb2xlIjogInt0b3BpY30gUmVzZWFyY2hlciIsICJ2
|
||||
ZXJib3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAyMCwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rp
|
||||
b25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJncHQtNG8tbWluaSIsICJkZWxlZ2F0aW9uX2Vu
|
||||
YWJsZWQ/IjogZmFsc2UsICJhbGxvd19jb2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9yZXRy
|
||||
eV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6IFtdfV1KhwIKCmNyZXdfdGFza3MS+AEK9QFbeyJr
|
||||
ZXkiOiAiMDZhNzMyMjBmNDE0OGE0YmJkNWJhY2IwZDBiNDRmY2UiLCAiaWQiOiAiOTIzMWJmZjIt
|
||||
ODFmOS00MjU4LTgyMDktMjkzMjUyOWI1ZjlmIiwgImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwg
|
||||
Imh1bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdlbnRfcm9sZSI6ICJ7dG9waWN9IFJlc2VhcmNoZXIi
|
||||
LCAiYWdlbnRfa2V5IjogImYzMzg2ZjZkOGRhNzVhYTQxNmE2ZTMxMDA1M2Y3Njk4IiwgInRvb2xz
|
||||
X25hbWVzIjogW119XXoCGAGFAQABAAASjgIKECTO19lNFYzBivlrqiZfSxASCIAH8VhjiPfQKgxU
|
||||
YXNrIENyZWF0ZWQwATnIr+NPX34JGEHQAeRPX34JGEouCghjcmV3X2tleRIiCiBlZTY3NDVkN2M4
|
||||
YWU4MmUwMGRmOTRkZTBmN2Y4NzExOEoxCgdjcmV3X2lkEiYKJDUxYmI1NGQ0LWM4MTAtNDA0Yy04
|
||||
MTQzLWVmNTgwMTlhN2Q2OEouCgh0YXNrX2tleRIiCiAwNmE3MzIyMGY0MTQ4YTRiYmQ1YmFjYjBk
|
||||
MGI0NGZjZUoxCgd0YXNrX2lkEiYKJDkyMzFiZmYyLTgxZjktNDI1OC04MjA5LTI5MzI1MjliNWY5
|
||||
ZnoCGAGFAQABAAASswcKEHGb7KITfOkYfQT7CRjWfUcSCIn6YlQJ1QVbKgxDcmV3IENyZWF0ZWQw
|
||||
ATmYH/BYX34JGEEgJ/JYX34JGEoaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjgwLjBKGgoOcHl0aG9u
|
||||
X3ZlcnNpb24SCAoGMy4xMi41Si4KCGNyZXdfa2V5EiIKIGVlNjc0NWQ3YzhhZTgyZTAwZGY5NGRl
|
||||
MGY3Zjg3MTE4SjEKB2NyZXdfaWQSJgokZDc5Y2UyMWUtYmU1Ny00NTdiLWExMzEtNjZkMDFmZjQx
|
||||
ZTI2ShwKDGNyZXdfcHJvY2VzcxIMCgpzZXF1ZW50aWFsShEKC2NyZXdfbWVtb3J5EgIQAEoaChRj
|
||||
cmV3X251bWJlcl9vZl90YXNrcxICGAFKGwoVY3Jld19udW1iZXJfb2ZfYWdlbnRzEgIYAUrZAgoL
|
||||
Y3Jld19hZ2VudHMSyQIKxgJbeyJrZXkiOiAiZjMzODZmNmQ4ZGE3NWFhNDE2YTZlMzEwMDUzZjc2
|
||||
OTgiLCAiaWQiOiAiNzRhNDUxNzgtNmExOS00N2RjLThlZjktZDdhZmQ5YzUwMDQ0IiwgInJvbGUi
|
||||
OiAie3RvcGljfSBSZXNlYXJjaGVyIiwgInZlcmJvc2U/IjogZmFsc2UsICJtYXhfaXRlciI6IDIw
|
||||
LCAibWF4X3JwbSI6IG51bGwsICJmdW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAibGxtIjogImdw
|
||||
dC00by1taW5pIiwgImRlbGVnYXRpb25fZW5hYmxlZD8iOiBmYWxzZSwgImFsbG93X2NvZGVfZXhl
|
||||
Y3V0aW9uPyI6IGZhbHNlLCAibWF4X3JldHJ5X2xpbWl0IjogMiwgInRvb2xzX25hbWVzIjogW119
|
||||
XUqHAgoKY3Jld190YXNrcxL4AQr1AVt7ImtleSI6ICIwNmE3MzIyMGY0MTQ4YTRiYmQ1YmFjYjBk
|
||||
MGI0NGZjZSIsICJpZCI6ICJjZWZiYjE1ZS01Y2M4LTQwZTctYTViMS03ODkzYjJlZGFkYmQiLCAi
|
||||
YXN5bmNfZXhlY3V0aW9uPyI6IGZhbHNlLCAiaHVtYW5faW5wdXQ/IjogZmFsc2UsICJhZ2VudF9y
|
||||
b2xlIjogInt0b3BpY30gUmVzZWFyY2hlciIsICJhZ2VudF9rZXkiOiAiZjMzODZmNmQ4ZGE3NWFh
|
||||
NDE2YTZlMzEwMDUzZjc2OTgiLCAidG9vbHNfbmFtZXMiOiBbXX1degIYAYUBAAEAAA==
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '11392'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.27.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Tue, 19 Nov 2024 22:14:39 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
205
tests/cassettes/test_llm_callback_replacement.yaml
Normal file
205
tests/cassettes/test_llm_callback_replacement.yaml
Normal file
@@ -0,0 +1,205 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"messages": [{"role": "user", "content": "Hello, world!"}], "model": "gpt-4o-mini",
|
||||
"stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '101'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- x64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- Linux
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.9
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA4xSwWrcMBS8+ytedY6LvWvYZi8lpZSkBJLSQiChGK307FUi66nSc9Ml7L8H2e56
|
||||
l7bQiw8zb8Yzg14yAGG0WINQW8mq8za/+Oqv5MUmXv+8+/Hl3uO3j59u1efreHO+/PAszpKCNo+o
|
||||
+LfqraLOW2RDbqRVQMmYXMvVsqyWy1VVDERHGm2StZ7zivLOOJMvikWVF6u8fDept2QURrGGhwwA
|
||||
4GX4ppxO4y+xhsFrQDqMUbYo1ocjABHIJkTIGE1k6ViczaQix+iG6JdoLb2BS3oGJR1cwSiAHfXA
|
||||
pOXu/bEwYNNHmcK73toJ3x+SWGp9oE2c+APeGGfitg4oI7n018jkxcDuM4DvQ+P+pITwgTrPNdMT
|
||||
umRYlqOdmHeeyfOJY2JpZ3gxjXRqVmtkaWw8GkwoqbaoZ+W8ruy1oSMiO6r8Z5a/eY+1jWv/x34m
|
||||
lELPqGsfUBt12nc+C5ge4b/ODhMPgUXcRcauboxrMfhgxifQ+LrYyEKXi6opRbbPXgEAAP//AwAM
|
||||
DMWoEAMAAA==
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e185b2c1b790303-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Tue, 12 Nov 2024 17:49:00 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=l.QrRLcNZkML_KSfxjir6YCV35B8GNTitBTNh7cPGc4-1731433740-1.0.1.1-j1ejlmykyoI8yk6i6pQjtPoovGzfxI2f5vG6u0EqodQMjCvhbHfNyN_wmYkeT._BMvFi.zDQ8m_PqEHr8tSdEQ;
|
||||
path=/; expires=Tue, 12-Nov-24 18:19:00 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=jcCDyMK__Fd0V5DMeqt9yXdlKc7Hsw87a1K01pZu9l0-1731433740848-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- user-tqfegqsiobpvvjmn0giaipdq
|
||||
openai-processing-ms:
|
||||
- '322'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '200000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '199978'
|
||||
x-ratelimit-reset-requests:
|
||||
- 8.64s
|
||||
x-ratelimit-reset-tokens:
|
||||
- 6ms
|
||||
x-request-id:
|
||||
- req_037288753767e763a51a04eae757ca84
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "user", "content": "Hello, world from another agent!"}],
|
||||
"model": "gpt-4o-mini", "stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '120'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=l.QrRLcNZkML_KSfxjir6YCV35B8GNTitBTNh7cPGc4-1731433740-1.0.1.1-j1ejlmykyoI8yk6i6pQjtPoovGzfxI2f5vG6u0EqodQMjCvhbHfNyN_wmYkeT._BMvFi.zDQ8m_PqEHr8tSdEQ;
|
||||
_cfuvid=jcCDyMK__Fd0V5DMeqt9yXdlKc7Hsw87a1K01pZu9l0-1731433740848-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- x64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- Linux
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.9
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA4xSy27bMBC86yu2PFuBZAt14UvRU5MA7aVAEKAIBJpcSUwoLkuu6jiB/z3QI5aM
|
||||
tkAvPMzsDGZ2+ZoACKPFDoRqJKvW2/TLD3+z//oiD8dfL7d339zvW125x9zX90/3mVj1Cto/ouJ3
|
||||
1ZWi1ltkQ26kVUDJ2Lvm201ebDbbIh+IljTaXlZ7TgtKW+NMus7WRZpt0/zTpG7IKIxiBz8TAIDX
|
||||
4e1zOo3PYgfZ6h1pMUZZo9idhwBEINsjQsZoIkvHYjWTihyjG6Jfo7X0Ab4bhcAEipxDxXAw3IB0
|
||||
xA0GkDU6voJrOoCSDm5gNIUjdcCk5fHz0jxg1UXZF3SdtRN+Oqe1VPtA+zjxZ7wyzsSmDCgjuT5Z
|
||||
ZPJiYE8JwMOwle6iqPCBWs8l0xO63jAvRjsx32JBfpxIJpZ2xjfTJi/dSo0sjY2LrQolVYN6Vs4n
|
||||
kJ02tCCSRec/w/zNe+xtXP0/9jOhFHpGXfqA2qjLwvNYwP6n/mvsvOMhsIjHyNiWlXE1Bh/M+E8q
|
||||
X2Z7mel8XVS5SE7JGwAAAP//AwA/cK4yNQMAAA==
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e185b31398a0303-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Tue, 12 Nov 2024 17:49:02 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- user-tqfegqsiobpvvjmn0giaipdq
|
||||
openai-processing-ms:
|
||||
- '889'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '200000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9998'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '199975'
|
||||
x-ratelimit-reset-requests:
|
||||
- 16.489s
|
||||
x-ratelimit-reset-tokens:
|
||||
- 7ms
|
||||
x-request-id:
|
||||
- req_bde3810b36a4859688e53d1df64bdd20
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
565
tests/cassettes/test_multiple_before_after_crew.yaml
Normal file
565
tests/cassettes/test_multiple_before_after_crew.yaml
Normal file
@@ -0,0 +1,565 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: !!binary |
|
||||
CoBACiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkS1z8KEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRKQDAoQsz1SskG+EjJPD44lvCSFCxIIlcKcY48gxQAqDENyZXcgQ3JlYXRlZDABORht
|
||||
kkfU8QgYQZgEmUfU8QgYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODAuMEoaCg5weXRob25fdmVy
|
||||
c2lvbhIICgYzLjExLjdKLgoIY3Jld19rZXkSIgogMWYxMjhiZGI3YmFhNGI2NzcxNGYxZGFlZGMy
|
||||
ZjNhYjZKMQoHY3Jld19pZBImCiQ0MzA0ODRhNS0zODM3LTRkZDktOTBmYS1kMTg3NDM0MjRmZDRK
|
||||
HAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdf
|
||||
bnVtYmVyX29mX3Rhc2tzEgIYAkobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgCSqoFCgtjcmV3
|
||||
X2FnZW50cxKaBQqXBVt7ImtleSI6ICI3M2MzNDljOTNjMTYzYjVkNGRmOThhNjRmYWMxYzQzMCIs
|
||||
ICJpZCI6ICI0NWU2MTcyOC04MDQzLTQ4ZTUtYjY1YS1mZjAxM2E5OGIwZjMiLCAicm9sZSI6ICJ7
|
||||
dG9waWN9IFNlbmlvciBEYXRhIFJlc2VhcmNoZXJcbiIsICJ2ZXJib3NlPyI6IHRydWUsICJtYXhf
|
||||
aXRlciI6IDIwLCAibWF4X3JwbSI6IG51bGwsICJmdW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAi
|
||||
bGxtIjogImdwdC00byIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJhbGxvd19jb2Rl
|
||||
X2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6
|
||||
IFtdfSwgeyJrZXkiOiAiMTA0ZmUwNjU5ZTEwYjQyNmNmODhmMDI0ZmI1NzE1NTMiLCAiaWQiOiAi
|
||||
MTgyOGQ3NTktYzgzMS00YTBhLTk5YmQtNzU4OWM3ZGMzNjM1IiwgInJvbGUiOiAie3RvcGljfSBS
|
||||
ZXBvcnRpbmcgQW5hbHlzdFxuIiwgInZlcmJvc2U/IjogdHJ1ZSwgIm1heF9pdGVyIjogMjAsICJt
|
||||
YXhfcnBtIjogbnVsbCwgImZ1bmN0aW9uX2NhbGxpbmdfbGxtIjogIiIsICJsbG0iOiAiZ3B0LTRv
|
||||
IiwgImRlbGVnYXRpb25fZW5hYmxlZD8iOiBmYWxzZSwgImFsbG93X2NvZGVfZXhlY3V0aW9uPyI6
|
||||
IGZhbHNlLCAibWF4X3JldHJ5X2xpbWl0IjogMiwgInRvb2xzX25hbWVzIjogW119XUqTBAoKY3Jl
|
||||
d190YXNrcxKEBAqBBFt7ImtleSI6ICI2YWZjNGIzOTYyNTlmYmI3NjgxZjU2Yzc3NTVjYzkzNyIs
|
||||
ICJpZCI6ICIxZWQ2YmQ5Yy0wNTFhLTRhYjUtYjQ5NC01NzI1NDY1OWIyODQiLCAiYXN5bmNfZXhl
|
||||
Y3V0aW9uPyI6IGZhbHNlLCAiaHVtYW5faW5wdXQ/IjogZmFsc2UsICJhZ2VudF9yb2xlIjogInt0
|
||||
b3BpY30gU2VuaW9yIERhdGEgUmVzZWFyY2hlclxuIiwgImFnZW50X2tleSI6ICI3M2MzNDljOTNj
|
||||
MTYzYjVkNGRmOThhNjRmYWMxYzQzMCIsICJ0b29sc19uYW1lcyI6IFtdfSwgeyJrZXkiOiAiYjE3
|
||||
YjE4OGRiZjE0ZjkzYTk4ZTViOTVhYWQzNjc1NzciLCAiaWQiOiAiZWE2MmJjM2EtMGYzNi00NGFl
|
||||
LWJiYTktOWJlY2RmNTFlNGE4IiwgImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lu
|
||||
cHV0PyI6IGZhbHNlLCAiYWdlbnRfcm9sZSI6ICJ7dG9waWN9IFJlcG9ydGluZyBBbmFseXN0XG4i
|
||||
LCAiYWdlbnRfa2V5IjogIjEwNGZlMDY1OWUxMGI0MjZjZjg4ZjAyNGZiNTcxNTUzIiwgInRvb2xz
|
||||
X25hbWVzIjogW119XXoCGAGFAQABAAASjgIKEJohcMRpBfZqBQCPSM3m2SQSCIXDnmVDTos7KgxU
|
||||
YXNrIENyZWF0ZWQwATnAMqxH1PEIGEHAr6xH1PEIGEouCghjcmV3X2tleRIiCiAxZjEyOGJkYjdi
|
||||
YWE0YjY3NzE0ZjFkYWVkYzJmM2FiNkoxCgdjcmV3X2lkEiYKJDQzMDQ4NGE1LTM4MzctNGRkOS05
|
||||
MGZhLWQxODc0MzQyNGZkNEouCgh0YXNrX2tleRIiCiA2YWZjNGIzOTYyNTlmYmI3NjgxZjU2Yzc3
|
||||
NTVjYzkzN0oxCgd0YXNrX2lkEiYKJDFlZDZiZDljLTA1MWEtNGFiNS1iNDk0LTU3MjU0NjU5YjI4
|
||||
NHoCGAGFAQABAAASjgIKEFme5tfbl7IKlOtxKXOBwbkSCEkXsxaQXDs+KgxUYXNrIENyZWF0ZWQw
|
||||
ATloDuBJ1PEIGEGABOFJ1PEIGEouCghjcmV3X2tleRIiCiAxZjEyOGJkYjdiYWE0YjY3NzE0ZjFk
|
||||
YWVkYzJmM2FiNkoxCgdjcmV3X2lkEiYKJDQzMDQ4NGE1LTM4MzctNGRkOS05MGZhLWQxODc0MzQy
|
||||
NGZkNEouCgh0YXNrX2tleRIiCiBiMTdiMTg4ZGJmMTRmOTNhOThlNWI5NWFhZDM2NzU3N0oxCgd0
|
||||
YXNrX2lkEiYKJGVhNjJiYzNhLTBmMzYtNDRhZS1iYmE5LTliZWNkZjUxZTRhOHoCGAGFAQABAAAS
|
||||
kAwKEFJJx8FrKG3eBx+YFpoVWbsSCPJRoDF/VPvQKgxDcmV3IENyZWF0ZWQwATnYWWZO1PEIGEGw
|
||||
JGlO1PEIGEoaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjgwLjBKGgoOcHl0aG9uX3ZlcnNpb24SCAoG
|
||||
My4xMS43Si4KCGNyZXdfa2V5EiIKIDFmMTI4YmRiN2JhYTRiNjc3MTRmMWRhZWRjMmYzYWI2SjEK
|
||||
B2NyZXdfaWQSJgokYmViNmFlOTEtYTdjMS00YWVmLTg0ZjUtMzNhZjUwYTc3NjVhShwKDGNyZXdf
|
||||
cHJvY2VzcxIMCgpzZXF1ZW50aWFsShEKC2NyZXdfbWVtb3J5EgIQAEoaChRjcmV3X251bWJlcl9v
|
||||
Zl90YXNrcxICGAJKGwoVY3Jld19udW1iZXJfb2ZfYWdlbnRzEgIYAkqqBQoLY3Jld19hZ2VudHMS
|
||||
mgUKlwVbeyJrZXkiOiAiNzNjMzQ5YzkzYzE2M2I1ZDRkZjk4YTY0ZmFjMWM0MzAiLCAiaWQiOiAi
|
||||
MTk5NmFkYzctODQzNy00ODM3LThhYWYtNzQ1NWFlNjU1MzNkIiwgInJvbGUiOiAie3RvcGljfSBT
|
||||
ZW5pb3IgRGF0YSBSZXNlYXJjaGVyXG4iLCAidmVyYm9zZT8iOiB0cnVlLCAibWF4X2l0ZXIiOiAy
|
||||
MCwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJn
|
||||
cHQtNG8iLCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6IGZhbHNlLCAiYWxsb3dfY29kZV9leGVjdXRp
|
||||
b24/IjogZmFsc2UsICJtYXhfcmV0cnlfbGltaXQiOiAyLCAidG9vbHNfbmFtZXMiOiBbXX0sIHsi
|
||||
a2V5IjogIjEwNGZlMDY1OWUxMGI0MjZjZjg4ZjAyNGZiNTcxNTUzIiwgImlkIjogIjhiYTA2NGM3
|
||||
LWQ2NTItNGFmMi1hYjQ0LWJhNDA2NDRkMDJlMSIsICJyb2xlIjogInt0b3BpY30gUmVwb3J0aW5n
|
||||
IEFuYWx5c3RcbiIsICJ2ZXJib3NlPyI6IHRydWUsICJtYXhfaXRlciI6IDIwLCAibWF4X3JwbSI6
|
||||
IG51bGwsICJmdW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAibGxtIjogImdwdC00byIsICJkZWxl
|
||||
Z2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJhbGxvd19jb2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwg
|
||||
Im1heF9yZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6IFtdfV1KkwQKCmNyZXdfdGFza3MS
|
||||
hAQKgQRbeyJrZXkiOiAiNmFmYzRiMzk2MjU5ZmJiNzY4MWY1NmM3NzU1Y2M5MzciLCAiaWQiOiAi
|
||||
YTQzMTgxZjUtYzkwYi00ZDk0LWI1NTAtMTI2ZmNjM2Y4NjU3IiwgImFzeW5jX2V4ZWN1dGlvbj8i
|
||||
OiBmYWxzZSwgImh1bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdlbnRfcm9sZSI6ICJ7dG9waWN9IFNl
|
||||
bmlvciBEYXRhIFJlc2VhcmNoZXJcbiIsICJhZ2VudF9rZXkiOiAiNzNjMzQ5YzkzYzE2M2I1ZDRk
|
||||
Zjk4YTY0ZmFjMWM0MzAiLCAidG9vbHNfbmFtZXMiOiBbXX0sIHsia2V5IjogImIxN2IxODhkYmYx
|
||||
NGY5M2E5OGU1Yjk1YWFkMzY3NTc3IiwgImlkIjogImE3NzczMTA2LTEwNDEtNDI2ZS05NjVhLTRj
|
||||
YzExYjNkMjhiNyIsICJhc3luY19leGVjdXRpb24/IjogZmFsc2UsICJodW1hbl9pbnB1dD8iOiBm
|
||||
YWxzZSwgImFnZW50X3JvbGUiOiAie3RvcGljfSBSZXBvcnRpbmcgQW5hbHlzdFxuIiwgImFnZW50
|
||||
X2tleSI6ICIxMDRmZTA2NTllMTBiNDI2Y2Y4OGYwMjRmYjU3MTU1MyIsICJ0b29sc19uYW1lcyI6
|
||||
IFtdfV16AhgBhQEAAQAAEo4CChCfS8p2nUmN4gbwYTXM780iEgiD3bJ3GYKbCyoMVGFzayBDcmVh
|
||||
dGVkMAE5oK91TtTxCBhBoCx2TtTxCBhKLgoIY3Jld19rZXkSIgogMWYxMjhiZGI3YmFhNGI2Nzcx
|
||||
NGYxZGFlZGMyZjNhYjZKMQoHY3Jld19pZBImCiRiZWI2YWU5MS1hN2MxLTRhZWYtODRmNS0zM2Fm
|
||||
NTBhNzc2NWFKLgoIdGFza19rZXkSIgogNmFmYzRiMzk2MjU5ZmJiNzY4MWY1NmM3NzU1Y2M5MzdK
|
||||
MQoHdGFza19pZBImCiRhNDMxODFmNS1jOTBiLTRkOTQtYjU1MC0xMjZmY2MzZjg2NTd6AhgBhQEA
|
||||
AQAAEo4CChD6YglTfqRETeD91myfQQucEgjVnKDpP/acdSoMVGFzayBDcmVhdGVkMAE54FB+UNTx
|
||||
CBhBOOl+UNTxCBhKLgoIY3Jld19rZXkSIgogMWYxMjhiZGI3YmFhNGI2NzcxNGYxZGFlZGMyZjNh
|
||||
YjZKMQoHY3Jld19pZBImCiRiZWI2YWU5MS1hN2MxLTRhZWYtODRmNS0zM2FmNTBhNzc2NWFKLgoI
|
||||
dGFza19rZXkSIgogYjE3YjE4OGRiZjE0ZjkzYTk4ZTViOTVhYWQzNjc1NzdKMQoHdGFza19pZBIm
|
||||
CiRhNzc3MzEwNi0xMDQxLTQyNmUtOTY1YS00Y2MxMWIzZDI4Yjd6AhgBhQEAAQAAEpAMChD6odHS
|
||||
2zFS6x5ow3o0vrPMEgjfWriALnu/VyoMQ3JldyBDcmVhdGVkMAE5gE7+UNTxCBhB6IQAUdTxCBhK
|
||||
GgoOY3Jld2FpX3ZlcnNpb24SCAoGMC44MC4wShoKDnB5dGhvbl92ZXJzaW9uEggKBjMuMTEuN0ou
|
||||
CghjcmV3X2tleRIiCiAxZjEyOGJkYjdiYWE0YjY3NzE0ZjFkYWVkYzJmM2FiNkoxCgdjcmV3X2lk
|
||||
EiYKJDAxN2I1M2M0LWU0N2EtNDRjMy1hZTZhLWE4NDc2N2Q3MDEwMkocCgxjcmV3X3Byb2Nlc3MS
|
||||
DAoKc2VxdWVudGlhbEoRCgtjcmV3X21lbW9yeRICEABKGgoUY3Jld19udW1iZXJfb2ZfdGFza3MS
|
||||
AhgCShsKFWNyZXdfbnVtYmVyX29mX2FnZW50cxICGAJKqgUKC2NyZXdfYWdlbnRzEpoFCpcFW3si
|
||||
a2V5IjogIjczYzM0OWM5M2MxNjNiNWQ0ZGY5OGE2NGZhYzFjNDMwIiwgImlkIjogImE1YjNmZjcz
|
||||
LTVjMmQtNDI5Ny05M2ExLTY2NDMyOWJmNGNiYiIsICJyb2xlIjogInt0b3BpY30gU2VuaW9yIERh
|
||||
dGEgUmVzZWFyY2hlclxuIiwgInZlcmJvc2U/IjogdHJ1ZSwgIm1heF9pdGVyIjogMjAsICJtYXhf
|
||||
cnBtIjogbnVsbCwgImZ1bmN0aW9uX2NhbGxpbmdfbGxtIjogIiIsICJsbG0iOiAiZ3B0LTRvIiwg
|
||||
ImRlbGVnYXRpb25fZW5hYmxlZD8iOiBmYWxzZSwgImFsbG93X2NvZGVfZXhlY3V0aW9uPyI6IGZh
|
||||
bHNlLCAibWF4X3JldHJ5X2xpbWl0IjogMiwgInRvb2xzX25hbWVzIjogW119LCB7ImtleSI6ICIx
|
||||
MDRmZTA2NTllMTBiNDI2Y2Y4OGYwMjRmYjU3MTU1MyIsICJpZCI6ICJiMTU4Y2RjNS1iYzZmLTQ3
|
||||
YTktYjE2NS1kNDQ2YzFmODhkNDUiLCAicm9sZSI6ICJ7dG9waWN9IFJlcG9ydGluZyBBbmFseXN0
|
||||
XG4iLCAidmVyYm9zZT8iOiB0cnVlLCAibWF4X2l0ZXIiOiAyMCwgIm1heF9ycG0iOiBudWxsLCAi
|
||||
ZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJncHQtNG8iLCAiZGVsZWdhdGlvbl9l
|
||||
bmFibGVkPyI6IGZhbHNlLCAiYWxsb3dfY29kZV9leGVjdXRpb24/IjogZmFsc2UsICJtYXhfcmV0
|
||||
cnlfbGltaXQiOiAyLCAidG9vbHNfbmFtZXMiOiBbXX1dSpMECgpjcmV3X3Rhc2tzEoQECoEEW3si
|
||||
a2V5IjogIjZhZmM0YjM5NjI1OWZiYjc2ODFmNTZjNzc1NWNjOTM3IiwgImlkIjogImY0ZjZiZWI0
|
||||
LWRhOGQtNDU1MC05ZWVhLThlOWM5NzY1NWNlNCIsICJhc3luY19leGVjdXRpb24/IjogZmFsc2Us
|
||||
ICJodW1hbl9pbnB1dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAie3RvcGljfSBTZW5pb3IgRGF0
|
||||
YSBSZXNlYXJjaGVyXG4iLCAiYWdlbnRfa2V5IjogIjczYzM0OWM5M2MxNjNiNWQ0ZGY5OGE2NGZh
|
||||
YzFjNDMwIiwgInRvb2xzX25hbWVzIjogW119LCB7ImtleSI6ICJiMTdiMTg4ZGJmMTRmOTNhOThl
|
||||
NWI5NWFhZDM2NzU3NyIsICJpZCI6ICJlMzA0ZWI4ZC1jYTViLTRmYmMtYmRiNS1mODZlNjE3ZWJl
|
||||
ZDYiLCAiYXN5bmNfZXhlY3V0aW9uPyI6IGZhbHNlLCAiaHVtYW5faW5wdXQ/IjogZmFsc2UsICJh
|
||||
Z2VudF9yb2xlIjogInt0b3BpY30gUmVwb3J0aW5nIEFuYWx5c3RcbiIsICJhZ2VudF9rZXkiOiAi
|
||||
MTA0ZmUwNjU5ZTEwYjQyNmNmODhmMDI0ZmI1NzE1NTMiLCAidG9vbHNfbmFtZXMiOiBbXX1degIY
|
||||
AYUBAAEAABKOAgoQ4nBhY/C/2pvPVCF+gQM4/xIIZfcNXrNnmOMqDFRhc2sgQ3JlYXRlZDABORiU
|
||||
EVHU8QgYQUgJElHU8QgYSi4KCGNyZXdfa2V5EiIKIDFmMTI4YmRiN2JhYTRiNjc3MTRmMWRhZWRj
|
||||
MmYzYWI2SjEKB2NyZXdfaWQSJgokMDE3YjUzYzQtZTQ3YS00NGMzLWFlNmEtYTg0NzY3ZDcwMTAy
|
||||
Si4KCHRhc2tfa2V5EiIKIDZhZmM0YjM5NjI1OWZiYjc2ODFmNTZjNzc1NWNjOTM3SjEKB3Rhc2tf
|
||||
aWQSJgokZjRmNmJlYjQtZGE4ZC00NTUwLTllZWEtOGU5Yzk3NjU1Y2U0egIYAYUBAAEAABKOAgoQ
|
||||
/NE8mExW1Yc4m8BzpCjK4BII6h0SL5AwtM8qDFRhc2sgQ3JlYXRlZDABObAHR1HU8QgYQeB8R1HU
|
||||
8QgYSi4KCGNyZXdfa2V5EiIKIDFmMTI4YmRiN2JhYTRiNjc3MTRmMWRhZWRjMmYzYWI2SjEKB2Ny
|
||||
ZXdfaWQSJgokMDE3YjUzYzQtZTQ3YS00NGMzLWFlNmEtYTg0NzY3ZDcwMTAySi4KCHRhc2tfa2V5
|
||||
EiIKIGIxN2IxODhkYmYxNGY5M2E5OGU1Yjk1YWFkMzY3NTc3SjEKB3Rhc2tfaWQSJgokZTMwNGVi
|
||||
OGQtY2E1Yi00ZmJjLWJkYjUtZjg2ZTYxN2ViZWQ2egIYAYUBAAEAABKQDAoQ3X0ZU0/DPkeXYe62
|
||||
8W/vqRII03hxvkN2cu4qDENyZXcgQ3JlYXRlZDABOUDsalnU8QgYQZAmbVnU8QgYShoKDmNyZXdh
|
||||
aV92ZXJzaW9uEggKBjAuODAuMEoaCg5weXRob25fdmVyc2lvbhIICgYzLjExLjdKLgoIY3Jld19r
|
||||
ZXkSIgogMWYxMjhiZGI3YmFhNGI2NzcxNGYxZGFlZGMyZjNhYjZKMQoHY3Jld19pZBImCiQ5NjNm
|
||||
ODBkMy1jYTJkLTQ4MDktODRmMC1hNjgxNzEzM2YzOWFKHAoMY3Jld19wcm9jZXNzEgwKCnNlcXVl
|
||||
bnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdfbnVtYmVyX29mX3Rhc2tzEgIYAkobChVj
|
||||
cmV3X251bWJlcl9vZl9hZ2VudHMSAhgCSqoFCgtjcmV3X2FnZW50cxKaBQqXBVt7ImtleSI6ICI3
|
||||
M2MzNDljOTNjMTYzYjVkNGRmOThhNjRmYWMxYzQzMCIsICJpZCI6ICJmNDFlMTI2YS0wZTNhLTRm
|
||||
OWEtODdhNi05MzQ3ZGU5YTUxODQiLCAicm9sZSI6ICJ7dG9waWN9IFNlbmlvciBEYXRhIFJlc2Vh
|
||||
cmNoZXJcbiIsICJ2ZXJib3NlPyI6IHRydWUsICJtYXhfaXRlciI6IDIwLCAibWF4X3JwbSI6IG51
|
||||
bGwsICJmdW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAibGxtIjogImdwdC00byIsICJkZWxlZ2F0
|
||||
aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJhbGxvd19jb2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1h
|
||||
eF9yZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6IFtdfSwgeyJrZXkiOiAiMTA0ZmUwNjU5
|
||||
ZTEwYjQyNmNmODhmMDI0ZmI1NzE1NTMiLCAiaWQiOiAiMGZkMzJmYzAtNGY4Mi00OWNiLWI1MDct
|
||||
MWZmZTU5YzBlNWRmIiwgInJvbGUiOiAie3RvcGljfSBSZXBvcnRpbmcgQW5hbHlzdFxuIiwgInZl
|
||||
cmJvc2U/IjogdHJ1ZSwgIm1heF9pdGVyIjogMjAsICJtYXhfcnBtIjogbnVsbCwgImZ1bmN0aW9u
|
||||
X2NhbGxpbmdfbGxtIjogIiIsICJsbG0iOiAiZ3B0LTRvIiwgImRlbGVnYXRpb25fZW5hYmxlZD8i
|
||||
OiBmYWxzZSwgImFsbG93X2NvZGVfZXhlY3V0aW9uPyI6IGZhbHNlLCAibWF4X3JldHJ5X2xpbWl0
|
||||
IjogMiwgInRvb2xzX25hbWVzIjogW119XUqTBAoKY3Jld190YXNrcxKEBAqBBFt7ImtleSI6ICI2
|
||||
YWZjNGIzOTYyNTlmYmI3NjgxZjU2Yzc3NTVjYzkzNyIsICJpZCI6ICI1YjlkYTE0Ni0xOGQ1LTQ1
|
||||
NjctODYwNi1hMTU3MGU3YzQyYTgiLCAiYXN5bmNfZXhlY3V0aW9uPyI6IGZhbHNlLCAiaHVtYW5f
|
||||
aW5wdXQ/IjogZmFsc2UsICJhZ2VudF9yb2xlIjogInt0b3BpY30gU2VuaW9yIERhdGEgUmVzZWFy
|
||||
Y2hlclxuIiwgImFnZW50X2tleSI6ICI3M2MzNDljOTNjMTYzYjVkNGRmOThhNjRmYWMxYzQzMCIs
|
||||
ICJ0b29sc19uYW1lcyI6IFtdfSwgeyJrZXkiOiAiYjE3YjE4OGRiZjE0ZjkzYTk4ZTViOTVhYWQz
|
||||
Njc1NzciLCAiaWQiOiAiZDI1ZjVlZmQtZjMzNC00YjRjLWE1NjktNTI2ZjAyZGY3MDAzIiwgImFz
|
||||
eW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdlbnRfcm9s
|
||||
ZSI6ICJ7dG9waWN9IFJlcG9ydGluZyBBbmFseXN0XG4iLCAiYWdlbnRfa2V5IjogIjEwNGZlMDY1
|
||||
OWUxMGI0MjZjZjg4ZjAyNGZiNTcxNTUzIiwgInRvb2xzX25hbWVzIjogW119XXoCGAGFAQABAAAS
|
||||
jgIKEDpNX1u5dj3lqgySCZra2sISCEMmeziSVN2NKgxUYXNrIENyZWF0ZWQwATmoV3lZ1PEIGEHQ
|
||||
93lZ1PEIGEouCghjcmV3X2tleRIiCiAxZjEyOGJkYjdiYWE0YjY3NzE0ZjFkYWVkYzJmM2FiNkox
|
||||
CgdjcmV3X2lkEiYKJDk2M2Y4MGQzLWNhMmQtNDgwOS04NGYwLWE2ODE3MTMzZjM5YUouCgh0YXNr
|
||||
X2tleRIiCiA2YWZjNGIzOTYyNTlmYmI3NjgxZjU2Yzc3NTVjYzkzN0oxCgd0YXNrX2lkEiYKJDVi
|
||||
OWRhMTQ2LTE4ZDUtNDU2Ny04NjA2LWExNTcwZTdjNDJhOHoCGAGFAQABAAA=
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '8195'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.27.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Mon, 18 Nov 2024 03:19:16 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are plants Senior Data
|
||||
Researcher\n. You''re a seasoned researcher with a knack for uncovering the
|
||||
latest developments in plants. Known for your ability to find the most relevant
|
||||
information and present it in a clear and concise manner.\n\nYour personal goal
|
||||
is: Uncover cutting-edge developments in plants\n\nTo give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: Your final answer must be the great and the
|
||||
most complete as possible, it must be outcome described.\n\nI MUST use these
|
||||
formats, my job depends on it!"}, {"role": "user", "content": "\nCurrent Task:
|
||||
Conduct a thorough research about plants Make sure you find any interesting
|
||||
and relevant information given the current year is 2024.\n\n\nThis is the expect
|
||||
criteria for your final answer: A list with 10 bullet points of the most relevant
|
||||
information about plants\n\nyou MUST return the actual complete content as the
|
||||
final answer, not a summary.\n\nBegin! This is VERY important to you, use the
|
||||
tools available and give your best Final Answer, your job depends on it!\n\nThought:"}],
|
||||
"model": "gpt-4o", "stop": ["\nObservation:"], "stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1245'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- _cfuvid=74kaPOoAcp8YRSA0XocQ1FFNksu9V0_KiWdQfo7wQuQ-1731827382509-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA3xXXY/dNg59z68g7ssCwZ3BzDTJJPOWBskiQD8GM9kN0HYR0BJtsyOLriTbuS36
|
||||
3wtSvl/dxb5c4FqiRB6dcyj98Qxgw35zBxvXY3HDGC7e/mvA26Upu8/fP/w4v8dFuh8fH68/h1e7
|
||||
7qfNViOk+ZVc2UddOhnGQIUl1mGXCAvpqte331y/fvPmzctrGxjEU9CwbiwXL+Ti5urmxcXV64ur
|
||||
V2tgL+wob+7g52cAAH/Yr6YYPX3d3MHVdv9loJyxo83dYRLAJknQLxvMmXPBWDbb46CTWCha1p96
|
||||
mbq+3MFHiLKAwwgdzwQInaYOGPNCCT5wxABv7c8d/BJ/ideX8Pz5vykVdhjgA6aBYwdv/YzR0UCx
|
||||
5OfP7+BjBK1sC/N+ZrvO5AyZChSBRLOESTHj3wl6WWAh6JIsMAaMJUOzgwG/8sC/a2Ae0RFg9JAo
|
||||
y5QcAbUtO6bodpfwAy1QyPVRgnRMGTAEWaCVBFNqMAJ2id0UypQIFi49DBx5wADkLEST5GFEV7Yw
|
||||
Zd2x3/kko0R22fZFOvwtAroWz1gI2kS5hzGJnxwBR3BcdkBx5iTRILlU5G4UuXcPHx/vH2y5e60S
|
||||
/kmRCjtDbR08lLGDHjNgxdZD5i5yyw5jCbstUMQmaJ5jIseZoKMoAwF5LoZ0XIG8hE89r4BkQ6T0
|
||||
BJ5mCjJqfiAtuCRjhtLr2SeCQRIp0JVEWq/nTJgpb2GkXPLWanCBB4XA9Rg72sIoSjDGEHbAUVVg
|
||||
SO6YgreAVsRDJjclxWiRFPzCngygbxSgb1k8z5Syjr+TmCnNqCRRgN63raRS4a9DVGuE5jSsx5mg
|
||||
Q47koSR0Gr6th86RC2PhmRQJVw9aImQiDw3Gp3rUjRSMRokOk6doGFI2ylkGyINmMSaqWSiiXT1K
|
||||
OCYSyalI085QPyGaIhuUucYXDW6V3XoOiqQmdcIfDFquHqvEyqUXxiVMjUR4pN8myiUZSvBAmTC5
|
||||
XuEyhuV/gJqC8bIGOBxNBZyhISM6pkg5k6/HTm1LTiFSlhlqaV20QkZeEaPYY3QarunXpS9yzYWS
|
||||
fseGAxfVorRQElHFNgsHKH1SA1LFJZnJr8c4Y2KyEJ0Z9GfAiJ05C4x2lo4qBi8Vg/t+VyTRQJ5r
|
||||
/Z9OPKA6UZTZDhymbAiv7qIcCoQRphFGCWEq5M9Eu4WnKEsEzDD+bZftmTAvD6BTyqaeVVsmzrpb
|
||||
HsmZdlUZpCdMlJSeAthkSQ30hPMOBioYavVSekpQ5CvHDG2SwYA29HR4wUJpC8h+Ffs5YRLlIpUT
|
||||
htYrRetxF0tvHP1WCb5TgMwbaKBkrGtNqRxnCSqRRJ7MdvaVqND2JB6TKL2pYmkdDyIt0E7RJIfr
|
||||
6efJ9Ypiw9JOFNRBekwDOpqsN6xeUuudMUzYBAJtqTJFv7cvjoWS5+x4DBxVUziOSdD1ICPFbFvj
|
||||
THGianKW70WDytejpRoYtyaf6l0XD6sWC7xTD1RMPivpz70NRjGvODFhJTFhpdJJd9meHv+JrWqP
|
||||
VTWpo3qgr0UJBQuh1X0U+AEvbxIp1TeDiM8qWYQiI4yJRS300vqs2leGJhE+rcLKlRGVZ39LZECO
|
||||
BbU9WMsqPKtXTdGTdYZMkEuinCVVob1WtN6GDunQ7/eqkmh41UFl/tH0ww7MKjiaqmXSlPyUS1JO
|
||||
+IkUttITp5M+bk2rlyLZmJq5SiHhyN7uBqW/PN1c56Ntvr9faBJ6z0ncTNYGi+yJd6hXBXzuhZWB
|
||||
ecqKi9EPf5twf11oiXyF4o1C8d6zzqj9+x7dE6p0qpZIjS6qrhdMPh+XVCu08hL5ya16yirFBXMh
|
||||
M5RQDYHP6zvh8Vabu+497nfd9yXtf9Ql9JZ8Xi9W1Y0GfDKXJ/QLWg61C2NLZbc9JmSJVCTUlKux
|
||||
YO0Jp9AcNgcMhVI0dzV4rq8Unw+i5qOmuHcguE+i92Ujy3eYOrrIDoPeMFqbXGcdmmuiSkdNtwvS
|
||||
mGta0x8aLODPojTfHhsuWCBIPrTqcd3TnKNLKuWSsIoMg5l7IN/tb4PiKcVji6k3VK2AtHHnXS40
|
||||
qHUlGaTQ2YWjYjZw4c4MYzWOepvMl6dX8ETtlFFfAHEKYf3+5+FOH6QbkzR5HT98bzly7r+otCTq
|
||||
/T0XGTc2+uczgP/Y22E6ew5sNNOxfCnyRFEXvLm5rettjq+V4+jLm/VlsSlSMBwHbl/sw84W/OKp
|
||||
IId88vzYOHQ9+WPo8a2Ck2c5GXh2UvZ/p/O/1q6lc+z+3/J/AQAA///ClEhOTi0oSU2JLyhKTclM
|
||||
RvUyQllRKiil4FIGD2awg5UgSSE+LTMvPbWooCgT0qFKK4g3MU1OMzVJSU1MVeKq5QIAAAD//wMA
|
||||
IimVsFkOAAA=
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e44d146eed6a423-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Mon, 18 Nov 2024 03:19:25 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=TExeV_B53ShoY.Ag2_Czvi.2L9gx.ekuTvv6twEsyZs-1731899965-1.0.1.1-TI1CwjC1TYPFLagqlZnBGPwghLqfQ14IMBF7MxpAfc1ZVDU6ahhzFq9sUtZDpajNRPyefUF9MUCXzF8vfGAyPw;
|
||||
path=/; expires=Mon, 18-Nov-24 03:49:25 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '13765'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '30000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '29999712'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_2d538d2fe02ebd3efaa4c15c43b6f88a
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: !!binary |
|
||||
Cs4CCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSpQIKEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRKOAgoQGHdygqOzofI91Tjg07fdjRIIXwWg5RZ9w0oqDFRhc2sgQ3JlYXRlZDABOej4
|
||||
TaXX8QgYQcgbUKXX8QgYSi4KCGNyZXdfa2V5EiIKIDFmMTI4YmRiN2JhYTRiNjc3MTRmMWRhZWRj
|
||||
MmYzYWI2SjEKB2NyZXdfaWQSJgokOTYzZjgwZDMtY2EyZC00ODA5LTg0ZjAtYTY4MTcxMzNmMzlh
|
||||
Si4KCHRhc2tfa2V5EiIKIGIxN2IxODhkYmYxNGY5M2E5OGU1Yjk1YWFkMzY3NTc3SjEKB3Rhc2tf
|
||||
aWQSJgokZDI1ZjVlZmQtZjMzNC00YjRjLWE1NjktNTI2ZjAyZGY3MDAzegIYAYUBAAEAAA==
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '337'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.27.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Mon, 18 Nov 2024 03:19:26 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are plants Reporting Analyst\n.
|
||||
You''re a meticulous analyst with a keen eye for detail. You''re known for your
|
||||
ability to turn complex data into clear and concise reports, making it easy
|
||||
for others to understand and act on the information you provide.\n\nYour personal
|
||||
goal is: Create detailed reports based on plants data analysis and research
|
||||
findings\n\nTo give my best complete final answer to the task use the exact
|
||||
following format:\n\nThought: I now can give a great answer\nFinal Answer: Your
|
||||
final answer must be the great and the most complete as possible, it must be
|
||||
outcome described.\n\nI MUST use these formats, my job depends on it!"}, {"role":
|
||||
"user", "content": "\nCurrent Task: Review the context you got and expand each
|
||||
topic into a full section for a report. Make sure the report is detailed and
|
||||
contains any and all relevant information.\n\n\nThis is the expect criteria
|
||||
for your final answer: A fully fledge reports with the mains topics, each with
|
||||
a full section of information. Formatted as markdown without ''```''\n\nyou
|
||||
MUST return the actual complete content as the final answer, not a summary.\n\nThis
|
||||
is the context you''re working with:\n1. **Vertical Farming Advancements**:
|
||||
In 2024, vertical farming is set to revolutionize how we grow plants by maximizing
|
||||
space and resource efficiency. New technologies allow for urban agriculture
|
||||
with minimal ecological impact, using hydroponics and aeroponics to cultivate
|
||||
fresh produce in city environments.\n\n2. **CRISPR and Plant Genetics**: CRISPR
|
||||
technology has advanced significantly, enabling precise genome editing in plants.
|
||||
This allows for the development of crops that are more resistant to diseases,
|
||||
pests, and climate change, potentially increasing yield and food security worldwide.\n\n3.
|
||||
**Biodiversity Conservation**: Efforts to conserve plant biodiversity have gained
|
||||
traction, with initiatives focusing on seed banks and botanical gardens. These
|
||||
efforts aim to preserve the genetic diversity necessary for ecological resilience
|
||||
in the face of changing environmental conditions.\n\n4. **Carbon Sequestration
|
||||
Research**: Plants'' role in carbon capture is being harnessed more effectively,
|
||||
with research focused on enhancing the carbon-sequestering abilities of trees
|
||||
and soil through improved plant varieties and land management practices.\n\n5.
|
||||
**Phytoremediation Technologies**: Innovative use of plants to clean up polluted
|
||||
environments, known as phytoremediation, has advanced. Researchers are developing
|
||||
plants specifically engineered to absorb heavy metals and other toxins from
|
||||
the soil and water, aiding in environmental restoration.\n\n6. **Synthetic Botany**:
|
||||
This emerging field involves redesigning plant biological processes to create
|
||||
new functionalities such as biofuels, pharmaceuticals, and other valuable compounds.
|
||||
This interdisciplinary approach opens new avenues for plant-based technology.\n\n7.
|
||||
**Climate-Resilient Crops**: With climate change posing significant threats
|
||||
to agriculture, developing crops that can withstand extreme weather conditions
|
||||
such as drought and floods is a top priority. 2024 sees breakthroughs in engineering
|
||||
crops that maintain productivity under these stressors.\n\n8. **Algae Farming
|
||||
Innovations**: Algae are increasingly used in various industries due to their
|
||||
efficiency in photosynthesis and rapid growth. Innovations in algae farming
|
||||
are contributing to biofuel production, carbon capture, and sustainable aquaculture
|
||||
feeds.\n\n9. **Edible Plant Packaging**: The trend towards sustainability in
|
||||
reducing plastic waste has led to innovations in plant-based, edible packaging.
|
||||
These biodegradable solutions are making headway in food safety, reducing waste,
|
||||
and providing a more sustainable packaging alternative.\n\n10. **Forest Restoration
|
||||
Projects**: Large-scale reforestation efforts are underway globally to combat
|
||||
deforestation and habitat loss. These projects integrate traditional knowledge
|
||||
with modern practices to restore ecosystems, promote biodiversity, and mitigate
|
||||
climate impacts.\n\nBegin! This is VERY important to you, use the tools available
|
||||
and give your best Final Answer, your job depends on it!\n\nThought:"}], "model":
|
||||
"gpt-4o", "stop": ["\nObservation:"], "stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '4283'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- _cfuvid=74kaPOoAcp8YRSA0XocQ1FFNksu9V0_KiWdQfo7wQuQ-1731827382509-0.0.1.1-604800000;
|
||||
__cf_bm=TExeV_B53ShoY.Ag2_Czvi.2L9gx.ekuTvv6twEsyZs-1731899965-1.0.1.1-TI1CwjC1TYPFLagqlZnBGPwghLqfQ14IMBF7MxpAfc1ZVDU6ahhzFq9sUtZDpajNRPyefUF9MUCXzF8vfGAyPw
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.11.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA4xa244cN5J911cEel6rC7qNbemtpbW9gnd3BMk7LzMDgcWMzKSbSeaQzCqlBwbm
|
||||
N+b35ksWJ0jmpaoN7IshdyaZwbicOHFY/3hGdGeau7d0p3uV9DDa+4f/HdR/jQ8P7/7j54+v3v30
|
||||
+JP5/HLw89D9Z/vnx7sDVvjTL6xTXXXUfhgtJ+NdfqwDq8TY9cW3r1589+bNm2/+KA8G37DFsm5M
|
||||
96/9/cvnL1/fP//u/vk3ZWHvjeZ495b+8oyI6B/yX5joGv5695aeH+pfBo5RdXz3dnmJ6C54i7/c
|
||||
qRhNTMqlu8P6UHuX2InVP/d+6vr0lj6Q8xfSylFnzkyKOphOysULh7+6v7ofjFOWHuT/3+IPf6BP
|
||||
PPqQyDv6aJVL9ME5f1Y4fSTlGvrenU3wbmCXlKUPziSjkjlzJBxX9vgDvTjSnzkko5WlH1QYjOvo
|
||||
oTkrpxkLI1774GTFgc71zba8aSKN3kRuKHlKQbnY+jDQFE7KkeqC0ZNNU2A6zeTHZAbzK5bFUWkW
|
||||
G41DjCL+GDj6KWgmblujDTs9H+nn3sTNRsrSwKn3WHj2Fmfpgr9g+QgfRDJusdLOFJPSj9yQVTOH
|
||||
eCDfJnZkXOIuIDPwT08+9RwopjBpGBspTronFSk+zlEHNXKI5ANNOKiKpGhUcHxbDmqVa6JWo3Ed
|
||||
LGZKrHvnre9msnzmoDqO1M9N8KN3RscDXXqj+989xMWk3k+JojeWJvHOYBzj+G5KwbDDMztJrA/i
|
||||
SMXbzTlw3UsFlu1xbEJUTMBZBhMT8Zohu2/KKSKT2iQCtUoba5JKTKlnQkBMTjd4og0cexqDbybN
|
||||
splxpE2atx/JJ8NZzKAssYaPJKHMMCqdjsi2n3imEztujYRT26nJXwzcTLp+T5IN+Z8t0D6mnPU8
|
||||
mBiLX+JiECrrVB2hrY8ckLLauzgNHOKRfpgC0mDwgZ9IdGWtv0RqfaCZVbgPfnLNzgcNx9EkJm3N
|
||||
AB9p7xpTAsQuTgHb4Hsmovqp9b6hOI2jnY9L6Z5lXQreUpxj4mGX0GJNpJ7tCOMDd5NV6ycPZE3X
|
||||
p5wPS54gAW08oLxgLYoWrkiJA82GbZPd9mT1PSDX88Kni58HDp34B2/qMGmj7JKaAgpIlV5Zyw5F
|
||||
MHrU0GnOlWN+zb6DAYvfeuU6PhZ4enmk958+fP74Sd7JOPcjO05GCzSh2tQ4WqOXTCzvb2rQuFwM
|
||||
1JWV1OfU6AJH2BNN50xrtHLJzgiXOlkpx8DaRMY6PzAxAuo6HItdj8ogHfwIDxkLl3EBrM23S+JE
|
||||
uDQZJGnyNPjGtPNqlB+4lMbk8E1u2AGdlNZTUHo+UOtjYkkhOLRBTP0odetbMSJS6tEwAhNSWGyS
|
||||
zoPvNSayihwPNHJMFTGaM4f4VMLW8hczi2fhs3Ouw9EjgRHo5CuAc86lXIHIdi4JznoKQIGLD7a5
|
||||
mCZH9jZGqAQf4B5UuhpNc33KgmdyTJSzatQopzurYPwUt0CDJEwSXKk+xKr4zgRSJ+DYjKXwObzU
|
||||
UBOkEwvOt9b7RsD8ockuQTM5lMy6z27hpvgdtgg0Ze845kZwQvc8SL3A50abhnOltagja36VhiTV
|
||||
bk5TzSuJXZxiUgZJyPveNwalE4hJybOcn8h742pyE7vOOM7JMgY/mMiRToHVY+rlkNWHfrINtZNr
|
||||
VHaZnUvxyTkuaqbLCvMlngZnjaN3kWFuZ/1J2U2B17p9daR3xjcGKQZfv8eKkKESr3zftjnaGYM5
|
||||
nEvDotN2mTQvZZx0QA8zpwHFoHNT9A4uwOpaGdUJ6xaONRhamCUom5ZzU7XAsxhLaudmyUEh9SQf
|
||||
q3vwqX2qbStnS7MWGoGUOCn3mBPg5JNyYkKnQsMu4uQzWIU5e+wH9ggDklSh3rgO6Ay/iZs/r9sW
|
||||
L/DOB6jqgLP4dqmR7OM4sja8bUxYp87K2Focvi0+XPerLSK3wXbKxC4wNyXVuqAGKaDAMfmQLR6D
|
||||
B0ePR3p3c26EV9noa+c4UI2limTNOdPC0UeTfDAcFxjYxBce9W2b8x1lqHLBkhdAmRCPYrKA13Sy
|
||||
RleEK+7MLTWqlrtJhYYUci4mWWB93Hx38VwmcLmGBBlbsAJWuhc72hZnRoKy9rWVC+qupLjWyusj
|
||||
vVfh5B195r9PHFPx3CeOrILu8don1oDBUP5Evel66fYlpPHf//zXTf5oNabCO2T/W7jBAQeTTLd0
|
||||
z6s2TLVQlSC6FgqMU20xtWx/H7P5ORQ5kUrQUuCCfutCIbeBpZeAK2RwAhMM/sxNcTgSl2UbrAbT
|
||||
pkE51Qkp3QOiMNZxDF7pHu+bQRBVfcXYUcBZVSD1QAUQkRzISpZP0YcTqTT4OPYcjKb3f3pJTfbi
|
||||
2Pvk4+xQlyYbhDxnMgkJm70g9Mr4QcXyBvi0UBU0OwMPwayT9znBIldreKW7kS1rCZIAxlJj2xzM
|
||||
hAGJUL8sRdchHOPqfZmyMBbDYeL3TXf5XX9mzxmHN9eK5hy0npVN/ZNDwpJgnBlqM0mstw4txlZe
|
||||
qizIl+vycUpD4bU9qKZBI/8dfvjHI33sZ5g2cGNyFv9cWYVh4Yg3L0jsyKycu2ZNmbUq0/CkLSsn
|
||||
h1KDcTIxbseZwwrx8AuceQHokpL6z0x39NZOSXa05pHhvfOMMVbZHJ3kvxq0jlLjO39KF9gzyjI0
|
||||
7vt8tlgyoy2zb8OgtXk0L3ntA5lh8CehHyX5ehWGdrIUpxOYkAa4YRDi00zKNGVg2De8Db7n1IaJ
|
||||
ZTAXLJZpwMd0n4FQfOya/S52phYjChqrsomDy8HISkLlXTkE01i2jzKULDC4cHhQ2+swp00eCH6d
|
||||
ODeU1hS3VBYPaFjnnmzq11FiI2wUnAjYWPNg4aHF75kC5i9zrYsNfY3Ue9sUTadwMulIgftcqUpK
|
||||
M6cKiD84dZ3L0PvQU9ZGgoiWKjSiXOCzpSK+OdJngSi0bGm5cx2UlmmtFa7uW4rLm0JK5lWRCJzz
|
||||
Z4Wdk1mo0wpYqBER2cjxBWRS57Ct2LOZzipthf4SGhO1Ga1xoGYyHmngHj/KpsV5N6S/zvMn49tJ
|
||||
8GNEAivNkwynZbbJis5Z2UlgDrogBvZ4oFPwqmFXO1fUfuSlvd+flGhZS7GJU9/NZWBbfYHqPXlr
|
||||
4iCfG1XqL2rOU7ZkJkoA9hbRqwxOYnzttCfjlS4jfzEPBQq2BW2KANeB7SyeXatzl3iNd//+579S
|
||||
7Wt2rg6qylkFtkxUAp/LaA6TKic0rpki1ALMIwjmdZfYVKcEp/UxGksSgDpKbtjA1dy26bk4fW7V
|
||||
myFymyBP5iRK96xiOqyLLIQ11RQWsxnfUSXOD0bf4s0iKtVK+fZI73NTuf9UZoFE7zHT4YWH65Yj
|
||||
5W9c6d8QMsBWxYOph9tyXW5Ez2XmPTw5tZdvh+XbeZ5E2Cgh94PxmJ2P9DmrBy18shvkTFZm89iv
|
||||
HgWsE2yLBoETiMtdYhlWRwmpb9fZV8DlawKA0YWVlM460iwtbpmQZYjFiBxBniwvRKGMAZLjZxhe
|
||||
gCfy0myxYW1gC61Bgq88T4RW/HmZYzYyRK7NzKKkm1Yl4UqSSd5yyNiO7mc8NqqSQM7AzmPMi3mp
|
||||
KBYxiW9WZQ7mVF1PGEMVq+1MHNAAdQ1jRf3fiyoipIMpCpoP67Qh/GiTNeLenXYCCXCygAWYF7hb
|
||||
JP489In49siQu4EcWfESeaQUyFUiZ7F1qYPvjvRgO8WL/r+5SJBKkIcb1a9O5CpV8t5M4mgotmth
|
||||
w+wnICZzXKwvIL7BxUNlh3l04YzmOyz6+6Sqp1pmIQRiHxwsU9G8tPJEQMpb0i7XLMFf6lx/qLWT
|
||||
eh4k0A1DQdp8ts6+4rOHK4amdg7a0Q4ZDZeRZqsXFz5zqAMKlm40tN0dyHUFyVnZjiW8G/WpuK/1
|
||||
Po3BZKC5dXNuWsGfTbNDd/Eobaf8jbsPK50PbI0UFwpyw9VaE/uBVb4/oAGnbZVmMJopLbpCdpdI
|
||||
FjJz7RyNhikHWBSGmjvXrULOYNWcledSWXX0BcnyeYAibYKerAoHeKK0+Nwm5loCb470fSOQmSXm
|
||||
j0o/KlClSp0Gf84TUvIXKARbk8f68nonIwKzzSRzD31bonGAorzb4UifgbbrjiaKIIbrqgYf2wRh
|
||||
tAq6Il1UTBk16iR/I1BlXgm2LoF5SlgtfmmNHTZ06xLUmMGodm/Ac2ZXnXSAhe43JrCGQjFalWsj
|
||||
q6DenTNIwI5sckae76/OTs7jMtPOJZ9jOdlpSlkkqm0mFnhULSe54nONCgCC/1YNUxv8sEhfpUlE
|
||||
VheGv4PRfFiPULQkDiYPgnn0iU+HtFwhlSujhpT1dWrdkLvrgXZJlJL+cmEtRCr3Tq2qru+RlPfY
|
||||
Xgtmbu6m5JajKaXh25u0EelSBpYqGWS+t7vXKLBPvb9s8HiliGsS1qvh50f6wWMAgRS1CHofi6CH
|
||||
t37M47rZiJ4IulWh4/uolSBnK3uUm54symg/nBQ8v3vmIKWdMAxl1S17aC3lvTS8uTNaJayKllV1
|
||||
vL7x3cHVRgx+dP5iuem43FD6hoOrtzbgXTtlpKoheUD7ZTqzywq5OGsd1I5ZursVQ7OUJvo1UlCy
|
||||
PyPwRpqtQtn22IcrEW1zhDp8rYwtaxGtsVVRzM1vqxWJv+ZcX+w6qEfWa9G0h6Fop9vbp514ZM5s
|
||||
TS8wsNz7FA6IU17lfsG2fT7wRl6MzA5mn00W1YfRu1UEYWqhdy7a7BWnkTFEusWGo5VLpazDdP7M
|
||||
oVxD4+3/+fFP8epaSmamkH9W0XBSxub7qKqEm6tfWGzzfrlkrLPs7QCyCm2HVcBdNNRdC9sOeXI7
|
||||
IIxOlDChxjf3LpBF0q0ex0K1BdbmRXnIly5Vp9uDBMapTYy3rUIO9OT9yfbXLYHbKSr8uMZN1pa/
|
||||
/7b8XMb6bgz+FMvz5e+tcSb2XxA77/DTmJj8eCdPf3tG9Df5Wc60+6XNHc4zpi/JP7LDht9+823e
|
||||
7279IdD69MXr16/K4wSZfH3y8uWL8nue/ZZfSg5sfttzpyEuN+va9YdAamqM3zx4tjn4rUFP7Z0P
|
||||
b1z3/9l+faA1j4mbL2Pgxuj9odfXAv8ikuTTry2OFoPvMn59aY3rOAiRREja8cuL169Op+9ev3mp
|
||||
75799uz/AAAA//8DAD3tznO2JQAA
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e44d19eed83a423-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Mon, 18 Nov 2024 03:19:42 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '17464'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '30000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '29998958'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 2ms
|
||||
x-request-id:
|
||||
- req_9b9ed646842761c5b091045f5e85bfd3
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
495
tests/cassettes/test_multiple_before_after_kickoff.yaml
Normal file
495
tests/cassettes/test_multiple_before_after_kickoff.yaml
Normal file
@@ -0,0 +1,495 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: !!binary |
|
||||
CusOCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSwg4KEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRKaDAoQoIrabVYsFbbHfYiDTst34xIIG0YGNNs8p2gqDENyZXcgQ3JlYXRlZDABOYhW
|
||||
grn2rgkYQUBQhbn2rgkYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODAuMEoaCg5weXRob25fdmVy
|
||||
c2lvbhIICgYzLjEyLjdKLgoIY3Jld19rZXkSIgogMWYxMjhiZGI3YmFhNGI2NzcxNGYxZGFlZGMy
|
||||
ZjNhYjZKMQoHY3Jld19pZBImCiQzM2Q1NTk3MS1iZmI2LTQ5MTgtODNhZC1iZWMxZmEyYzc0NjhK
|
||||
HAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdf
|
||||
bnVtYmVyX29mX3Rhc2tzEgIYAkobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgCSrQFCgtjcmV3
|
||||
X2FnZW50cxKkBQqhBVt7ImtleSI6ICI3M2MzNDljOTNjMTYzYjVkNGRmOThhNjRmYWMxYzQzMCIs
|
||||
ICJpZCI6ICIwZGZjYzg3MS01ZGI5LTRkYjItOWIyNy0xN2I0MmIyZmZiMTAiLCAicm9sZSI6ICJ7
|
||||
dG9waWN9IFNlbmlvciBEYXRhIFJlc2VhcmNoZXJcbiIsICJ2ZXJib3NlPyI6IHRydWUsICJtYXhf
|
||||
aXRlciI6IDIwLCAibWF4X3JwbSI6IG51bGwsICJmdW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAi
|
||||
bGxtIjogImdwdC00by1taW5pIiwgImRlbGVnYXRpb25fZW5hYmxlZD8iOiBmYWxzZSwgImFsbG93
|
||||
X2NvZGVfZXhlY3V0aW9uPyI6IGZhbHNlLCAibWF4X3JldHJ5X2xpbWl0IjogMiwgInRvb2xzX25h
|
||||
bWVzIjogW119LCB7ImtleSI6ICIxMDRmZTA2NTllMTBiNDI2Y2Y4OGYwMjRmYjU3MTU1MyIsICJp
|
||||
ZCI6ICJiZjFkODdkZC0zZmUyLTRjYTctOTI1My0xYTQyYTljNWE5NjYiLCAicm9sZSI6ICJ7dG9w
|
||||
aWN9IFJlcG9ydGluZyBBbmFseXN0XG4iLCAidmVyYm9zZT8iOiB0cnVlLCAibWF4X2l0ZXIiOiAy
|
||||
MCwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJn
|
||||
cHQtNG8tbWluaSIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJhbGxvd19jb2RlX2V4
|
||||
ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6IFtd
|
||||
fV1KkwQKCmNyZXdfdGFza3MShAQKgQRbeyJrZXkiOiAiNmFmYzRiMzk2MjU5ZmJiNzY4MWY1NmM3
|
||||
NzU1Y2M5MzciLCAiaWQiOiAiNjhhZmY3NzctODEwYy00N2Q0LTlmMjItMjBlY2VhY2Y3ZTFhIiwg
|
||||
ImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdlbnRf
|
||||
cm9sZSI6ICJ7dG9waWN9IFNlbmlvciBEYXRhIFJlc2VhcmNoZXJcbiIsICJhZ2VudF9rZXkiOiAi
|
||||
NzNjMzQ5YzkzYzE2M2I1ZDRkZjk4YTY0ZmFjMWM0MzAiLCAidG9vbHNfbmFtZXMiOiBbXX0sIHsi
|
||||
a2V5IjogImIxN2IxODhkYmYxNGY5M2E5OGU1Yjk1YWFkMzY3NTc3IiwgImlkIjogImNlZmFlNzU1
|
||||
LTMzNzctNGE3OS1hNGMyLTZkMDk5Yzk0YmRlYiIsICJhc3luY19leGVjdXRpb24/IjogZmFsc2Us
|
||||
ICJodW1hbl9pbnB1dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAie3RvcGljfSBSZXBvcnRpbmcg
|
||||
QW5hbHlzdFxuIiwgImFnZW50X2tleSI6ICIxMDRmZTA2NTllMTBiNDI2Y2Y4OGYwMjRmYjU3MTU1
|
||||
MyIsICJ0b29sc19uYW1lcyI6IFtdfV16AhgBhQEAAQAAEo4CChAk4SmgCGgI1cLD7bspORIREgiA
|
||||
ME8JXP1gfioMVGFzayBDcmVhdGVkMAE56EuWufauCRhBWOCWufauCRhKLgoIY3Jld19rZXkSIgog
|
||||
MWYxMjhiZGI3YmFhNGI2NzcxNGYxZGFlZGMyZjNhYjZKMQoHY3Jld19pZBImCiQzM2Q1NTk3MS1i
|
||||
ZmI2LTQ5MTgtODNhZC1iZWMxZmEyYzc0NjhKLgoIdGFza19rZXkSIgogNmFmYzRiMzk2MjU5ZmJi
|
||||
NzY4MWY1NmM3NzU1Y2M5MzdKMQoHdGFza19pZBImCiQ2OGFmZjc3Ny04MTBjLTQ3ZDQtOWYyMi0y
|
||||
MGVjZWFjZjdlMWF6AhgBhQEAAQAA
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '1902'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.27.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Wed, 20 Nov 2024 13:05:09 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are plants Senior Data
|
||||
Researcher\n. You''re a seasoned researcher with a knack for uncovering the
|
||||
latest developments in plants. Known for your ability to find the most relevant
|
||||
information and present it in a clear and concise manner.\n\nYour personal goal
|
||||
is: Uncover cutting-edge developments in plants\n\nTo give my best complete
|
||||
final answer to the task use the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: Your final answer must be the great and the
|
||||
most complete as possible, it must be outcome described.\n\nI MUST use these
|
||||
formats, my job depends on it!"}, {"role": "user", "content": "\nCurrent Task:
|
||||
Conduct a thorough research about plants Make sure you find any interesting
|
||||
and relevant information given the current year is 2024.\n\n\nThis is the expect
|
||||
criteria for your final answer: A list with 10 bullet points of the most relevant
|
||||
information about plants\n\nyou MUST return the actual complete content as the
|
||||
final answer, not a summary.\n\nBegin! This is VERY important to you, use the
|
||||
tools available and give your best Final Answer, your job depends on it!\n\nThought:"}],
|
||||
"model": "gpt-4o-mini", "stop": ["\nObservation:"], "stream": false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1250'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=CkK4UvBd9ukXvn50uJwGambJcz5zERAJfeXJ9xge6H4-1732107842-1.0.1.1-IOK2yVL3RlD75MgmnKzIEyE38HNknwn6I8BBJ1wjGz4jCTd0YWIBPnvUm9gB8D_zLlUA9G7p_wbrfyc4mO_Bmg;
|
||||
_cfuvid=MmeN9oHWrBLThkEJdaSFHBfWe95JvA8iFnnt7CC92tk-1732107842102-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- x64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
content: "{\n \"id\": \"chatcmpl-AVegEGAMTASvlfwjAjy5PsqGwtN6X\",\n \"object\":
|
||||
\"chat.completion\",\n \"created\": 1732107906,\n \"model\": \"gpt-4o-mini-2024-07-18\",\n
|
||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
||||
\"assistant\",\n \"content\": \"I now can give a great answer \\nFinal
|
||||
Answer: \\n1. **Plant-Based Plastics**: 2024 has seen significant advancements
|
||||
in bioplastics derived from plants such as corn and sugarcane, which are being
|
||||
used as sustainable alternatives to petroleum-based plastics. These innovations
|
||||
aim to reduce plastic waste and reliance on fossil fuels.\\n\\n2. **Gene Editing
|
||||
Breakthroughs**: CRISPR technology continues to revolutionize agricultural practices.
|
||||
Researchers have developed genetically modified crops that are more resistant
|
||||
to pests and diseases, thereby decreasing the need for chemical pesticides and
|
||||
increasing food security.\\n\\n3. **Vertical Farming Expansion**: Urban agriculture
|
||||
has gained momentum with vertical farming techniques that utilize less water
|
||||
and space. In 2024, cities worldwide are adopting these innovative systems to
|
||||
meet the growing food demand while reducing transportation emissions.\\n\\n4.
|
||||
**Climate-Resilient Plants**: Scientists are identifying and breeding plant
|
||||
varieties that can withstand extreme weather conditions such as droughts, floods,
|
||||
and high temperatures, helping farmers adapt to climate change and ensuring
|
||||
crop yields remain stable.\\n\\n5. **Edible Vaccines in Plants**: Research into
|
||||
producing vaccines in plants has progressed, with trials showing that certain
|
||||
plants can be engineered to express antigens that could serve as edible vaccines,
|
||||
offering a low-cost and easy delivery method for immunizations.\\n\\n6. **Fungi
|
||||
and Plant Collaborations**: The study of mycorrhizal fungi and their symbiotic
|
||||
relationships with plants has gained traction. Research indicates that these
|
||||
fungi enhance nutrient uptake and enhance plant resilience to environmental
|
||||
stressors, making them pivotal in sustainable agriculture.\\n\\n7. **Biophilic
|
||||
Design**: The trend of integrating nature into architecture and urban planning
|
||||
has expanded, with an emphasis on incorporating plants into building designs
|
||||
to improve air quality, enhance mental well-being, and reduce overall energy
|
||||
consumption.\\n\\n8. **Deciduous Trees and Urban Cooling**: Research shows that
|
||||
strategically planted deciduous trees can significantly lower urban temperatures,
|
||||
reduce energy consumption for cooling, and improve urban biodiversity, highlighting
|
||||
the importance of green infrastructure.\\n\\n9. **Carnivorous Plant Studies**:
|
||||
The understanding of carnivorous plants and their unique adaptations has advanced,
|
||||
with findings suggesting potential applications in pest control and sustainable
|
||||
agriculture due to their natural predatory methods.\\n\\n10. **Smart Agriculture
|
||||
Technology**: The integration of IoT and AI in agriculture is revolutionizing
|
||||
plant cultivation. Farmers now use sensors and data analytics to monitor plant
|
||||
health in real-time, optimize water usage, and increase crop yields sustainably.\\n\\nThese
|
||||
insights underline the ongoing research and innovations in the plant science
|
||||
field that can lead to more sustainable and resilient agricultural practices
|
||||
while addressing critical global challenges.\",\n \"refusal\": null\n
|
||||
\ },\n \"logprobs\": null,\n \"finish_reason\": \"stop\"\n }\n
|
||||
\ ],\n \"usage\": {\n \"prompt_tokens\": 227,\n \"completion_tokens\":
|
||||
529,\n \"total_tokens\": 756,\n \"prompt_tokens_details\": {\n \"cached_tokens\":
|
||||
0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\": {\n
|
||||
\ \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
||||
\"fp_0705bf87c0\"\n}\n"
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e58a64f3a306225-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Wed, 20 Nov 2024 13:05:13 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '7289'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '30000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '150000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '29999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '149999711'
|
||||
x-ratelimit-reset-requests:
|
||||
- 2ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_250abe3944c3c859e59c2c976b0a1248
|
||||
http_version: HTTP/1.1
|
||||
status_code: 200
|
||||
- request:
|
||||
body: !!binary |
|
||||
Cs4CCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSpQIKEgoQY3Jld2FpLnRl
|
||||
bGVtZXRyeRKOAgoQDrK/EDJCqiT+GyzEBtuJzBIIDx9rCaKAVKMqDFRhc2sgQ3JlYXRlZDABOTCH
|
||||
rYH4rgkYQdCQroH4rgkYSi4KCGNyZXdfa2V5EiIKIDFmMTI4YmRiN2JhYTRiNjc3MTRmMWRhZWRj
|
||||
MmYzYWI2SjEKB2NyZXdfaWQSJgokMzNkNTU5NzEtYmZiNi00OTE4LTgzYWQtYmVjMWZhMmM3NDY4
|
||||
Si4KCHRhc2tfa2V5EiIKIGIxN2IxODhkYmYxNGY5M2E5OGU1Yjk1YWFkMzY3NTc3SjEKB3Rhc2tf
|
||||
aWQSJgokY2VmYWU3NTUtMzM3Ny00YTc5LWE0YzItNmQwOTljOTRiZGViegIYAYUBAAEAAA==
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '337'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
User-Agent:
|
||||
- OTel-OTLP-Exporter-Python/1.27.0
|
||||
method: POST
|
||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||
response:
|
||||
body:
|
||||
string: "\n\0"
|
||||
headers:
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/x-protobuf
|
||||
Date:
|
||||
- Wed, 20 Nov 2024 13:05:19 GMT
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are plants Reporting Analyst\n.
|
||||
You''re a meticulous analyst with a keen eye for detail. You''re known for your
|
||||
ability to turn complex data into clear and concise reports, making it easy
|
||||
for others to understand and act on the information you provide.\n\nYour personal
|
||||
goal is: Create detailed reports based on plants data analysis and research
|
||||
findings\n\nTo give my best complete final answer to the task use the exact
|
||||
following format:\n\nThought: I now can give a great answer\nFinal Answer: Your
|
||||
final answer must be the great and the most complete as possible, it must be
|
||||
outcome described.\n\nI MUST use these formats, my job depends on it!"}, {"role":
|
||||
"user", "content": "\nCurrent Task: Review the context you got and expand each
|
||||
topic into a full section for a report. Make sure the report is detailed and
|
||||
contains any and all relevant information.\n\n\nThis is the expect criteria
|
||||
for your final answer: A fully fledge reports with the mains topics, each with
|
||||
a full section of information. Formatted as markdown without ''```''\n\nyou
|
||||
MUST return the actual complete content as the final answer, not a summary.\n\nThis
|
||||
is the context you''re working with:\n1. **Plant-Based Plastics**: 2024 has
|
||||
seen significant advancements in bioplastics derived from plants such as corn
|
||||
and sugarcane, which are being used as sustainable alternatives to petroleum-based
|
||||
plastics. These innovations aim to reduce plastic waste and reliance on fossil
|
||||
fuels.\n\n2. **Gene Editing Breakthroughs**: CRISPR technology continues to
|
||||
revolutionize agricultural practices. Researchers have developed genetically
|
||||
modified crops that are more resistant to pests and diseases, thereby decreasing
|
||||
the need for chemical pesticides and increasing food security.\n\n3. **Vertical
|
||||
Farming Expansion**: Urban agriculture has gained momentum with vertical farming
|
||||
techniques that utilize less water and space. In 2024, cities worldwide are
|
||||
adopting these innovative systems to meet the growing food demand while reducing
|
||||
transportation emissions.\n\n4. **Climate-Resilient Plants**: Scientists are
|
||||
identifying and breeding plant varieties that can withstand extreme weather
|
||||
conditions such as droughts, floods, and high temperatures, helping farmers
|
||||
adapt to climate change and ensuring crop yields remain stable.\n\n5. **Edible
|
||||
Vaccines in Plants**: Research into producing vaccines in plants has progressed,
|
||||
with trials showing that certain plants can be engineered to express antigens
|
||||
that could serve as edible vaccines, offering a low-cost and easy delivery method
|
||||
for immunizations.\n\n6. **Fungi and Plant Collaborations**: The study of mycorrhizal
|
||||
fungi and their symbiotic relationships with plants has gained traction. Research
|
||||
indicates that these fungi enhance nutrient uptake and enhance plant resilience
|
||||
to environmental stressors, making them pivotal in sustainable agriculture.\n\n7.
|
||||
**Biophilic Design**: The trend of integrating nature into architecture and
|
||||
urban planning has expanded, with an emphasis on incorporating plants into building
|
||||
designs to improve air quality, enhance mental well-being, and reduce overall
|
||||
energy consumption.\n\n8. **Deciduous Trees and Urban Cooling**: Research shows
|
||||
that strategically planted deciduous trees can significantly lower urban temperatures,
|
||||
reduce energy consumption for cooling, and improve urban biodiversity, highlighting
|
||||
the importance of green infrastructure.\n\n9. **Carnivorous Plant Studies**:
|
||||
The understanding of carnivorous plants and their unique adaptations has advanced,
|
||||
with findings suggesting potential applications in pest control and sustainable
|
||||
agriculture due to their natural predatory methods.\n\n10. **Smart Agriculture
|
||||
Technology**: The integration of IoT and AI in agriculture is revolutionizing
|
||||
plant cultivation. Farmers now use sensors and data analytics to monitor plant
|
||||
health in real-time, optimize water usage, and increase crop yields sustainably.\n\nThese
|
||||
insights underline the ongoing research and innovations in the plant science
|
||||
field that can lead to more sustainable and resilient agricultural practices
|
||||
while addressing critical global challenges.\n\nBegin! This is VERY important
|
||||
to you, use the tools available and give your best Final Answer, your job depends
|
||||
on it!\n\nThought:"}], "model": "gpt-4o-mini", "stop": ["\nObservation:"], "stream":
|
||||
false}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '4383'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=CkK4UvBd9ukXvn50uJwGambJcz5zERAJfeXJ9xge6H4-1732107842-1.0.1.1-IOK2yVL3RlD75MgmnKzIEyE38HNknwn6I8BBJ1wjGz4jCTd0YWIBPnvUm9gB8D_zLlUA9G7p_wbrfyc4mO_Bmg;
|
||||
_cfuvid=MmeN9oHWrBLThkEJdaSFHBfWe95JvA8iFnnt7CC92tk-1732107842102-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.52.1
|
||||
x-stainless-arch:
|
||||
- x64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.52.1
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.7
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
content: "{\n \"id\": \"chatcmpl-AVegMn5Ai2jlIz10QbT0vQQ6TPvD0\",\n \"object\":
|
||||
\"chat.completion\",\n \"created\": 1732107914,\n \"model\": \"gpt-4o-mini-2024-07-18\",\n
|
||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
||||
\"assistant\",\n \"content\": \"I now can give a great answer \\nFinal
|
||||
Answer: \\n\\n# Comprehensive Report on Advancements in Plant Science (2024)\\n\\n##
|
||||
1. Plant-Based Plastics\\n2024 has witnessed remarkable progress in the development
|
||||
of plant-based plastics, particularly those derived from renewable resources
|
||||
such as corn and sugarcane. Unlike traditional petroleum-based plastics, these
|
||||
bioplastics are designed to be more environmentally friendly, addressing pressing
|
||||
concerns regarding plastic waste and the continuous reliance on fossil fuels.
|
||||
The production processes have been optimized to ensure lower carbon footprints
|
||||
and reduced greenhouse gas emissions. Moreover, major manufacturers are beginning
|
||||
to scale up production of these materials, integrating them into a wide array
|
||||
of products, from packaging to consumer goods, thereby offering sustainable
|
||||
alternatives and contributing positively to circular economy initiatives.\\n\\n##
|
||||
2. Gene Editing Breakthroughs\\nThe introduction of CRISPR technology has emerged
|
||||
as a game-changer in agricultural methodologies. In 2024, cutting-edge research
|
||||
has led to the development of genetically modified crops that exhibit enhanced
|
||||
resistance to pests and diseases. This advancement not only supports increased
|
||||
food production but also significantly reduces dependency on chemical pesticides,
|
||||
which can have detrimental effects on both environmental health and biodiversity.
|
||||
By facilitating the creation of hardier crops, these innovations play a crucial
|
||||
role in bolstering food security, enabling farmers to achieve higher yields
|
||||
even under challenging agricultural conditions.\\n\\n## 3. Vertical Farming
|
||||
Expansion\\nVertical farming techniques have surged in popularity as urban agriculture
|
||||
continues to evolve in 2024. These innovative systems are engineered to optimize
|
||||
space and resource usage, utilizing cutting-edge hydroponic and aeroponic methods
|
||||
to grow food in controlled environments. Cities around the globe are implementing
|
||||
vertical farms to tackle the increasing demand for fresh produce while simultaneously
|
||||
minimizing transportation emissions and enhancing food security in urban settings.
|
||||
By using significantly less water and land compared to traditional farming,
|
||||
vertical farming represents a sustainable solution to urban food production
|
||||
challenges in the face of rapid urbanization.\\n\\n## 4. Climate-Resilient Plants\\nIn
|
||||
response to the unpredictable impacts of climate change, researchers are focused
|
||||
on identifying and breeding plant varieties capable of enduring extreme weather
|
||||
conditions, such as droughts, floods, and extreme temperatures. This initiative
|
||||
aims to empower farmers with viable options to adapt to shifting climate patterns,
|
||||
ensuring stable crop yields despite adverse environmental conditions. The development
|
||||
of these climate-resilient plants is vital for maintaining food supply chains
|
||||
and safeguarding agricultural biodiversity, ultimately contributing to the long-term
|
||||
sustainability of farming practices.\\n\\n## 5. Edible Vaccines in Plants\\nInnovative
|
||||
research in the area of edible vaccines is making significant strides in 2024.
|
||||
Scientists are exploring the potential of genetically engineered plants to express
|
||||
specific antigens that can serve as immunizations. These advancements hold promising
|
||||
implications for public health, particularly in developing regions where traditional
|
||||
vaccine delivery systems may be less feasible. Edible vaccines offer a cost-effective,
|
||||
accessible, and non-invasive way to promote immunity against various diseases,
|
||||
highlighting the intersection of agriculture and health science in addressing
|
||||
global health challenges.\\n\\n## 6. Fungi and Plant Collaborations\\nThe symbiotic
|
||||
relationships between mycorrhizal fungi and plants have gained increasing attention
|
||||
in research circles. Studies conducted in 2024 reveal that these fungi enhance
|
||||
nutrient uptake, improve soil health, and bolster plant resilience when faced
|
||||
with environmental stressors, such as drought and soil degradation. The incorporation
|
||||
of these beneficial fungi into sustainable agricultural practices not only promotes
|
||||
plant health but also contributes to organic farming resiliency, highlighting
|
||||
the importance of understanding and utilizing biological partnerships in cultivating
|
||||
healthy crops.\\n\\n## 7. Biophilic Design\\nThe concept of biophilic design
|
||||
continues to gain traction in architecture and urban planning throughout 2024,
|
||||
underscoring the importance of integrating nature into built environments. This
|
||||
approach encourages the incorporation of plants and green spaces into architectural
|
||||
designs, enhancing air quality and promoting mental well-being. By reducing
|
||||
reliance on artificial heating and cooling systems, biophilic design offers
|
||||
opportunities to lower overall energy consumption. As cities recognize the benefits
|
||||
of urban greenery, significant investments are being made to create healthier,
|
||||
more sustainable urban ecosystems.\\n\\n## 8. Deciduous Trees and Urban Cooling\\nResearch
|
||||
conducted this year has demonstrated that strategic planting of deciduous trees
|
||||
in urban areas can lead to significant temperature reductions. These trees act
|
||||
as natural air conditioners by providing shade and releasing moisture. The benefits
|
||||
extend beyond cooling cities\u2014for instance, they contribute to lower energy
|
||||
consumption for air conditioning, enhance local biodiversity, and improve overall
|
||||
urban livability. Understanding the role of green infrastructure in combating
|
||||
urban heat islands is essential for developing climate-responsive cities.\\n\\n##
|
||||
9. Carnivorous Plant Studies\\nRecent advancements in the study of carnivorous
|
||||
plants reveal exciting potential applications in sustainable pest control and
|
||||
agriculture. By understanding their unique adaptations and natural predatory
|
||||
methods, researchers envision the use of these plants as biological pest management
|
||||
tools in crop production. In 2024, interest in harnessing the ecological roles
|
||||
of these plants may lead to more sustainable farming practices, alleviating
|
||||
the need for chemical pesticides and promoting healthier ecosystems.\\n\\n##
|
||||
10. Smart Agriculture Technology\\nThe integration of Internet of Things (IoT)
|
||||
and Artificial Intelligence (AI) into agriculture has transformed how farming
|
||||
is conducted in 2024. Smart agriculture technologies enable farmers to monitor
|
||||
plant health, soil conditions, and microclimates through advanced sensors and
|
||||
data analytics. This real-time data helps optimize water usage, elevate crop
|
||||
yields, and foster sustainable farming practices. As farmers harness these technological
|
||||
advancements, the sector moves toward a more efficient and environmentally conscious
|
||||
model of production.\\n\\nIn conclusion, these insights reflect ongoing research
|
||||
and innovation in the field of plant science, laying the groundwork for more
|
||||
sustainable and resilient agricultural practices. As the world faces critical
|
||||
global challenges, these advancements illustrate the potential to leverage plant
|
||||
science in creating solutions that address both environmental concerns and food
|
||||
security.\",\n \"refusal\": null\n },\n \"logprobs\": null,\n
|
||||
\ \"finish_reason\": \"stop\"\n }\n ],\n \"usage\": {\n \"prompt_tokens\":
|
||||
777,\n \"completion_tokens\": 1163,\n \"total_tokens\": 1940,\n \"prompt_tokens_details\":
|
||||
{\n \"cached_tokens\": 0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\":
|
||||
{\n \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
||||
\"fp_0705bf87c0\"\n}\n"
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8e58a67f18856225-GRU
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Wed, 20 Nov 2024 13:05:35 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '20960'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-ratelimit-limit-requests:
|
||||
- '30000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '150000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '29999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '149998934'
|
||||
x-ratelimit-reset-requests:
|
||||
- 2ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_1a11ba8b9c0cb1803e99cd95fa1fb890
|
||||
http_version: HTTP/1.1
|
||||
status_code: 200
|
||||
version: 1
|
||||
21
tests/config/agents.yaml
Normal file
21
tests/config/agents.yaml
Normal file
@@ -0,0 +1,21 @@
|
||||
researcher:
|
||||
role: >
|
||||
{topic} Senior Data Researcher
|
||||
goal: >
|
||||
Uncover cutting-edge developments in {topic}
|
||||
backstory: >
|
||||
You're a seasoned researcher with a knack for uncovering the latest
|
||||
developments in {topic}. Known for your ability to find the most relevant
|
||||
information and present it in a clear and concise manner.
|
||||
verbose: true
|
||||
|
||||
reporting_analyst:
|
||||
role: >
|
||||
{topic} Reporting Analyst
|
||||
goal: >
|
||||
Create detailed reports based on {topic} data analysis and research findings
|
||||
backstory: >
|
||||
You're a meticulous analyst with a keen eye for detail. You're known for
|
||||
your ability to turn complex data into clear and concise reports, making
|
||||
it easy for others to understand and act on the information you provide.
|
||||
verbose: true
|
||||
17
tests/config/tasks.yaml
Normal file
17
tests/config/tasks.yaml
Normal file
@@ -0,0 +1,17 @@
|
||||
research_task:
|
||||
description: >
|
||||
Conduct a thorough research about {topic}
|
||||
Make sure you find any interesting and relevant information given
|
||||
the current year is 2024.
|
||||
expected_output: >
|
||||
A list with 10 bullet points of the most relevant information about {topic}
|
||||
agent: researcher
|
||||
|
||||
reporting_task:
|
||||
description: >
|
||||
Review the context you got and expand each topic into a full section for a report.
|
||||
Make sure the report is detailed and contains any and all relevant information.
|
||||
expected_output: >
|
||||
A fully fledge reports with the mains topics, each with a full section of information.
|
||||
Formatted as markdown without '```'
|
||||
agent: reporting_analyst
|
||||
@@ -564,6 +564,7 @@ def test_crew_kickoff_usage_metrics():
|
||||
assert result.token_usage.prompt_tokens > 0
|
||||
assert result.token_usage.completion_tokens > 0
|
||||
assert result.token_usage.successful_requests > 0
|
||||
assert result.token_usage.cached_prompt_tokens == 0
|
||||
|
||||
|
||||
def test_agents_rpm_is_never_set_if_crew_max_RPM_is_not_set():
|
||||
@@ -1280,10 +1281,11 @@ def test_agent_usage_metrics_are_captured_for_hierarchical_process():
|
||||
assert result.raw == "Howdy!"
|
||||
|
||||
assert result.token_usage == UsageMetrics(
|
||||
total_tokens=2626,
|
||||
prompt_tokens=2482,
|
||||
completion_tokens=144,
|
||||
successful_requests=5,
|
||||
total_tokens=1673,
|
||||
prompt_tokens=1562,
|
||||
completion_tokens=111,
|
||||
successful_requests=3,
|
||||
cached_prompt_tokens=0
|
||||
)
|
||||
|
||||
|
||||
@@ -1777,26 +1779,22 @@ def test_crew_train_success(
|
||||
]
|
||||
)
|
||||
|
||||
crew_training_handler.assert_has_calls(
|
||||
[
|
||||
mock.call("training_data.pkl"),
|
||||
mock.call().load(),
|
||||
mock.call("trained_agents_data.pkl"),
|
||||
mock.call().save_trained_data(
|
||||
agent_id="Researcher",
|
||||
trained_data=task_evaluator().evaluate_training_data().model_dump(),
|
||||
),
|
||||
mock.call("trained_agents_data.pkl"),
|
||||
mock.call().save_trained_data(
|
||||
agent_id="Senior Writer",
|
||||
trained_data=task_evaluator().evaluate_training_data().model_dump(),
|
||||
),
|
||||
mock.call(),
|
||||
mock.call().load(),
|
||||
mock.call(),
|
||||
mock.call().load(),
|
||||
]
|
||||
)
|
||||
crew_training_handler.assert_any_call("training_data.pkl")
|
||||
crew_training_handler().load.assert_called()
|
||||
|
||||
crew_training_handler.assert_any_call("trained_agents_data.pkl")
|
||||
crew_training_handler().load.assert_called()
|
||||
|
||||
crew_training_handler().save_trained_data.assert_has_calls([
|
||||
mock.call(
|
||||
agent_id="Researcher",
|
||||
trained_data=task_evaluator().evaluate_training_data().model_dump(),
|
||||
),
|
||||
mock.call(
|
||||
agent_id="Senior Writer",
|
||||
trained_data=task_evaluator().evaluate_training_data().model_dump(),
|
||||
)
|
||||
])
|
||||
|
||||
|
||||
def test_crew_train_error():
|
||||
|
||||
264
tests/flow_test.py
Normal file
264
tests/flow_test.py
Normal file
@@ -0,0 +1,264 @@
|
||||
"""Test Flow creation and execution basic functionality."""
|
||||
|
||||
import asyncio
|
||||
|
||||
import pytest
|
||||
from crewai.flow.flow import Flow, and_, listen, or_, router, start
|
||||
|
||||
|
||||
def test_simple_sequential_flow():
|
||||
"""Test a simple flow with two steps called sequentially."""
|
||||
execution_order = []
|
||||
|
||||
class SimpleFlow(Flow):
|
||||
@start()
|
||||
def step_1(self):
|
||||
execution_order.append("step_1")
|
||||
|
||||
@listen(step_1)
|
||||
def step_2(self):
|
||||
execution_order.append("step_2")
|
||||
|
||||
flow = SimpleFlow()
|
||||
flow.kickoff()
|
||||
|
||||
assert execution_order == ["step_1", "step_2"]
|
||||
|
||||
|
||||
def test_flow_with_multiple_starts():
|
||||
"""Test a flow with multiple start methods."""
|
||||
execution_order = []
|
||||
|
||||
class MultiStartFlow(Flow):
|
||||
@start()
|
||||
def step_a(self):
|
||||
execution_order.append("step_a")
|
||||
|
||||
@start()
|
||||
def step_b(self):
|
||||
execution_order.append("step_b")
|
||||
|
||||
@listen(step_a)
|
||||
def step_c(self):
|
||||
execution_order.append("step_c")
|
||||
|
||||
@listen(step_b)
|
||||
def step_d(self):
|
||||
execution_order.append("step_d")
|
||||
|
||||
flow = MultiStartFlow()
|
||||
flow.kickoff()
|
||||
|
||||
assert "step_a" in execution_order
|
||||
assert "step_b" in execution_order
|
||||
assert "step_c" in execution_order
|
||||
assert "step_d" in execution_order
|
||||
assert execution_order.index("step_c") > execution_order.index("step_a")
|
||||
assert execution_order.index("step_d") > execution_order.index("step_b")
|
||||
|
||||
|
||||
def test_cyclic_flow():
|
||||
"""Test a cyclic flow that runs a finite number of iterations."""
|
||||
execution_order = []
|
||||
|
||||
class CyclicFlow(Flow):
|
||||
iteration = 0
|
||||
max_iterations = 3
|
||||
|
||||
@start("loop")
|
||||
def step_1(self):
|
||||
if self.iteration >= self.max_iterations:
|
||||
return # Do not proceed further
|
||||
execution_order.append(f"step_1_{self.iteration}")
|
||||
|
||||
@listen(step_1)
|
||||
def step_2(self):
|
||||
execution_order.append(f"step_2_{self.iteration}")
|
||||
|
||||
@router(step_2)
|
||||
def step_3(self):
|
||||
execution_order.append(f"step_3_{self.iteration}")
|
||||
self.iteration += 1
|
||||
if self.iteration < self.max_iterations:
|
||||
return "loop"
|
||||
|
||||
return "exit"
|
||||
|
||||
flow = CyclicFlow()
|
||||
flow.kickoff()
|
||||
|
||||
expected_order = []
|
||||
for i in range(flow.max_iterations):
|
||||
expected_order.extend([f"step_1_{i}", f"step_2_{i}", f"step_3_{i}"])
|
||||
|
||||
assert execution_order == expected_order
|
||||
|
||||
|
||||
def test_flow_with_and_condition():
|
||||
"""Test a flow where a step waits for multiple other steps to complete."""
|
||||
execution_order = []
|
||||
|
||||
class AndConditionFlow(Flow):
|
||||
@start()
|
||||
def step_1(self):
|
||||
execution_order.append("step_1")
|
||||
|
||||
@start()
|
||||
def step_2(self):
|
||||
execution_order.append("step_2")
|
||||
|
||||
@listen(and_(step_1, step_2))
|
||||
def step_3(self):
|
||||
execution_order.append("step_3")
|
||||
|
||||
flow = AndConditionFlow()
|
||||
flow.kickoff()
|
||||
|
||||
assert "step_1" in execution_order
|
||||
assert "step_2" in execution_order
|
||||
assert execution_order[-1] == "step_3"
|
||||
assert execution_order.index("step_3") > execution_order.index("step_1")
|
||||
assert execution_order.index("step_3") > execution_order.index("step_2")
|
||||
|
||||
|
||||
def test_flow_with_or_condition():
|
||||
"""Test a flow where a step is triggered when any of multiple steps complete."""
|
||||
execution_order = []
|
||||
|
||||
class OrConditionFlow(Flow):
|
||||
@start()
|
||||
def step_a(self):
|
||||
execution_order.append("step_a")
|
||||
|
||||
@start()
|
||||
def step_b(self):
|
||||
execution_order.append("step_b")
|
||||
|
||||
@listen(or_(step_a, step_b))
|
||||
def step_c(self):
|
||||
execution_order.append("step_c")
|
||||
|
||||
flow = OrConditionFlow()
|
||||
flow.kickoff()
|
||||
|
||||
assert "step_a" in execution_order or "step_b" in execution_order
|
||||
assert "step_c" in execution_order
|
||||
assert execution_order.index("step_c") > min(
|
||||
execution_order.index("step_a"), execution_order.index("step_b")
|
||||
)
|
||||
|
||||
|
||||
def test_flow_with_router():
|
||||
"""Test a flow that uses a router method to determine the next step."""
|
||||
execution_order = []
|
||||
|
||||
class RouterFlow(Flow):
|
||||
@start()
|
||||
def start_method(self):
|
||||
execution_order.append("start_method")
|
||||
|
||||
@router(start_method)
|
||||
def router(self):
|
||||
execution_order.append("router")
|
||||
# Ensure the condition is set to True to follow the "step_if_true" path
|
||||
condition = True
|
||||
return "step_if_true" if condition else "step_if_false"
|
||||
|
||||
@listen("step_if_true")
|
||||
def truthy(self):
|
||||
execution_order.append("step_if_true")
|
||||
|
||||
@listen("step_if_false")
|
||||
def falsy(self):
|
||||
execution_order.append("step_if_false")
|
||||
|
||||
flow = RouterFlow()
|
||||
flow.kickoff()
|
||||
|
||||
assert execution_order == ["start_method", "router", "step_if_true"]
|
||||
|
||||
|
||||
def test_async_flow():
|
||||
"""Test an asynchronous flow."""
|
||||
execution_order = []
|
||||
|
||||
class AsyncFlow(Flow):
|
||||
@start()
|
||||
async def step_1(self):
|
||||
execution_order.append("step_1")
|
||||
await asyncio.sleep(0.1)
|
||||
|
||||
@listen(step_1)
|
||||
async def step_2(self):
|
||||
execution_order.append("step_2")
|
||||
await asyncio.sleep(0.1)
|
||||
|
||||
flow = AsyncFlow()
|
||||
asyncio.run(flow.kickoff_async())
|
||||
|
||||
assert execution_order == ["step_1", "step_2"]
|
||||
|
||||
|
||||
def test_flow_with_exceptions():
|
||||
"""Test flow behavior when exceptions occur in steps."""
|
||||
execution_order = []
|
||||
|
||||
class ExceptionFlow(Flow):
|
||||
@start()
|
||||
def step_1(self):
|
||||
execution_order.append("step_1")
|
||||
raise ValueError("An error occurred in step_1")
|
||||
|
||||
@listen(step_1)
|
||||
def step_2(self):
|
||||
execution_order.append("step_2")
|
||||
|
||||
flow = ExceptionFlow()
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
flow.kickoff()
|
||||
|
||||
# Ensure step_2 did not execute
|
||||
assert execution_order == ["step_1"]
|
||||
|
||||
|
||||
def test_flow_restart():
|
||||
"""Test restarting a flow after it has completed."""
|
||||
execution_order = []
|
||||
|
||||
class RestartableFlow(Flow):
|
||||
@start()
|
||||
def step_1(self):
|
||||
execution_order.append("step_1")
|
||||
|
||||
@listen(step_1)
|
||||
def step_2(self):
|
||||
execution_order.append("step_2")
|
||||
|
||||
flow = RestartableFlow()
|
||||
flow.kickoff()
|
||||
flow.kickoff() # Restart the flow
|
||||
|
||||
assert execution_order == ["step_1", "step_2", "step_1", "step_2"]
|
||||
|
||||
|
||||
def test_flow_with_custom_state():
|
||||
"""Test a flow that maintains and modifies internal state."""
|
||||
|
||||
class StateFlow(Flow):
|
||||
def __init__(self):
|
||||
super().__init__()
|
||||
self.counter = 0
|
||||
|
||||
@start()
|
||||
def step_1(self):
|
||||
self.counter += 1
|
||||
|
||||
@listen(step_1)
|
||||
def step_2(self):
|
||||
self.counter *= 2
|
||||
assert self.counter == 2
|
||||
|
||||
flow = StateFlow()
|
||||
flow.kickoff()
|
||||
assert flow.counter == 2
|
||||
0
tests/knowledge/__init__.py
Normal file
0
tests/knowledge/__init__.py
Normal file
BIN
tests/knowledge/crewai_quickstart.pdf
Normal file
BIN
tests/knowledge/crewai_quickstart.pdf
Normal file
Binary file not shown.
545
tests/knowledge/knowledge_test.py
Normal file
545
tests/knowledge/knowledge_test.py
Normal file
@@ -0,0 +1,545 @@
|
||||
"""Test Knowledge creation and querying functionality."""
|
||||
|
||||
from pathlib import Path
|
||||
from unittest.mock import patch
|
||||
|
||||
from crewai.knowledge.source.csv_knowledge_source import CSVKnowledgeSource
|
||||
from crewai.knowledge.source.excel_knowledge_source import ExcelKnowledgeSource
|
||||
from crewai.knowledge.source.json_knowledge_source import JSONKnowledgeSource
|
||||
from crewai.knowledge.source.pdf_knowledge_source import PDFKnowledgeSource
|
||||
from crewai.knowledge.source.string_knowledge_source import StringKnowledgeSource
|
||||
from crewai.knowledge.source.text_file_knowledge_source import TextFileKnowledgeSource
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def mock_vector_db():
|
||||
"""Mock vector database operations."""
|
||||
with patch("crewai.knowledge.storage.knowledge_storage.KnowledgeStorage") as mock:
|
||||
# Mock the query method to return a predefined response
|
||||
instance = mock.return_value
|
||||
instance.query.return_value = [
|
||||
{
|
||||
"context": "Brandon's favorite color is blue and he likes Mexican food.",
|
||||
"score": 0.9,
|
||||
}
|
||||
]
|
||||
instance.reset.return_value = None
|
||||
yield instance
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def reset_knowledge_storage(mock_vector_db):
|
||||
"""Fixture to reset knowledge storage before each test."""
|
||||
yield
|
||||
|
||||
|
||||
def test_single_short_string(mock_vector_db):
|
||||
# Create a knowledge base with a single short string
|
||||
content = "Brandon's favorite color is blue and he likes Mexican food."
|
||||
string_source = StringKnowledgeSource(
|
||||
content=content, metadata={"preference": "personal"}
|
||||
)
|
||||
mock_vector_db.sources = [string_source]
|
||||
mock_vector_db.query.return_value = [{"context": content, "score": 0.9}]
|
||||
# Perform a query
|
||||
query = "What is Brandon's favorite color?"
|
||||
results = mock_vector_db.query(query)
|
||||
|
||||
# Assert that the results contain the expected information
|
||||
assert any("blue" in result["context"].lower() for result in results)
|
||||
# Verify the mock was called
|
||||
mock_vector_db.query.assert_called_once()
|
||||
|
||||
|
||||
# @pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_single_2k_character_string(mock_vector_db):
|
||||
# Create a 2k character string with various facts about Brandon
|
||||
content = (
|
||||
"Brandon is a software engineer who lives in San Francisco. "
|
||||
"He enjoys hiking and often visits the trails in the Bay Area. "
|
||||
"Brandon has a pet dog named Max, who is a golden retriever. "
|
||||
"He loves reading science fiction books, and his favorite author is Isaac Asimov. "
|
||||
"Brandon's favorite movie is Inception, and he enjoys watching it with his friends. "
|
||||
"He is also a fan of Mexican cuisine, especially tacos and burritos. "
|
||||
"Brandon plays the guitar and often performs at local open mic nights. "
|
||||
"He is learning French and plans to visit Paris next year. "
|
||||
"Brandon is passionate about technology and often attends tech meetups in the city. "
|
||||
"He is also interested in AI and machine learning, and he is currently working on a project related to natural language processing. "
|
||||
"Brandon's favorite color is blue, and he often wears blue shirts. "
|
||||
"He enjoys cooking and often tries new recipes on weekends. "
|
||||
"Brandon is a morning person and likes to start his day with a run in the park. "
|
||||
"He is also a coffee enthusiast and enjoys trying different coffee blends. "
|
||||
"Brandon is a member of a local book club and enjoys discussing books with fellow members. "
|
||||
"He is also a fan of board games and often hosts game nights at his place. "
|
||||
"Brandon is an advocate for environmental conservation and volunteers for local clean-up drives. "
|
||||
"He is also a mentor for aspiring software developers and enjoys sharing his knowledge with others. "
|
||||
"Brandon's favorite sport is basketball, and he often plays with his friends on weekends. "
|
||||
"He is also a fan of the Golden State Warriors and enjoys watching their games. "
|
||||
)
|
||||
string_source = StringKnowledgeSource(
|
||||
content=content, metadata={"preference": "personal"}
|
||||
)
|
||||
mock_vector_db.sources = [string_source]
|
||||
mock_vector_db.query.return_value = [{"context": content, "score": 0.9}]
|
||||
|
||||
# Perform a query
|
||||
query = "What is Brandon's favorite movie?"
|
||||
results = mock_vector_db.query(query)
|
||||
|
||||
# Assert that the results contain the expected information
|
||||
assert any("inception" in result["context"].lower() for result in results)
|
||||
mock_vector_db.query.assert_called_once()
|
||||
|
||||
|
||||
def test_multiple_short_strings(mock_vector_db):
|
||||
# Create multiple short string sources
|
||||
contents = [
|
||||
"Brandon loves hiking.",
|
||||
"Brandon has a dog named Max.",
|
||||
"Brandon enjoys painting landscapes.",
|
||||
]
|
||||
string_sources = [
|
||||
StringKnowledgeSource(content=content, metadata={"preference": "personal"})
|
||||
for content in contents
|
||||
]
|
||||
|
||||
# Mock the vector db query response
|
||||
mock_vector_db.query.return_value = [
|
||||
{"context": "Brandon has a dog named Max.", "score": 0.9}
|
||||
]
|
||||
|
||||
mock_vector_db.sources = string_sources
|
||||
|
||||
# Perform a query
|
||||
query = "What is the name of Brandon's pet?"
|
||||
results = mock_vector_db.query(query)
|
||||
|
||||
# Assert that the correct information is retrieved
|
||||
assert any("max" in result["context"].lower() for result in results)
|
||||
# Verify the mock was called
|
||||
mock_vector_db.query.assert_called_once()
|
||||
|
||||
|
||||
def test_multiple_2k_character_strings(mock_vector_db):
|
||||
# Create multiple 2k character strings with various facts about Brandon
|
||||
contents = [
|
||||
(
|
||||
"Brandon is a software engineer who lives in San Francisco. "
|
||||
"He enjoys hiking and often visits the trails in the Bay Area. "
|
||||
"Brandon has a pet dog named Max, who is a golden retriever. "
|
||||
"He loves reading science fiction books, and his favorite author is Isaac Asimov. "
|
||||
"Brandon's favorite movie is Inception, and he enjoys watching it with his friends. "
|
||||
"He is also a fan of Mexican cuisine, especially tacos and burritos. "
|
||||
"Brandon plays the guitar and often performs at local open mic nights. "
|
||||
"He is learning French and plans to visit Paris next year. "
|
||||
"Brandon is passionate about technology and often attends tech meetups in the city. "
|
||||
"He is also interested in AI and machine learning, and he is currently working on a project related to natural language processing. "
|
||||
"Brandon's favorite color is blue, and he often wears blue shirts. "
|
||||
"He enjoys cooking and often tries new recipes on weekends. "
|
||||
"Brandon is a morning person and likes to start his day with a run in the park. "
|
||||
"He is also a coffee enthusiast and enjoys trying different coffee blends. "
|
||||
"Brandon is a member of a local book club and enjoys discussing books with fellow members. "
|
||||
"He is also a fan of board games and often hosts game nights at his place. "
|
||||
"Brandon is an advocate for environmental conservation and volunteers for local clean-up drives. "
|
||||
"He is also a mentor for aspiring software developers and enjoys sharing his knowledge with others. "
|
||||
"Brandon's favorite sport is basketball, and he often plays with his friends on weekends. "
|
||||
"He is also a fan of the Golden State Warriors and enjoys watching their games. "
|
||||
)
|
||||
* 2, # Repeat to ensure it's 2k characters
|
||||
(
|
||||
"Brandon loves traveling and has visited over 20 countries. "
|
||||
"He is fluent in Spanish and often practices with his friends. "
|
||||
"Brandon's favorite city is Barcelona, where he enjoys the architecture and culture. "
|
||||
"He is a foodie and loves trying new cuisines, with a particular fondness for sushi. "
|
||||
"Brandon is an avid cyclist and participates in local cycling events. "
|
||||
"He is also a photographer and enjoys capturing landscapes and cityscapes. "
|
||||
"Brandon is a tech enthusiast and follows the latest trends in gadgets and software. "
|
||||
"He is also a fan of virtual reality and owns a VR headset. "
|
||||
"Brandon's favorite book is 'The Hitchhiker's Guide to the Galaxy'. "
|
||||
"He enjoys watching documentaries and learning about history and science. "
|
||||
"Brandon is a coffee lover and has a collection of coffee mugs from different countries. "
|
||||
"He is also a fan of jazz music and often attends live performances. "
|
||||
"Brandon is a member of a local running club and participates in marathons. "
|
||||
"He is also a volunteer at a local animal shelter and helps with dog walking. "
|
||||
"Brandon's favorite holiday is Christmas, and he enjoys decorating his home. "
|
||||
"He is also a fan of classic movies and has a collection of DVDs. "
|
||||
"Brandon is a mentor for young professionals and enjoys giving career advice. "
|
||||
"He is also a fan of puzzles and enjoys solving them in his free time. "
|
||||
"Brandon's favorite sport is soccer, and he often plays with his friends. "
|
||||
"He is also a fan of FC Barcelona and enjoys watching their matches. "
|
||||
)
|
||||
* 2, # Repeat to ensure it's 2k characters
|
||||
]
|
||||
string_sources = [
|
||||
StringKnowledgeSource(content=content, metadata={"preference": "personal"})
|
||||
for content in contents
|
||||
]
|
||||
|
||||
mock_vector_db.sources = string_sources
|
||||
mock_vector_db.query.return_value = [{"context": contents[1], "score": 0.9}]
|
||||
|
||||
# Perform a query
|
||||
query = "What is Brandon's favorite book?"
|
||||
results = mock_vector_db.query(query)
|
||||
|
||||
# Assert that the correct information is retrieved
|
||||
assert any(
|
||||
"the hitchhiker's guide to the galaxy" in result["context"].lower()
|
||||
for result in results
|
||||
)
|
||||
mock_vector_db.query.assert_called_once()
|
||||
|
||||
|
||||
def test_single_short_file(mock_vector_db, tmpdir):
|
||||
# Create a single short text file
|
||||
content = "Brandon's favorite sport is basketball."
|
||||
file_path = Path(tmpdir.join("short_file.txt"))
|
||||
with open(file_path, "w") as f:
|
||||
f.write(content)
|
||||
|
||||
file_source = TextFileKnowledgeSource(
|
||||
file_path=file_path, metadata={"preference": "personal"}
|
||||
)
|
||||
mock_vector_db.sources = [file_source]
|
||||
mock_vector_db.query.return_value = [{"context": content, "score": 0.9}]
|
||||
# Perform a query
|
||||
query = "What sport does Brandon like?"
|
||||
results = mock_vector_db.query(query)
|
||||
|
||||
# Assert that the results contain the expected information
|
||||
assert any("basketball" in result["context"].lower() for result in results)
|
||||
mock_vector_db.query.assert_called_once()
|
||||
|
||||
|
||||
def test_single_2k_character_file(mock_vector_db, tmpdir):
|
||||
# Create a single 2k character text file with various facts about Brandon
|
||||
content = (
|
||||
"Brandon is a software engineer who lives in San Francisco. "
|
||||
"He enjoys hiking and often visits the trails in the Bay Area. "
|
||||
"Brandon has a pet dog named Max, who is a golden retriever. "
|
||||
"He loves reading science fiction books, and his favorite author is Isaac Asimov. "
|
||||
"Brandon's favorite movie is Inception, and he enjoys watching it with his friends. "
|
||||
"He is also a fan of Mexican cuisine, especially tacos and burritos. "
|
||||
"Brandon plays the guitar and often performs at local open mic nights. "
|
||||
"He is learning French and plans to visit Paris next year. "
|
||||
"Brandon is passionate about technology and often attends tech meetups in the city. "
|
||||
"He is also interested in AI and machine learning, and he is currently working on a project related to natural language processing. "
|
||||
"Brandon's favorite color is blue, and he often wears blue shirts. "
|
||||
"He enjoys cooking and often tries new recipes on weekends. "
|
||||
"Brandon is a morning person and likes to start his day with a run in the park. "
|
||||
"He is also a coffee enthusiast and enjoys trying different coffee blends. "
|
||||
"Brandon is a member of a local book club and enjoys discussing books with fellow members. "
|
||||
"He is also a fan of board games and often hosts game nights at his place. "
|
||||
"Brandon is an advocate for environmental conservation and volunteers for local clean-up drives. "
|
||||
"He is also a mentor for aspiring software developers and enjoys sharing his knowledge with others. "
|
||||
"Brandon's favorite sport is basketball, and he often plays with his friends on weekends. "
|
||||
"He is also a fan of the Golden State Warriors and enjoys watching their games. "
|
||||
) * 2 # Repeat to ensure it's 2k characters
|
||||
file_path = Path(tmpdir.join("long_file.txt"))
|
||||
with open(file_path, "w") as f:
|
||||
f.write(content)
|
||||
|
||||
file_source = TextFileKnowledgeSource(
|
||||
file_path=file_path, metadata={"preference": "personal"}
|
||||
)
|
||||
mock_vector_db.sources = [file_source]
|
||||
mock_vector_db.query.return_value = [{"context": content, "score": 0.9}]
|
||||
# Perform a query
|
||||
query = "What is Brandon's favorite movie?"
|
||||
results = mock_vector_db.query(query)
|
||||
|
||||
# Assert that the results contain the expected information
|
||||
assert any("inception" in result["context"].lower() for result in results)
|
||||
mock_vector_db.query.assert_called_once()
|
||||
|
||||
|
||||
def test_multiple_short_files(mock_vector_db, tmpdir):
|
||||
# Create multiple short text files
|
||||
contents = [
|
||||
{
|
||||
"content": "Brandon works as a software engineer.",
|
||||
"metadata": {"category": "profession", "source": "occupation"},
|
||||
},
|
||||
{
|
||||
"content": "Brandon lives in New York.",
|
||||
"metadata": {"category": "city", "source": "personal"},
|
||||
},
|
||||
{
|
||||
"content": "Brandon enjoys cooking Italian food.",
|
||||
"metadata": {"category": "hobby", "source": "personal"},
|
||||
},
|
||||
]
|
||||
file_paths = []
|
||||
for i, item in enumerate(contents):
|
||||
file_path = Path(tmpdir.join(f"file_{i}.txt"))
|
||||
with open(file_path, "w") as f:
|
||||
f.write(item["content"])
|
||||
file_paths.append((file_path, item["metadata"]))
|
||||
|
||||
file_sources = [
|
||||
TextFileKnowledgeSource(file_path=path, metadata=metadata)
|
||||
for path, metadata in file_paths
|
||||
]
|
||||
mock_vector_db.sources = file_sources
|
||||
mock_vector_db.query.return_value = [
|
||||
{"context": "Brandon lives in New York.", "score": 0.9}
|
||||
]
|
||||
# Perform a query
|
||||
query = "What city does he reside in?"
|
||||
results = mock_vector_db.query(query)
|
||||
# Assert that the correct information is retrieved
|
||||
assert any("new york" in result["context"].lower() for result in results)
|
||||
mock_vector_db.query.assert_called_once()
|
||||
|
||||
|
||||
def test_multiple_2k_character_files(mock_vector_db, tmpdir):
|
||||
# Create multiple 2k character text files with various facts about Brandon
|
||||
contents = [
|
||||
(
|
||||
"Brandon loves traveling and has visited over 20 countries. "
|
||||
"He is fluent in Spanish and often practices with his friends. "
|
||||
"Brandon's favorite city is Barcelona, where he enjoys the architecture and culture. "
|
||||
"He is a foodie and loves trying new cuisines, with a particular fondness for sushi. "
|
||||
"Brandon is an avid cyclist and participates in local cycling events. "
|
||||
"He is also a photographer and enjoys capturing landscapes and cityscapes. "
|
||||
"Brandon is a tech enthusiast and follows the latest trends in gadgets and software. "
|
||||
"He is also a fan of virtual reality and owns a VR headset. "
|
||||
"Brandon's favorite book is 'The Hitchhiker's Guide to the Galaxy'. "
|
||||
"He enjoys watching documentaries and learning about history and science. "
|
||||
"Brandon is a coffee lover and has a collection of coffee mugs from different countries. "
|
||||
"He is also a fan of jazz music and often attends live performances. "
|
||||
"Brandon is a member of a local running club and participates in marathons. "
|
||||
"He is also a volunteer at a local animal shelter and helps with dog walking. "
|
||||
"Brandon's favorite holiday is Christmas, and he enjoys decorating his home. "
|
||||
"He is also a fan of classic movies and has a collection of DVDs. "
|
||||
"Brandon is a mentor for young professionals and enjoys giving career advice. "
|
||||
"He is also a fan of puzzles and enjoys solving them in his free time. "
|
||||
"Brandon's favorite sport is soccer, and he often plays with his friends. "
|
||||
"He is also a fan of FC Barcelona and enjoys watching their matches. "
|
||||
)
|
||||
* 2, # Repeat to ensure it's 2k characters
|
||||
(
|
||||
"Brandon is a software engineer who lives in San Francisco. "
|
||||
"He enjoys hiking and often visits the trails in the Bay Area. "
|
||||
"Brandon has a pet dog named Max, who is a golden retriever. "
|
||||
"He loves reading science fiction books, and his favorite author is Isaac Asimov. "
|
||||
"Brandon's favorite movie is Inception, and he enjoys watching it with his friends. "
|
||||
"He is also a fan of Mexican cuisine, especially tacos and burritos. "
|
||||
"Brandon plays the guitar and often performs at local open mic nights. "
|
||||
"He is learning French and plans to visit Paris next year. "
|
||||
"Brandon is passionate about technology and often attends tech meetups in the city. "
|
||||
"He is also interested in AI and machine learning, and he is currently working on a project related to natural language processing. "
|
||||
"Brandon's favorite color is blue, and he often wears blue shirts. "
|
||||
"He enjoys cooking and often tries new recipes on weekends. "
|
||||
"Brandon is a morning person and likes to start his day with a run in the park. "
|
||||
"He is also a coffee enthusiast and enjoys trying different coffee blends. "
|
||||
"Brandon is a member of a local book club and enjoys discussing books with fellow members. "
|
||||
"He is also a fan of board games and often hosts game nights at his place. "
|
||||
"Brandon is an advocate for environmental conservation and volunteers for local clean-up drives. "
|
||||
"He is also a mentor for aspiring software developers and enjoys sharing his knowledge with others. "
|
||||
"Brandon's favorite sport is basketball, and he often plays with his friends on weekends. "
|
||||
"He is also a fan of the Golden State Warriors and enjoys watching their games. "
|
||||
)
|
||||
* 2, # Repeat to ensure it's 2k characters
|
||||
]
|
||||
file_paths = []
|
||||
for i, content in enumerate(contents):
|
||||
file_path = Path(tmpdir.join(f"long_file_{i}.txt"))
|
||||
with open(file_path, "w") as f:
|
||||
f.write(content)
|
||||
file_paths.append(file_path)
|
||||
|
||||
file_sources = [
|
||||
TextFileKnowledgeSource(file_path=path, metadata={"preference": "personal"})
|
||||
for path in file_paths
|
||||
]
|
||||
mock_vector_db.sources = file_sources
|
||||
mock_vector_db.query.return_value = [
|
||||
{
|
||||
"context": "Brandon's favorite book is 'The Hitchhiker's Guide to the Galaxy'.",
|
||||
"score": 0.9,
|
||||
}
|
||||
]
|
||||
# Perform a query
|
||||
query = "What is Brandon's favorite book?"
|
||||
results = mock_vector_db.query(query)
|
||||
|
||||
# Assert that the correct information is retrieved
|
||||
assert any(
|
||||
"the hitchhiker's guide to the galaxy" in result["context"].lower()
|
||||
for result in results
|
||||
)
|
||||
mock_vector_db.query.assert_called_once()
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_hybrid_string_and_files(mock_vector_db, tmpdir):
|
||||
# Create string sources
|
||||
string_contents = [
|
||||
"Brandon is learning French.",
|
||||
"Brandon visited Paris last summer.",
|
||||
]
|
||||
string_sources = [
|
||||
StringKnowledgeSource(content=content, metadata={"preference": "personal"})
|
||||
for content in string_contents
|
||||
]
|
||||
|
||||
# Create file sources
|
||||
file_contents = [
|
||||
"Brandon prefers tea over coffee.",
|
||||
"Brandon's favorite book is 'The Alchemist'.",
|
||||
]
|
||||
file_paths = []
|
||||
for i, content in enumerate(file_contents):
|
||||
file_path = Path(tmpdir.join(f"file_{i}.txt"))
|
||||
with open(file_path, "w") as f:
|
||||
f.write(content)
|
||||
file_paths.append(file_path)
|
||||
|
||||
file_sources = [
|
||||
TextFileKnowledgeSource(file_path=path, metadata={"preference": "personal"})
|
||||
for path in file_paths
|
||||
]
|
||||
|
||||
# Combine string and file sources
|
||||
mock_vector_db.sources = string_sources + file_sources
|
||||
mock_vector_db.query.return_value = [{"context": file_contents[1], "score": 0.9}]
|
||||
|
||||
# Perform a query
|
||||
query = "What is Brandon's favorite book?"
|
||||
results = mock_vector_db.query(query)
|
||||
|
||||
# Assert that the correct information is retrieved
|
||||
assert any("the alchemist" in result["context"].lower() for result in results)
|
||||
mock_vector_db.query.assert_called_once()
|
||||
|
||||
|
||||
def test_pdf_knowledge_source(mock_vector_db):
|
||||
# Get the directory of the current file
|
||||
current_dir = Path(__file__).parent
|
||||
# Construct the path to the PDF file
|
||||
pdf_path = current_dir / "crewai_quickstart.pdf"
|
||||
|
||||
# Create a PDFKnowledgeSource
|
||||
pdf_source = PDFKnowledgeSource(
|
||||
file_path=pdf_path, metadata={"preference": "personal"}
|
||||
)
|
||||
mock_vector_db.sources = [pdf_source]
|
||||
mock_vector_db.query.return_value = [
|
||||
{"context": "crewai create crew latest-ai-development", "score": 0.9}
|
||||
]
|
||||
|
||||
# Perform a query
|
||||
query = "How do you create a crew?"
|
||||
results = mock_vector_db.query(query)
|
||||
|
||||
# Assert that the correct information is retrieved
|
||||
assert any(
|
||||
"crewai create crew latest-ai-development" in result["context"].lower()
|
||||
for result in results
|
||||
)
|
||||
mock_vector_db.query.assert_called_once()
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_csv_knowledge_source(mock_vector_db, tmpdir):
|
||||
"""Test CSVKnowledgeSource with a simple CSV file."""
|
||||
|
||||
# Create a CSV file with sample data
|
||||
csv_content = [
|
||||
["Name", "Age", "City"],
|
||||
["Brandon", "30", "New York"],
|
||||
["Alice", "25", "Los Angeles"],
|
||||
["Bob", "35", "Chicago"],
|
||||
]
|
||||
csv_path = Path(tmpdir.join("data.csv"))
|
||||
with open(csv_path, "w", encoding="utf-8") as f:
|
||||
for row in csv_content:
|
||||
f.write(",".join(row) + "\n")
|
||||
|
||||
# Create a CSVKnowledgeSource
|
||||
csv_source = CSVKnowledgeSource(
|
||||
file_path=csv_path, metadata={"preference": "personal"}
|
||||
)
|
||||
mock_vector_db.sources = [csv_source]
|
||||
mock_vector_db.query.return_value = [
|
||||
{"context": "Brandon is 30 years old.", "score": 0.9}
|
||||
]
|
||||
|
||||
# Perform a query
|
||||
query = "How old is Brandon?"
|
||||
results = mock_vector_db.query(query)
|
||||
|
||||
# Assert that the correct information is retrieved
|
||||
assert any("30" in result["context"] for result in results)
|
||||
mock_vector_db.query.assert_called_once()
|
||||
|
||||
|
||||
def test_json_knowledge_source(mock_vector_db, tmpdir):
|
||||
"""Test JSONKnowledgeSource with a simple JSON file."""
|
||||
|
||||
# Create a JSON file with sample data
|
||||
json_data = {
|
||||
"people": [
|
||||
{"name": "Brandon", "age": 30, "city": "New York"},
|
||||
{"name": "Alice", "age": 25, "city": "Los Angeles"},
|
||||
{"name": "Bob", "age": 35, "city": "Chicago"},
|
||||
]
|
||||
}
|
||||
json_path = Path(tmpdir.join("data.json"))
|
||||
with open(json_path, "w", encoding="utf-8") as f:
|
||||
import json
|
||||
|
||||
json.dump(json_data, f)
|
||||
|
||||
# Create a JSONKnowledgeSource
|
||||
json_source = JSONKnowledgeSource(
|
||||
file_path=json_path, metadata={"preference": "personal"}
|
||||
)
|
||||
mock_vector_db.sources = [json_source]
|
||||
mock_vector_db.query.return_value = [
|
||||
{"context": "Alice lives in Los Angeles.", "score": 0.9}
|
||||
]
|
||||
|
||||
# Perform a query
|
||||
query = "Where does Alice reside?"
|
||||
results = mock_vector_db.query(query)
|
||||
|
||||
# Assert that the correct information is retrieved
|
||||
assert any("los angeles" in result["context"].lower() for result in results)
|
||||
mock_vector_db.query.assert_called_once()
|
||||
|
||||
|
||||
def test_excel_knowledge_source(mock_vector_db, tmpdir):
|
||||
"""Test ExcelKnowledgeSource with a simple Excel file."""
|
||||
|
||||
# Create an Excel file with sample data
|
||||
import pandas as pd
|
||||
|
||||
excel_data = {
|
||||
"Name": ["Brandon", "Alice", "Bob"],
|
||||
"Age": [30, 25, 35],
|
||||
"City": ["New York", "Los Angeles", "Chicago"],
|
||||
}
|
||||
df = pd.DataFrame(excel_data)
|
||||
excel_path = Path(tmpdir.join("data.xlsx"))
|
||||
df.to_excel(excel_path, index=False)
|
||||
|
||||
# Create an ExcelKnowledgeSource
|
||||
excel_source = ExcelKnowledgeSource(
|
||||
file_path=excel_path, metadata={"preference": "personal"}
|
||||
)
|
||||
mock_vector_db.sources = [excel_source]
|
||||
mock_vector_db.query.return_value = [
|
||||
{"context": "Brandon is 30 years old.", "score": 0.9}
|
||||
]
|
||||
|
||||
# Perform a query
|
||||
query = "What is Brandon's age?"
|
||||
results = mock_vector_db.query(query)
|
||||
|
||||
# Assert that the correct information is retrieved
|
||||
assert any("30" in result["context"] for result in results)
|
||||
mock_vector_db.query.assert_called_once()
|
||||
30
tests/llm_test.py
Normal file
30
tests/llm_test.py
Normal file
@@ -0,0 +1,30 @@
|
||||
import pytest
|
||||
|
||||
from crewai.agents.agent_builder.utilities.base_token_process import TokenProcess
|
||||
from crewai.llm import LLM
|
||||
from crewai.utilities.token_counter_callback import TokenCalcHandler
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_llm_callback_replacement():
|
||||
llm = LLM(model="gpt-4o-mini")
|
||||
|
||||
calc_handler_1 = TokenCalcHandler(token_cost_process=TokenProcess())
|
||||
calc_handler_2 = TokenCalcHandler(token_cost_process=TokenProcess())
|
||||
|
||||
llm.call(
|
||||
messages=[{"role": "user", "content": "Hello, world!"}],
|
||||
callbacks=[calc_handler_1],
|
||||
)
|
||||
usage_metrics_1 = calc_handler_1.token_cost_process.get_summary()
|
||||
|
||||
llm.call(
|
||||
messages=[{"role": "user", "content": "Hello, world from another agent!"}],
|
||||
callbacks=[calc_handler_2],
|
||||
)
|
||||
usage_metrics_2 = calc_handler_2.token_cost_process.get_summary()
|
||||
|
||||
# The first handler should not have been updated
|
||||
assert usage_metrics_1.successful_requests == 1
|
||||
assert usage_metrics_2.successful_requests == 1
|
||||
assert usage_metrics_1 == calc_handler_1.token_cost_process.get_summary()
|
||||
270
tests/memory/cassettes/test_save_and_search_with_provider.yaml
Normal file
270
tests/memory/cassettes/test_save_and_search_with_provider.yaml
Normal file
@@ -0,0 +1,270 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: ''
|
||||
headers:
|
||||
accept:
|
||||
- '*/*'
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
host:
|
||||
- api.mem0.ai
|
||||
user-agent:
|
||||
- python-httpx/0.27.0
|
||||
method: GET
|
||||
uri: https://api.mem0.ai/v1/memories/?user_id=test
|
||||
response:
|
||||
body:
|
||||
string: '[]'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8b477138bad847b9-BOM
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Sat, 17 Aug 2024 06:00:11 GMT
|
||||
NEL:
|
||||
- '{"success_fraction":0,"report_to":"cf-nel","max_age":604800}'
|
||||
Report-To:
|
||||
- '{"endpoints":[{"url":"https:\/\/a.nel.cloudflare.com\/report\/v4?s=uuyH2foMJVDpV%2FH52g1q%2FnvXKe3dBKVzvsK0mqmSNezkiszNR9OgrEJfVqmkX%2FlPFRP2sH4zrOuzGo6k%2FjzsjYJczqSWJUZHN2pPujiwnr1E9W%2BdLGKmG6%2FqPrGYAy2SBRWkkJVWsTO3OQ%3D%3D"}],"group":"cf-nel","max_age":604800}'
|
||||
Server:
|
||||
- cloudflare
|
||||
allow:
|
||||
- GET, POST, DELETE, OPTIONS
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
cross-origin-opener-policy:
|
||||
- same-origin
|
||||
referrer-policy:
|
||||
- same-origin
|
||||
vary:
|
||||
- Accept, origin, Cookie
|
||||
x-content-type-options:
|
||||
- nosniff
|
||||
x-frame-options:
|
||||
- DENY
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"batch": [{"properties": {"python_version": "3.12.4 (v3.12.4:8e8a4baf65,
|
||||
Jun 6 2024, 17:33:18) [Clang 13.0.0 (clang-1300.0.29.30)]", "os": "darwin",
|
||||
"os_version": "Darwin Kernel Version 23.4.0: Wed Feb 21 21:44:54 PST 2024; root:xnu-10063.101.15~2/RELEASE_ARM64_T6030",
|
||||
"os_release": "23.4.0", "processor": "arm", "machine": "arm64", "function":
|
||||
"mem0.client.main.MemoryClient", "$lib": "posthog-python", "$lib_version": "3.5.0",
|
||||
"$geoip_disable": true}, "timestamp": "2024-08-17T06:00:11.526640+00:00", "context":
|
||||
{}, "distinct_id": "fd411bd3-99a2-42d6-acd7-9fca8ad09580", "event": "client.init"}],
|
||||
"historical_migration": false, "sentAt": "2024-08-17T06:00:11.701621+00:00",
|
||||
"api_key": "phc_hgJkUVJFYtmaJqrvf6CYN67TIQ8yhXAkWzUn9AMU4yX"}'
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '740'
|
||||
Content-Type:
|
||||
- application/json
|
||||
User-Agent:
|
||||
- posthog-python/3.5.0
|
||||
method: POST
|
||||
uri: https://us.i.posthog.com/batch/
|
||||
response:
|
||||
body:
|
||||
string: '{"status":"Ok"}'
|
||||
headers:
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '15'
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Sat, 17 Aug 2024 06:00:12 GMT
|
||||
access-control-allow-credentials:
|
||||
- 'true'
|
||||
server:
|
||||
- envoy
|
||||
vary:
|
||||
- origin, access-control-request-method, access-control-request-headers
|
||||
x-envoy-upstream-service-time:
|
||||
- '69'
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "user", "content": "Remember the following insights
|
||||
from Agent run: test value with provider"}], "metadata": {"task": "test_task_provider",
|
||||
"agent": "test_agent_provider"}, "app_id": "Researcher"}'
|
||||
headers:
|
||||
accept:
|
||||
- '*/*'
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '219'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.mem0.ai
|
||||
user-agent:
|
||||
- python-httpx/0.27.0
|
||||
method: POST
|
||||
uri: https://api.mem0.ai/v1/memories/
|
||||
response:
|
||||
body:
|
||||
string: '{"message":"ok"}'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8b477140282547b9-BOM
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '16'
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Sat, 17 Aug 2024 06:00:13 GMT
|
||||
NEL:
|
||||
- '{"success_fraction":0,"report_to":"cf-nel","max_age":604800}'
|
||||
Report-To:
|
||||
- '{"endpoints":[{"url":"https:\/\/a.nel.cloudflare.com\/report\/v4?s=FRjJKSk3YxVj03wA7S05H8ts35KnWfqS3wb6Rfy4kVZ4BgXfw7nJbm92wI6vEv5fWcAcHVnOlkJDggs11B01BMuB2k3a9RqlBi0dJNiMuk%2Bgm5xE%2BODMPWJctYNRwQMjNVbteUpS%2Fad8YA%3D%3D"}],"group":"cf-nel","max_age":604800}'
|
||||
Server:
|
||||
- cloudflare
|
||||
allow:
|
||||
- GET, POST, DELETE, OPTIONS
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
cross-origin-opener-policy:
|
||||
- same-origin
|
||||
referrer-policy:
|
||||
- same-origin
|
||||
vary:
|
||||
- Accept, origin, Cookie
|
||||
x-content-type-options:
|
||||
- nosniff
|
||||
x-frame-options:
|
||||
- DENY
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"query": "test value with provider", "limit": 3, "app_id": "Researcher"}'
|
||||
headers:
|
||||
accept:
|
||||
- '*/*'
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '73'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.mem0.ai
|
||||
user-agent:
|
||||
- python-httpx/0.27.0
|
||||
method: POST
|
||||
uri: https://api.mem0.ai/v1/memories/search/
|
||||
response:
|
||||
body:
|
||||
string: '[]'
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 8b47714d083b47b9-BOM
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '2'
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Sat, 17 Aug 2024 06:00:14 GMT
|
||||
NEL:
|
||||
- '{"success_fraction":0,"report_to":"cf-nel","max_age":604800}'
|
||||
Report-To:
|
||||
- '{"endpoints":[{"url":"https:\/\/a.nel.cloudflare.com\/report\/v4?s=2DRWL1cdKdMvnE8vx1fPUGeTITOgSGl3N5g84PS6w30GRqpfz79BtSx6REhpnOiFV8kM6KGqln0iCZ5yoHc2jBVVJXhPJhQ5t0uerD9JFnkphjISrJOU1MJjZWneT9PlNABddxvVNCmluA%3D%3D"}],"group":"cf-nel","max_age":604800}'
|
||||
Server:
|
||||
- cloudflare
|
||||
allow:
|
||||
- POST, OPTIONS
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
cross-origin-opener-policy:
|
||||
- same-origin
|
||||
referrer-policy:
|
||||
- same-origin
|
||||
vary:
|
||||
- Accept, origin, Cookie
|
||||
x-content-type-options:
|
||||
- nosniff
|
||||
x-frame-options:
|
||||
- DENY
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"batch": [{"properties": {"python_version": "3.12.4 (v3.12.4:8e8a4baf65,
|
||||
Jun 6 2024, 17:33:18) [Clang 13.0.0 (clang-1300.0.29.30)]", "os": "darwin",
|
||||
"os_version": "Darwin Kernel Version 23.4.0: Wed Feb 21 21:44:54 PST 2024; root:xnu-10063.101.15~2/RELEASE_ARM64_T6030",
|
||||
"os_release": "23.4.0", "processor": "arm", "machine": "arm64", "function":
|
||||
"mem0.client.main.MemoryClient", "$lib": "posthog-python", "$lib_version": "3.5.0",
|
||||
"$geoip_disable": true}, "timestamp": "2024-08-17T06:00:13.593952+00:00", "context":
|
||||
{}, "distinct_id": "fd411bd3-99a2-42d6-acd7-9fca8ad09580", "event": "client.add"}],
|
||||
"historical_migration": false, "sentAt": "2024-08-17T06:00:13.858277+00:00",
|
||||
"api_key": "phc_hgJkUVJFYtmaJqrvf6CYN67TIQ8yhXAkWzUn9AMU4yX"}'
|
||||
headers:
|
||||
Accept:
|
||||
- '*/*'
|
||||
Accept-Encoding:
|
||||
- gzip, deflate
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '739'
|
||||
Content-Type:
|
||||
- application/json
|
||||
User-Agent:
|
||||
- posthog-python/3.5.0
|
||||
method: POST
|
||||
uri: https://us.i.posthog.com/batch/
|
||||
response:
|
||||
body:
|
||||
string: '{"status":"Ok"}'
|
||||
headers:
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Length:
|
||||
- '15'
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Sat, 17 Aug 2024 06:00:13 GMT
|
||||
access-control-allow-credentials:
|
||||
- 'true'
|
||||
server:
|
||||
- envoy
|
||||
vary:
|
||||
- origin, access-control-request-method, access-control-request-headers
|
||||
x-envoy-upstream-service-time:
|
||||
- '33'
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
@@ -38,6 +38,7 @@ def mock_crew_factory():
|
||||
|
||||
crew = MockCrew()
|
||||
crew.name = name
|
||||
crew.knowledge = None
|
||||
|
||||
task_output = TaskOutput(
|
||||
description="Test task", raw="Task output", agent="Test Agent"
|
||||
@@ -67,6 +68,7 @@ def mock_crew_factory():
|
||||
crew.process = Process.sequential
|
||||
crew.config = None
|
||||
crew.cache = True
|
||||
crew.embedder = None
|
||||
|
||||
# Add non-empty agents and tasks
|
||||
mock_agent = MagicMock(spec=Agent)
|
||||
|
||||
@@ -1,5 +1,8 @@
|
||||
import pytest
|
||||
|
||||
from crewai.agent import Agent
|
||||
from crewai.project import agent, task
|
||||
from crewai.crew import Crew
|
||||
from crewai.project import CrewBase, after_kickoff, agent, before_kickoff, crew, task
|
||||
from crewai.task import Task
|
||||
|
||||
|
||||
@@ -23,6 +26,43 @@ class SimpleCrew:
|
||||
)
|
||||
|
||||
|
||||
@CrewBase
|
||||
class TestCrew:
|
||||
agents_config = "config/agents.yaml"
|
||||
tasks_config = "config/tasks.yaml"
|
||||
|
||||
@agent
|
||||
def researcher(self):
|
||||
return Agent(config=self.agents_config["researcher"])
|
||||
|
||||
@agent
|
||||
def reporting_analyst(self):
|
||||
return Agent(config=self.agents_config["reporting_analyst"])
|
||||
|
||||
@task
|
||||
def research_task(self):
|
||||
return Task(config=self.tasks_config["research_task"])
|
||||
|
||||
@task
|
||||
def reporting_task(self):
|
||||
return Task(config=self.tasks_config["reporting_task"])
|
||||
|
||||
@before_kickoff
|
||||
def modify_inputs(self, inputs):
|
||||
if inputs:
|
||||
inputs["topic"] = "Bicycles"
|
||||
return inputs
|
||||
|
||||
@after_kickoff
|
||||
def modify_outputs(self, outputs):
|
||||
outputs.raw = outputs.raw + " post processed"
|
||||
return outputs
|
||||
|
||||
@crew
|
||||
def crew(self):
|
||||
return Crew(agents=self.agents, tasks=self.tasks, verbose=True)
|
||||
|
||||
|
||||
def test_agent_memoization():
|
||||
crew = SimpleCrew()
|
||||
first_call_result = crew.simple_agent()
|
||||
@@ -43,6 +83,16 @@ def test_task_memoization():
|
||||
), "Task memoization is not working as expected"
|
||||
|
||||
|
||||
def test_crew_memoization():
|
||||
crew = TestCrew()
|
||||
first_call_result = crew.crew()
|
||||
second_call_result = crew.crew()
|
||||
|
||||
assert (
|
||||
first_call_result is second_call_result
|
||||
), "Crew references should point to the same object"
|
||||
|
||||
|
||||
def test_task_name():
|
||||
simple_task = SimpleCrew().simple_task()
|
||||
assert (
|
||||
@@ -53,3 +103,84 @@ def test_task_name():
|
||||
assert (
|
||||
custom_named_task.name == "Custom"
|
||||
), "Custom task name is not being set as expected"
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_before_kickoff_modification():
|
||||
crew = TestCrew()
|
||||
inputs = {"topic": "LLMs"}
|
||||
result = crew.crew().kickoff(inputs=inputs)
|
||||
assert "bicycles" in result.raw, "Before kickoff function did not modify inputs"
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_after_kickoff_modification():
|
||||
crew = TestCrew()
|
||||
# Assuming the crew execution returns a dict
|
||||
result = crew.crew().kickoff({"topic": "LLMs"})
|
||||
|
||||
assert (
|
||||
"post processed" in result.raw
|
||||
), "After kickoff function did not modify outputs"
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_before_kickoff_with_none_input():
|
||||
crew = TestCrew()
|
||||
crew.crew().kickoff(None)
|
||||
# Test should pass without raising exceptions
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_multiple_before_after_kickoff():
|
||||
@CrewBase
|
||||
class MultipleHooksCrew:
|
||||
agents_config = "config/agents.yaml"
|
||||
tasks_config = "config/tasks.yaml"
|
||||
|
||||
@agent
|
||||
def researcher(self):
|
||||
return Agent(config=self.agents_config["researcher"])
|
||||
|
||||
@agent
|
||||
def reporting_analyst(self):
|
||||
return Agent(config=self.agents_config["reporting_analyst"])
|
||||
|
||||
@task
|
||||
def research_task(self):
|
||||
return Task(config=self.tasks_config["research_task"])
|
||||
|
||||
@task
|
||||
def reporting_task(self):
|
||||
return Task(config=self.tasks_config["reporting_task"])
|
||||
|
||||
@before_kickoff
|
||||
def first_before(self, inputs):
|
||||
inputs["topic"] = "Bicycles"
|
||||
return inputs
|
||||
|
||||
@before_kickoff
|
||||
def second_before(self, inputs):
|
||||
inputs["topic"] = "plants"
|
||||
return inputs
|
||||
|
||||
@after_kickoff
|
||||
def first_after(self, outputs):
|
||||
outputs.raw = outputs.raw + " processed first"
|
||||
return outputs
|
||||
|
||||
@after_kickoff
|
||||
def second_after(self, outputs):
|
||||
outputs.raw = outputs.raw + " processed second"
|
||||
return outputs
|
||||
|
||||
@crew
|
||||
def crew(self):
|
||||
return Crew(agents=self.agents, tasks=self.tasks, verbose=True)
|
||||
|
||||
crew = MultipleHooksCrew()
|
||||
result = crew.crew().kickoff({"topic": "LLMs"})
|
||||
|
||||
assert "plants" in result.raw, "First before_kickoff not executed"
|
||||
assert "processed first" in result.raw, "First after_kickoff not executed"
|
||||
assert "processed second" in result.raw, "Second after_kickoff not executed"
|
||||
|
||||
526
uv.lock
generated
526
uv.lock
generated
@@ -490,28 +490,32 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "chroma-hnswlib"
|
||||
version = "0.7.3"
|
||||
version = "0.7.6"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "numpy" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/c0/59/1224cbae62c7b84c84088cdf6c106b9b2b893783c000d22c442a1672bc75/chroma-hnswlib-0.7.3.tar.gz", hash = "sha256:b6137bedde49fffda6af93b0297fe00429fc61e5a072b1ed9377f909ed95a932", size = 31876 }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/73/09/10d57569e399ce9cbc5eee2134996581c957f63a9addfa6ca657daf006b8/chroma_hnswlib-0.7.6.tar.gz", hash = "sha256:4dce282543039681160259d29fcde6151cc9106c6461e0485f57cdccd83059b7", size = 32256 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/1a/36/d1069ffa520efcf93f6d81b527e3c7311e12363742fdc786cbdaea3ab02e/chroma_hnswlib-0.7.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:59d6a7c6f863c67aeb23e79a64001d537060b6995c3eca9a06e349ff7b0998ca", size = 219588 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c3/e8/263d331f5ce29367f6f8854cd7fa1f54fce72ab4f92ab957525ef9165a9c/chroma_hnswlib-0.7.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d71a3f4f232f537b6152947006bd32bc1629a8686df22fd97777b70f416c127a", size = 197094 },
|
||||
{ url = "https://files.pythonhosted.org/packages/a9/72/a9b61ae00d490c26359a8e10f3974c0d38065b894e6a2573ec6a7597f8e3/chroma_hnswlib-0.7.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c92dc1ebe062188e53970ba13f6b07e0ae32e64c9770eb7f7ffa83f149d4210", size = 2315620 },
|
||||
{ url = "https://files.pythonhosted.org/packages/2f/48/f7609a3cb15a24c5d8ec18911ce10ac94144e9a89584f0a86bf9871b024c/chroma_hnswlib-0.7.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:49da700a6656fed8753f68d44b8cc8ae46efc99fc8a22a6d970dc1697f49b403", size = 2350956 },
|
||||
{ url = "https://files.pythonhosted.org/packages/cc/3d/ca311b8f79744db3f4faad8fd9140af80d34c94829d3ed1726c98cf4a611/chroma_hnswlib-0.7.3-cp310-cp310-win_amd64.whl", hash = "sha256:108bc4c293d819b56476d8f7865803cb03afd6ca128a2a04d678fffc139af029", size = 150598 },
|
||||
{ url = "https://files.pythonhosted.org/packages/94/3f/844393b0d2ea1072b7704d6eff5c595e05ae8b831b96340cdb76b2fe995c/chroma_hnswlib-0.7.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:11e7ca93fb8192214ac2b9c0943641ac0daf8f9d4591bb7b73be808a83835667", size = 221219 },
|
||||
{ url = "https://files.pythonhosted.org/packages/11/7a/673ccb9bb2faf9cf655d9040e970c02a96645966e06837fde7d10edf242a/chroma_hnswlib-0.7.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:6f552e4d23edc06cdeb553cdc757d2fe190cdeb10d43093d6a3319f8d4bf1c6b", size = 198652 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ba/f4/c81a40da5473d5d80fc9d0c5bd5b1cb64e530a6ea941c69f195fe81c488c/chroma_hnswlib-0.7.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f96f4d5699e486eb1fb95849fe35ab79ab0901265805be7e60f4eaa83ce263ec", size = 2332260 },
|
||||
{ url = "https://files.pythonhosted.org/packages/48/0e/068b658a547d6090b969014146321e28dae1411da54b76d081e51a2af22b/chroma_hnswlib-0.7.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:368e57fe9ebae05ee5844840fa588028a023d1182b0cfdb1d13f607c9ea05756", size = 2367211 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d2/32/a91850c7aa8a34f61838913155103808fe90da6f1ea4302731b59e9ba6f2/chroma_hnswlib-0.7.3-cp311-cp311-win_amd64.whl", hash = "sha256:b7dca27b8896b494456db0fd705b689ac6b73af78e186eb6a42fea2de4f71c6f", size = 151574 },
|
||||
{ url = "https://files.pythonhosted.org/packages/a8/74/b9dde05ea8685d2f8c4681b517e61c7887e974f6272bb24ebc8f2105875b/chroma_hnswlib-0.7.6-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:f35192fbbeadc8c0633f0a69c3d3e9f1a4eab3a46b65458bbcbcabdd9e895c36", size = 195821 },
|
||||
{ url = "https://files.pythonhosted.org/packages/fd/58/101bfa6bc41bc6cc55fbb5103c75462a7bf882e1704256eb4934df85b6a8/chroma_hnswlib-0.7.6-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:6f007b608c96362b8f0c8b6b2ac94f67f83fcbabd857c378ae82007ec92f4d82", size = 183854 },
|
||||
{ url = "https://files.pythonhosted.org/packages/17/ff/95d49bb5ce134f10d6aa08d5f3bec624eaff945f0b17d8c3fce888b9a54a/chroma_hnswlib-0.7.6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:456fd88fa0d14e6b385358515aef69fc89b3c2191706fd9aee62087b62aad09c", size = 2358774 },
|
||||
{ url = "https://files.pythonhosted.org/packages/3a/6d/27826180a54df80dbba8a4f338b022ba21c0c8af96fd08ff8510626dee8f/chroma_hnswlib-0.7.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5dfaae825499c2beaa3b75a12d7ec713b64226df72a5c4097203e3ed532680da", size = 2392739 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d6/63/ee3e8b7a8f931918755faacf783093b61f32f59042769d9db615999c3de0/chroma_hnswlib-0.7.6-cp310-cp310-win_amd64.whl", hash = "sha256:2487201982241fb1581be26524145092c95902cb09fc2646ccfbc407de3328ec", size = 150955 },
|
||||
{ url = "https://files.pythonhosted.org/packages/f5/af/d15fdfed2a204c0f9467ad35084fbac894c755820b203e62f5dcba2d41f1/chroma_hnswlib-0.7.6-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:81181d54a2b1e4727369486a631f977ffc53c5533d26e3d366dda243fb0998ca", size = 196911 },
|
||||
{ url = "https://files.pythonhosted.org/packages/0d/19/aa6f2139f1ff7ad23a690ebf2a511b2594ab359915d7979f76f3213e46c4/chroma_hnswlib-0.7.6-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:4b4ab4e11f1083dd0a11ee4f0e0b183ca9f0f2ed63ededba1935b13ce2b3606f", size = 185000 },
|
||||
{ url = "https://files.pythonhosted.org/packages/79/b1/1b269c750e985ec7d40b9bbe7d66d0a890e420525187786718e7f6b07913/chroma_hnswlib-0.7.6-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:53db45cd9173d95b4b0bdccb4dbff4c54a42b51420599c32267f3abbeb795170", size = 2377289 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c7/2d/d5663e134436e5933bc63516a20b5edc08b4c1b1588b9680908a5f1afd04/chroma_hnswlib-0.7.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5c093f07a010b499c00a15bc9376036ee4800d335360570b14f7fe92badcdcf9", size = 2411755 },
|
||||
{ url = "https://files.pythonhosted.org/packages/3e/79/1bce519cf186112d6d5ce2985392a89528c6e1e9332d680bf752694a4cdf/chroma_hnswlib-0.7.6-cp311-cp311-win_amd64.whl", hash = "sha256:0540b0ac96e47d0aa39e88ea4714358ae05d64bbe6bf33c52f316c664190a6a3", size = 151888 },
|
||||
{ url = "https://files.pythonhosted.org/packages/93/ac/782b8d72de1c57b64fdf5cb94711540db99a92768d93d973174c62d45eb8/chroma_hnswlib-0.7.6-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:e87e9b616c281bfbe748d01705817c71211613c3b063021f7ed5e47173556cb7", size = 197804 },
|
||||
{ url = "https://files.pythonhosted.org/packages/32/4e/fd9ce0764228e9a98f6ff46af05e92804090b5557035968c5b4198bc7af9/chroma_hnswlib-0.7.6-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ec5ca25bc7b66d2ecbf14502b5729cde25f70945d22f2aaf523c2d747ea68912", size = 185421 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d9/3d/b59a8dedebd82545d873235ef2d06f95be244dfece7ee4a1a6044f080b18/chroma_hnswlib-0.7.6-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:305ae491de9d5f3c51e8bd52d84fdf2545a4a2bc7af49765cda286b7bb30b1d4", size = 2389672 },
|
||||
{ url = "https://files.pythonhosted.org/packages/74/1e/80a033ea4466338824974a34f418e7b034a7748bf906f56466f5caa434b0/chroma_hnswlib-0.7.6-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:822ede968d25a2c88823ca078a58f92c9b5c4142e38c7c8b4c48178894a0a3c5", size = 2436986 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "chromadb"
|
||||
version = "0.4.24"
|
||||
version = "0.5.18"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "bcrypt" },
|
||||
@@ -519,6 +523,7 @@ dependencies = [
|
||||
{ name = "chroma-hnswlib" },
|
||||
{ name = "fastapi" },
|
||||
{ name = "grpcio" },
|
||||
{ name = "httpx" },
|
||||
{ name = "importlib-resources" },
|
||||
{ name = "kubernetes" },
|
||||
{ name = "mmh3" },
|
||||
@@ -531,11 +536,10 @@ dependencies = [
|
||||
{ name = "orjson" },
|
||||
{ name = "overrides" },
|
||||
{ name = "posthog" },
|
||||
{ name = "pulsar-client" },
|
||||
{ name = "pydantic" },
|
||||
{ name = "pypika" },
|
||||
{ name = "pyyaml" },
|
||||
{ name = "requests" },
|
||||
{ name = "rich" },
|
||||
{ name = "tenacity" },
|
||||
{ name = "tokenizers" },
|
||||
{ name = "tqdm" },
|
||||
@@ -543,9 +547,9 @@ dependencies = [
|
||||
{ name = "typing-extensions" },
|
||||
{ name = "uvicorn", extra = ["standard"] },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/47/6b/a5465827d8017b658d18ad1e63d2dc31109dec717c6bd068e82485186f4b/chromadb-0.4.24.tar.gz", hash = "sha256:a5c80b4e4ad9b236ed2d4899a5b9e8002b489293f2881cb2cadab5b199ee1c72", size = 13667084 }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/15/95/d1a3f14c864e37d009606b82bd837090902b5e5a8e892fcab07eeaec0438/chromadb-0.5.18.tar.gz", hash = "sha256:cfbb3e5aeeb1dd532b47d80ed9185e8a9886c09af41c8e6123edf94395d76aec", size = 33620708 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/cc/63/b7d76109331318423f9cfb89bd89c99e19f5d0b47a5105439a629224d297/chromadb-0.4.24-py3-none-any.whl", hash = "sha256:3a08e237a4ad28b5d176685bd22429a03717fe09d35022fb230d516108da01da", size = 525452 },
|
||||
{ url = "https://files.pythonhosted.org/packages/82/85/4d2f8b9202153105ad4514ae09e9fe6f3b353a45e44e0ef7eca03dd8b9dc/chromadb-0.5.18-py3-none-any.whl", hash = "sha256:9dd3827b5e04b4ff0a5ea0df28a78bac88a09f45be37fcd7fe20f879b57c43cf", size = 615499 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -604,7 +608,7 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "crewai"
|
||||
version = "0.76.9"
|
||||
version = "0.80.0"
|
||||
source = { editable = "." }
|
||||
dependencies = [
|
||||
{ name = "appdirs" },
|
||||
@@ -634,6 +638,21 @@ dependencies = [
|
||||
agentops = [
|
||||
{ name = "agentops" },
|
||||
]
|
||||
fastembed = [
|
||||
{ name = "fastembed" },
|
||||
]
|
||||
openpyxl = [
|
||||
{ name = "openpyxl" },
|
||||
]
|
||||
pandas = [
|
||||
{ name = "pandas" },
|
||||
]
|
||||
pdfplumber = [
|
||||
{ name = "pdfplumber" },
|
||||
]
|
||||
mem0 = [
|
||||
{ name = "mem0ai" },
|
||||
]
|
||||
tools = [
|
||||
{ name = "crewai-tools" },
|
||||
]
|
||||
@@ -663,19 +682,24 @@ requires-dist = [
|
||||
{ name = "agentops", marker = "extra == 'agentops'", specifier = ">=0.3.0" },
|
||||
{ name = "appdirs", specifier = ">=1.4.4" },
|
||||
{ name = "auth0-python", specifier = ">=4.7.1" },
|
||||
{ name = "chromadb", specifier = ">=0.4.24" },
|
||||
{ name = "chromadb", specifier = ">=0.5.18" },
|
||||
{ name = "click", specifier = ">=8.1.7" },
|
||||
{ name = "crewai-tools", specifier = ">=0.13.4" },
|
||||
{ name = "crewai-tools", marker = "extra == 'tools'", specifier = ">=0.13.4" },
|
||||
{ name = "crewai-tools", specifier = ">=0.14.0" },
|
||||
{ name = "crewai-tools", marker = "extra == 'tools'", specifier = ">=0.14.0" },
|
||||
{ name = "fastembed", marker = "extra == 'fastembed'", specifier = ">=0.4.1" },
|
||||
{ name = "instructor", specifier = ">=1.3.3" },
|
||||
{ name = "json-repair", specifier = ">=0.25.2" },
|
||||
{ name = "jsonref", specifier = ">=1.1.0" },
|
||||
{ name = "langchain", specifier = ">=0.2.16" },
|
||||
{ name = "litellm", specifier = ">=1.44.22" },
|
||||
{ name = "mem0ai", marker = "extra == 'mem0'", specifier = ">=0.1.29" },
|
||||
{ name = "openai", specifier = ">=1.13.3" },
|
||||
{ name = "openpyxl", marker = "extra == 'openpyxl'", specifier = ">=3.1.5" },
|
||||
{ name = "opentelemetry-api", specifier = ">=1.22.0" },
|
||||
{ name = "opentelemetry-exporter-otlp-proto-http", specifier = ">=1.22.0" },
|
||||
{ name = "opentelemetry-sdk", specifier = ">=1.22.0" },
|
||||
{ name = "pandas", marker = "extra == 'pandas'", specifier = ">=2.2.3" },
|
||||
{ name = "pdfplumber", marker = "extra == 'pdfplumber'", specifier = ">=0.11.4" },
|
||||
{ name = "pydantic", specifier = ">=2.4.2" },
|
||||
{ name = "python-dotenv", specifier = ">=1.0.0" },
|
||||
{ name = "pyvis", specifier = ">=0.3.2" },
|
||||
@@ -688,7 +712,7 @@ requires-dist = [
|
||||
[package.metadata.requires-dev]
|
||||
dev = [
|
||||
{ name = "cairosvg", specifier = ">=2.7.1" },
|
||||
{ name = "crewai-tools", specifier = ">=0.13.4" },
|
||||
{ name = "crewai-tools", specifier = ">=0.14.0" },
|
||||
{ name = "mkdocs", specifier = ">=1.4.3" },
|
||||
{ name = "mkdocs-material", specifier = ">=9.5.7" },
|
||||
{ name = "mkdocs-material-extensions", specifier = ">=1.3.1" },
|
||||
@@ -707,7 +731,7 @@ dev = [
|
||||
|
||||
[[package]]
|
||||
name = "crewai-tools"
|
||||
version = "0.13.4"
|
||||
version = "0.14.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "beautifulsoup4" },
|
||||
@@ -725,9 +749,9 @@ dependencies = [
|
||||
{ name = "requests" },
|
||||
{ name = "selenium" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/64/bd/eff7b633a0b28ff4ed115adde1499e3dcc683e4f0b5c378a4c6f5c0c1bf6/crewai_tools-0.13.4.tar.gz", hash = "sha256:b6ac527633b7018471d892c21ac96bc961a86b6626d996b1ed7d53cd481d4505", size = 816588 }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/9b/6d/4fa91b481b120f83bb58f365203d8aa8564e8ced1035d79f8aedb7d71e2f/crewai_tools-0.14.0.tar.gz", hash = "sha256:510f3a194bcda4fdae4314bd775521964b5f229ddbe451e5d9e0216cae57f4e3", size = 815892 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/6c/40/93cd347d854059cf5e54a81b70f896deea7ad1f03e9c024549eb323c4da5/crewai_tools-0.13.4-py3-none-any.whl", hash = "sha256:eda78fe3c4df57676259d8dd6b2610fa31f89b90909512f15893adb57fb9e825", size = 463703 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c8/ed/9f4e64e1507062957b0118085332d38b621c1000874baef2d1c4069bfd97/crewai_tools-0.14.0-py3-none-any.whl", hash = "sha256:0a804a828c29869c3af3253f4fc4c3967a3f80f06dab22e9bbe9526608a31564", size = 462980 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -889,7 +913,7 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "embedchain"
|
||||
version = "0.1.123"
|
||||
version = "0.1.125"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "alembic" },
|
||||
@@ -914,11 +938,21 @@ dependencies = [
|
||||
{ name = "sqlalchemy" },
|
||||
{ name = "tiktoken" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/5d/6a/955b5a72fa6727db203c4d46ae0e30ac47f4f50389f663cd5ea157b0d819/embedchain-0.1.123.tar.gz", hash = "sha256:aecaf81c21de05b5cdb649b6cde95ef68ffa759c69c54f6ff2eaa667f2ad0580", size = 124797 }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/6c/ea/eedb6016719f94fe4bd4c5aa44cc5463d85494bbd0864cc465e4317d4987/embedchain-0.1.125.tar.gz", hash = "sha256:15a6d368b48ba33feb93b237caa54f6e9078537c02a49c1373e59cc32627a138", size = 125176 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/a7/51/0c78d26da4afbe68370306669556b274f1021cac02f3155d8da2be407763/embedchain-0.1.123-py3-none-any.whl", hash = "sha256:1210e993b6364d7c702b6bd44b053fc244dd77f2a65ea4b90b62709114ea6c25", size = 210909 },
|
||||
{ url = "https://files.pythonhosted.org/packages/52/82/3d0355c22bc68cfbb8fbcf670da4c01b31bd7eb516974a08cf7533e89887/embedchain-0.1.125-py3-none-any.whl", hash = "sha256:f87b49732dc192c6b61221830f29e59cf2aff26d8f5d69df81f6f6cf482715c2", size = 211367 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "et-xmlfile"
|
||||
version = "2.0.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/d3/38/af70d7ab1ae9d4da450eeec1fa3918940a5fafb9055e934af8d6eb0c2313/et_xmlfile-2.0.0.tar.gz", hash = "sha256:dab3f4764309081ce75662649be815c4c9081e88f0837825f90fd28317d4da54", size = 17234 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/c1/8b/5fe2cc11fee489817272089c4203e679c63b570a5aaeb18d852ae3cbba6a/et_xmlfile-2.0.0-py3-none-any.whl", hash = "sha256:7a91720bc756843502c3b7504c77b8fe44217c85c537d85037f0f536151b2caa", size = 18059 },
|
||||
]
|
||||
|
||||
|
||||
[[package]]
|
||||
name = "exceptiongroup"
|
||||
version = "1.2.2"
|
||||
@@ -977,6 +1011,28 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/c8/0c/92b468e4649e61eaa2d93a92e19a5b57a0f6cecaa236c53a76f3f72a4696/fastavro-1.9.7-cp312-cp312-win_amd64.whl", hash = "sha256:b6b2ccdc78f6afc18c52e403ee68c00478da12142815c1bd8a00973138a166d0", size = 487778 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "fastembed"
|
||||
version = "0.4.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "huggingface-hub" },
|
||||
{ name = "loguru" },
|
||||
{ name = "mmh3" },
|
||||
{ name = "numpy" },
|
||||
{ name = "onnx" },
|
||||
{ name = "onnxruntime" },
|
||||
{ name = "pillow" },
|
||||
{ name = "py-rust-stemmers" },
|
||||
{ name = "requests" },
|
||||
{ name = "tokenizers" },
|
||||
{ name = "tqdm" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/0c/75/0883d15b54016fa99a32cc333182bf862025bf0983daac417a1beabb53bf/fastembed-0.4.1.tar.gz", hash = "sha256:d5dcfffc3554dca48caf16eec35e38c20544c58e396a5d215f238d40c8442718", size = 40461 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/19/ae/1303f005be08ff31686a30421121680b864cc6d82f7cd82cee74a6ff9416/fastembed-0.4.1-py3-none-any.whl", hash = "sha256:f75f02468aafa8de474844f9fbaa89683a3dcfd76521fa83cfc3efc885db61f3", size = 65123 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "filelock"
|
||||
version = "3.16.1"
|
||||
@@ -2034,6 +2090,19 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/22/f3/89a4d65d1b9286eb5ac6a6e92dd93523d92f3142a832e60c00d5cad64176/litellm-1.50.2-py3-none-any.whl", hash = "sha256:99cac60c78037946ab809b7cfbbadad53507bb2db8ae39391b4be215a0869fdd", size = 6318265 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "loguru"
|
||||
version = "0.7.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "colorama", marker = "sys_platform == 'win32'" },
|
||||
{ name = "win32-setctime", marker = "sys_platform == 'win32'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/9e/30/d87a423766b24db416a46e9335b9602b054a72b96a88a241f2b09b560fa8/loguru-0.7.2.tar.gz", hash = "sha256:e671a53522515f34fd406340ee968cb9ecafbc4b36c679da03c18fd8d0bd51ac", size = 145103 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/03/0a/4f6fed21aa246c6b49b561ca55facacc2a44b87d65b8b92362a8e99ba202/loguru-0.7.2-py3-none-any.whl", hash = "sha256:003d71e3d3ed35f0f8984898359d65b79e5b21943f78af86aa5491210429b8eb", size = 62549 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "mako"
|
||||
version = "1.3.6"
|
||||
@@ -2160,7 +2229,7 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "mem0ai"
|
||||
version = "0.1.22"
|
||||
version = "0.1.29"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "openai" },
|
||||
@@ -2170,9 +2239,9 @@ dependencies = [
|
||||
{ name = "qdrant-client" },
|
||||
{ name = "sqlalchemy" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/7f/b4/64c6f7d9684bd1f9b46d251abfc7d5b2cc8371d70f1f9eec097f9872c719/mem0ai-0.1.22.tar.gz", hash = "sha256:d01aa028763719bd0ede2de4602121a7c3bf023f46112cd50cc9169140e11be2", size = 53117 }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/a9/bf/152718f9da3844dd24d4c45850b2e719798b5ce9389adf4ec873ee8905ca/mem0ai-0.1.29.tar.gz", hash = "sha256:42adefb7a9b241be03fbcabadf5328abf91b4ac390bc97e5966e55e3cac192c5", size = 55201 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/2b/27/3ef75abb28bf8b46c2cc34730f6be733ef2584652474216215019ee036a2/mem0ai-0.1.22-py3-none-any.whl", hash = "sha256:c783e15131c16a0d91e44e30195c1eeae9c36468de40006d5e42cf4516059855", size = 75695 },
|
||||
{ url = "https://files.pythonhosted.org/packages/65/9b/755be84f669415b3b513cfd935e768c4c84ac5c1ab6ff6ac2dab990a261a/mem0ai-0.1.29-py3-none-any.whl", hash = "sha256:07bbfd4238d0d7da65d5e4cf75a217eeb5b2829834e399074b05bb046730a57f", size = 79558 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -2302,74 +2371,58 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "mmh3"
|
||||
version = "5.0.1"
|
||||
version = "4.1.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/e2/08/04ad6419f072ea3f51f9a0f429dd30f5f0a0b02ead7ca11a831117b6f9e8/mmh3-5.0.1.tar.gz", hash = "sha256:7dab080061aeb31a6069a181f27c473a1f67933854e36a3464931f2716508896", size = 32008 }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/63/96/aa247e82878b123468f0079ce2ac77e948315bab91ce45d2934a62e0af95/mmh3-4.1.0.tar.gz", hash = "sha256:a1cf25348b9acd229dda464a094d6170f47d2850a1fcb762a3b6172d2ce6ca4a", size = 26357 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/fa/b9/9a91b0a0e330557cdbf51fc43ca0ba306633f2ec6d2b15e871e288592a32/mmh3-5.0.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:f0a4b4bf05778ed77d820d6e7d0e9bd6beb0c01af10e1ce9233f5d2f814fcafa", size = 52867 },
|
||||
{ url = "https://files.pythonhosted.org/packages/da/28/6b37f0d6707872764e1af49f327b0940b6a3ad995d91b3839b90ba35f559/mmh3-5.0.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ac7a391039aeab95810c2d020b69a94eb6b4b37d4e2374831e92db3a0cdf71c6", size = 38352 },
|
||||
{ url = "https://files.pythonhosted.org/packages/76/84/a98f59a620b522f218876a0630b02fc345ecf078f6393595756ddb3aa0b5/mmh3-5.0.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:3a2583b5521ca49756d8d8bceba80627a9cc295f255dcab4e3df7ccc2f09679a", size = 38214 },
|
||||
{ url = "https://files.pythonhosted.org/packages/35/cb/4980c7eb6cd31f49d1913a4066562bc9e0af28526750f1232be9688a9cd4/mmh3-5.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:081a8423fe53c1ac94f87165f3e4c500125d343410c1a0c5f1703e898a3ef038", size = 93502 },
|
||||
{ url = "https://files.pythonhosted.org/packages/65/f3/29726296fadeaf06134a6978f7c453dfa562cf2f0f1faf9ae28b9b8ef76e/mmh3-5.0.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b8b4d72713799755dc8954a7d36d5c20a6c8de7b233c82404d122c7c7c1707cc", size = 98394 },
|
||||
{ url = "https://files.pythonhosted.org/packages/35/fd/e181f4f4b250f7b63ee27a7d65e5e290a3ea0e26cc633f4bfd906f04558b/mmh3-5.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:389a6fd51efc76d3182d36ec306448559c1244f11227d2bb771bdd0e6cc91321", size = 98052 },
|
||||
{ url = "https://files.pythonhosted.org/packages/61/5c/8a5d838da3eb3fb91035ef5eaaea469abab4e8e3fae55607c27a1a07d162/mmh3-5.0.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:39f4128edaa074bff721b1d31a72508cba4d2887ee7867f22082e1fe9d4edea0", size = 86320 },
|
||||
{ url = "https://files.pythonhosted.org/packages/10/80/3f33a8f4de12cea322607da1a84d001513affb741b3c3cc1277ecb85d34b/mmh3-5.0.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1d5d23a94d91aabba3386b3769048d5f4210fdfef80393fece2f34ba5a7b466c", size = 93232 },
|
||||
{ url = "https://files.pythonhosted.org/packages/9e/1c/d0ce5f498493be4de2e7e7596e1cbf63315a4c0bb8bb94e3c37c4fad965d/mmh3-5.0.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:16347d038361f8b8f24fd2b7ef378c9b68ddee9f7706e46269b6e0d322814713", size = 93590 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d9/66/770b5ad35b5a2eb7965f3fcaeaa76148e59543575d2e27b80690c1b0795c/mmh3-5.0.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:6e299408565af7d61f2d20a5ffdd77cf2ed902460fe4e6726839d59ba4b72316", size = 88433 },
|
||||
{ url = "https://files.pythonhosted.org/packages/14/58/e0d258b18749d8640233976493716a40aa27352dcb1cea941836357dac24/mmh3-5.0.1-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:42050af21ddfc5445ee5a66e73a8fc758c71790305e3ee9e4a85a8e69e810f94", size = 99339 },
|
||||
{ url = "https://files.pythonhosted.org/packages/38/26/7267146122deb584cf377975b994d80c6d72c4c8d0e8eedff4d0cc5cd4c8/mmh3-5.0.1-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:2ae9b1f5ef27ec54659920f0404b7ceb39966e28867c461bfe83a05e8d18ddb0", size = 93944 },
|
||||
{ url = "https://files.pythonhosted.org/packages/8d/6b/df60b14a2dd383d8848f6f35496c86c7003be3ffb236789e98d002c542c6/mmh3-5.0.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:50c2495a02045f3047d71d4ae9cdd7a15efc0bcbb7ff17a18346834a8e2d1d19", size = 92798 },
|
||||
{ url = "https://files.pythonhosted.org/packages/0a/3f/d5fecf13915163a15b449e5cc89232a4df90e836ecad1c38121318119d27/mmh3-5.0.1-cp310-cp310-win32.whl", hash = "sha256:c028fa77cddf351ca13b4a56d43c1775652cde0764cadb39120b68f02a23ecf6", size = 39185 },
|
||||
{ url = "https://files.pythonhosted.org/packages/74/8e/4bb5ade332a87de633cda21dae09d6002d69601f2b93e9f40302ab2d9acf/mmh3-5.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:c5e741e421ec14400c4aae30890515c201f518403bdef29ae1e00d375bb4bbb5", size = 39766 },
|
||||
{ url = "https://files.pythonhosted.org/packages/16/2b/cd5cfa4d7ad40a37655af491f9270909d63fc27bcf0558ec36000ee5347f/mmh3-5.0.1-cp310-cp310-win_arm64.whl", hash = "sha256:b17156d56fabc73dbf41bca677ceb6faed435cc8544f6566d72ea77d8a17e9d0", size = 36540 },
|
||||
{ url = "https://files.pythonhosted.org/packages/fb/8a/f3b9cf8b7110fef0f130158d7602af6f5b09f2cf568130814b7c92e2507b/mmh3-5.0.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9a6d5a9b1b923f1643559ba1fc0bf7a5076c90cbb558878d3bf3641ce458f25d", size = 52867 },
|
||||
{ url = "https://files.pythonhosted.org/packages/bf/06/f466e0da3c5bd6fbb1e047f70fd4e9e9563d0268aa56de511f363478dbf2/mmh3-5.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3349b968be555f7334bbcce839da98f50e1e80b1c615d8e2aa847ea4a964a012", size = 38349 },
|
||||
{ url = "https://files.pythonhosted.org/packages/13/f0/2d3daca276a4673f82af859e4b0b18befd4e6e54f1017ba48ea9735b2f1b/mmh3-5.0.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1bd3c94b110e55db02ab9b605029f48a2f7f677c6e58c09d44e42402d438b7e1", size = 38211 },
|
||||
{ url = "https://files.pythonhosted.org/packages/e3/56/a2d203ca97702d4e045ac1a46a608393da1a1dddb24f81de664dae940518/mmh3-5.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d47ba84d48608f79adbb10bb09986b6dc33eeda5c2d1bd75d00820081b73bde9", size = 95104 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ec/45/c7c8ae64e3ae024776a0ce5377c16c6741a3359f3e9505fc35fc5012beb2/mmh3-5.0.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c0217987a8b8525c8d9170f66d036dec4ab45cfbd53d47e8d76125791ceb155e", size = 100049 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d5/74/681113776fe406c09870ab2152ffbd214a15bbc8f1d1da9ad73ce594b878/mmh3-5.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2797063a34e78d1b61639a98b0edec1c856fa86ab80c7ec859f1796d10ba429", size = 99671 },
|
||||
{ url = "https://files.pythonhosted.org/packages/bf/4f/dbb8be18ce9b6ff8df14bc14348c0404b3091fb51df9c673ebfcf5877db3/mmh3-5.0.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8bba16340adcbd47853a2fbe5afdb397549e8f2e79324ff1dced69a3f8afe7c3", size = 87549 },
|
||||
{ url = "https://files.pythonhosted.org/packages/5f/82/274d646f3f604c35b7e3d4eb7f3ff08b3bdc6a2c87d797709bb6f084a611/mmh3-5.0.1-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:282797957c9f60b51b9d768a602c25f579420cc9af46feb77d457a27823d270a", size = 94780 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c9/a1/f094ca8b8fb5e2ac53201070bda42b0fee80ceb92c153eb99a1453e3aed3/mmh3-5.0.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:e4fb670c29e63f954f9e7a2cdcd57b36a854c2538f579ef62681ccbaa1de2b69", size = 90430 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d9/23/4732ba68c6ef7242b69bb53b9e1bcb2ef065d68ed85fd26e829fb911ab5a/mmh3-5.0.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:8ee7d85438dc6aff328e19ab052086a3c29e8a9b632998a49e5c4b0034e9e8d6", size = 89451 },
|
||||
{ url = "https://files.pythonhosted.org/packages/3c/c5/daea5d534fcf20b2399c2a7b1cd00a8d29d4d474247c15c2c94548a1a272/mmh3-5.0.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:b7fb5db231f3092444bc13901e6a8d299667126b00636ffbad4a7b45e1051e2f", size = 94703 },
|
||||
{ url = "https://files.pythonhosted.org/packages/5e/4a/34d5691e7be7c63c34181387bc69bdcc0005ca93c8b562d68cb5775e0e78/mmh3-5.0.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:c100dd441703da5ec136b1d9003ed4a041d8a1136234c9acd887499796df6ad8", size = 91054 },
|
||||
{ url = "https://files.pythonhosted.org/packages/5c/3a/ab31bb5e9e1a19a4a997593cbe6ce56710308218ff36c7f76d40ff9c8d2e/mmh3-5.0.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:71f3b765138260fd7a7a2dba0ea5727dabcd18c1f80323c9cfef97a7e86e01d0", size = 89571 },
|
||||
{ url = "https://files.pythonhosted.org/packages/0b/79/b986bb067dbfcba6879afe6e723aad1bd53f223450532dd9a4606d0af389/mmh3-5.0.1-cp311-cp311-win32.whl", hash = "sha256:9a76518336247fd17689ce3ae5b16883fd86a490947d46a0193d47fb913e26e3", size = 39187 },
|
||||
{ url = "https://files.pythonhosted.org/packages/48/69/97029eda3df0f84edde16a496a2e71bac508fc5d1f0a31e163da071e2670/mmh3-5.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:336bc4df2e44271f1c302d289cc3d78bd52d3eed8d306c7e4bff8361a12bf148", size = 39766 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c7/51/538f2b8412303281d8ce2a9a5c4ea84ff81f06de98af0b7c72059727a3bb/mmh3-5.0.1-cp311-cp311-win_arm64.whl", hash = "sha256:af6522722fbbc5999aa66f7244d0986767a46f1fb05accc5200f75b72428a508", size = 36540 },
|
||||
{ url = "https://files.pythonhosted.org/packages/75/c7/5b52d0882e7c0dccfaf8786a648e2b26c5307c594abe5cbe98c092607c97/mmh3-5.0.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:f2730bb263ed9c388e8860438b057a53e3cc701134a6ea140f90443c4c11aa40", size = 52907 },
|
||||
{ url = "https://files.pythonhosted.org/packages/01/b5/9609fa353c27188292748db033323c206f3fc6fbfa124bccf6a42af0da08/mmh3-5.0.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6246927bc293f6d56724536400b85fb85f5be26101fa77d5f97dd5e2a4c69bf2", size = 38389 },
|
||||
{ url = "https://files.pythonhosted.org/packages/33/99/49bf3c86244857b3b250c2f54aff22a5a78ef12258af556fa39bb1e80699/mmh3-5.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:fbca322519a6e6e25b6abf43e940e1667cf8ea12510e07fb4919b48a0cd1c411", size = 38204 },
|
||||
{ url = "https://files.pythonhosted.org/packages/f8/04/8860cab35b48aaefe40cf88344437e79ddc93cf7ff745dacd1cd56a2be1e/mmh3-5.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eae8c19903ed8a1724ad9e67e86f15d198a7a1271a4f9be83d47e38f312ed672", size = 95091 },
|
||||
{ url = "https://files.pythonhosted.org/packages/fa/e9/4ac56001a5bab6d26aa3dfabeddea6d7f037fd2972c76803259f51a5af75/mmh3-5.0.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a09fd6cc72c07c0c07c3357714234b646d78052487c4a3bd5f7f6e08408cff60", size = 100055 },
|
||||
{ url = "https://files.pythonhosted.org/packages/18/e8/7d5fd73f559c423ed5b72f940130c27803a406ee0ffc32ef5422f733df67/mmh3-5.0.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2ff8551fee7ae3b11c5d986b6347ade0dccaadd4670ffdb2b944dee120ffcc84", size = 99764 },
|
||||
{ url = "https://files.pythonhosted.org/packages/54/d8/c0d89da6c729feec997a9b3b68698894cef12359ade0da95eba9e03b1d5d/mmh3-5.0.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e39694c73a5a20c8bf36dfd8676ed351e5234d55751ba4f7562d85449b21ef3f", size = 87650 },
|
||||
{ url = "https://files.pythonhosted.org/packages/dd/41/ec0ee3fd5124c83cb767dcea8569bb326f8981cc88c991e3e4e948a31e24/mmh3-5.0.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eba6001989a92f72a89c7cf382fda831678bd780707a66b4f8ca90239fdf2123", size = 94976 },
|
||||
{ url = "https://files.pythonhosted.org/packages/8e/fa/e8059199fe6fbb2fd6494302904cb1209b2f8b6899d58059858a280e89a5/mmh3-5.0.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:0771f90c9911811cc606a5c7b7b58f33501c9ee896ed68a6ac22c7d55878ecc0", size = 90485 },
|
||||
{ url = "https://files.pythonhosted.org/packages/3a/a0/eb9da5f93dea3f44b8e970f013279d1543ab210ccf63bb030830968682aa/mmh3-5.0.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:09b31ed0c0c0920363e96641fac4efde65b1ab62b8df86293142f35a254e72b4", size = 89554 },
|
||||
{ url = "https://files.pythonhosted.org/packages/e7/e8/5803181eac4e015b4caf307af22fea74292dca48e580d93afe402dcdc138/mmh3-5.0.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:5cf4a8deda0235312db12075331cb417c4ba163770edfe789bde71d08a24b692", size = 94872 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ed/f9/4d55063f9dcaed41524f078a85989efdf1d335159af5e70af29942ebae67/mmh3-5.0.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:41f7090a95185ef20ac018581a99337f0cbc84a2135171ee3290a9c0d9519585", size = 91326 },
|
||||
{ url = "https://files.pythonhosted.org/packages/80/75/0a5acab5291480acd939db80e94448ac937fc7fbfddc0a67b3e721ebfc9c/mmh3-5.0.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b97b5b368fb7ff22194ec5854f5b12d8de9ab67a0f304728c7f16e5d12135b76", size = 89810 },
|
||||
{ url = "https://files.pythonhosted.org/packages/9b/fd/eb1a3573cda74d4c2381d10ded62c128e869954ced1881c15e2bcd97a48f/mmh3-5.0.1-cp312-cp312-win32.whl", hash = "sha256:842516acf04da546f94fad52db125ee619ccbdcada179da51c326a22c4578cb9", size = 39206 },
|
||||
{ url = "https://files.pythonhosted.org/packages/66/e8/542ed252924002b84c43a68a080cfd4facbea0d5df361e4f59637638d3c7/mmh3-5.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:d963be0dbfd9fca209c17172f6110787ebf78934af25e3694fe2ba40e55c1e2b", size = 39799 },
|
||||
{ url = "https://files.pythonhosted.org/packages/bd/25/ff2cd36c82a23afa57a05cdb52ab467a911fb12c055c8a8238c0d426cbf0/mmh3-5.0.1-cp312-cp312-win_arm64.whl", hash = "sha256:a5da292ceeed8ce8e32b68847261a462d30fd7b478c3f55daae841404f433c15", size = 36537 },
|
||||
{ url = "https://files.pythonhosted.org/packages/09/e0/fb19c46265c18311b422ba5ce3e18046ad45c48cfb213fd6dbec23ae6b51/mmh3-5.0.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:673e3f1c8d4231d6fb0271484ee34cb7146a6499fc0df80788adb56fd76842da", size = 52909 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c3/94/54fc591e7a24c7ce2c531ecfc5715cff932f9d320c2936550cc33d67304d/mmh3-5.0.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f795a306bd16a52ad578b663462cc8e95500b3925d64118ae63453485d67282b", size = 38396 },
|
||||
{ url = "https://files.pythonhosted.org/packages/1f/9a/142bcc9d0d28fc8ae45bbfb83926adc069f984cdf3495a71534cc22b8e27/mmh3-5.0.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:5ed57a5e28e502a1d60436cc25c76c3a5ba57545f250f2969af231dc1221e0a5", size = 38207 },
|
||||
{ url = "https://files.pythonhosted.org/packages/f8/5b/f1c9110aa70321bb1ee713f17851b9534586c63bc25e0110e4fc03ae2450/mmh3-5.0.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:632c28e7612e909dbb6cbe2fe496201ada4695b7715584005689c5dc038e59ad", size = 94988 },
|
||||
{ url = "https://files.pythonhosted.org/packages/87/e5/4dc67e7e0e716c641ab0a5875a659e37258417439590feff5c3bd3ff4538/mmh3-5.0.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:53fd6bd525a5985e391c43384672d9d6b317fcb36726447347c7fc75bfed34ec", size = 99969 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ac/68/d148327337687c53f04ad9ceaedfa9ad155ee0111d0cb06220f044d66720/mmh3-5.0.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dceacf6b0b961a0e499836af3aa62d60633265607aef551b2a3e3c48cdaa5edd", size = 99662 },
|
||||
{ url = "https://files.pythonhosted.org/packages/13/79/782adb6df6397947c1097b1e94b7f8d95629a4a73df05cf7207bd5148c1f/mmh3-5.0.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8f0738d478fdfb5d920f6aff5452c78f2c35b0eff72caa2a97dfe38e82f93da2", size = 87606 },
|
||||
{ url = "https://files.pythonhosted.org/packages/f2/c2/0404383281df049d0e4ccf07fabd659fc1f3da834df6708d934116cbf45d/mmh3-5.0.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8e70285e7391ab88b872e5bef632bad16b9d99a6d3ca0590656a4753d55988af", size = 94836 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c8/33/fda67c5f28e4c2131891cf8cbc3513cfc55881e3cfe26e49328e38ffacb3/mmh3-5.0.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:27e5fc6360aa6b828546a4318da1a7da6bf6e5474ccb053c3a6aa8ef19ff97bd", size = 90492 },
|
||||
{ url = "https://files.pythonhosted.org/packages/64/2f/0ed38aefe2a87f30bb1b12e5b75dc69fcffdc16def40d1752d6fc7cbbf96/mmh3-5.0.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:7989530c3c1e2c17bf5a0ec2bba09fd19819078ba90beedabb1c3885f5040b0d", size = 89594 },
|
||||
{ url = "https://files.pythonhosted.org/packages/95/ab/6e7a5e765fc78e3dbd0a04a04cfdf72e91eb8e31976228e69d82c741a5b4/mmh3-5.0.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:cdad7bee649950da7ecd3cbbbd12fb81f1161072ecbdb5acfa0018338c5cb9cf", size = 94929 },
|
||||
{ url = "https://files.pythonhosted.org/packages/74/51/f748f00c072006f4a093d9b08853a0e2e3cd5aeaa91343d4e2d942851978/mmh3-5.0.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:e143b8f184c1bb58cecd85ab4a4fd6dc65a2d71aee74157392c3fddac2a4a331", size = 91317 },
|
||||
{ url = "https://files.pythonhosted.org/packages/df/a1/21ee8017a7feb0270c49f756ff56da9f99bd150dcfe3b3f6f0d4b243423d/mmh3-5.0.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:e5eb12e886f3646dd636f16b76eb23fc0c27e8ff3c1ae73d4391e50ef60b40f6", size = 89861 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c2/d2/46a6d070de4659bdf91cd6a62d659f8cc547dadee52b6d02bcbacb3262ed/mmh3-5.0.1-cp313-cp313-win32.whl", hash = "sha256:16e6dddfa98e1c2d021268e72c78951234186deb4df6630e984ac82df63d0a5d", size = 39201 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ed/07/316c062f09019b99b248a4183c5333f8eeebe638345484774908a8f2c9c0/mmh3-5.0.1-cp313-cp313-win_amd64.whl", hash = "sha256:d3ffb792d70b8c4a2382af3598dad6ae0c5bd9cee5b7ffcc99aa2f5fd2c1bf70", size = 39807 },
|
||||
{ url = "https://files.pythonhosted.org/packages/9d/d3/f7e6d7d062b8d7072c3989a528d9d47486ee5d5ae75250f6e26b4976d098/mmh3-5.0.1-cp313-cp313-win_arm64.whl", hash = "sha256:122fa9ec148383f9124292962bda745f192b47bfd470b2af5fe7bb3982b17896", size = 36539 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ef/5a/8609dc74421858f7e94a89dc69221ab9b2c14d0d63a139b46ec190eedc44/mmh3-4.1.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:be5ac76a8b0cd8095784e51e4c1c9c318c19edcd1709a06eb14979c8d850c31a", size = 39433 },
|
||||
{ url = "https://files.pythonhosted.org/packages/93/6c/e7a0f07c7082c76964b1ff46aa852f36e2ec6a9c3530dec0afa0b3162fc2/mmh3-4.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:98a49121afdfab67cd80e912b36404139d7deceb6773a83620137aaa0da5714c", size = 29280 },
|
||||
{ url = "https://files.pythonhosted.org/packages/76/84/60ca728ec7d7e1779a98000d64941c6221786124b4f07bf105a627055890/mmh3-4.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5259ac0535874366e7d1a5423ef746e0d36a9e3c14509ce6511614bdc5a7ef5b", size = 30130 },
|
||||
{ url = "https://files.pythonhosted.org/packages/2a/22/f2ec190b491f712d9ef5ea6252204b6f05255ac9af54a7b505adc3128aed/mmh3-4.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c5950827ca0453a2be357696da509ab39646044e3fa15cad364eb65d78797437", size = 68837 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ae/b9/c1e8065671e1d2f4e280c9c57389e74964f4a5792cac26717ad592002c7d/mmh3-4.1.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1dd0f652ae99585b9dd26de458e5f08571522f0402155809fd1dc8852a613a39", size = 72275 },
|
||||
{ url = "https://files.pythonhosted.org/packages/6b/18/92bbdb102ab2b4e80084e927187d871758280eb067c649693e42bfc6d0d1/mmh3-4.1.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:99d25548070942fab1e4a6f04d1626d67e66d0b81ed6571ecfca511f3edf07e6", size = 70919 },
|
||||
{ url = "https://files.pythonhosted.org/packages/e2/cd/391ce1d1bb559871a5d3a6bbb30b82bf51d3e3b42c4e8589cccb201953da/mmh3-4.1.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:53db8d9bad3cb66c8f35cbc894f336273f63489ce4ac416634932e3cbe79eb5b", size = 65885 },
|
||||
{ url = "https://files.pythonhosted.org/packages/03/87/4b01a43336bd506478850d1bc3d180648b2d26b4acf1fc4bf1df72bf562f/mmh3-4.1.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:75da0f615eb55295a437264cc0b736753f830b09d102aa4c2a7d719bc445ec05", size = 67610 },
|
||||
{ url = "https://files.pythonhosted.org/packages/e8/12/b464149a1b7181c7ce431ebf3d24fa994863f2f1abc75b78d202dde966e0/mmh3-4.1.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:b926b07fd678ea84b3a2afc1fa22ce50aeb627839c44382f3d0291e945621e1a", size = 74888 },
|
||||
{ url = "https://files.pythonhosted.org/packages/fc/3e/f4eb45a23fc17b970394c1fe74eba157514577ae2d63757684241651d754/mmh3-4.1.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:c5b053334f9b0af8559d6da9dc72cef0a65b325ebb3e630c680012323c950bb6", size = 72969 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c0/3b/83934fd9494371357da0ca026d55ad427c199d611b97b6ffeecacfd8e720/mmh3-4.1.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:5bf33dc43cd6de2cb86e0aa73a1cc6530f557854bbbe5d59f41ef6de2e353d7b", size = 80338 },
|
||||
{ url = "https://files.pythonhosted.org/packages/b6/c4/5bcd709ea7269173d7e925402f05e05cf12194ef53cc9912a5ad166f8ded/mmh3-4.1.0-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:fa7eacd2b830727ba3dd65a365bed8a5c992ecd0c8348cf39a05cc77d22f4970", size = 76580 },
|
||||
{ url = "https://files.pythonhosted.org/packages/da/6a/4c0680d64475e551d7f4cc78bf0fd247c711ed2717f6bb311934993d1e69/mmh3-4.1.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:42dfd6742b9e3eec599f85270617debfa0bbb913c545bb980c8a4fa7b2d047da", size = 75325 },
|
||||
{ url = "https://files.pythonhosted.org/packages/70/bc/e2ed99e580b3dd121f6462147bd5f521c57b3c81c692aa2d416b0678c89f/mmh3-4.1.0-cp310-cp310-win32.whl", hash = "sha256:2974ad343f0d39dcc88e93ee6afa96cedc35a9883bc067febd7ff736e207fa47", size = 31235 },
|
||||
{ url = "https://files.pythonhosted.org/packages/73/2b/3aec865da7feb52830782d9fb7c54115cc18815680c244301adf9080622f/mmh3-4.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:74699a8984ded645c1a24d6078351a056f5a5f1fe5838870412a68ac5e28d865", size = 31271 },
|
||||
{ url = "https://files.pythonhosted.org/packages/17/2a/925439189ccf562bdcb839aed6263d718359f0c376d673beb3b83d3864ac/mmh3-4.1.0-cp310-cp310-win_arm64.whl", hash = "sha256:f0dc874cedc23d46fc488a987faa6ad08ffa79e44fb08e3cd4d4cf2877c00a00", size = 30147 },
|
||||
{ url = "https://files.pythonhosted.org/packages/2e/d6/86beea107e7e9700df9522466346c23a2f54faa81337c86fd17002aa95a6/mmh3-4.1.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:3280a463855b0eae64b681cd5b9ddd9464b73f81151e87bb7c91a811d25619e6", size = 39427 },
|
||||
{ url = "https://files.pythonhosted.org/packages/1c/08/65fa5489044e2afc304e8540c6c607d5d7b136ddc5cd8315c13de0adc34c/mmh3-4.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:97ac57c6c3301769e757d444fa7c973ceb002cb66534b39cbab5e38de61cd896", size = 29281 },
|
||||
{ url = "https://files.pythonhosted.org/packages/b3/aa/98511d3ea3f6ba958136d913be3be3c1009be935a20ecc7b2763f0a605b6/mmh3-4.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a7b6502cdb4dbd880244818ab363c8770a48cdccecf6d729ade0241b736b5ec0", size = 30130 },
|
||||
{ url = "https://files.pythonhosted.org/packages/3c/b7/1a93f81643435b0e57f1046c4ffe46f0214693eaede0d9b0a1a236776e70/mmh3-4.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:52ba2da04671a9621580ddabf72f06f0e72c1c9c3b7b608849b58b11080d8f14", size = 69072 },
|
||||
{ url = "https://files.pythonhosted.org/packages/45/9e/2ff70246aefd9cf146bc6a420c28ed475a0d1a325f31ee203be02f9215d4/mmh3-4.1.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5a5fef4c4ecc782e6e43fbeab09cff1bac82c998a1773d3a5ee6a3605cde343e", size = 72470 },
|
||||
{ url = "https://files.pythonhosted.org/packages/dc/cb/57bc1fdbdbe6837aebfca982494e23e2498ee2a89585c9054713b22e4167/mmh3-4.1.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5135358a7e00991f73b88cdc8eda5203bf9de22120d10a834c5761dbeb07dd13", size = 71251 },
|
||||
{ url = "https://files.pythonhosted.org/packages/4d/c2/46d7d2721b69fbdfd30231309e6395f62ff6744e5c00dd8113b9faa06fba/mmh3-4.1.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cff9ae76a54f7c6fe0167c9c4028c12c1f6de52d68a31d11b6790bb2ae685560", size = 66035 },
|
||||
{ url = "https://files.pythonhosted.org/packages/6f/a4/7ba4bcc838818bcf018e26d118d5ddb605c23c4fad040dc4d811f1cfcb04/mmh3-4.1.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f6f02576a4d106d7830ca90278868bf0983554dd69183b7bbe09f2fcd51cf54f", size = 67844 },
|
||||
{ url = "https://files.pythonhosted.org/packages/71/ed/8e80d1038e7bb15eaf739711d1fc36f2341acb6b1b95fa77003f2799c91e/mmh3-4.1.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:073d57425a23721730d3ff5485e2da489dd3c90b04e86243dd7211f889898106", size = 76724 },
|
||||
{ url = "https://files.pythonhosted.org/packages/1c/22/a6a70ca81f0ce8fe2f3a68d89c1184c2d2d0fbe0ee305da50e972c5ff9fa/mmh3-4.1.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:71e32ddec7f573a1a0feb8d2cf2af474c50ec21e7a8263026e8d3b4b629805db", size = 75004 },
|
||||
{ url = "https://files.pythonhosted.org/packages/73/20/abe50b605760f1f5b6e0b436c650649e69ca478d0f41b154f300367c09e4/mmh3-4.1.0-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:7cbb20b29d57e76a58b40fd8b13a9130db495a12d678d651b459bf61c0714cea", size = 82230 },
|
||||
{ url = "https://files.pythonhosted.org/packages/45/80/a1fc99d3ee50b573df0bfbb1ad518463af78d2ebca44bfca3b3f9473d651/mmh3-4.1.0-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:a42ad267e131d7847076bb7e31050f6c4378cd38e8f1bf7a0edd32f30224d5c9", size = 78679 },
|
||||
{ url = "https://files.pythonhosted.org/packages/9e/51/6c9ee2ddf3b386f45ff83b6926a5e826635757d91dab04cbf16eee05f9a7/mmh3-4.1.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:4a013979fc9390abadc445ea2527426a0e7a4495c19b74589204f9b71bcaafeb", size = 77382 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ee/fa/4b377f244c27fac5f0343cc4dc0d2eb0a08049afc8d5322d07be7461a768/mmh3-4.1.0-cp311-cp311-win32.whl", hash = "sha256:1d3b1cdad7c71b7b88966301789a478af142bddcb3a2bee563f7a7d40519a00f", size = 31232 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d1/b0/500ef56c29b276d796bfdb47c16d34fa18a68945e4d730a6fa7d483583ed/mmh3-4.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:0dc6dc32eb03727467da8e17deffe004fbb65e8b5ee2b502d36250d7a3f4e2ec", size = 31276 },
|
||||
{ url = "https://files.pythonhosted.org/packages/cc/84/94795e6e710c3861f8f355a12be9c9f4b8433a538c983e75bd4c00496a8a/mmh3-4.1.0-cp311-cp311-win_arm64.whl", hash = "sha256:9ae3a5c1b32dda121c7dc26f9597ef7b01b4c56a98319a7fe86c35b8bc459ae6", size = 30142 },
|
||||
{ url = "https://files.pythonhosted.org/packages/18/45/b4d41e86b00eed8c500adbe0007129861710e181c7f49c507ef6beae9496/mmh3-4.1.0-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0033d60c7939168ef65ddc396611077a7268bde024f2c23bdc283a19123f9e9c", size = 39495 },
|
||||
{ url = "https://files.pythonhosted.org/packages/a6/d4/f041b8704cb8d1aad3717105daa582e29818b78a540622dfed84cd00d88f/mmh3-4.1.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:d6af3e2287644b2b08b5924ed3a88c97b87b44ad08e79ca9f93d3470a54a41c5", size = 29334 },
|
||||
{ url = "https://files.pythonhosted.org/packages/cb/bb/8f75378e1a83b323f9ed06248333c383e7dac614c2f95e1419965cb91693/mmh3-4.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:d82eb4defa245e02bb0b0dc4f1e7ee284f8d212633389c91f7fba99ba993f0a2", size = 30144 },
|
||||
{ url = "https://files.pythonhosted.org/packages/3e/50/5e36c1945bd83e780a37361fc1999fc4c5a59ecc10a373557fdf0e58eb1f/mmh3-4.1.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ba245e94b8d54765e14c2d7b6214e832557e7856d5183bc522e17884cab2f45d", size = 69094 },
|
||||
{ url = "https://files.pythonhosted.org/packages/70/c7/6ae37e7519a938226469476b84bcea2650e2a2cc7a848e6a206ea98ecee3/mmh3-4.1.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bb04e2feeabaad6231e89cd43b3d01a4403579aa792c9ab6fdeef45cc58d4ec0", size = 72611 },
|
||||
{ url = "https://files.pythonhosted.org/packages/5e/47/6613f69f57f1e5045e66b22fae9c2fb39ef754c455805d3917f6073e316e/mmh3-4.1.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1e3b1a27def545ce11e36158ba5d5390cdbc300cfe456a942cc89d649cf7e3b2", size = 71462 },
|
||||
{ url = "https://files.pythonhosted.org/packages/e0/0a/e423db18ce7b479c4b96381a112b443f0985c611de420f95c58a9f934080/mmh3-4.1.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ce0ab79ff736d7044e5e9b3bfe73958a55f79a4ae672e6213e92492ad5e734d5", size = 66165 },
|
||||
{ url = "https://files.pythonhosted.org/packages/4c/7b/bfeb68bee5bddc8baf7ef630b93edc0a533202d84eb076dbb6c77e7e5fd5/mmh3-4.1.0-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3b02268be6e0a8eeb8a924d7db85f28e47344f35c438c1e149878bb1c47b1cd3", size = 68088 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d4/a6/b82e30143997c05776887f5177f724e3b714aa7e7346fbe2ec70f52abcd0/mmh3-4.1.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:deb887f5fcdaf57cf646b1e062d56b06ef2f23421c80885fce18b37143cba828", size = 76241 },
|
||||
{ url = "https://files.pythonhosted.org/packages/6c/60/a3d5872cf7610fcb13e36c472476020c5cf217b23c092bad452eb7784407/mmh3-4.1.0-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:99dd564e9e2b512eb117bd0cbf0f79a50c45d961c2a02402787d581cec5448d5", size = 74538 },
|
||||
{ url = "https://files.pythonhosted.org/packages/f6/d5/742173a94c78f4edab71c04097f6f9150c47f8fd034d592f5f34a9444719/mmh3-4.1.0-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:08373082dfaa38fe97aa78753d1efd21a1969e51079056ff552e687764eafdfe", size = 81793 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d0/7a/a1db0efe7c67b761d83be3d50e35ef26628ef56b3b8bc776d07412ee8b16/mmh3-4.1.0-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:54b9c6a2ea571b714e4fe28d3e4e2db37abfd03c787a58074ea21ee9a8fd1740", size = 78217 },
|
||||
{ url = "https://files.pythonhosted.org/packages/b3/78/1ff8da7c859cd09704e2f500588d171eda9688fcf6f29e028ef261262a16/mmh3-4.1.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:a7b1edf24c69e3513f879722b97ca85e52f9032f24a52284746877f6a7304086", size = 77052 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ed/c7/cf16ace81fc9fbe54a75c914306252af26c6ea485366bb3b579bf6e3dbb8/mmh3-4.1.0-cp312-cp312-win32.whl", hash = "sha256:411da64b951f635e1e2284b71d81a5a83580cea24994b328f8910d40bed67276", size = 31277 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d2/0b/b3b1637dca9414451edf287fd91e667e7231d5ffd7498137fe011951fc0a/mmh3-4.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:bebc3ecb6ba18292e3d40c8712482b4477abd6981c2ebf0e60869bd90f8ac3a9", size = 31318 },
|
||||
{ url = "https://files.pythonhosted.org/packages/dd/6c/c0f06040c58112ccbd0df989055ede98f7c1a1f392dc6a3fc63ec6c124ec/mmh3-4.1.0-cp312-cp312-win_arm64.whl", hash = "sha256:168473dd608ade6a8d2ba069600b35199a9af837d96177d3088ca91f2b3798e3", size = 30147 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -2564,6 +2617,33 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/7e/80/cab10959dc1faead58dc8384a781dfbf93cb4d33d50988f7a69f1b7c9bbe/oauthlib-3.2.2-py3-none-any.whl", hash = "sha256:8139f29aac13e25d502680e9e19963e83f16838d48a0d71c287fe40e7067fbca", size = 151688 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "onnx"
|
||||
version = "1.17.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "numpy" },
|
||||
{ name = "protobuf" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/9a/54/0e385c26bf230d223810a9c7d06628d954008a5e5e4b73ee26ef02327282/onnx-1.17.0.tar.gz", hash = "sha256:48ca1a91ff73c1d5e3ea2eef20ae5d0e709bb8a2355ed798ffc2169753013fd3", size = 12165120 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/2e/29/57053ba7787788ac75efb095cfc1ae290436b6d3a26754693cd7ed1b4fac/onnx-1.17.0-cp310-cp310-macosx_12_0_universal2.whl", hash = "sha256:38b5df0eb22012198cdcee527cc5f917f09cce1f88a69248aaca22bd78a7f023", size = 16645616 },
|
||||
{ url = "https://files.pythonhosted.org/packages/75/0d/831807a18db2a5e8f7813848c59272b904a4ef3939fe4d1288cbce9ea735/onnx-1.17.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d545335cb49d4d8c47cc803d3a805deb7ad5d9094dc67657d66e568610a36d7d", size = 15908420 },
|
||||
{ url = "https://files.pythonhosted.org/packages/dd/5b/c4f95dbe652d14aeba9afaceb177e9ffc48ac3c03048dd3f872f26f07e34/onnx-1.17.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3193a3672fc60f1a18c0f4c93ac81b761bc72fd8a6c2035fa79ff5969f07713e", size = 16046244 },
|
||||
{ url = "https://files.pythonhosted.org/packages/08/a9/c1f218085043dccc6311460239e253fa6957cf12ee4b0a56b82014938d0b/onnx-1.17.0-cp310-cp310-win32.whl", hash = "sha256:0141c2ce806c474b667b7e4499164227ef594584da432fd5613ec17c1855e311", size = 14423516 },
|
||||
{ url = "https://files.pythonhosted.org/packages/0e/d3/d26ebf590a65686dde6b27fef32493026c5be9e42083340d947395f93405/onnx-1.17.0-cp310-cp310-win_amd64.whl", hash = "sha256:dfd777d95c158437fda6b34758f0877d15b89cbe9ff45affbedc519b35345cf9", size = 14528496 },
|
||||
{ url = "https://files.pythonhosted.org/packages/e5/a9/8d1b1d53aec70df53e0f57e9f9fcf47004276539e29230c3d5f1f50719ba/onnx-1.17.0-cp311-cp311-macosx_12_0_universal2.whl", hash = "sha256:d6fc3a03fc0129b8b6ac03f03bc894431ffd77c7d79ec023d0afd667b4d35869", size = 16647991 },
|
||||
{ url = "https://files.pythonhosted.org/packages/7b/e3/cc80110e5996ca61878f7b4c73c7a286cd88918ff35eacb60dc75ab11ef5/onnx-1.17.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f01a4b63d4e1d8ec3e2f069e7b798b2955810aa434f7361f01bc8ca08d69cce4", size = 15908949 },
|
||||
{ url = "https://files.pythonhosted.org/packages/b1/2f/91092557ed478e323a2b4471e2081fdf88d1dd52ae988ceaf7db4e4506ff/onnx-1.17.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4a183c6178be001bf398260e5ac2c927dc43e7746e8638d6c05c20e321f8c949", size = 16048190 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ac/59/9ea23fc22d0bb853133f363e6248e31bcbc6c1c90543a3938c00412ac02a/onnx-1.17.0-cp311-cp311-win32.whl", hash = "sha256:081ec43a8b950171767d99075b6b92553901fa429d4bc5eb3ad66b36ef5dbe3a", size = 14424299 },
|
||||
{ url = "https://files.pythonhosted.org/packages/51/a5/19b0dfcb567b62e7adf1a21b08b23224f0c2d13842aee4d0abc6f07f9cf5/onnx-1.17.0-cp311-cp311-win_amd64.whl", hash = "sha256:95c03e38671785036bb704c30cd2e150825f6ab4763df3a4f1d249da48525957", size = 14529142 },
|
||||
{ url = "https://files.pythonhosted.org/packages/b4/dd/c416a11a28847fafb0db1bf43381979a0f522eb9107b831058fde012dd56/onnx-1.17.0-cp312-cp312-macosx_12_0_universal2.whl", hash = "sha256:0e906e6a83437de05f8139ea7eaf366bf287f44ae5cc44b2850a30e296421f2f", size = 16651271 },
|
||||
{ url = "https://files.pythonhosted.org/packages/f0/6c/f040652277f514ecd81b7251841f96caa5538365af7df07f86c6018cda2b/onnx-1.17.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3d955ba2939878a520a97614bcf2e79c1df71b29203e8ced478fa78c9a9c63c2", size = 15907522 },
|
||||
{ url = "https://files.pythonhosted.org/packages/3d/7c/67f4952d1b56b3f74a154b97d0dd0630d525923b354db117d04823b8b49b/onnx-1.17.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4f3fb5cc4e2898ac5312a7dc03a65133dd2abf9a5e520e69afb880a7251ec97a", size = 16046307 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ae/20/6da11042d2ab870dfb4ce4a6b52354d7651b6b4112038b6d2229ab9904c4/onnx-1.17.0-cp312-cp312-win32.whl", hash = "sha256:317870fca3349d19325a4b7d1b5628f6de3811e9710b1e3665c68b073d0e68d7", size = 14424235 },
|
||||
{ url = "https://files.pythonhosted.org/packages/35/55/c4d11bee1fdb0c4bd84b4e3562ff811a19b63266816870ae1f95567aa6e1/onnx-1.17.0-cp312-cp312-win_amd64.whl", hash = "sha256:659b8232d627a5460d74fd3c96947ae83db6d03f035ac633e20cd69cfa029227", size = 14530453 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "onnxruntime"
|
||||
version = "1.19.2"
|
||||
@@ -2613,6 +2693,18 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/ad/31/28a83e124e9f9dd04c83b5aeb6f8b1770f45addde4dd3d34d9a9091590ad/openai-1.52.1-py3-none-any.whl", hash = "sha256:f23e83df5ba04ee0e82c8562571e8cb596cd88f9a84ab783e6c6259e5ffbfb4a", size = 386945 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "openpyxl"
|
||||
version = "3.1.5"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "et-xmlfile" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/3d/f9/88d94a75de065ea32619465d2f77b29a0469500e99012523b91cc4141cd1/openpyxl-3.1.5.tar.gz", hash = "sha256:cf0e3cf56142039133628b5acffe8ef0c12bc902d2aadd3e0fe5878dc08d1050", size = 186464 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/c0/da/977ded879c29cbd04de313843e76868e6e13408a94ed6b987245dc7c8506/openpyxl-3.1.5-py2.py3-none-any.whl", hash = "sha256:5282c12b107bffeef825f4617dc029afaf41d0ea60823bbb665ef3079dc79de2", size = 250910 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "opentelemetry-api"
|
||||
version = "1.27.0"
|
||||
@@ -2927,6 +3019,33 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08", size = 31191 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pdfminer-six"
|
||||
version = "20231228"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "charset-normalizer" },
|
||||
{ name = "cryptography" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/31/b1/a43e3bd872ded4deea4f8efc7aff1703fca8c5455d0c06e20506a06a44ff/pdfminer.six-20231228.tar.gz", hash = "sha256:6004da3ad1a7a4d45930cb950393df89b068e73be365a6ff64a838d37bcb08c4", size = 7362505 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/eb/9c/e46fe7502b32d7db6af6e36a9105abb93301fa1ec475b5ddcba8b35ae23a/pdfminer.six-20231228-py3-none-any.whl", hash = "sha256:e8d3c3310e6fbc1fe414090123ab01351634b4ecb021232206c4c9a8ca3e3b8f", size = 5614515 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pdfplumber"
|
||||
version = "0.11.4"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "pdfminer-six" },
|
||||
{ name = "pillow" },
|
||||
{ name = "pypdfium2" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/ca/f0/457bda3629dfa5b01c645519fe30230e1739751f6645e23fca2dabf6c2e5/pdfplumber-0.11.4.tar.gz", hash = "sha256:147b55cde2351fcb9523b46b09cc771eea3602faecfb60d463c6bf951694fbe8", size = 113305 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d0/87/415cb472981a8d2e36beeeadf074ebb686cc2bfe8d18de973232da291bd5/pdfplumber-0.11.4-py3-none-any.whl", hash = "sha256:6150f0678c7aaba974ac09839c17475d6c0c4d126b5f92cb85154885f31c6d73", size = 59182 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pexpect"
|
||||
version = "4.9.0"
|
||||
@@ -2941,69 +3060,61 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "pillow"
|
||||
version = "11.0.0"
|
||||
version = "10.4.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/a5/26/0d95c04c868f6bdb0c447e3ee2de5564411845e36a858cfd63766bc7b563/pillow-11.0.0.tar.gz", hash = "sha256:72bacbaf24ac003fea9bff9837d1eedb6088758d41e100c1552930151f677739", size = 46737780 }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/cd/74/ad3d526f3bf7b6d3f408b73fde271ec69dfac8b81341a318ce825f2b3812/pillow-10.4.0.tar.gz", hash = "sha256:166c1cd4d24309b30d61f79f4a9114b7b2313d7450912277855ff5dfd7cd4a06", size = 46555059 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/98/fb/a6ce6836bd7fd93fbf9144bf54789e02babc27403b50a9e1583ee877d6da/pillow-11.0.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:6619654954dc4936fcff82db8eb6401d3159ec6be81e33c6000dfd76ae189947", size = 3154708 },
|
||||
{ url = "https://files.pythonhosted.org/packages/6a/1d/1f51e6e912d8ff316bb3935a8cda617c801783e0b998bf7a894e91d3bd4c/pillow-11.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:b3c5ac4bed7519088103d9450a1107f76308ecf91d6dabc8a33a2fcfb18d0fba", size = 2979223 },
|
||||
{ url = "https://files.pythonhosted.org/packages/90/83/e2077b0192ca8a9ef794dbb74700c7e48384706467067976c2a95a0f40a1/pillow-11.0.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a65149d8ada1055029fcb665452b2814fe7d7082fcb0c5bed6db851cb69b2086", size = 4183167 },
|
||||
{ url = "https://files.pythonhosted.org/packages/0e/74/467af0146970a98349cdf39e9b79a6cc8a2e7558f2c01c28a7b6b85c5bda/pillow-11.0.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:88a58d8ac0cc0e7f3a014509f0455248a76629ca9b604eca7dc5927cc593c5e9", size = 4283912 },
|
||||
{ url = "https://files.pythonhosted.org/packages/85/b1/d95d4f7ca3a6c1ae120959605875a31a3c209c4e50f0029dc1a87566cf46/pillow-11.0.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:c26845094b1af3c91852745ae78e3ea47abf3dbcd1cf962f16b9a5fbe3ee8488", size = 4195815 },
|
||||
{ url = "https://files.pythonhosted.org/packages/41/c3/94f33af0762ed76b5a237c5797e088aa57f2b7fa8ee7932d399087be66a8/pillow-11.0.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:1a61b54f87ab5786b8479f81c4b11f4d61702830354520837f8cc791ebba0f5f", size = 4366117 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ba/3c/443e7ef01f597497268899e1cca95c0de947c9bbf77a8f18b3c126681e5d/pillow-11.0.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:674629ff60030d144b7bca2b8330225a9b11c482ed408813924619c6f302fdbb", size = 4278607 },
|
||||
{ url = "https://files.pythonhosted.org/packages/26/95/1495304448b0081e60c0c5d63f928ef48bb290acee7385804426fa395a21/pillow-11.0.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:598b4e238f13276e0008299bd2482003f48158e2b11826862b1eb2ad7c768b97", size = 4410685 },
|
||||
{ url = "https://files.pythonhosted.org/packages/45/da/861e1df971ef0de9870720cb309ca4d553b26a9483ec9be3a7bf1de4a095/pillow-11.0.0-cp310-cp310-win32.whl", hash = "sha256:9a0f748eaa434a41fccf8e1ee7a3eed68af1b690e75328fd7a60af123c193b50", size = 2249185 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d5/4e/78f7c5202ea2a772a5ab05069c1b82503e6353cd79c7e474d4945f4b82c3/pillow-11.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:a5629742881bcbc1f42e840af185fd4d83a5edeb96475a575f4da50d6ede337c", size = 2566726 },
|
||||
{ url = "https://files.pythonhosted.org/packages/77/e4/6e84eada35cbcc646fc1870f72ccfd4afacb0fae0c37ffbffe7f5dc24bf1/pillow-11.0.0-cp310-cp310-win_arm64.whl", hash = "sha256:ee217c198f2e41f184f3869f3e485557296d505b5195c513b2bfe0062dc537f1", size = 2254585 },
|
||||
{ url = "https://files.pythonhosted.org/packages/f0/eb/f7e21b113dd48a9c97d364e0915b3988c6a0b6207652f5a92372871b7aa4/pillow-11.0.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:1c1d72714f429a521d8d2d018badc42414c3077eb187a59579f28e4270b4b0fc", size = 3154705 },
|
||||
{ url = "https://files.pythonhosted.org/packages/25/b3/2b54a1d541accebe6bd8b1358b34ceb2c509f51cb7dcda8687362490da5b/pillow-11.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:499c3a1b0d6fc8213519e193796eb1a86a1be4b1877d678b30f83fd979811d1a", size = 2979222 },
|
||||
{ url = "https://files.pythonhosted.org/packages/20/12/1a41eddad8265c5c19dda8fb6c269ce15ee25e0b9f8f26286e6202df6693/pillow-11.0.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c8b2351c85d855293a299038e1f89db92a2f35e8d2f783489c6f0b2b5f3fe8a3", size = 4190220 },
|
||||
{ url = "https://files.pythonhosted.org/packages/a9/9b/8a8c4d07d77447b7457164b861d18f5a31ae6418ef5c07f6f878fa09039a/pillow-11.0.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6f4dba50cfa56f910241eb7f883c20f1e7b1d8f7d91c750cd0b318bad443f4d5", size = 4291399 },
|
||||
{ url = "https://files.pythonhosted.org/packages/fc/e4/130c5fab4a54d3991129800dd2801feeb4b118d7630148cd67f0e6269d4c/pillow-11.0.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:5ddbfd761ee00c12ee1be86c9c0683ecf5bb14c9772ddbd782085779a63dd55b", size = 4202709 },
|
||||
{ url = "https://files.pythonhosted.org/packages/39/63/b3fc299528d7df1f678b0666002b37affe6b8751225c3d9c12cf530e73ed/pillow-11.0.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:45c566eb10b8967d71bf1ab8e4a525e5a93519e29ea071459ce517f6b903d7fa", size = 4372556 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c6/a6/694122c55b855b586c26c694937d36bb8d3b09c735ff41b2f315c6e66a10/pillow-11.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:b4fd7bd29610a83a8c9b564d457cf5bd92b4e11e79a4ee4716a63c959699b306", size = 4287187 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ba/a9/f9d763e2671a8acd53d29b1e284ca298bc10a595527f6be30233cdb9659d/pillow-11.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:cb929ca942d0ec4fac404cbf520ee6cac37bf35be479b970c4ffadf2b6a1cad9", size = 4418468 },
|
||||
{ url = "https://files.pythonhosted.org/packages/6e/0e/b5cbad2621377f11313a94aeb44ca55a9639adabcaaa073597a1925f8c26/pillow-11.0.0-cp311-cp311-win32.whl", hash = "sha256:006bcdd307cc47ba43e924099a038cbf9591062e6c50e570819743f5607404f5", size = 2249249 },
|
||||
{ url = "https://files.pythonhosted.org/packages/dc/83/1470c220a4ff06cd75fc609068f6605e567ea51df70557555c2ab6516b2c/pillow-11.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:52a2d8323a465f84faaba5236567d212c3668f2ab53e1c74c15583cf507a0291", size = 2566769 },
|
||||
{ url = "https://files.pythonhosted.org/packages/52/98/def78c3a23acee2bcdb2e52005fb2810ed54305602ec1bfcfab2bda6f49f/pillow-11.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:16095692a253047fe3ec028e951fa4221a1f3ed3d80c397e83541a3037ff67c9", size = 2254611 },
|
||||
{ url = "https://files.pythonhosted.org/packages/1c/a3/26e606ff0b2daaf120543e537311fa3ae2eb6bf061490e4fea51771540be/pillow-11.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d2c0a187a92a1cb5ef2c8ed5412dd8d4334272617f532d4ad4de31e0495bd923", size = 3147642 },
|
||||
{ url = "https://files.pythonhosted.org/packages/4f/d5/1caabedd8863526a6cfa44ee7a833bd97f945dc1d56824d6d76e11731939/pillow-11.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:084a07ef0821cfe4858fe86652fffac8e187b6ae677e9906e192aafcc1b69903", size = 2978999 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d9/ff/5a45000826a1aa1ac6874b3ec5a856474821a1b59d838c4f6ce2ee518fe9/pillow-11.0.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8069c5179902dcdce0be9bfc8235347fdbac249d23bd90514b7a47a72d9fecf4", size = 4196794 },
|
||||
{ url = "https://files.pythonhosted.org/packages/9d/21/84c9f287d17180f26263b5f5c8fb201de0f88b1afddf8a2597a5c9fe787f/pillow-11.0.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f02541ef64077f22bf4924f225c0fd1248c168f86e4b7abdedd87d6ebaceab0f", size = 4300762 },
|
||||
{ url = "https://files.pythonhosted.org/packages/84/39/63fb87cd07cc541438b448b1fed467c4d687ad18aa786a7f8e67b255d1aa/pillow-11.0.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:fcb4621042ac4b7865c179bb972ed0da0218a076dc1820ffc48b1d74c1e37fe9", size = 4210468 },
|
||||
{ url = "https://files.pythonhosted.org/packages/7f/42/6e0f2c2d5c60f499aa29be14f860dd4539de322cd8fb84ee01553493fb4d/pillow-11.0.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:00177a63030d612148e659b55ba99527803288cea7c75fb05766ab7981a8c1b7", size = 4381824 },
|
||||
{ url = "https://files.pythonhosted.org/packages/31/69/1ef0fb9d2f8d2d114db982b78ca4eeb9db9a29f7477821e160b8c1253f67/pillow-11.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:8853a3bf12afddfdf15f57c4b02d7ded92c7a75a5d7331d19f4f9572a89c17e6", size = 4296436 },
|
||||
{ url = "https://files.pythonhosted.org/packages/44/ea/dad2818c675c44f6012289a7c4f46068c548768bc6c7f4e8c4ae5bbbc811/pillow-11.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3107c66e43bda25359d5ef446f59c497de2b5ed4c7fdba0894f8d6cf3822dafc", size = 4429714 },
|
||||
{ url = "https://files.pythonhosted.org/packages/af/3a/da80224a6eb15bba7a0dcb2346e2b686bb9bf98378c0b4353cd88e62b171/pillow-11.0.0-cp312-cp312-win32.whl", hash = "sha256:86510e3f5eca0ab87429dd77fafc04693195eec7fd6a137c389c3eeb4cfb77c6", size = 2249631 },
|
||||
{ url = "https://files.pythonhosted.org/packages/57/97/73f756c338c1d86bb802ee88c3cab015ad7ce4b838f8a24f16b676b1ac7c/pillow-11.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:8ec4a89295cd6cd4d1058a5e6aec6bf51e0eaaf9714774e1bfac7cfc9051db47", size = 2567533 },
|
||||
{ url = "https://files.pythonhosted.org/packages/0b/30/2b61876e2722374558b871dfbfcbe4e406626d63f4f6ed92e9c8e24cac37/pillow-11.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:27a7860107500d813fcd203b4ea19b04babe79448268403172782754870dac25", size = 2254890 },
|
||||
{ url = "https://files.pythonhosted.org/packages/63/24/e2e15e392d00fcf4215907465d8ec2a2f23bcec1481a8ebe4ae760459995/pillow-11.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:bcd1fb5bb7b07f64c15618c89efcc2cfa3e95f0e3bcdbaf4642509de1942a699", size = 3147300 },
|
||||
{ url = "https://files.pythonhosted.org/packages/43/72/92ad4afaa2afc233dc44184adff289c2e77e8cd916b3ddb72ac69495bda3/pillow-11.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0e038b0745997c7dcaae350d35859c9715c71e92ffb7e0f4a8e8a16732150f38", size = 2978742 },
|
||||
{ url = "https://files.pythonhosted.org/packages/9e/da/c8d69c5bc85d72a8523fe862f05ababdc52c0a755cfe3d362656bb86552b/pillow-11.0.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ae08bd8ffc41aebf578c2af2f9d8749d91f448b3bfd41d7d9ff573d74f2a6b2", size = 4194349 },
|
||||
{ url = "https://files.pythonhosted.org/packages/cd/e8/686d0caeed6b998351d57796496a70185376ed9c8ec7d99e1d19ad591fc6/pillow-11.0.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d69bfd8ec3219ae71bcde1f942b728903cad25fafe3100ba2258b973bd2bc1b2", size = 4298714 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ec/da/430015cec620d622f06854be67fd2f6721f52fc17fca8ac34b32e2d60739/pillow-11.0.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:61b887f9ddba63ddf62fd02a3ba7add935d053b6dd7d58998c630e6dbade8527", size = 4208514 },
|
||||
{ url = "https://files.pythonhosted.org/packages/44/ae/7e4f6662a9b1cb5f92b9cc9cab8321c381ffbee309210940e57432a4063a/pillow-11.0.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:c6a660307ca9d4867caa8d9ca2c2658ab685de83792d1876274991adec7b93fa", size = 4380055 },
|
||||
{ url = "https://files.pythonhosted.org/packages/74/d5/1a807779ac8a0eeed57f2b92a3c32ea1b696e6140c15bd42eaf908a261cd/pillow-11.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:73e3a0200cdda995c7e43dd47436c1548f87a30bb27fb871f352a22ab8dcf45f", size = 4296751 },
|
||||
{ url = "https://files.pythonhosted.org/packages/38/8c/5fa3385163ee7080bc13026d59656267daaaaf3c728c233d530e2c2757c8/pillow-11.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:fba162b8872d30fea8c52b258a542c5dfd7b235fb5cb352240c8d63b414013eb", size = 4430378 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ca/1d/ad9c14811133977ff87035bf426875b93097fb50af747793f013979facdb/pillow-11.0.0-cp313-cp313-win32.whl", hash = "sha256:f1b82c27e89fffc6da125d5eb0ca6e68017faf5efc078128cfaa42cf5cb38798", size = 2249588 },
|
||||
{ url = "https://files.pythonhosted.org/packages/fb/01/3755ba287dac715e6afdb333cb1f6d69740a7475220b4637b5ce3d78cec2/pillow-11.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:8ba470552b48e5835f1d23ecb936bb7f71d206f9dfeee64245f30c3270b994de", size = 2567509 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c0/98/2c7d727079b6be1aba82d195767d35fcc2d32204c7a5820f822df5330152/pillow-11.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:846e193e103b41e984ac921b335df59195356ce3f71dcfd155aa79c603873b84", size = 2254791 },
|
||||
{ url = "https://files.pythonhosted.org/packages/eb/38/998b04cc6f474e78b563716b20eecf42a2fa16a84589d23c8898e64b0ffd/pillow-11.0.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:4ad70c4214f67d7466bea6a08061eba35c01b1b89eaa098040a35272a8efb22b", size = 3150854 },
|
||||
{ url = "https://files.pythonhosted.org/packages/13/8e/be23a96292113c6cb26b2aa3c8b3681ec62b44ed5c2bd0b258bd59503d3c/pillow-11.0.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:6ec0d5af64f2e3d64a165f490d96368bb5dea8b8f9ad04487f9ab60dc4bb6003", size = 2982369 },
|
||||
{ url = "https://files.pythonhosted.org/packages/97/8a/3db4eaabb7a2ae8203cd3a332a005e4aba00067fc514aaaf3e9721be31f1/pillow-11.0.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c809a70e43c7977c4a42aefd62f0131823ebf7dd73556fa5d5950f5b354087e2", size = 4333703 },
|
||||
{ url = "https://files.pythonhosted.org/packages/28/ac/629ffc84ff67b9228fe87a97272ab125bbd4dc462745f35f192d37b822f1/pillow-11.0.0-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:4b60c9520f7207aaf2e1d94de026682fc227806c6e1f55bba7606d1c94dd623a", size = 4412550 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d6/07/a505921d36bb2df6868806eaf56ef58699c16c388e378b0dcdb6e5b2fb36/pillow-11.0.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:1e2688958a840c822279fda0086fec1fdab2f95bf2b717b66871c4ad9859d7e8", size = 4461038 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d6/b9/fb620dd47fc7cc9678af8f8bd8c772034ca4977237049287e99dda360b66/pillow-11.0.0-cp313-cp313t-win32.whl", hash = "sha256:607bbe123c74e272e381a8d1957083a9463401f7bd01287f50521ecb05a313f8", size = 2253197 },
|
||||
{ url = "https://files.pythonhosted.org/packages/df/86/25dde85c06c89d7fc5db17940f07aae0a56ac69aa9ccb5eb0f09798862a8/pillow-11.0.0-cp313-cp313t-win_amd64.whl", hash = "sha256:5c39ed17edea3bc69c743a8dd3e9853b7509625c2462532e62baa0732163a904", size = 2572169 },
|
||||
{ url = "https://files.pythonhosted.org/packages/51/85/9c33f2517add612e17f3381aee7c4072779130c634921a756c97bc29fb49/pillow-11.0.0-cp313-cp313t-win_arm64.whl", hash = "sha256:75acbbeb05b86bc53cbe7b7e6fe00fbcf82ad7c684b3ad82e3d711da9ba287d3", size = 2256828 },
|
||||
{ url = "https://files.pythonhosted.org/packages/36/57/42a4dd825eab762ba9e690d696d894ba366e06791936056e26e099398cda/pillow-11.0.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:1187739620f2b365de756ce086fdb3604573337cc28a0d3ac4a01ab6b2d2a6d2", size = 3119239 },
|
||||
{ url = "https://files.pythonhosted.org/packages/98/f7/25f9f9e368226a1d6cf3507081a1a7944eddd3ca7821023377043f5a83c8/pillow-11.0.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:fbbcb7b57dc9c794843e3d1258c0fbf0f48656d46ffe9e09b63bbd6e8cd5d0a2", size = 2950803 },
|
||||
{ url = "https://files.pythonhosted.org/packages/59/01/98ead48a6c2e31e6185d4c16c978a67fe3ccb5da5c2ff2ba8475379bb693/pillow-11.0.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d203af30149ae339ad1b4f710d9844ed8796e97fda23ffbc4cc472968a47d0b", size = 3281098 },
|
||||
{ url = "https://files.pythonhosted.org/packages/51/c0/570255b2866a0e4d500a14f950803a2ec273bac7badc43320120b9262450/pillow-11.0.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:21a0d3b115009ebb8ac3d2ebec5c2982cc693da935f4ab7bb5c8ebe2f47d36f2", size = 3323665 },
|
||||
{ url = "https://files.pythonhosted.org/packages/0e/75/689b4ec0483c42bfc7d1aacd32ade7a226db4f4fac57c6fdcdf90c0731e3/pillow-11.0.0-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:73853108f56df97baf2bb8b522f3578221e56f646ba345a372c78326710d3830", size = 3310533 },
|
||||
{ url = "https://files.pythonhosted.org/packages/3d/30/38bd6149cf53da1db4bad304c543ade775d225961c4310f30425995cb9ec/pillow-11.0.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:e58876c91f97b0952eb766123bfef372792ab3f4e3e1f1a2267834c2ab131734", size = 3414886 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ec/3d/c32a51d848401bd94cabb8767a39621496491ee7cd5199856b77da9b18ad/pillow-11.0.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:224aaa38177597bb179f3ec87eeefcce8e4f85e608025e9cfac60de237ba6316", size = 2567508 },
|
||||
{ url = "https://files.pythonhosted.org/packages/0e/69/a31cccd538ca0b5272be2a38347f8839b97a14be104ea08b0db92f749c74/pillow-10.4.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:4d9667937cfa347525b319ae34375c37b9ee6b525440f3ef48542fcf66f2731e", size = 3509271 },
|
||||
{ url = "https://files.pythonhosted.org/packages/9a/9e/4143b907be8ea0bce215f2ae4f7480027473f8b61fcedfda9d851082a5d2/pillow-10.4.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:543f3dc61c18dafb755773efc89aae60d06b6596a63914107f75459cf984164d", size = 3375658 },
|
||||
{ url = "https://files.pythonhosted.org/packages/8a/25/1fc45761955f9359b1169aa75e241551e74ac01a09f487adaaf4c3472d11/pillow-10.4.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7928ecbf1ece13956b95d9cbcfc77137652b02763ba384d9ab508099a2eca856", size = 4332075 },
|
||||
{ url = "https://files.pythonhosted.org/packages/5e/dd/425b95d0151e1d6c951f45051112394f130df3da67363b6bc75dc4c27aba/pillow-10.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e4d49b85c4348ea0b31ea63bc75a9f3857869174e2bf17e7aba02945cd218e6f", size = 4444808 },
|
||||
{ url = "https://files.pythonhosted.org/packages/b1/84/9a15cc5726cbbfe7f9f90bfb11f5d028586595907cd093815ca6644932e3/pillow-10.4.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:6c762a5b0997f5659a5ef2266abc1d8851ad7749ad9a6a5506eb23d314e4f46b", size = 4356290 },
|
||||
{ url = "https://files.pythonhosted.org/packages/b5/5b/6651c288b08df3b8c1e2f8c1152201e0b25d240e22ddade0f1e242fc9fa0/pillow-10.4.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:a985e028fc183bf12a77a8bbf36318db4238a3ded7fa9df1b9a133f1cb79f8fc", size = 4525163 },
|
||||
{ url = "https://files.pythonhosted.org/packages/07/8b/34854bf11a83c248505c8cb0fcf8d3d0b459a2246c8809b967963b6b12ae/pillow-10.4.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:812f7342b0eee081eaec84d91423d1b4650bb9828eb53d8511bcef8ce5aecf1e", size = 4463100 },
|
||||
{ url = "https://files.pythonhosted.org/packages/78/63/0632aee4e82476d9cbe5200c0cdf9ba41ee04ed77887432845264d81116d/pillow-10.4.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:ac1452d2fbe4978c2eec89fb5a23b8387aba707ac72810d9490118817d9c0b46", size = 4592880 },
|
||||
{ url = "https://files.pythonhosted.org/packages/df/56/b8663d7520671b4398b9d97e1ed9f583d4afcbefbda3c6188325e8c297bd/pillow-10.4.0-cp310-cp310-win32.whl", hash = "sha256:bcd5e41a859bf2e84fdc42f4edb7d9aba0a13d29a2abadccafad99de3feff984", size = 2235218 },
|
||||
{ url = "https://files.pythonhosted.org/packages/f4/72/0203e94a91ddb4a9d5238434ae6c1ca10e610e8487036132ea9bf806ca2a/pillow-10.4.0-cp310-cp310-win_amd64.whl", hash = "sha256:ecd85a8d3e79cd7158dec1c9e5808e821feea088e2f69a974db5edf84dc53141", size = 2554487 },
|
||||
{ url = "https://files.pythonhosted.org/packages/bd/52/7e7e93d7a6e4290543f17dc6f7d3af4bd0b3dd9926e2e8a35ac2282bc5f4/pillow-10.4.0-cp310-cp310-win_arm64.whl", hash = "sha256:ff337c552345e95702c5fde3158acb0625111017d0e5f24bf3acdb9cc16b90d1", size = 2243219 },
|
||||
{ url = "https://files.pythonhosted.org/packages/a7/62/c9449f9c3043c37f73e7487ec4ef0c03eb9c9afc91a92b977a67b3c0bbc5/pillow-10.4.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:0a9ec697746f268507404647e531e92889890a087e03681a3606d9b920fbee3c", size = 3509265 },
|
||||
{ url = "https://files.pythonhosted.org/packages/f4/5f/491dafc7bbf5a3cc1845dc0430872e8096eb9e2b6f8161509d124594ec2d/pillow-10.4.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:dfe91cb65544a1321e631e696759491ae04a2ea11d36715eca01ce07284738be", size = 3375655 },
|
||||
{ url = "https://files.pythonhosted.org/packages/73/d5/c4011a76f4207a3c151134cd22a1415741e42fa5ddecec7c0182887deb3d/pillow-10.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5dc6761a6efc781e6a1544206f22c80c3af4c8cf461206d46a1e6006e4429ff3", size = 4340304 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ac/10/c67e20445a707f7a610699bba4fe050583b688d8cd2d202572b257f46600/pillow-10.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5e84b6cc6a4a3d76c153a6b19270b3526a5a8ed6b09501d3af891daa2a9de7d6", size = 4452804 },
|
||||
{ url = "https://files.pythonhosted.org/packages/a9/83/6523837906d1da2b269dee787e31df3b0acb12e3d08f024965a3e7f64665/pillow-10.4.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:bbc527b519bd3aa9d7f429d152fea69f9ad37c95f0b02aebddff592688998abe", size = 4365126 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ba/e5/8c68ff608a4203085158cff5cc2a3c534ec384536d9438c405ed6370d080/pillow-10.4.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:76a911dfe51a36041f2e756b00f96ed84677cdeb75d25c767f296c1c1eda1319", size = 4533541 },
|
||||
{ url = "https://files.pythonhosted.org/packages/f4/7c/01b8dbdca5bc6785573f4cee96e2358b0918b7b2c7b60d8b6f3abf87a070/pillow-10.4.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:59291fb29317122398786c2d44427bbd1a6d7ff54017075b22be9d21aa59bd8d", size = 4471616 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c8/57/2899b82394a35a0fbfd352e290945440e3b3785655a03365c0ca8279f351/pillow-10.4.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:416d3a5d0e8cfe4f27f574362435bc9bae57f679a7158e0096ad2beb427b8696", size = 4600802 },
|
||||
{ url = "https://files.pythonhosted.org/packages/4d/d7/a44f193d4c26e58ee5d2d9db3d4854b2cfb5b5e08d360a5e03fe987c0086/pillow-10.4.0-cp311-cp311-win32.whl", hash = "sha256:7086cc1d5eebb91ad24ded9f58bec6c688e9f0ed7eb3dbbf1e4800280a896496", size = 2235213 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c1/d0/5866318eec2b801cdb8c82abf190c8343d8a1cd8bf5a0c17444a6f268291/pillow-10.4.0-cp311-cp311-win_amd64.whl", hash = "sha256:cbed61494057c0f83b83eb3a310f0bf774b09513307c434d4366ed64f4128a91", size = 2554498 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d4/c8/310ac16ac2b97e902d9eb438688de0d961660a87703ad1561fd3dfbd2aa0/pillow-10.4.0-cp311-cp311-win_arm64.whl", hash = "sha256:f5f0c3e969c8f12dd2bb7e0b15d5c468b51e5017e01e2e867335c81903046a22", size = 2243219 },
|
||||
{ url = "https://files.pythonhosted.org/packages/05/cb/0353013dc30c02a8be34eb91d25e4e4cf594b59e5a55ea1128fde1e5f8ea/pillow-10.4.0-cp312-cp312-macosx_10_10_x86_64.whl", hash = "sha256:673655af3eadf4df6b5457033f086e90299fdd7a47983a13827acf7459c15d94", size = 3509350 },
|
||||
{ url = "https://files.pythonhosted.org/packages/e7/cf/5c558a0f247e0bf9cec92bff9b46ae6474dd736f6d906315e60e4075f737/pillow-10.4.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:866b6942a92f56300012f5fbac71f2d610312ee65e22f1aa2609e491284e5597", size = 3374980 },
|
||||
{ url = "https://files.pythonhosted.org/packages/84/48/6e394b86369a4eb68b8a1382c78dc092245af517385c086c5094e3b34428/pillow-10.4.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:29dbdc4207642ea6aad70fbde1a9338753d33fb23ed6956e706936706f52dd80", size = 4343799 },
|
||||
{ url = "https://files.pythonhosted.org/packages/3b/f3/a8c6c11fa84b59b9df0cd5694492da8c039a24cd159f0f6918690105c3be/pillow-10.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf2342ac639c4cf38799a44950bbc2dfcb685f052b9e262f446482afaf4bffca", size = 4459973 },
|
||||
{ url = "https://files.pythonhosted.org/packages/7d/1b/c14b4197b80150fb64453585247e6fb2e1d93761fa0fa9cf63b102fde822/pillow-10.4.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:f5b92f4d70791b4a67157321c4e8225d60b119c5cc9aee8ecf153aace4aad4ef", size = 4370054 },
|
||||
{ url = "https://files.pythonhosted.org/packages/55/77/40daddf677897a923d5d33329acd52a2144d54a9644f2a5422c028c6bf2d/pillow-10.4.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:86dcb5a1eb778d8b25659d5e4341269e8590ad6b4e8b44d9f4b07f8d136c414a", size = 4539484 },
|
||||
{ url = "https://files.pythonhosted.org/packages/40/54/90de3e4256b1207300fb2b1d7168dd912a2fb4b2401e439ba23c2b2cabde/pillow-10.4.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:780c072c2e11c9b2c7ca37f9a2ee8ba66f44367ac3e5c7832afcfe5104fd6d1b", size = 4477375 },
|
||||
{ url = "https://files.pythonhosted.org/packages/13/24/1bfba52f44193860918ff7c93d03d95e3f8748ca1de3ceaf11157a14cf16/pillow-10.4.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:37fb69d905be665f68f28a8bba3c6d3223c8efe1edf14cc4cfa06c241f8c81d9", size = 4608773 },
|
||||
{ url = "https://files.pythonhosted.org/packages/55/04/5e6de6e6120451ec0c24516c41dbaf80cce1b6451f96561235ef2429da2e/pillow-10.4.0-cp312-cp312-win32.whl", hash = "sha256:7dfecdbad5c301d7b5bde160150b4db4c659cee2b69589705b6f8a0c509d9f42", size = 2235690 },
|
||||
{ url = "https://files.pythonhosted.org/packages/74/0a/d4ce3c44bca8635bd29a2eab5aa181b654a734a29b263ca8efe013beea98/pillow-10.4.0-cp312-cp312-win_amd64.whl", hash = "sha256:1d846aea995ad352d4bdcc847535bd56e0fd88d36829d2c90be880ef1ee4668a", size = 2554951 },
|
||||
{ url = "https://files.pythonhosted.org/packages/b5/ca/184349ee40f2e92439be9b3502ae6cfc43ac4b50bc4fc6b3de7957563894/pillow-10.4.0-cp312-cp312-win_arm64.whl", hash = "sha256:e553cad5179a66ba15bb18b353a19020e73a7921296a7979c4a2b7f6a5cd57f9", size = 2243427 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c3/00/706cebe7c2c12a6318aabe5d354836f54adff7156fd9e1bd6c89f4ba0e98/pillow-10.4.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8bc1a764ed8c957a2e9cacf97c8b2b053b70307cf2996aafd70e91a082e70df3", size = 3525685 },
|
||||
{ url = "https://files.pythonhosted.org/packages/cf/76/f658cbfa49405e5ecbfb9ba42d07074ad9792031267e782d409fd8fe7c69/pillow-10.4.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:6209bb41dc692ddfee4942517c19ee81b86c864b626dbfca272ec0f7cff5d9fb", size = 3374883 },
|
||||
{ url = "https://files.pythonhosted.org/packages/46/2b/99c28c4379a85e65378211971c0b430d9c7234b1ec4d59b2668f6299e011/pillow-10.4.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bee197b30783295d2eb680b311af15a20a8b24024a19c3a26431ff83eb8d1f70", size = 4339837 },
|
||||
{ url = "https://files.pythonhosted.org/packages/f1/74/b1ec314f624c0c43711fdf0d8076f82d9d802afd58f1d62c2a86878e8615/pillow-10.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1ef61f5dd14c300786318482456481463b9d6b91ebe5ef12f405afbba77ed0be", size = 4455562 },
|
||||
{ url = "https://files.pythonhosted.org/packages/4a/2a/4b04157cb7b9c74372fa867096a1607e6fedad93a44deeff553ccd307868/pillow-10.4.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:297e388da6e248c98bc4a02e018966af0c5f92dfacf5a5ca22fa01cb3179bca0", size = 4366761 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ac/7b/8f1d815c1a6a268fe90481232c98dd0e5fa8c75e341a75f060037bd5ceae/pillow-10.4.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:e4db64794ccdf6cb83a59d73405f63adbe2a1887012e308828596100a0b2f6cc", size = 4536767 },
|
||||
{ url = "https://files.pythonhosted.org/packages/e5/77/05fa64d1f45d12c22c314e7b97398ffb28ef2813a485465017b7978b3ce7/pillow-10.4.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:bd2880a07482090a3bcb01f4265f1936a903d70bc740bfcb1fd4e8a2ffe5cf5a", size = 4477989 },
|
||||
{ url = "https://files.pythonhosted.org/packages/12/63/b0397cfc2caae05c3fb2f4ed1b4fc4fc878f0243510a7a6034ca59726494/pillow-10.4.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4b35b21b819ac1dbd1233317adeecd63495f6babf21b7b2512d244ff6c6ce309", size = 4610255 },
|
||||
{ url = "https://files.pythonhosted.org/packages/7b/f9/cfaa5082ca9bc4a6de66ffe1c12c2d90bf09c309a5f52b27759a596900e7/pillow-10.4.0-cp313-cp313-win32.whl", hash = "sha256:551d3fd6e9dc15e4c1eb6fc4ba2b39c0c7933fa113b220057a34f4bb3268a060", size = 2235603 },
|
||||
{ url = "https://files.pythonhosted.org/packages/01/6a/30ff0eef6e0c0e71e55ded56a38d4859bf9d3634a94a88743897b5f96936/pillow-10.4.0-cp313-cp313-win_amd64.whl", hash = "sha256:030abdbe43ee02e0de642aee345efa443740aa4d828bfe8e2eb11922ea6a21ea", size = 2554972 },
|
||||
{ url = "https://files.pythonhosted.org/packages/48/2c/2e0a52890f269435eee38b21c8218e102c621fe8d8df8b9dd06fabf879ba/pillow-10.4.0-cp313-cp313-win_arm64.whl", hash = "sha256:5b001114dd152cfd6b23befeb28d7aee43553e2402c9f159807bf55f33af8a8d", size = 2243375 },
|
||||
{ url = "https://files.pythonhosted.org/packages/38/30/095d4f55f3a053392f75e2eae45eba3228452783bab3d9a920b951ac495c/pillow-10.4.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:5b4815f2e65b30f5fbae9dfffa8636d992d49705723fe86a3661806e069352d4", size = 3493889 },
|
||||
{ url = "https://files.pythonhosted.org/packages/f3/e8/4ff79788803a5fcd5dc35efdc9386af153569853767bff74540725b45863/pillow-10.4.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:8f0aef4ef59694b12cadee839e2ba6afeab89c0f39a3adc02ed51d109117b8da", size = 3346160 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d7/ac/4184edd511b14f760c73f5bb8a5d6fd85c591c8aff7c2229677a355c4179/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9f4727572e2918acaa9077c919cbbeb73bd2b3ebcfe033b72f858fc9fbef0026", size = 3435020 },
|
||||
{ url = "https://files.pythonhosted.org/packages/da/21/1749cd09160149c0a246a81d646e05f35041619ce76f6493d6a96e8d1103/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ff25afb18123cea58a591ea0244b92eb1e61a1fd497bf6d6384f09bc3262ec3e", size = 3490539 },
|
||||
{ url = "https://files.pythonhosted.org/packages/b6/f5/f71fe1888b96083b3f6dfa0709101f61fc9e972c0c8d04e9d93ccef2a045/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:dc3e2db6ba09ffd7d02ae9141cfa0ae23393ee7687248d46a7507b75d610f4f5", size = 3476125 },
|
||||
{ url = "https://files.pythonhosted.org/packages/96/b9/c0362c54290a31866c3526848583a2f45a535aa9d725fd31e25d318c805f/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:02a2be69f9c9b8c1e97cf2713e789d4e398c751ecfd9967c18d0ce304efbf885", size = 3579373 },
|
||||
{ url = "https://files.pythonhosted.org/packages/52/3b/ce7a01026a7cf46e5452afa86f97a5e88ca97f562cafa76570178ab56d8d/pillow-10.4.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:0755ffd4a0c6f267cccbae2e9903d95477ca2f77c4fcf3a3a09570001856c8a5", size = 2554661 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -3202,34 +3313,6 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/22/a6/858897256d0deac81a172289110f31629fc4cee19b6f01283303e18c8db3/ptyprocess-0.7.0-py2.py3-none-any.whl", hash = "sha256:4b41f3967fce3af57cc7e94b888626c18bf37a083e3651ca8feeb66d492fef35", size = 13993 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pulsar-client"
|
||||
version = "3.5.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "certifi" },
|
||||
]
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/e0/aa/eb3b04be87b961324e49748f3a715a12127d45d76258150bfa61b2a002d8/pulsar_client-3.5.0-cp310-cp310-macosx_10_15_universal2.whl", hash = "sha256:c18552edb2f785de85280fe624bc507467152bff810fc81d7660fa2dfa861f38", size = 10953552 },
|
||||
{ url = "https://files.pythonhosted.org/packages/cc/20/d59bf89ccdda45edd89f5b54bd1e93605ebe5ad3cb73f4f4f5e8eca8f9e6/pulsar_client-3.5.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:18d438e456c146f01be41ef146f649dedc8f7bc714d9eaef94cff2e34099812b", size = 5190714 },
|
||||
{ url = "https://files.pythonhosted.org/packages/1a/02/ca7e96b97d564d0375b8e3de65f95ac86c8502c40f6ff750e9d145709d9a/pulsar_client-3.5.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:18a26a0719841103c7a89eb1492c4a8fedf89adaa386375baecbb4fa2707e88f", size = 5429820 },
|
||||
{ url = "https://files.pythonhosted.org/packages/47/f3/682670cdc951b830cd3d8d1287521997327254e59508772664aaa656e246/pulsar_client-3.5.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:ab0e1605dc5f44a126163fd06cd0a768494ad05123f6e0de89a2c71d6e2d2319", size = 5710427 },
|
||||
{ url = "https://files.pythonhosted.org/packages/bc/00/119cd039286dfc1c91a5580963e9ba79204cd4717b16b7a6fdc57d1c1673/pulsar_client-3.5.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:cdef720891b97656fdce3bf5913ea7729b2156b84ba64314f432c1e72c6117fa", size = 5916490 },
|
||||
{ url = "https://files.pythonhosted.org/packages/0a/cc/d606b483dbb263cbaf7fc7c3d2ec4032628cf3324266cf9a4ccdb2a73076/pulsar_client-3.5.0-cp310-cp310-win_amd64.whl", hash = "sha256:a42544e38773191fe550644a90e8050579476bb2dcf17ac69a4aed62a6cb70e7", size = 3305387 },
|
||||
{ url = "https://files.pythonhosted.org/packages/0d/2e/aec6886a6d67f09230476182399b7fad694fbcbbaf004ce914725d4eddd9/pulsar_client-3.5.0-cp311-cp311-macosx_10_15_universal2.whl", hash = "sha256:fd94432ea5d398ea78f8f2e09a217ec5058d26330c137a22690478c031e116da", size = 10954116 },
|
||||
{ url = "https://files.pythonhosted.org/packages/43/06/b98df9300f60e5fad3396f843dd633c31176a495a2d60ba111c99511658a/pulsar_client-3.5.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d6252ae462e07ece4071213fdd9c76eab82ca522a749f2dc678037d4cbacd40b", size = 5189618 },
|
||||
{ url = "https://files.pythonhosted.org/packages/72/05/c9aef7da7802a03c0b65ffe8f00a24289ff992f99ed5d5d1fd0ed63d9cf6/pulsar_client-3.5.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:03b4d440b2d74323784328b082872ee2f206c440b5d224d7941eb3c083ec06c6", size = 5429329 },
|
||||
{ url = "https://files.pythonhosted.org/packages/06/96/9acfe6f1d827cdd53b8460b04c63b4081333ef64a49a2f425419f1eb6b6b/pulsar_client-3.5.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:f60af840b8d64a2fac5a0c1ce6ae0ddffec5f42267c6ded2c5e74bad8345f2a1", size = 5710106 },
|
||||
{ url = "https://files.pythonhosted.org/packages/e1/7b/877a06eff5c9ac828cdb75e378ee29b0adac9328da9ee173eaf7076d8c56/pulsar_client-3.5.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:2277a447c3b7f6571cb1eb9fc5c25da3fdd43d0b2fb91cf52054adfadc7d6842", size = 5916541 },
|
||||
{ url = "https://files.pythonhosted.org/packages/fb/62/ed1da1ef72c95ba6a830e43995550ed0a1d26c223fb4b036ac6cd028c2ed/pulsar_client-3.5.0-cp311-cp311-win_amd64.whl", hash = "sha256:f20f3e9dd50db2a37059abccad42078b7a4754b8bc1d3ae6502e71c1ad2209f0", size = 3305485 },
|
||||
{ url = "https://files.pythonhosted.org/packages/81/19/4b145766df706aa5e09f60bbf5f87b934e6ac950fddd18f4acd520c465b9/pulsar_client-3.5.0-cp312-cp312-macosx_10_15_universal2.whl", hash = "sha256:d61f663d85308e12f44033ba95af88730f581a7e8da44f7a5c080a3aaea4878d", size = 10967548 },
|
||||
{ url = "https://files.pythonhosted.org/packages/bf/bd/9bc05ee861b46884554a4c61f96edb9602de131dd07982c27920e554ab5b/pulsar_client-3.5.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2a1ba0be25b6f747bcb28102b7d906ec1de48dc9f1a2d9eacdcc6f44ab2c9e17", size = 5189598 },
|
||||
{ url = "https://files.pythonhosted.org/packages/76/00/379bedfa6f1c810553996a4cb0984fa2e2c89afc5953df0936e1c9636003/pulsar_client-3.5.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a181e3e60ac39df72ccb3c415d7aeac61ad0286497a6e02739a560d5af28393a", size = 5430145 },
|
||||
{ url = "https://files.pythonhosted.org/packages/88/c8/8a37d75aa9132a69a28061c9e5f4b516328a1968b58bbae018f431c6d3d4/pulsar_client-3.5.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:3c72895ff7f51347e4f78b0375b2213fa70dd4790bbb78177b4002846f1fd290", size = 5708960 },
|
||||
{ url = "https://files.pythonhosted.org/packages/6e/9a/abd98661e3f7ae3a8e1d3fb0fc7eba1a30005391ebd575ab06a66021256c/pulsar_client-3.5.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:547dba1b185a17eba915e51d0a3aca27c80747b6187e5cd7a71a3ca33921decc", size = 5915227 },
|
||||
{ url = "https://files.pythonhosted.org/packages/a2/51/db376181d05716de595515fac736e3d06e96d3345ba0e31c0a90c352eae1/pulsar_client-3.5.0-cp312-cp312-win_amd64.whl", hash = "sha256:443b786eed96bc86d2297a6a42e79f39d1abf217ec603e0bd303f3488c0234af", size = 3306515 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pure-eval"
|
||||
version = "0.2.3"
|
||||
@@ -3248,6 +3331,48 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/f6/f0/10642828a8dfb741e5f3fbaac830550a518a775c7fff6f04a007259b0548/py-1.11.0-py2.py3-none-any.whl", hash = "sha256:607c53218732647dff4acdfcd50cb62615cedf612e72d1724fb1a0cc6405b378", size = 98708 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "py-rust-stemmers"
|
||||
version = "0.1.3"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/f4/8a/c7481c6e324da825f13bafb362dbca47dbf8a7dd1a3a3502f47cdb05bfa9/py_rust_stemmers-0.1.3.tar.gz", hash = "sha256:ad796d47874181a25addb505a04245e34620bd7a0c5055671f52d9ce993253e2", size = 8676 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/3e/ed/4c85aa5f2046f7c34db174b89f92d24daaa347a149343f43614a6329c006/py_rust_stemmers-0.1.3-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:8b4861673bc690a5830a5d84d61c64a95ede86f79c9952df66e99e0559fe8264", size = 287578 },
|
||||
{ url = "https://files.pythonhosted.org/packages/72/7c/b3df3222e375cb838572952217cedf3d7925f85f3449c3c87142417e9fab/py_rust_stemmers-0.1.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:b0d2108c758e8081064cbbb7fc70d3cdfd32e0cccf7d051c1d888d16c91c1e78", size = 273908 },
|
||||
{ url = "https://files.pythonhosted.org/packages/48/d2/2c422476a6e21d9adbf4355b306269ac396eaa853efc896afdb2c628a334/py_rust_stemmers-0.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fdf43a726b81dd5439a98973200546660e10379e805bb6fd6366dbd8d0857666", size = 309863 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ff/4f/42cd09a77639f3b0b2d662cbbc19248355ce40ba69eaac796007aae37b7e/py_rust_stemmers-0.1.3-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:03acb3d89f8090f67698d2c64172492618585927dfb56d0b5f6070ff54269940", size = 313215 },
|
||||
{ url = "https://files.pythonhosted.org/packages/8a/2c/39bfcdf674c799cb486fd1f10a9ce1599030884b47f2819aabb39db0398a/py_rust_stemmers-0.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b3f8cd1139a641ed53e9a1d7f25ae9cf3757cae96a2b0ce0d9399332ec8b148f", size = 323524 },
|
||||
{ url = "https://files.pythonhosted.org/packages/95/b4/38e66537da1864538912aae92f8285badf8201bccdddfdbe06c3c27e99ac/py_rust_stemmers-0.1.3-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:0a5906aa2eec31f647b94d6cc9b2b065bf77ca31be095fcbb1b412ba42f0e473", size = 323903 },
|
||||
{ url = "https://files.pythonhosted.org/packages/78/a5/7f219ff3547bfc1337b00761c6cd857fe51b90014b9d51aeba325e33d548/py_rust_stemmers-0.1.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:b89fe8e55201604e89bdbd7559b19337ef9ae703a5545878d37664507c1067e9", size = 485483 },
|
||||
{ url = "https://files.pythonhosted.org/packages/66/59/43c89cb1388a9c508d28868ce04900d0f3b4457a74b1c61411c9306a3aa4/py_rust_stemmers-0.1.3-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:0d43981b272c73709d3885ed096a332b2a160db2317fbe16cc9ef3b1d974d39a", size = 567275 },
|
||||
{ url = "https://files.pythonhosted.org/packages/7d/3a/08722448c51e7b926b8f40a55f363e92236a89b761e89e5ee76b0e11baa8/py_rust_stemmers-0.1.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:1b379c3901a87ee63d7cbb01a68ece78af7040e0c3e3d52fe7b108bfa399feb2", size = 488902 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c3/74/41efa33c0eb008eb2b1337f40021debf487e8cea5dbe4af97241a43d54b7/py_rust_stemmers-0.1.3-cp310-none-win_amd64.whl", hash = "sha256:0f571ee0f2a4b2314d4cd8ef26af83e1fd24ea3e3ff97407d536184167f05957", size = 208973 },
|
||||
{ url = "https://files.pythonhosted.org/packages/da/3b/f61826b786ed06f195c80b542abe082dcdd1747341c1194f6f782d566a02/py_rust_stemmers-0.1.3-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:2d8b8e6b6d5839a168dae510a00ff4662c7d0a22d12f24fe81caa0ac59265711", size = 287577 },
|
||||
{ url = "https://files.pythonhosted.org/packages/59/fd/322bf0dbc142ae71516c06c2026f4ac0a4685f108a873935581b7eef3d9d/py_rust_stemmers-0.1.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:02b347ab8fe686a88aef0432060471d501b37a6b9a868e7c50bffcd382269cf2", size = 273910 },
|
||||
{ url = "https://files.pythonhosted.org/packages/10/34/02aa64046e4a21b1dd5f7d602fb33b1c79bd0dd57c8ebfe5897efcf62ac3/py_rust_stemmers-0.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d4a65b429eb1282934a1cc3c1b2698ae32a6dc00d6be00dd747e688c642eb110", size = 309863 },
|
||||
{ url = "https://files.pythonhosted.org/packages/10/a4/f4fd2afc713b0497b76023c6e491f356962213bd518f148cbd28b7144e78/py_rust_stemmers-0.1.3-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:9fbbb37e0df579859b42b3f850aa08fe829d190d32c6338349eccb0e762b74c6", size = 313218 },
|
||||
{ url = "https://files.pythonhosted.org/packages/98/78/f64e096df43d730fb5f6e2201e6d6ca05ed18e94946f11cdeddd0205f099/py_rust_stemmers-0.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d6f9790fe1e9962787817b1894486df7e0b5fc59e4adad423e189530530fae11", size = 323525 },
|
||||
{ url = "https://files.pythonhosted.org/packages/21/38/09beb9ca8ec3af8dbfd441f77fc003472ca900f678d1eb25839db08df691/py_rust_stemmers-0.1.3-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:fd5d7388f807f584b4c55bfbe608ef40cff0024c1dc54de95d28265395065d02", size = 323903 },
|
||||
{ url = "https://files.pythonhosted.org/packages/fc/63/08af5678a0cb0f6c5a462def7aec0c32f3742574ee36ddd660103d13bc86/py_rust_stemmers-0.1.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:72a7b810d8d376c03f0ccebe146f04cbf4c6c97bd74e489b0ddf1342eb40970c", size = 485484 },
|
||||
{ url = "https://files.pythonhosted.org/packages/33/a7/740b8dd06cb48ed397d65cabda9d38c2c310869c3bf51b0e0a347cb7fc8f/py_rust_stemmers-0.1.3-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:658784c0072f7aae67c726be9acac40dd27b29416356c63a3a760a9499a93513", size = 567275 },
|
||||
{ url = "https://files.pythonhosted.org/packages/6e/75/e785900047b4fc5773d0bea37c565825df26de81f25ab2d341ecaa2f55f5/py_rust_stemmers-0.1.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:e6afcd19da56d4182eecb43bdb6c5b9686370063f2538df877fc23f1d16f909e", size = 488906 },
|
||||
{ url = "https://files.pythonhosted.org/packages/5b/ee/86ee4eb3188f45cf0831318dab9afddc231ae71b8fecc0dbbc79eb885ded/py_rust_stemmers-0.1.3-cp311-none-win_amd64.whl", hash = "sha256:47211ac6252eb484f5067d30b1812667936deffcef89b4b0acd2efe881a99aed", size = 208976 },
|
||||
{ url = "https://files.pythonhosted.org/packages/cc/08/f9c9ef78c7dca7a69c451b1df754195e02a3a1e7a450becdce687102aae7/py_rust_stemmers-0.1.3-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a36bfbd9219a55bdf5aa9c5d74b8a3741cb092495190ca18551dc39f57272d57", size = 287577 },
|
||||
{ url = "https://files.pythonhosted.org/packages/50/3a/5c518bc2761f8a873b1ec9333f7f74a8f58e7e8b39d5de065038427b114b/py_rust_stemmers-0.1.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ca1ab04ff2fa15a1d0685007293ffdf4679dcfdc02fc5b36c1af0111670908a1", size = 273906 },
|
||||
{ url = "https://files.pythonhosted.org/packages/b4/ae/3cae1a65a99687e4bf830ab733b3adde13e458a7908b6826dd9025c8c5c3/py_rust_stemmers-0.1.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ccaa08251b9cb421429976d56365ddf9db63b5a8ac4e7817723fb0b62adf8b19", size = 309864 },
|
||||
{ url = "https://files.pythonhosted.org/packages/a9/f2/b4167a4a64b0bade1695b32e4bd13ca752085d43559670fd7173cfb59b9e/py_rust_stemmers-0.1.3-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6262b40f989c0b0bcb3eaef5511268ba63703428c4ab1aa9353a58c8572735b7", size = 313217 },
|
||||
{ url = "https://files.pythonhosted.org/packages/54/ff/f27e0762a74668bf520525d7bad8daa4dd621ef5b3155c464c5bd8a7dd3f/py_rust_stemmers-0.1.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a073701b492ef900cee5185961c23006ba13fa6126cf716f241c929adbdfad6e", size = 323525 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d3/f2/2f4599ef5481be24378a23f93af405b4ca968450873d48d0a56ba925d7b5/py_rust_stemmers-0.1.3-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:39c75f10da70380076b68398d84cdc42b42966180bdb8216b81d21a824278b50", size = 323903 },
|
||||
{ url = "https://files.pythonhosted.org/packages/dd/84/1aea103917659abc12456ce061621557eed0a44e174270908e3fb28f2cc3/py_rust_stemmers-0.1.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:34f7d92abc85f0f0b1fa407410b3f2daaf2c36b8277a2ffff2ff0beb2f2acc2f", size = 485487 },
|
||||
{ url = "https://files.pythonhosted.org/packages/bd/67/16d48e7f02b285b39028aa47f847b3a279c903bc5cd49c8012ea90255317/py_rust_stemmers-0.1.3-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:fbb9f7933239a57d1d9c0fcdfbe0c5283a081e9e64ddc48ed878783be3d52b2b", size = 567278 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ad/1c/cb8cc9680f8aa04f96cb5c814887b3bb8d23a2e9abf460ef861ae16bfe50/py_rust_stemmers-0.1.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:921803a6f8259f10bf348ac0e32a767c28ab587c9ad5c3b1ee593a4bbbe98d39", size = 488907 },
|
||||
{ url = "https://files.pythonhosted.org/packages/cd/29/88217de06239e3e526fa6286a11e3662d94acb0be4216c1310301a252dab/py_rust_stemmers-0.1.3-cp312-none-win_amd64.whl", hash = "sha256:576206b540575e81bb84a0f620b7a8529f5e89b0b2ec7d4487f3183789dd5cfd", size = 208980 },
|
||||
{ url = "https://files.pythonhosted.org/packages/f1/45/e1ec9e76b4462e70fa42f6ac8be9f1bfe6565c1c260b9e5824e772157edf/py_rust_stemmers-0.1.3-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:59eacf7687738b20886a7c0ceeae999d501902b4e6234cf11eecd2f45f2c26bb", size = 288041 },
|
||||
{ url = "https://files.pythonhosted.org/packages/4a/5b/eb594ca68715c23dd3b8f52dd700c10cbdd8133faaaf19886962c8f97c90/py_rust_stemmers-0.1.3-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:e39d5d273e13aec2f07a2c3ea0050b3bf3aaa7b6e9f6bef3d4e728ab49979ae8", size = 274089 },
|
||||
{ url = "https://files.pythonhosted.org/packages/79/55/b62b14cdeb7268a818f21e4c8cfd543261c563dc9bd89ba7116293ce3008/py_rust_stemmers-0.1.3-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f95b25138431c4a457d684c49c6de5ff0c1852cf1cb3657e187ea63610fc7c21", size = 310373 },
|
||||
{ url = "https://files.pythonhosted.org/packages/a4/71/f0b7131505013eaaa4fbfcd821b30b36431d01b7fe96951d84721cdb4ef8/py_rust_stemmers-0.1.3-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1cc9df57dff15d12d7fec65a541af6fdcefd40ea5f7ebd48ad5202a1b9a56f89", size = 324052 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pyarrow"
|
||||
version = "17.0.0"
|
||||
@@ -3464,6 +3589,26 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/48/8f/9bbf22ba6a00001a45dbc54337e5bbbd43e7d8f34c8158c92cddc45736af/pypdf-5.0.1-py3-none-any.whl", hash = "sha256:ff8a32da6c7a63fea9c32fa4dd837cdd0db7966adf6c14f043e3f12592e992db", size = 294470 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pypdfium2"
|
||||
version = "4.30.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/a1/14/838b3ba247a0ba92e4df5d23f2bea9478edcfd72b78a39d6ca36ccd84ad2/pypdfium2-4.30.0.tar.gz", hash = "sha256:48b5b7e5566665bc1015b9d69c1ebabe21f6aee468b509531c3c8318eeee2e16", size = 140239 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/c7/9a/c8ff5cc352c1b60b0b97642ae734f51edbab6e28b45b4fcdfe5306ee3c83/pypdfium2-4.30.0-py3-none-macosx_10_13_x86_64.whl", hash = "sha256:b33ceded0b6ff5b2b93bc1fe0ad4b71aa6b7e7bd5875f1ca0cdfb6ba6ac01aab", size = 2837254 },
|
||||
{ url = "https://files.pythonhosted.org/packages/21/8b/27d4d5409f3c76b985f4ee4afe147b606594411e15ac4dc1c3363c9a9810/pypdfium2-4.30.0-py3-none-macosx_11_0_arm64.whl", hash = "sha256:4e55689f4b06e2d2406203e771f78789bd4f190731b5d57383d05cf611d829de", size = 2707624 },
|
||||
{ url = "https://files.pythonhosted.org/packages/11/63/28a73ca17c24b41a205d658e177d68e198d7dde65a8c99c821d231b6ee3d/pypdfium2-4.30.0-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e6e50f5ce7f65a40a33d7c9edc39f23140c57e37144c2d6d9e9262a2a854854", size = 2793126 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d1/96/53b3ebf0955edbd02ac6da16a818ecc65c939e98fdeb4e0958362bd385c8/pypdfium2-4.30.0-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3d0dd3ecaffd0b6dbda3da663220e705cb563918249bda26058c6036752ba3a2", size = 2591077 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ec/ee/0394e56e7cab8b5b21f744d988400948ef71a9a892cbeb0b200d324ab2c7/pypdfium2-4.30.0-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cc3bf29b0db8c76cdfaac1ec1cde8edf211a7de7390fbf8934ad2aa9b4d6dfad", size = 2864431 },
|
||||
{ url = "https://files.pythonhosted.org/packages/65/cd/3f1edf20a0ef4a212a5e20a5900e64942c5a374473671ac0780eaa08ea80/pypdfium2-4.30.0-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f1f78d2189e0ddf9ac2b7a9b9bd4f0c66f54d1389ff6c17e9fd9dc034d06eb3f", size = 2812008 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c8/91/2d517db61845698f41a2a974de90762e50faeb529201c6b3574935969045/pypdfium2-4.30.0-py3-none-musllinux_1_1_aarch64.whl", hash = "sha256:5eda3641a2da7a7a0b2f4dbd71d706401a656fea521b6b6faa0675b15d31a163", size = 6181543 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ba/c4/ed1315143a7a84b2c7616569dfb472473968d628f17c231c39e29ae9d780/pypdfium2-4.30.0-py3-none-musllinux_1_1_i686.whl", hash = "sha256:0dfa61421b5eb68e1188b0b2231e7ba35735aef2d867d86e48ee6cab6975195e", size = 6175911 },
|
||||
{ url = "https://files.pythonhosted.org/packages/7a/c4/9e62d03f414e0e3051c56d5943c3bf42aa9608ede4e19dc96438364e9e03/pypdfium2-4.30.0-py3-none-musllinux_1_1_x86_64.whl", hash = "sha256:f33bd79e7a09d5f7acca3b0b69ff6c8a488869a7fab48fdf400fec6e20b9c8be", size = 6267430 },
|
||||
{ url = "https://files.pythonhosted.org/packages/90/47/eda4904f715fb98561e34012826e883816945934a851745570521ec89520/pypdfium2-4.30.0-py3-none-win32.whl", hash = "sha256:ee2410f15d576d976c2ab2558c93d392a25fb9f6635e8dd0a8a3a5241b275e0e", size = 2775951 },
|
||||
{ url = "https://files.pythonhosted.org/packages/25/bd/56d9ec6b9f0fc4e0d95288759f3179f0fcd34b1a1526b75673d2f6d5196f/pypdfium2-4.30.0-py3-none-win_amd64.whl", hash = "sha256:90dbb2ac07be53219f56be09961eb95cf2473f834d01a42d901d13ccfad64b4c", size = 2892098 },
|
||||
{ url = "https://files.pythonhosted.org/packages/be/7a/097801205b991bc3115e8af1edb850d30aeaf0118520b016354cf5ccd3f6/pypdfium2-4.30.0-py3-none-win_arm64.whl", hash = "sha256:119b2969a6d6b1e8d55e99caaf05290294f2d0fe49c12a3f17102d01c441bd29", size = 2752118 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pypika"
|
||||
version = "0.48.9"
|
||||
@@ -4763,6 +4908,15 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/56/27/96a5cd2626d11c8280656c6c71d8ab50fe006490ef9971ccd154e0c42cd2/websockets-13.1-py3-none-any.whl", hash = "sha256:a9a396a6ad26130cdae92ae10c36af09d9bfe6cafe69670fd3b6da9b07b4044f", size = 152134 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "win32-setctime"
|
||||
version = "1.1.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/6b/dd/f95a13d2b235a28d613ba23ebad55191514550debb968b46aab99f2e3a30/win32_setctime-1.1.0.tar.gz", hash = "sha256:15cf5750465118d6929ae4de4eb46e8edae9a5634350c01ba582df868e932cb2", size = 3676 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/0a/e6/a7d828fef907843b2a5773ebff47fb79ac0c1c88d60c0ca9530ee941e248/win32_setctime-1.1.0-py3-none-any.whl", hash = "sha256:231db239e959c2fe7eb1d7dc129f11172354f98361c4fa2d6d2d7e278baa8aad", size = 3604 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "wrapt"
|
||||
version = "1.16.0"
|
||||
|
||||
Reference in New Issue
Block a user