Compare commits

...

10 Commits

Author SHA1 Message Date
Devin AI
f1015827e4 Fix: Update import sorting and implement abstract methods
Co-Authored-By: Joe Moura <joao@crewai.com>
2025-04-24 10:31:59 +00:00
Devin AI
f2c9c5581c Fix: Update tests to work with lazy ChromaDB imports
Co-Authored-By: Joe Moura <joao@crewai.com>
2025-04-24 10:22:18 +00:00
Devin AI
4434481f03 Fix: Make ChromaDB imports lazy to prevent SQLite3 version check on import (fixes #2682)
Co-Authored-By: Joe Moura <joao@crewai.com>
2025-04-24 10:18:03 +00:00
Kunal Lunia
685d20f46c added gpt-4.1 models and gemini-2.0 and 2.5 pro models (#2609)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
* added gpt4.1 models and gemini 2.0 and 2.5 models

* added flash model

* Updated test fun to all models

* Added Gemma3 test cases and passed all google test case

* added gemini 2.5 flash

* added gpt4.1 models and gemini 2.0 and 2.5 models

* added flash model

* Updated test fun to all models

* Added Gemma3 test cases and passed all google test case

* added gemini 2.5 flash

* added gpt4.1 models and gemini 2.0 and 2.5 models

* added flash model

* Updated test fun to all models

* Added Gemma3 test cases and passed all google test case

* added gemini 2.5 flash

* test: add missing cassettes

* test: ignore authorization key from gemini/gemma3 request

---------

Co-authored-by: Lucas Gomide <lucaslg200@gmail.com>
Co-authored-by: Lorenze Jay <63378463+lorenzejay@users.noreply.github.com>
2025-04-23 11:20:32 -07:00
Lucas Gomide
9ebf3aa043 docs(CodeInterpreterTool): update docs (#2675) 2025-04-23 10:27:25 -07:00
Tony Kipkemboi
2e4c97661a Add enterprise deployment documentation to CLI docs (#2670)
Some checks are pending
Notify Downstream / notify-downstream (push) Waiting to run
2025-04-22 13:27:58 -07:00
Tony Kipkemboi
16eb4df556 docs: update docs.json with contextual options, SEO, and 404 redirect (#2654)
* docs: 0.114.0 release notes, navigation restructure, new guides, deploy video, and cleanup

- Add v0.114.0 release notes with highlights image and doc links
- Restructure docs navigation (Strategy group, Releases tab, navbar links)
- Update quickstart with deployment video and clearer instructions
- Add/rename guides (Custom Manager Agent, Custom LLM)
- Remove legacy concept/tool docs
- Add new images and tool docs
- Minor formatting and content improvements throughout

* docs: update docs.json with contextual options, SEO indexing, and 404 redirect settings
2025-04-22 09:52:27 -07:00
Vini Brasil
3d9000495c Change CLI tool publish message (#2662) 2025-04-22 13:09:30 -03:00
Tony Kipkemboi
6d0039b117 docs: 0.114.0 release notes, navigation restructure, new guides, deploy video, and cleanup (#2653)
Some checks are pending
Notify Downstream / notify-downstream (push) Waiting to run
- Add v0.114.0 release notes with highlights image and doc links
- Restructure docs navigation (Strategy group, Releases tab, navbar links)
- Update quickstart with deployment video and clearer instructions
- Add/rename guides (Custom Manager Agent, Custom LLM)
- Remove legacy concept/tool docs
- Add new images and tool docs
- Minor formatting and content improvements throughout
2025-04-21 19:18:21 -04:00
Lorenze Jay
311a078ca6 Enhance knowledge management in CrewAI (#2637)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
* Enhance knowledge management in CrewAI

- Added `KnowledgeConfig` class to configure knowledge retrieval parameters such as `limit` and `score_threshold`.
- Updated `Agent` and `Crew` classes to utilize the new knowledge configuration for querying knowledge sources.
- Enhanced documentation to clarify the addition of knowledge sources at both agent and crew levels.
- Introduced new tips in documentation to guide users on knowledge source management and configuration.

* Refactor knowledge configuration parameters in CrewAI

- Renamed `limit` to `results_limit` in `KnowledgeConfig`, `query_knowledge`, and `query` methods for consistency and clarity.
- Updated related documentation to reflect the new parameter name, ensuring users understand the configuration options for knowledge retrieval.

* Refactor agent tests to utilize mock knowledge storage

- Updated test cases in `agent_test.py` to use `KnowledgeStorage` for mocking knowledge sources, enhancing test reliability and clarity.
- Renamed `limit` to `results_limit` in `KnowledgeConfig` for consistency with recent changes.
- Ensured that knowledge queries are properly mocked to return expected results during tests.

* Add VCR support for agent tests with query limits and score thresholds

- Introduced `@pytest.mark.vcr` decorator in `agent_test.py` for tests involving knowledge sources, ensuring consistent recording of HTTP interactions.
- Added new YAML cassette files for `test_agent_with_knowledge_sources_with_query_limit_and_score_threshold` and `test_agent_with_knowledge_sources_with_query_limit_and_score_threshold_default`, capturing the expected API responses for these tests.
- Enhanced test reliability by utilizing VCR to manage external API calls during testing.

* Update documentation to format parameter names in code style

- Changed the formatting of `results_limit` and `score_threshold` in the documentation to use code style for better clarity and emphasis.
- Ensured consistency in documentation presentation to enhance user understanding of configuration options.

* Enhance KnowledgeConfig with field descriptions

- Updated `results_limit` and `score_threshold` in `KnowledgeConfig` to use Pydantic's `Field` for improved documentation and clarity.
- Added descriptions to both parameters to provide better context for their usage in knowledge retrieval configuration.

* docstrings added
2025-04-18 18:33:04 -07:00
44 changed files with 2582 additions and 1830 deletions

View File

@@ -4,6 +4,36 @@ description: View the latest updates and changes to CrewAI
icon: timeline
---
<Update label="2025-04-07" description="v0.114.0">
## Release Highlights
<Frame>
<img src="/images/v01140.png" />
</Frame>
**New Features & Enhancements**
- Agents as an atomic unit. (`Agent(...).kickoff()`)
- Support for [Custom LLM implementations](https://docs.crewai.com/guides/advanced/custom-llm).
- Integrated External Memory and [Opik observability](https://docs.crewai.com/how-to/opik-observability).
- Enhanced YAML extraction.
- Multimodal agent validation.
- Added Secure fingerprints for agents and crews.
**Core Improvements & Fixes**
- Improved serialization, agent copying, and Python compatibility.
- Added wildcard support to `emit()`
- Added support for additional router calls and context window adjustments.
- Fixed typing issues, validation, and import statements.
- Improved method performance.
- Enhanced agent task handling, event emissions, and memory management.
- Fixed CLI issues, conditional tasks, cloning behavior, and tool outputs.
**Documentation & Guides**
- Improved documentation structure, theme, and organization.
- Added guides for Local NVIDIA NIM with WSL2, W&B Weave, and Arize Phoenix.
- Updated tool configuration examples, prompts, and observability docs.
- Guide on using singular agents within Flows.
</Update>
<Update label="2025-03-17" description="v0.108.0">
**Features**
- Converted tabs to spaces in `crew.py` template

View File

@@ -179,7 +179,78 @@ def crew(self) -> Crew:
```
</Note>
### 10. API Keys
### 10. Deploy
Deploy the crew or flow to [CrewAI Enterprise](https://app.crewai.com).
- **Authentication**: You need to be authenticated to deploy to CrewAI Enterprise.
```shell Terminal
crewai signup
```
If you already have an account, you can login with:
```shell Terminal
crewai login
```
- **Create a deployment**: Once you are authenticated, you can create a deployment for your crew or flow from the root of your localproject.
```shell Terminal
crewai deploy create
```
- Reads your local project configuration.
- Prompts you to confirm the environment variables (like `OPENAI_API_KEY`, `SERPER_API_KEY`) found locally. These will be securely stored with the deployment on the Enterprise platform. Ensure your sensitive keys are correctly configured locally (e.g., in a `.env` file) before running this.
- Links the deployment to the corresponding remote GitHub repository (it usually detects this automatically).
- **Deploy the Crew**: Once you are authenticated, you can deploy your crew or flow to CrewAI Enterprise.
```shell Terminal
crewai deploy push
```
- Initiates the deployment process on the CrewAI Enterprise platform.
- Upon successful initiation, it will output the Deployment created successfully! message along with the Deployment Name and a unique Deployment ID (UUID).
- **Deployment Status**: You can check the status of your deployment with:
```shell Terminal
crewai deploy status
```
This fetches the latest deployment status of your most recent deployment attempt (e.g., `Building Images for Crew`, `Deploy Enqueued`, `Online`).
- **Deployment Logs**: You can check the logs of your deployment with:
```shell Terminal
crewai deploy logs
```
This streams the deployment logs to your terminal.
- **List deployments**: You can list all your deployments with:
```shell Terminal
crewai deploy list
```
This lists all your deployments.
- **Delete a deployment**: You can delete a deployment with:
```shell Terminal
crewai deploy remove
```
This deletes the deployment from the CrewAI Enterprise platform.
- **Help Command**: You can get help with the CLI with:
```shell Terminal
crewai deploy --help
```
This shows the help message for the CrewAI Deploy CLI.
Watch this video tutorial for a step-by-step demonstration of deploying your crew to [CrewAI Enterprise](http://app.crewai.com) using the CLI.
<iframe
width="100%"
height="400"
src="https://www.youtube.com/embed/3EqSV-CYDZA"
title="CrewAI Deployment Guide"
frameborder="0"
style={{ borderRadius: '10px' }}
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allowfullscreen
></iframe>
### 11. API Keys
When running ```crewai create crew``` command, the CLI will first show you the top 5 most common LLM providers and ask you to select one.

View File

@@ -42,6 +42,16 @@ CrewAI supports various types of knowledge sources out of the box:
| `collection_name` | **str** | No | Name of the collection where the knowledge will be stored. Used to identify different sets of knowledge. Defaults to "knowledge" if not provided. |
| `storage` | **Optional[KnowledgeStorage]** | No | Custom storage configuration for managing how the knowledge is stored and retrieved. If not provided, a default storage will be created. |
<Tip>
Unlike retrieval from a vector database using a tool, agents preloaded with knowledge will not need a retrieval persona or task.
Simply add the relevant knowledge sources your agent or crew needs to function.
Knowledge sources can be added at the agent or crew level.
Crew level knowledge sources will be used by **all agents** in the crew.
Agent level knowledge sources will be used by the **specific agent** that is preloaded with the knowledge.
</Tip>
## Quickstart Example
<Tip>
@@ -146,6 +156,26 @@ result = crew.kickoff(
)
```
## Knowledge Configuration
You can configure the knowledge configuration for the crew or agent.
```python Code
from crewai.knowledge.knowledge_config import KnowledgeConfig
knowledge_config = KnowledgeConfig(results_limit=10, score_threshold=0.5)
agent = Agent(
...
knowledge_config=knowledge_config
)
```
<Tip>
`results_limit`: is the number of relevant documents to return. Default is 3.
`score_threshold`: is the minimum score for a document to be considered relevant. Default is 0.35.
</Tip>
## More Examples
Here are examples of how to use different types of knowledge sources:

View File

@@ -1,71 +0,0 @@
---
title: Using LlamaIndex Tools
description: Learn how to integrate LlamaIndex tools with CrewAI agents to enhance search-based queries and more.
icon: toolbox
---
## Using LlamaIndex Tools
<Info>
CrewAI seamlessly integrates with LlamaIndexs comprehensive toolkit for RAG (Retrieval-Augmented Generation) and agentic pipelines, enabling advanced search-based queries and more.
</Info>
Here are the available built-in tools offered by LlamaIndex.
```python Code
from crewai import Agent
from crewai_tools import LlamaIndexTool
# Example 1: Initialize from FunctionTool
from llama_index.core.tools import FunctionTool
your_python_function = lambda ...: ...
og_tool = FunctionTool.from_defaults(
your_python_function,
name="<name>",
description='<description>'
)
tool = LlamaIndexTool.from_tool(og_tool)
# Example 2: Initialize from LlamaHub Tools
from llama_index.tools.wolfram_alpha import WolframAlphaToolSpec
wolfram_spec = WolframAlphaToolSpec(app_id="<app_id>")
wolfram_tools = wolfram_spec.to_tool_list()
tools = [LlamaIndexTool.from_tool(t) for t in wolfram_tools]
# Example 3: Initialize Tool from a LlamaIndex Query Engine
query_engine = index.as_query_engine()
query_tool = LlamaIndexTool.from_query_engine(
query_engine,
name="Uber 2019 10K Query Tool",
description="Use this tool to lookup the 2019 Uber 10K Annual Report"
)
# Create and assign the tools to an agent
agent = Agent(
role='Research Analyst',
goal='Provide up-to-date market analysis',
backstory='An expert analyst with a keen eye for market trends.',
tools=[tool, *tools, query_tool]
)
# rest of the code ...
```
## Steps to Get Started
To effectively use the LlamaIndexTool, follow these steps:
<Steps>
<Step title="Package Installation">
Make sure that `crewai[tools]` package is installed in your Python environment:
<CodeGroup>
```shell Terminal
pip install 'crewai[tools]'
```
</CodeGroup>
</Step>
<Step title="Install and Use LlamaIndex">
Follow the LlamaIndex documentation [LlamaIndex Documentation](https://docs.llamaindex.ai/) to set up a RAG/agent pipeline.
</Step>
</Steps>

View File

@@ -8,25 +8,27 @@
"dark": "#C94C3C"
},
"favicon": "favicon.svg",
"contextual": {
"options": ["copy", "view", "chatgpt", "claude"]
},
"navigation": {
"tabs": [
{
"tab": "Get Started",
"tab": "Documentation",
"groups": [
{
"group": "Get Started",
"pages": [
"introduction",
"installation",
"quickstart",
"changelog"
"quickstart"
]
},
{
"group": "Guides",
"pages": [
{
"group": "Concepts",
"group": "Strategy",
"pages": [
"guides/concepts/evaluating-use-cases"
]
@@ -79,41 +81,6 @@
"concepts/event-listener"
]
},
{
"group": "How to Guides",
"pages": [
"how-to/create-custom-tools",
"how-to/sequential-process",
"how-to/hierarchical-process",
"how-to/custom-manager-agent",
"how-to/llm-connections",
"how-to/customizing-agents",
"how-to/multimodal-agents",
"how-to/coding-agents",
"how-to/force-tool-output-as-result",
"how-to/human-input-on-execution",
"how-to/kickoff-async",
"how-to/kickoff-for-each",
"how-to/replay-tasks-from-latest-crew-kickoff",
"how-to/conditional-tasks",
"how-to/langchain-tools",
"how-to/llamaindex-tools"
]
},
{
"group": "Agent Monitoring & Observability",
"pages": [
"how-to/agentops-observability",
"how-to/arize-phoenix-observability",
"how-to/langfuse-observability",
"how-to/langtrace-observability",
"how-to/mlflow-observability",
"how-to/openlit-observability",
"how-to/opik-observability",
"how-to/portkey-observability",
"how-to/weave-integration"
]
},
{
"group": "Tools",
"pages": [
@@ -141,6 +108,7 @@
"tools/hyperbrowserloadtool",
"tools/linkupsearchtool",
"tools/llamaindextool",
"tools/langchaintool",
"tools/serperdevtool",
"tools/s3readertool",
"tools/s3writertool",
@@ -170,6 +138,40 @@
"tools/youtubevideosearchtool"
]
},
{
"group": "Agent Monitoring & Observability",
"pages": [
"how-to/agentops-observability",
"how-to/arize-phoenix-observability",
"how-to/langfuse-observability",
"how-to/langtrace-observability",
"how-to/mlflow-observability",
"how-to/openlit-observability",
"how-to/opik-observability",
"how-to/portkey-observability",
"how-to/weave-integration"
]
},
{
"group": "Learn",
"pages": [
"how-to/conditional-tasks",
"how-to/coding-agents",
"how-to/create-custom-tools",
"how-to/custom-llm",
"how-to/custom-manager-agent",
"how-to/customizing-agents",
"how-to/force-tool-output-as-result",
"how-to/hierarchical-process",
"how-to/human-input-on-execution",
"how-to/kickoff-async",
"how-to/kickoff-for-each",
"how-to/llm-connections",
"how-to/multimodal-agents",
"how-to/replay-tasks-from-latest-crew-kickoff",
"how-to/sequential-process"
]
},
{
"group": "Telemetry",
"pages": [
@@ -188,19 +190,35 @@
]
}
]
},
{
"tab": "Releases",
"groups": [
{
"group": "Releases",
"pages": [
"changelog"
]
}
]
}
],
"global": {
"anchors": [
{
"anchor": "Community",
"anchor": "Website",
"href": "https://crewai.com",
"icon": "globe"
},
{
"anchor": "Forum",
"href": "https://community.crewai.com",
"icon": "discourse"
},
{
"anchor": "Tutorials",
"href": "https://www.youtube.com/@crewAIInc",
"icon": "youtube"
"anchor": "Get Help",
"href": "mailto:support@crewai.com",
"icon": "headset"
}
]
}
@@ -214,6 +232,12 @@
"strict": false
},
"navbar": {
"links": [
{
"label": "Start Free Trial",
"href": "https://app.crewai.com"
}
],
"primary": {
"type": "github",
"href": "https://github.com/crewAIInc/crewAI"
@@ -223,7 +247,12 @@
"prompt": "Search CrewAI docs"
},
"seo": {
"indexing": "navigable"
"indexing": "all"
},
"errors": {
"404": {
"redirect": true
}
},
"footer": {
"socials": {

View File

@@ -1,9 +1,13 @@
# Custom LLM Implementations
---
title: Custom LLM Implementation
description: Learn how to create custom LLM implementations in CrewAI.
icon: code
---
## Custom LLM Implementations
CrewAI now supports custom LLM implementations through the `BaseLLM` abstract base class. This allows you to create your own LLM implementations that don't rely on litellm's authentication mechanism.
## Using Custom LLM Implementations
To create a custom LLM implementation, you need to:
1. Inherit from the `BaseLLM` abstract base class

View File

@@ -1,5 +1,5 @@
---
title: Create Your Own Manager Agent
title: Custom Manager Agent
description: Learn how to set a custom agent as the manager in CrewAI, providing more control over task management and coordination.
icon: user-shield
---

View File

@@ -20,10 +20,8 @@ Here's an example of how to replay from a task:
To use the replay feature, follow these steps:
<Steps>
<Step title="Open your terminal or command prompt.">
</Step>
<Step title="Navigate to the directory where your CrewAI project is located.">
</Step>
<Step title="Open your terminal or command prompt."></Step>
<Step title="Navigate to the directory where your CrewAI project is located."></Step>
<Step title="Run the following commands:">
To view the latest kickoff task_ids use:

BIN
docs/images/v01140.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 MiB

View File

@@ -336,9 +336,22 @@ email_summarizer_task:
- research_task
```
## Deploying Your Project
## Deploying Your Crew
The easiest way to deploy your crew is through [CrewAI Enterprise](http://app.crewai.com), where you can deploy your crew in a few clicks.
The easiest way to deploy your crew to production is through [CrewAI Enterprise](http://app.crewai.com).
Watch this video tutorial for a step-by-step demonstration of deploying your crew to [CrewAI Enterprise](http://app.crewai.com) using the CLI.
<iframe
width="100%"
height="400"
src="https://www.youtube.com/embed/3EqSV-CYDZA"
title="CrewAI Deployment Guide"
frameborder="0"
style={{ borderRadius: '10px' }}
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allowfullscreen
></iframe>
<CardGroup cols={2}>
<Card

View File

@@ -8,11 +8,29 @@ icon: code-simple
## Description
The `CodeInterpreterTool` enables CrewAI agents to execute Python 3 code that they generate autonomously. The code is run in a secure, isolated Docker container, ensuring safety regardless of the content. This functionality is particularly valuable as it allows agents to create code, execute it, obtain the results, and utilize that information to inform subsequent decisions and actions.
The `CodeInterpreterTool` enables CrewAI agents to execute Python 3 code that they generate autonomously. This functionality is particularly valuable as it allows agents to create code, execute it, obtain the results, and utilize that information to inform subsequent decisions and actions.
## Requirements
There are several ways to use this tool:
### Docker Container (Recommended)
This is the primary option. The code runs in a secure, isolated Docker container, ensuring safety regardless of its content.
Make sure Docker is installed and running on your system. If you dont have it, you can install it from [here](https://docs.docker.com/get-docker/).
### Sandbox environment
If Docker is unavailable — either not installed or not accessible for any reason — the code will be executed in a restricted Python environment - called sandbox.
This environment is very limited, with strict restrictions on many modules and built-in functions.
### Unsafe Execution
**NOT RECOMMENDED FOR PRODUCTION**
This mode allows execution of any Python code, including dangerous calls to `sys, os..` and similar modules. [Check out](/tools/codeinterpretertool#enabling-unsafe-mode) how to enable this mode
## Logging
The `CodeInterpreterTool` logs the selected execution strategy to STDOUT
- Docker must be installed and running on your system. If you don't have it, you can install it from [here](https://docs.docker.com/get-docker/).
## Installation
@@ -74,18 +92,32 @@ programmer_agent = Agent(
)
```
### Enabling `unsafe_mode`
```python Code
from crewai_tools import CodeInterpreterTool
code = """
import os
os.system("ls -la")
"""
CodeInterpreterTool(unsafe_mode=True).run(code=code)
```
## Parameters
The `CodeInterpreterTool` accepts the following parameters during initialization:
- **user_dockerfile_path**: Optional. Path to a custom Dockerfile to use for the code interpreter container.
- **user_docker_base_url**: Optional. URL to the Docker daemon to use for running the container.
- **unsafe_mode**: Optional. Whether to run code directly on the host machine instead of in a Docker container. Default is `False`. Use with caution!
- **unsafe_mode**: Optional. Whether to run code directly on the host machine instead of in a Docker container or sandbox. Default is `False`. Use with caution!
- **default_image_tag**: Optional. Default Docker image tag. Default is `code-interpreter:latest`
When using the tool with an agent, the agent will need to provide:
- **code**: Required. The Python 3 code to execute.
- **libraries_used**: Required. A list of libraries used in the code that need to be installed.
- **libraries_used**: Optional. A list of libraries used in the code that need to be installed. Default is `[]`
## Agent Integration Example
@@ -152,7 +184,7 @@ class CodeInterpreterTool(BaseTool):
if self.unsafe_mode:
return self.run_code_unsafe(code, libraries_used)
else:
return self.run_code_in_docker(code, libraries_used)
return self.run_code_safety(code, libraries_used)
```
The tool performs the following steps:
@@ -168,8 +200,9 @@ The tool performs the following steps:
By default, the `CodeInterpreterTool` runs code in an isolated Docker container, which provides a layer of security. However, there are still some security considerations to keep in mind:
1. The Docker container has access to the current working directory, so sensitive files could potentially be accessed.
2. The `unsafe_mode` parameter allows code to be executed directly on the host machine, which should only be used in trusted environments.
3. Be cautious when allowing agents to install arbitrary libraries, as they could potentially include malicious code.
2. If the Docker container is unavailable and the code needs to run safely, it will be executed in a sandbox environment. For security reasons, installing arbitrary libraries is not allowed
3. The `unsafe_mode` parameter allows code to be executed directly on the host machine, which should only be used in trusted environments.
4. Be cautious when allowing agents to install arbitrary libraries, as they could potentially include malicious code.
## Conclusion

View File

@@ -1,10 +1,10 @@
---
title: Using LangChain Tools
description: Learn how to integrate LangChain tools with CrewAI agents to enhance search-based queries and more.
title: LangChain Tool
description: The `LangChainTool` is a wrapper for LangChain tools and query engines.
icon: link
---
## Using LangChain Tools
## `LangChainTool`
<Info>
CrewAI seamlessly integrates with LangChain's comprehensive [list of tools](https://python.langchain.com/docs/integrations/tools/), all of which can be used with CrewAI.

View File

@@ -114,6 +114,14 @@ class Agent(BaseAgent):
default=None,
description="Embedder configuration for the agent.",
)
agent_knowledge_context: Optional[str] = Field(
default=None,
description="Knowledge context for the agent.",
)
crew_knowledge_context: Optional[str] = Field(
default=None,
description="Knowledge context for the crew.",
)
@model_validator(mode="after")
def post_init_setup(self):
@@ -234,22 +242,30 @@ class Agent(BaseAgent):
memory = contextual_memory.build_context_for_task(task, context)
if memory.strip() != "":
task_prompt += self.i18n.slice("memory").format(memory=memory)
knowledge_config = (
self.knowledge_config.model_dump() if self.knowledge_config else {}
)
if self.knowledge:
agent_knowledge_snippets = self.knowledge.query([task.prompt()])
agent_knowledge_snippets = self.knowledge.query(
[task.prompt()], **knowledge_config
)
if agent_knowledge_snippets:
agent_knowledge_context = extract_knowledge_context(
self.agent_knowledge_context = extract_knowledge_context(
agent_knowledge_snippets
)
if agent_knowledge_context:
task_prompt += agent_knowledge_context
if self.agent_knowledge_context:
task_prompt += self.agent_knowledge_context
if self.crew:
knowledge_snippets = self.crew.query_knowledge([task.prompt()])
knowledge_snippets = self.crew.query_knowledge(
[task.prompt()], **knowledge_config
)
if knowledge_snippets:
crew_knowledge_context = extract_knowledge_context(knowledge_snippets)
if crew_knowledge_context:
task_prompt += crew_knowledge_context
self.crew_knowledge_context = extract_knowledge_context(
knowledge_snippets
)
if self.crew_knowledge_context:
task_prompt += self.crew_knowledge_context
tools = tools or self.tools or []
self.create_agent_executor(tools=tools, task=task)

View File

@@ -19,6 +19,7 @@ from crewai.agents.agent_builder.utilities.base_token_process import TokenProces
from crewai.agents.cache.cache_handler import CacheHandler
from crewai.agents.tools_handler import ToolsHandler
from crewai.knowledge.knowledge import Knowledge
from crewai.knowledge.knowledge_config import KnowledgeConfig
from crewai.knowledge.source.base_knowledge_source import BaseKnowledgeSource
from crewai.security.security_config import SecurityConfig
from crewai.tools.base_tool import BaseTool, Tool
@@ -155,6 +156,10 @@ class BaseAgent(ABC, BaseModel):
adapted_agent: bool = Field(
default=False, description="Whether the agent is adapted"
)
knowledge_config: Optional[KnowledgeConfig] = Field(
default=None,
description="Knowledge configuration for the agent such as limits and threshold",
)
@model_validator(mode="before")
@classmethod

View File

@@ -122,7 +122,16 @@ PROVIDERS = [
]
MODELS = {
"openai": ["gpt-4", "gpt-4o", "gpt-4o-mini", "o1-mini", "o1-preview"],
"openai": [
"gpt-4",
"gpt-4.1",
"gpt-4.1-mini-2025-04-14",
"gpt-4.1-nano-2025-04-14",
"gpt-4o",
"gpt-4o-mini",
"o1-mini",
"o1-preview",
],
"anthropic": [
"claude-3-5-sonnet-20240620",
"claude-3-sonnet-20240229",
@@ -132,8 +141,17 @@ MODELS = {
"gemini": [
"gemini/gemini-1.5-flash",
"gemini/gemini-1.5-pro",
"gemini/gemini-2.0-flash-lite-001",
"gemini/gemini-2.0-flash-001",
"gemini/gemini-2.0-flash-thinking-exp-01-21",
"gemini/gemini-2.5-flash-preview-04-17",
"gemini/gemini-2.5-pro-exp-03-25",
"gemini/gemini-gemma-2-9b-it",
"gemini/gemini-gemma-2-27b-it",
"gemini/gemma-3-1b-it",
"gemini/gemma-3-4b-it",
"gemini/gemma-3-12b-it",
"gemini/gemma-3-27b-it",
],
"nvidia_nim": [
"nvidia_nim/nvidia/mistral-nemo-minitron-8b-8k-instruct",

View File

@@ -117,7 +117,9 @@ class ToolCommand(BaseCommand, PlusAPIMixin):
published_handle = publish_response.json()["handle"]
console.print(
f"Successfully published {published_handle} ({project_version}).\nInstall it in other projects with crewai tool install {published_handle}",
f"Successfully published `{published_handle}` ({project_version}).\n\n"
+ "⚠️ Security checks are running in the background. Your tool will be available once these are complete.\n"
+ f"You can monitor the status or access your tool here:\nhttps://app.crewai.com/crewai_plus/tools/{published_handle}",
style="bold green",
)
@@ -153,8 +155,12 @@ class ToolCommand(BaseCommand, PlusAPIMixin):
login_response_json = login_response.json()
settings = Settings()
settings.tool_repository_username = login_response_json["credential"]["username"]
settings.tool_repository_password = login_response_json["credential"]["password"]
settings.tool_repository_username = login_response_json["credential"][
"username"
]
settings.tool_repository_password = login_response_json["credential"][
"password"
]
settings.dump()
console.print(
@@ -179,7 +185,7 @@ class ToolCommand(BaseCommand, PlusAPIMixin):
capture_output=False,
env=self._build_env_with_credentials(repository_handle),
text=True,
check=True
check=True,
)
if add_package_result.stderr:
@@ -204,7 +210,11 @@ class ToolCommand(BaseCommand, PlusAPIMixin):
settings = Settings()
env = os.environ.copy()
env[f"UV_INDEX_{repository_handle}_USERNAME"] = str(settings.tool_repository_username or "")
env[f"UV_INDEX_{repository_handle}_PASSWORD"] = str(settings.tool_repository_password or "")
env[f"UV_INDEX_{repository_handle}_USERNAME"] = str(
settings.tool_repository_username or ""
)
env[f"UV_INDEX_{repository_handle}_PASSWORD"] = str(
settings.tool_repository_password or ""
)
return env

View File

@@ -304,9 +304,7 @@ class Crew(BaseModel):
"""Initialize private memory attributes."""
self._external_memory = (
# External memory doesnt support a default value since it was designed to be managed entirely externally
self.external_memory.set_crew(self)
if self.external_memory
else None
self.external_memory.set_crew(self) if self.external_memory else None
)
self._long_term_memory = self.long_term_memory
@@ -1136,9 +1134,13 @@ class Crew(BaseModel):
result = self._execute_tasks(self.tasks, start_index, True)
return result
def query_knowledge(self, query: List[str]) -> Union[List[Dict[str, Any]], None]:
def query_knowledge(
self, query: List[str], results_limit: int = 3, score_threshold: float = 0.35
) -> Union[List[Dict[str, Any]], None]:
if self.knowledge:
return self.knowledge.query(query)
return self.knowledge.query(
query, results_limit=results_limit, score_threshold=score_threshold
)
return None
def fetch_inputs(self) -> Set[str]:
@@ -1220,9 +1222,13 @@ class Crew(BaseModel):
copied_data = self.model_dump(exclude=exclude)
copied_data = {k: v for k, v in copied_data.items() if v is not None}
if self.short_term_memory:
copied_data["short_term_memory"] = self.short_term_memory.model_copy(deep=True)
copied_data["short_term_memory"] = self.short_term_memory.model_copy(
deep=True
)
if self.long_term_memory:
copied_data["long_term_memory"] = self.long_term_memory.model_copy(deep=True)
copied_data["long_term_memory"] = self.long_term_memory.model_copy(
deep=True
)
if self.entity_memory:
copied_data["entity_memory"] = self.entity_memory.model_copy(deep=True)
if self.external_memory:
@@ -1230,7 +1236,6 @@ class Crew(BaseModel):
if self.user_memory:
copied_data["user_memory"] = self.user_memory.model_copy(deep=True)
copied_data.pop("agents", None)
copied_data.pop("tasks", None)
@@ -1403,7 +1408,10 @@ class Crew(BaseModel):
"short": (getattr(self, "_short_term_memory", None), "short term"),
"entity": (getattr(self, "_entity_memory", None), "entity"),
"knowledge": (getattr(self, "knowledge", None), "knowledge"),
"kickoff_outputs": (getattr(self, "_task_output_handler", None), "task output"),
"kickoff_outputs": (
getattr(self, "_task_output_handler", None),
"task output",
),
"external": (getattr(self, "_external_memory", None), "external"),
}

View File

@@ -43,7 +43,9 @@ class Knowledge(BaseModel):
self.storage.initialize_knowledge_storage()
self._add_sources()
def query(self, query: List[str], limit: int = 3) -> List[Dict[str, Any]]:
def query(
self, query: List[str], results_limit: int = 3, score_threshold: float = 0.35
) -> List[Dict[str, Any]]:
"""
Query across all knowledge sources to find the most relevant information.
Returns the top_k most relevant chunks.
@@ -56,7 +58,8 @@ class Knowledge(BaseModel):
results = self.storage.search(
query,
limit,
limit=results_limit,
score_threshold=score_threshold,
)
return results

View File

@@ -0,0 +1,16 @@
from pydantic import BaseModel, Field
class KnowledgeConfig(BaseModel):
"""Configuration for knowledge retrieval.
Args:
results_limit (int): The number of relevant documents to return.
score_threshold (float): The minimum score for a document to be considered relevant.
"""
results_limit: int = Field(default=3, description="The number of results to return")
score_threshold: float = Field(
default=0.35,
description="The minimum score for a result to be considered relevant",
)

View File

@@ -4,13 +4,15 @@ import io
import logging
import os
import shutil
from typing import Any, Dict, List, Optional, Union, cast
from typing import TYPE_CHECKING, Any, Dict, List, Optional, Union, cast
import chromadb
import chromadb.errors
from chromadb.api import ClientAPI
from chromadb.api.types import OneOrMany
from chromadb.config import Settings
# Type checking imports that don't cause runtime imports
if TYPE_CHECKING:
import chromadb
import chromadb.errors
from chromadb.api import ClientAPI
from chromadb.api.types import OneOrMany
from chromadb.config import Settings
from crewai.knowledge.storage.base_knowledge_storage import BaseKnowledgeStorage
from crewai.utilities import EmbeddingConfigurator
@@ -43,9 +45,9 @@ class KnowledgeStorage(BaseKnowledgeStorage):
search efficiency.
"""
collection: Optional[chromadb.Collection] = None
collection: Optional[Any] = None
collection_name: Optional[str] = "knowledge"
app: Optional[ClientAPI] = None
app: Optional[Any] = None
def __init__(
self,
@@ -84,6 +86,10 @@ class KnowledgeStorage(BaseKnowledgeStorage):
raise Exception("Collection not initialized")
def initialize_knowledge_storage(self):
# Import chromadb here to avoid importing at module level
import chromadb
from chromadb.config import Settings
base_path = os.path.join(db_storage_path(), "knowledge")
chroma_client = chromadb.PersistentClient(
path=base_path,
@@ -109,6 +115,10 @@ class KnowledgeStorage(BaseKnowledgeStorage):
raise Exception("Failed to create or get collection")
def reset(self):
# Import chromadb here to avoid importing at module level
import chromadb
from chromadb.config import Settings
base_path = os.path.join(db_storage_path(), KNOWLEDGE_DIRECTORY)
if not self.app:
self.app = chromadb.PersistentClient(
@@ -126,6 +136,11 @@ class KnowledgeStorage(BaseKnowledgeStorage):
documents: List[str],
metadata: Optional[Union[Dict[str, Any], List[Dict[str, Any]]]] = None,
):
# Import chromadb here to avoid importing at module level
import chromadb
import chromadb.errors
from chromadb.api.types import OneOrMany
if not self.collection:
raise Exception("Collection not initialized")
@@ -156,7 +171,7 @@ class KnowledgeStorage(BaseKnowledgeStorage):
filtered_ids.append(doc_id)
# If we have no metadata at all, set it to None
final_metadata: Optional[OneOrMany[chromadb.Metadata]] = (
final_metadata: Optional[OneOrMany['chromadb.Metadata']] = (
None if all(m is None for m in filtered_metadata) else filtered_metadata
)

View File

@@ -81,14 +81,26 @@ LLM_CONTEXT_WINDOW_SIZES = {
"gpt-4o": 128000,
"gpt-4o-mini": 128000,
"gpt-4-turbo": 128000,
"gpt-4.1": 1047576, # Based on official docs
"gpt-4.1-mini-2025-04-14": 1047576,
"gpt-4.1-nano-2025-04-14": 1047576,
"o1-preview": 128000,
"o1-mini": 128000,
"o3-mini": 200000, # Based on official o3-mini specifications
# gemini
"gemini-2.0-flash": 1048576,
"gemini-2.0-flash-thinking-exp-01-21": 32768,
"gemini-2.0-flash-lite-001": 1048576,
"gemini-2.0-flash-001": 1048576,
"gemini-2.5-flash-preview-04-17": 1048576,
"gemini-2.5-pro-exp-03-25": 1048576,
"gemini-1.5-pro": 2097152,
"gemini-1.5-flash": 1048576,
"gemini-1.5-flash-8b": 1048576,
"gemini/gemma-3-1b-it": 32000,
"gemini/gemma-3-4b-it": 128000,
"gemini/gemma-3-12b-it": 128000,
"gemini/gemma-3-27b-it": 128000,
# deepseek
"deepseek-chat": 128000,
# groq

View File

@@ -4,13 +4,18 @@ import logging
import os
import shutil
import uuid
from typing import Any, Dict, List, Optional
from typing import TYPE_CHECKING, Any, Dict, List, Optional
from chromadb.api import ClientAPI
# Type checking imports that don't cause runtime imports
if TYPE_CHECKING:
import chromadb
from chromadb.api import ClientAPI
from chromadb.config import Settings
from crewai.memory.storage.base_rag_storage import BaseRAGStorage
from crewai.utilities import EmbeddingConfigurator
from crewai.utilities.constants import MAX_FILE_NAME_LENGTH
from crewai.utilities.chromadb import sanitize_collection_name
from crewai.utilities.logger import Logger
from crewai.utilities.paths import db_storage_path
@@ -37,76 +42,45 @@ class RAGStorage(BaseRAGStorage):
search efficiency.
"""
app: ClientAPI | None = None
collection: Optional[Any] = None
collection_name: Optional[str] = "memory"
app: Optional[Any] = None
def __init__(
self, type, allow_reset=True, embedder_config=None, crew=None, path=None
self,
type: str = "memory",
allow_reset: bool = True,
embedder_config: Optional[Dict[str, Any]] = None,
crew: Any = None,
collection_name: Optional[str] = None,
):
super().__init__(type, allow_reset, embedder_config, crew)
agents = crew.agents if crew else []
agents = [self._sanitize_role(agent.role) for agent in agents]
agents = "_".join(agents)
self.agents = agents
self.storage_file_name = self._build_storage_file_name(type, agents)
self.collection_name = collection_name or type
self._set_embedder_config(embedder_config)
self.type = type
def save(
self,
value: Any,
metadata: Dict[str, Any],
) -> None:
with suppress_logging():
if not self.collection:
self._initialize_app()
self.allow_reset = allow_reset
self.path = path
self._initialize_app()
if isinstance(value, list):
documents = value
metadatas = [metadata] * len(value) if metadata else None
ids = [str(uuid.uuid4()) for _ in range(len(documents))]
else:
documents = [value]
metadatas = [metadata] if metadata else None
ids = [str(uuid.uuid4())]
def _set_embedder_config(self):
configurator = EmbeddingConfigurator()
self.embedder_config = configurator.configure_embedder(self.embedder_config)
def _initialize_app(self):
import chromadb
from chromadb.config import Settings
self._set_embedder_config()
chroma_client = chromadb.PersistentClient(
path=self.path if self.path else self.storage_file_name,
settings=Settings(allow_reset=self.allow_reset),
)
self.app = chroma_client
try:
self.collection = self.app.get_collection(
name=self.type, embedding_function=self.embedder_config
self.collection.add(
documents=documents,
metadatas=metadatas,
ids=ids,
)
except Exception:
self.collection = self.app.create_collection(
name=self.type, embedding_function=self.embedder_config
)
def _sanitize_role(self, role: str) -> str:
"""
Sanitizes agent roles to ensure valid directory names.
"""
return role.replace("\n", "").replace(" ", "_").replace("/", "_")
def _build_storage_file_name(self, type: str, file_name: str) -> str:
"""
Ensures file name does not exceed max allowed by OS
"""
base_path = f"{db_storage_path()}/{type}"
if len(file_name) > MAX_FILE_NAME_LENGTH:
logging.warning(
f"Trimming file name from {len(file_name)} to {MAX_FILE_NAME_LENGTH} characters."
)
file_name = file_name[:MAX_FILE_NAME_LENGTH]
return f"{base_path}/{file_name}"
def save(self, value: Any, metadata: Dict[str, Any]) -> None:
if not hasattr(self, "app") or not hasattr(self, "collection"):
self._initialize_app()
try:
self._generate_embedding(value, metadata)
except Exception as e:
logging.error(f"Error during {self.type} save: {str(e)}")
def search(
self,
@@ -115,54 +89,96 @@ class RAGStorage(BaseRAGStorage):
filter: Optional[dict] = None,
score_threshold: float = 0.35,
) -> List[Any]:
if not hasattr(self, "app"):
self._initialize_app()
with suppress_logging():
if not hasattr(self, "collection") or not self.collection:
self._initialize_app()
try:
with suppress_logging():
response = self.collection.query(query_texts=query, n_results=limit)
if isinstance(query, str):
query = [query]
fetched = self.collection.query(
query_texts=query,
n_results=limit,
where=filter,
)
results = []
for i in range(len(response["ids"][0])):
for i in range(len(fetched["ids"][0])): # type: ignore
result = {
"id": response["ids"][0][i],
"metadata": response["metadatas"][0][i],
"context": response["documents"][0][i],
"score": response["distances"][0][i],
"id": fetched["ids"][0][i], # type: ignore
"metadata": fetched["metadatas"][0][i], # type: ignore
"context": fetched["documents"][0][i], # type: ignore
"score": fetched["distances"][0][i], # type: ignore
}
if result["score"] >= score_threshold:
results.append(result)
return results
except Exception as e:
logging.error(f"Error during {self.type} search: {str(e)}")
return []
def _generate_embedding(self, text: str, metadata: Dict[str, Any]) -> None: # type: ignore
if not hasattr(self, "app") or not hasattr(self, "collection"):
def _initialize_app(self):
# Import chromadb here to avoid importing at module level
import chromadb
from chromadb.config import Settings
base_path = os.path.join(db_storage_path(), "memory")
chroma_client = chromadb.PersistentClient(
path=base_path,
settings=Settings(allow_reset=self.allow_reset),
)
self.app = chroma_client
try:
collection_name = (
f"memory_{self.collection_name}"
if self.collection_name
else "memory"
)
if self.app:
self.collection = self.app.get_or_create_collection(
name=sanitize_collection_name(collection_name),
embedding_function=self.embedder,
)
else:
raise Exception("Vector Database Client not initialized")
except Exception:
raise Exception("Failed to create or get collection")
def initialize_rag_storage(self):
self._initialize_app()
def reset(self) -> None:
# Import chromadb here to avoid importing at module level
import chromadb
from chromadb.config import Settings
base_path = os.path.join(db_storage_path(), "memory")
if not self.app:
self.app = chromadb.PersistentClient(
path=base_path,
settings=Settings(allow_reset=True),
)
self.app.reset()
shutil.rmtree(base_path)
self.app = None
self.collection = None
def _generate_embedding(
self, text: str, metadata: Optional[Dict[str, Any]] = None
) -> Any:
if not hasattr(self, "collection") or not self.collection:
self._initialize_app()
id = str(uuid.uuid4())
self.collection.add(
documents=[text],
metadatas=[metadata or {}],
ids=[str(uuid.uuid4())],
ids=[id],
)
return id
def reset(self) -> None:
try:
if self.app:
self.app.reset()
shutil.rmtree(f"{db_storage_path()}/{self.type}")
self.app = None
self.collection = None
except Exception as e:
if "attempt to write a readonly database" in str(e):
# Ignore this specific error
pass
else:
raise Exception(
f"An error occurred while resetting the {self.type} memory: {e}"
)
def _sanitize_role(self, role: str) -> str:
"""Sanitize role name for use in file names."""
return role.lower().replace(" ", "_").replace("\n", "").replace("/", "_")
def _create_default_embedding_function(self):
from chromadb.utils.embedding_functions.openai_embedding_function import (
@@ -172,3 +188,20 @@ class RAGStorage(BaseRAGStorage):
return OpenAIEmbeddingFunction(
api_key=os.getenv("OPENAI_API_KEY"), model_name="text-embedding-3-small"
)
def _set_embedder_config(self, embedder_config: Optional[Dict[str, Any]] = None) -> None:
"""Set the embedding configuration for the RAG storage.
Args:
embedder_config (Optional[Dict[str, Any]]): Configuration dictionary for the embedder.
If None or empty, defaults to the default embedding function.
"""
self.embedder = (
EmbeddingConfigurator().configure_embedder(embedder_config)
if embedder_config
else self._create_default_embedding_function()
)
def _build_storage_file_name(self, role_name: str) -> str:
"""Build storage file name from role name."""
return f"{self._sanitize_role(role_name)}_memory"

View File

@@ -1,8 +1,10 @@
import os
from typing import Any, Dict, Optional, cast
from typing import TYPE_CHECKING, Any, Dict, Optional, cast
from chromadb import Documents, EmbeddingFunction, Embeddings
from chromadb.api.types import validate_embedding_function
# Type checking imports that don't cause runtime imports
if TYPE_CHECKING:
from chromadb import Documents, EmbeddingFunction, Embeddings
from chromadb.api.types import EmbeddingFunction
class EmbeddingConfigurator:
@@ -24,8 +26,12 @@ class EmbeddingConfigurator:
def configure_embedder(
self,
embedder_config: Optional[Dict[str, Any]] = None,
) -> EmbeddingFunction:
) -> 'EmbeddingFunction':
"""Configures and returns an embedding function based on the provided config."""
# Import here to avoid importing chromadb at module level
from chromadb import EmbeddingFunction
from chromadb.api.types import validate_embedding_function
if embedder_config is None:
return self._create_default_embedding_function()
@@ -182,6 +188,9 @@ class EmbeddingConfigurator:
"IBM Watson dependencies are not installed. Please install them to use Watson embedding."
) from e
# Import chromadb types here to avoid importing at module level
from chromadb import Documents, EmbeddingFunction, Embeddings
class WatsonEmbeddingFunction(EmbeddingFunction):
def __call__(self, input: Documents) -> Embeddings:
if isinstance(input, str):
@@ -212,6 +221,10 @@ class EmbeddingConfigurator:
@staticmethod
def _configure_custom(config):
# Import here to avoid importing chromadb at module level
from chromadb import EmbeddingFunction
from chromadb.api.types import validate_embedding_function
custom_embedder = config.get("embedder")
if isinstance(custom_embedder, EmbeddingFunction):
try:

View File

@@ -10,6 +10,8 @@ from crewai import Agent, Crew, Task
from crewai.agents.cache import CacheHandler
from crewai.agents.crew_agent_executor import AgentFinish, CrewAgentExecutor
from crewai.agents.parser import CrewAgentParser, OutputParserException
from crewai.knowledge.knowledge import Knowledge
from crewai.knowledge.knowledge_config import KnowledgeConfig
from crewai.knowledge.source.base_knowledge_source import BaseKnowledgeSource
from crewai.knowledge.source.string_knowledge_source import StringKnowledgeSource
from crewai.llm import LLM
@@ -259,7 +261,9 @@ def test_cache_hitting():
def handle_tool_end(source, event):
received_events.append(event)
with (patch.object(CacheHandler, "read") as read,):
with (
patch.object(CacheHandler, "read") as read,
):
read.return_value = "0"
task = Task(
description="What is 2 times 6? Ignore correctness and just return the result of the multiplication tool, you must use the tool.",
@@ -1611,6 +1615,78 @@ def test_agent_with_knowledge_sources():
assert "red" in result.raw.lower()
@pytest.mark.vcr(filter_headers=["authorization"])
def test_agent_with_knowledge_sources_with_query_limit_and_score_threshold():
content = "Brandon's favorite color is red and he likes Mexican food."
string_source = StringKnowledgeSource(content=content)
knowledge_config = KnowledgeConfig(results_limit=10, score_threshold=0.5)
with patch(
"crewai.knowledge.storage.knowledge_storage.KnowledgeStorage"
) as MockKnowledge:
mock_knowledge_instance = MockKnowledge.return_value
mock_knowledge_instance.sources = [string_source]
mock_knowledge_instance.query.return_value = [{"content": content}]
with patch.object(Knowledge, "query") as mock_knowledge_query:
agent = Agent(
role="Information Agent",
goal="Provide information based on knowledge sources",
backstory="You have access to specific knowledge sources.",
llm=LLM(model="gpt-4o-mini"),
knowledge_sources=[string_source],
knowledge_config=knowledge_config,
)
task = Task(
description="What is Brandon's favorite color?",
expected_output="Brandon's favorite color.",
agent=agent,
)
crew = Crew(agents=[agent], tasks=[task])
crew.kickoff()
assert agent.knowledge is not None
mock_knowledge_query.assert_called_once_with(
[task.prompt()],
**knowledge_config.model_dump(),
)
@pytest.mark.vcr(filter_headers=["authorization"])
def test_agent_with_knowledge_sources_with_query_limit_and_score_threshold_default():
content = "Brandon's favorite color is red and he likes Mexican food."
string_source = StringKnowledgeSource(content=content)
knowledge_config = KnowledgeConfig()
with patch(
"crewai.knowledge.storage.knowledge_storage.KnowledgeStorage"
) as MockKnowledge:
mock_knowledge_instance = MockKnowledge.return_value
mock_knowledge_instance.sources = [string_source]
mock_knowledge_instance.query.return_value = [{"content": content}]
with patch.object(Knowledge, "query") as mock_knowledge_query:
string_source = StringKnowledgeSource(content=content)
knowledge_config = KnowledgeConfig()
agent = Agent(
role="Information Agent",
goal="Provide information based on knowledge sources",
backstory="You have access to specific knowledge sources.",
llm=LLM(model="gpt-4o-mini"),
knowledge_sources=[string_source],
knowledge_config=knowledge_config,
)
task = Task(
description="What is Brandon's favorite color?",
expected_output="Brandon's favorite color.",
agent=agent,
)
crew = Crew(agents=[agent], tasks=[task])
crew.kickoff()
assert agent.knowledge is not None
mock_knowledge_query.assert_called_once_with(
[task.prompt()],
**knowledge_config.model_dump(),
)
@pytest.mark.vcr(filter_headers=["authorization"])
def test_agent_with_knowledge_sources_extensive_role():
content = "Brandon's favorite color is red and he likes Mexican food."

View File

@@ -0,0 +1,330 @@
interactions:
- request:
body: '{"input": ["Brandon''s favorite color is red and he likes Mexican food."],
"model": "text-embedding-3-small", "encoding_format": "base64"}'
headers:
accept:
- application/json
accept-encoding:
- gzip, deflate, zstd
connection:
- keep-alive
content-length:
- '137'
content-type:
- application/json
host:
- api.openai.com
user-agent:
- OpenAI/Python 1.68.2
x-stainless-arch:
- arm64
x-stainless-async:
- 'false'
x-stainless-lang:
- python
x-stainless-os:
- MacOS
x-stainless-package-version:
- 1.68.2
x-stainless-read-timeout:
- '600'
x-stainless-retry-count:
- '0'
x-stainless-runtime:
- CPython
x-stainless-runtime-version:
- 3.12.9
method: POST
uri: https://api.openai.com/v1/embeddings
response:
body:
string: !!binary |
H4sIAAAAAAAAA1SaWw+yPtfmz59Pced/yrwR2bV9zhAQ2UkRFHEymYAiOxHZtEDfvN99ovdkNicm
YiNpu1bXdf1W//Nff/7802V1fp/++feff17VOP3z377PHumU/vPvP//9X3/+/Pnzn7/P/29k3mb5
41G9i9/w34/V+5Ev//z7D/9/nvzfQf/+889JZZQeb+UOCJHjaQqtRQfv1mXUaf0OTfTuHjx+CvAU
CWC/KEik3pNeZScAwr5Zzkgne4Gqd6jX2+oDW3BxGx8nMkfrxaq0GM2PNaV5G6QZM0P1joRJl32W
BVuXtTPPo02jZhRX8gjWdj7MgDz2D+wRexhoUHsaTN9P0RfLw5itFrmbCgzCHVFOdx/wQte1qJvK
FxH556YeT0pqoJ0RTNhqPwiskhTe0T7qIpzrwS4arGS24D4uc6y90d4VpMq+wy7hntS8mG297p2B
QNrwZ5pV1p4RZ6vPEHEPDtuWVA0L/w451CylgPUZBhFvXaQWWqXwwKobdNmavrcCyvDk4aRDezby
2c5Hym33obtXgLKlioYGKUfZovrOBtkSu6cOQbHw6POcJoyE+vmMxtegUst+HNh2ZImhpFTNsJ4V
73p1XlWO8NsSscGvH7asThqCqtoe6anKt+6SsI0K91Ef0UtGS5dIB8DD3QV2vtwaybB8lksCj5K7
pTjWccSrU5igG+hfVOuSRl+4sPGhtJ4Qtfr4w4g0EQcaRfCgnrKxXVFD8hnSZ6jTk5PN9ax4JxU5
T/5MsnTwskU6ARNOHnCJYHpsoPbgFfDzsRYaLe9dPbfLoiDTls800h+6vjruXYDSGiGs3fJFXw5Z
2APLfh+ok3y2YDvrfQvLm3/x5fPervlV0QUUF5c33aFAzQRua0JwHeCKb5/FGITQVzsgc1KIg6vA
Mmo+Xg5CiY/80LSf2UqSFwdOKDHoMVyBvk6jq0DzTRTs1uUHLFui9ciAyYGq8fHOhDqzzoCpgkE9
Ta/q9aW/VnRO/IR6zaVjs8DvQ9Q/byvOji+YzUV7ztGHji4+b4t7LZqHuwO9+NRR9aZ0YFFWKQF8
oXu+5D2rgfee2RlmG3/CO+mFXbq0SgJ35cGm+JxKjEnnY6vsjlOI9+aSsPUqdSNMj3FBD3qdMHHA
UgCXpjj46xaeI6GBnAZtrfZJOLMKLKg6VrDCz5jaF6K6/LjWGpwt36FHS430JTwlHtrvYIL3q03d
JUkWDWVaEtKLvglq9sj4HiRDHtFo3yjuyGe2D/nPY8ER4xwmtq/AQtvoptF7rdiAX9jdgRtn7+Bn
tJEZJc2GgD4yNZz3YBjWu3ziAaFrhXenYwImk8wNpOrj/M13t+Y7wz/D3/57Lz0b2K1kHixsVceP
ztUy8VYyH90e/AbvXU+sGZ8aDnKhc6LWu9nVPJp3CYqztfBbYCn6DDf9CG16+VBf9VswK6seI7h7
nqirt5POrDOr4LC/ExoMssgmx6EzlCfv+s2fnS5e3vIM9VeX0YDfaUxs04EHhXb/4KPzJoC1ac2j
jeltac5xgc5gd++B3PYI22Bmw1g9KgFZx+cdGynQMv7ilz7yDuFKkP351LMfTiYsJQCpLhIpWguN
76FsiVd6cLRJX4LU5qEFOUQdt3OBGCfJHX7PD/x7H+9L1xThHJQYF+HiUrWYe6DvogzvTlKvM6OP
EijuT5i01wcF7Pg6QvgK4pZ6vlzo82dzCmBAD4zuU7MfFvypHbRLREZku91ma5+PBfjGjz8e7mbN
sFByKDkcInrs5Iv+oVDhoTppJTYI17LVHD4qvBBo4NB9lmyRDuYMt0bqkQWbJZteYsaBUpZNvFf7
Su+cVQ5gr6YI+7u3A0Sev83KJowLXyzuwGV7N5jR68Zc6iWPoB71O63gwiqEtU9sR3wq2yG0nPZE
ja7fuKuzmQ10PnUctriPysROHCtogQ3D+JPs3O1a1AGULo+SauHJ0XlpX8ZI350yeqCtmjFn4T0w
SHyDz/xqg+2xnU3UZWJP7VURXHaDlgBHg8P00NhvfTy2rgqPRM1pvKIjmPwodNDU8neq2ydTZ2O7
C1F5qFKKX8PFHRZvXGFV81eaBMou4wcOjzLwbk/qpxe1/uyMoEAmCc4Yl4GQrZzstsgq+Qe9D4cZ
LFkbJCg6NjG9Wy8pWzj/3cBY7VSaymuf0eK2jnBXmRE+bZkJlqdp+NAf6gFbh9saPQ/d7Q6siTtg
HG1ubDtLnA8/1fuK/TpfXVbO7gijGR7pQ3e2EVPA2qJ170v+FosWWEPf6iBRgI6dku7Z0ianM2Jl
scWq4R/Z9qkd2l98YU3isD7z7UuBtw4V9Hbef+pZu2wEEE/3DCdFdXDF6j7NMOmftg99uXDFME1T
MCX6SA/OvAGTcm4tlJ1uFoHzsdQZvxHu8BQ+eurtkiXrcuobMLzW2Tf+TCZka+8or+FpYqypDhBU
5eahY3a54ftLBzULDTuF0f3t4X2N7+4QGrsUifHG8We0fdXzbCoOQHjzJlwbKFk/umYDMQsqmunt
0RUDj2ko+/QP6ofSW19xoDVQUHHuw289YJf3MiM+vwQ+f5Xf9Xw3OwIT3eNwcu59sDruWUBjOQ/f
/FPd+V0FDZKlWPrG71Nf1UJvYOyh2R+rUxQxL88D0KnUoeq8sfTV3+cBME6Pid5uPnDXZBOk6Fyf
ZRxe02wQd6LdweOknzHu4zYj42WV0P5QddSbjzv3qydMEMutia3sHgOmk0CDnZ5gjO1HEwm9kUsg
7R47cu6j7bBWwVlDfsy9sNHXls5jXV+R6sYt3hdSyJZQdAo4dvGeWtHJGeYUpzEUkpzhg3UT6l5c
+hVAw4rxJeUfmejpQYAe7+6CdUM4ZGPt2wW4RvRBTtK2rCdD9yE8lUGK3fdJjNa++ljoKDsbsnkN
F53pZyeF72U806h3OnfVpwqiOx8+fbEH2F1lv65QNhUrPp6FYGChsUuQ44QG+XyWpu72qOHhmi4W
tqSLrI8lfOXwu/4EWcF+WPwK3OHNF2y8++UP90gq6LuCgdXyqkXisl5N+DS3D38cDjOb2LrpQYap
R+1XvGckOz8lKB7gGVtNyA9MLeYO1uSe/I1/sm/kM4Q0FPAh4LfR3JwXH+7ulYbNfa6yZdqPBZyK
k0b4Q5zrM4hUAscBB/7mUXRgtsU2hJvLcaDhV691b0uDCJ2G2lcS2YqEX/1Q93NMFi0wIuZTJYQf
73yhd1Tm7Lee8Fffn2G0YyR5OTMEiiz81cNzNXkWKOjJwLio9IHXmm6Gh06JCVceLPbyo9BCttTY
9EGHhz562zGGhnn1iOQLA1jj65BCw5NaelNnkTGzLGcknMDNp+c0ASztKgWN+mZPeLtvXbYTdx0q
J5Zh4/OqgNhc5xWl92tMHca9I4L7xEPcqzDopdvfwGLnxxGQ15nhOMWSPj60xoROc8uwNRhwWC9Z
pwFXwzufhMctYKdJilFrhyN1JbQOS+ExCV5KMaA4JUUtkkYkcHekId1J23KYPlc8Q6F+nbHpiAYT
/H0cgqbY7nF2/SzDeidTAcXus6exvKsznvVVBWzcaVitLu+azP2OQ3nQ3vFR35rDSuswAN/zi6bc
0XZnTnvwEOXrlSyXThyW4v7wIS5eDwJvAj98/VGOXpVWUvVkDBnLrZhAr4MTTQ63NXv1JTThZz8D
ejZOlc7sgFNgs9SCr4xwBMOK3AQw8bDHTnhQ3fmcOxVQRVvyW+cdZrNw23RQTsuQXvdvEnXCTezg
V/9jXyyFYU7eUwC+649x0A/Z2F6bEH52UovvrviKmGQ0AZQjJPn0W8+WROs0eNPGJ/7uV0ZO4baH
7MnLOEorLZtruvGgbGw0uie3sl5W89Er6fFc0KvCDpHYEHuGj6zoaWC/FzDSOMqha60ZvcX3ldEw
TRN48Z83ImSSP7ztfeijDzftyLwXm2hpa0GFtVFb1Lrtp3q2XC+AN5+38a1Gu2HN1spCfpC5fu1F
Klubq05A9zIdvFPfE2P35xIgpQptvAOew8b+Up1RoEUJWVq31L/+xwKdOjk+SIcxG7/+BT6FmMM7
3f/UyzbaqlAe8RG7toOG1RtTC87BpyXKc5gzej51K5DTOiTrz3+MrtlC7Xzf0MygOViFneEgTWMG
EU/PdqDQxyr4xruvnP2rK1YCSGDPVkbk1dOjNS83IXynuUV2TjYPrBJYgooP4b/1XAbjwB0IspIX
wvjrJxfjcW+g/RjI3/zeEsG1ABUqTNAeopppx5CgQx2bRA53Z33xo9SCX/9CQ6fh9NkQtg5U+qnx
pcVadfbYvVV4lOwtdV3dHOayuvLwr5442DAah3uoAilsYuqm2Y0xv0g4aDTWilWeXob+5yerbRHS
3OMXtvRjl8DoNu588P0/oVlOZyTprMWHT5jUc1xsW8QB80z3NYb6Kk2tBQzz4vmgrZuaeGNooe/+
4Hs7hvo69zsICkuT8eGCz7qwSS0Ic1F442P08CIWJ0n+mx/OT3iumXB8KNAk4Zna25HLuqYJKyV/
uRXWb29uIHxnqYg7jDmOv/p5xYHTgs/HWbB52PYubV2nAr1xCjEOoxIsFSYQds1B9cEwCvqH65Cn
tLkAqblbeMZW2ZxBe3R3hKvzVV/VxvLh6SjVON9fpYg8thcffuOF7le+yJazUUL4He+DMDjWq9AK
Ofz6bep/kBmtgfoJwM+P//QSAcPowcpUF2oNZ6zz2+nSwmePI+p847G/eKIJvzzG59zsOKzFsYOQ
K7dbrOW2Fq3skRH48rueujh/RYufcwJMQ2nxV0n9AJ7TLgI8vh8G9oZr/eUxZwOs9v74t77MQWMb
cFrhAf/4AlnDdwJrkifY9rpZnwu1smBwMix6KRxVF+JFVmFgCasv3MpXtBYPJ4Bf/uR/84P1nlZ7
sDrZIba//n972BYB7ByXYrtGZT13ecYB4zNp1N33VT3xaOBgkjCf2l//skRPysHPTmmpcxzLaN4e
hgA+5DjA9qrE7rLDyAdAgwrWpdsH9N94Adc0EPGVxRMYXwKRBN1PMVXdwIrGSpzv6PIIbGorh4e+
3MP1DrlPevOVIi7ZOp24EH6su4rdp+nqc7o1eoBs74AzdXPK1s/q3OGzmAm9G45c08tbXuGrkUey
kblRX81LoMIg2kRkMa4gm8N+nuFq+hrWw/RabzNPWxEHjDPeqWKRfeshgXypxv6WB9eMNRFvIWJ0
2pff2GB94EpBT7uoaGRvJDAE6NSDn5/An6TUZ/JZUmgZqkj9n59Mt0YHvTeqMW5QFTFOnnv0KE3g
S9JFdsku6EOIL3dINet4ZswwPsbPj1Kz6gX2V/898tsWm5F20lnwVjsISTQS5uAh+/kLkBi+hfeL
a4P1Jdkqsh8f8ounaO2yJYeeax+wh5ZTti2ESoDILDVqA61gLdjLCrw/1AjnXaNFQmqqFpJkofzq
rXMtPLJDAwf7qpNNGJVs1Squh0shSr5eTWRgnZYoP/5Hf/V9eegnBzk93JJlSSx3+3EH86/+TJxs
0sePw2k/v4F97TW7rDrZBuTX8UT98HgBf+P9wlcXapy2a0Q9PQhhnM0FtnxhYKstqykiUS8RepR0
ff6ej+jnv9WrEGWdwB9D+OODNnu9o+X1MM8wf9kVVu3rJluaUuNQuz1wVP2spfv14xU0p8ih2lfv
j/1pGuH7cYmw7WIXTOZ+10FWVlsif/VmZa3JqHz1IE6Q2mdTzXcK+PmPmzpfmdjbaovW07mmh1sf
slwB+wKi+pLjQ35p3JHgYYXvQr3R5+x8shliwwBBWkzU6U6gZkpshYB1Jv7LE8lbEO+wjo869SWv
05dVkRQYO+aLajei12xrEg+ixEM07tddLcZJkENjy084v6V3MJfVk4dKTxu6S/U4E19GKoDlnYXY
GODodibRq59/95tOWQZi9887hMcW+BsBnrKuv1Qxep42nE8bpGX0OSUdsvqT9NX7N3196dMK3jGT
8Y8v0vvF4+DcKcDnC/rK2NVYC9gmUYXNuW3qsTnLPuT4QvnL36gZWjkgVnHF8VDw0ZJ06Rn89OXh
y5/nYotm8KwH2R+Ss66z80hWeLZvnb9aj0O09pJigVbYW2TzfG/qL//t4fvVfDCWudFlfc8F4Fuv
/RmUj+FvfP/48zGTXH3eo0aA3SFofHFNunruPlao6PcbwCY7qpFgrQGBJ0f18cXakujzOy/uZPTp
MVVeTOQbKf/xJh8Yu1GfnFtC/uphR65EMCpVpaLwg2O6D8I3G7n7cIfh5xhT9cu/tk2pwb/n704o
5ZrlUOb++m9POIvscxpzD/C6N/scM69s9kdlhPFbcrGfjL07J+uphfkwQiIAS3HZrAQxOoXPHtuq
ttdZ6ZxS2NCtgs3XfaOz1H+EEE/jiUZZsNX71wOkcHFaAWP+1X39b5LA5elCbHCtNyyf10xgfUQT
tsL2FgkuYHf441Xu+3TNVuUSeSDMjS3+zg9M9ypa4Tk71PjohoW7dNtUhd4hWKm1nwFgr7zI4U9f
eN/6Roqu5cBV0yvy+vLHJXx/PAhvpMV6L6VsNHZlBcUYOfiIPM9d3pakKi+xWagfd+eBedIgKXUV
E+zue60WY1LeUX1fH4Trmiqa72OmAtDGM7WGjZqtvixX8DQQ++cHojZLzhWyLmSlh52oAvLVI1Ai
RPXnznZqwQTOGZJBOn737wAYy9UZXomI/K6Sg4wqh2sMPvsV+Eyyxay5pHsJSmEb00MPqD5yZnqH
h13n+gVYAp0gq+GQ8nzr2KrkOXsqr7KFocZdsEfXSm+8/XGEsekH2NIClQnNoUrhV/9hz8nL7Dt/
Dfn87OIo0P1sCuaUh6Of8Pi51FM04scpgYdOinE4LZa7ts1gQOsyrth5C3B4r9E5h29gyT48qaK+
wh0KfvoIm6P7AsupWCowjKNMjU451bM1cw388lYfda4W8T//8tO3qL40NZ3C4QzltkMkXa5DvV7e
xgjD7WVHrbCVo/Vz5Xl4dYMdPnz546/fBNVrfiLsNDZsLTTY/+XBzlu418365DyABK32pQDPOuOC
HUGPdRWwffOBvn75E7D3fuRvP2TOft+V3/ltPOOdvrinZw7H10elKh7e+qwr8vyXJ+jLuxx+fAea
B9X20bc/Jy7r0wT1W+KoSUI4rHC3DWBQrTr2PmQEM51LE3KPgFDHPmn1ll7VCpW9EhDu64f7Je16
CPJDSv/y09/+QXIa8U14V+53fh3gdq5B/Uxn2boeNPJ3v3+8aL7HjgbsVvH9zVqGNVPNWkWFC3yM
maeDWZ3SBB57y8YPN1RdsT4ad3iuY5kweZ3YlHnajMjj8PDrxlwZKVgWKN/8JLydsmGcH64FvvyG
2l7yqIXJPDXw16+5uJ445C03E7Sl1egPYVHojNJGQy3sK7LVYkVfvL0yg/FWX+h+cT9gDo6z+ePZ
1PsQD/A7fbgrWtH09FLWjT7LmRKD24frCT9fB33clpICnVug4PjC37J5rG0Cw3MXUHVBU0buZKqg
9pAbIn/95dZpUAKNIvz2x6gwEJl/8Art04akzwyB1VkOFij8JvKRKZbRXDMmwFLv9vj2kMdo2Vyu
IXwKZw5rEoEDO5eage7lOyVojsmwGn53/ulzwmXRC3Q/fbbi7o3d8fqu57MUWCiWGxMb57cyrPHE
J3Cjdi+q7kUjEz+N3/x4Az7qzjZjiHox9Iz3RL/nq1v20tGAuXUx8SGWnJpwsttA8VE1VK0uh0E0
9WsFke0fiJArn+jbLwgB7qwdVvFwcLfpLr//rV93NfV0QZ96CF3zEGMHZCoQxtkq4MtZL1g7GLo7
f/UUcJos80EkGy7jnrWKmizkCbt4G7AU94uHCpBSAvNyD2axGTiYB82dgAFag6iatQY3jZZh/Cgs
9u0H8gicu+2X56g1PxmbHF6r4ogvQjOBbhaDFZqmusEnutHB+uuH3o2Vo4ar9+58wKqCEsOzsMtX
jM03q+pQrfYMHxVniZaW0zuknfPNr75m7CxGBrSLM4+vn2pwl3Qrj8BiRYKdahrc6ZhgCwqaUmNL
/zRg+PYf//qLxxBw0fzrXzqiUdCQShyjXz+tpM6QUe/L9+kkXToI1nuGL1qc6uu9VgVk1G6HHRO/
f/mz/vgNdpX247L+eWrR1WogVpOyAV9+3EKBozw2/L0BmKy+O0iiTsJJHL0yui0YD6XLsyTAdh71
Mj9mH/7zuxXwX//68+d//G4YtN0jf30vBkz5Mv3H/7kq8B/if4xt+nr9vYZAxrTI//n3/76B8M9n
6NrP9D+nrsnf4z///rMV/t41+GfqpvT1/z7/1/dV//Wv/wUAAP//AwBcfFVx4CAAAA==
headers:
CF-RAY:
- 931fcf607c16eb34-SJC
Connection:
- keep-alive
Content-Encoding:
- gzip
Content-Type:
- application/json
Date:
- Thu, 17 Apr 2025 23:47:53 GMT
Server:
- cloudflare
Set-Cookie:
- __cf_bm=CncSMPCav.9EJL3emmM0sTqugx5GN6_Oy8JPFBssVho-1744933673-1.0.1.1-Q1XMvHbQdrfEWkkCYeeNHwFdZ1NpjAGJ_0jOUYIk_APelFe7nCanjW_xlOj12b3JQql9.iWQDiHvCeYJDTWkdxnNiMQOEiFMYHX5YZXUuJs;
path=/; expires=Fri, 18-Apr-25 00:17:53 GMT; domain=.api.openai.com; HttpOnly;
Secure; SameSite=None
- _cfuvid=unfPTYCpF5COtm5PuZDuaJqlhefP0iibfjsXHc9lKq0-1744933673515-0.0.1.1-604800000;
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
Transfer-Encoding:
- chunked
X-Content-Type-Options:
- nosniff
access-control-allow-origin:
- '*'
access-control-expose-headers:
- X-Request-ID
alt-svc:
- h3=":443"; ma=86400
cf-cache-status:
- DYNAMIC
openai-model:
- text-embedding-3-small
openai-organization:
- crewai-iuxna1
openai-processing-ms:
- '75'
openai-version:
- '2020-10-01'
strict-transport-security:
- max-age=31536000; includeSubDomains; preload
via:
- envoy-router-8687b6cbdb-4qpmr
x-envoy-upstream-service-time:
- '46'
x-ratelimit-limit-requests:
- '10000'
x-ratelimit-limit-tokens:
- '10000000'
x-ratelimit-remaining-requests:
- '9999'
x-ratelimit-remaining-tokens:
- '9999986'
x-ratelimit-reset-requests:
- 6ms
x-ratelimit-reset-tokens:
- 0s
x-request-id:
- req_b8c884a7fe2bd4732903ecbdc632576d
status:
code: 200
message: OK
- request:
body: '{"messages": [{"role": "system", "content": "You are Information Agent.
You have access to specific knowledge sources.\nYour personal goal is: Provide
information based on knowledge sources\nTo give my best complete final answer
to the task respond using the exact following format:\n\nThought: I now can
give a great answer\nFinal Answer: Your final answer must be the great and the
most complete as possible, it must be outcome described.\n\nI MUST use these
formats, my job depends on it!"}, {"role": "user", "content": "\nCurrent Task:
What is Brandon''s favorite color?\n\nThis is the expected criteria for your
final answer: Brandon''s favorite color.\nyou MUST return the actual complete
content as the final answer, not a summary.\n\nBegin! This is VERY important
to you, use the tools available and give your best Final Answer, your job depends
on it!\n\nThought:"}], "model": "gpt-4o-mini", "stop": ["\nObservation:"]}'
headers:
accept:
- application/json
accept-encoding:
- gzip, deflate, zstd
connection:
- keep-alive
content-length:
- '926'
content-type:
- application/json
host:
- api.openai.com
user-agent:
- OpenAI/Python 1.68.2
x-stainless-arch:
- arm64
x-stainless-async:
- 'false'
x-stainless-lang:
- python
x-stainless-os:
- MacOS
x-stainless-package-version:
- 1.68.2
x-stainless-raw-response:
- 'true'
x-stainless-read-timeout:
- '600.0'
x-stainless-retry-count:
- '0'
x-stainless-runtime:
- CPython
x-stainless-runtime-version:
- 3.12.9
method: POST
uri: https://api.openai.com/v1/chat/completions
response:
body:
string: !!binary |
H4sIAAAAAAAAAwAAAP//jJNNi9swEIbv+RWDLr0ki/NBvm5NYUsplFK29NAuZiKNnWlkjVaSk02X
/PdiJxtn2y30YrCeecfvvCM/9QAUG7UEpTeYdOXtYPXp7vbLw7evm4ePmObvt/jrsVgVn9/5dbhj
1W8Usv5JOj2rbrRU3lJicSesA2GiputwNpksxuPpbNyCSgzZRlb6NJjIoGLHg1E2mgyy2WA4P6s3
wpqiWsL3HgDAU/tsfDpDj2oJWf/5pKIYsSS1vBQBqCC2OVEYI8eELql+B7W4RK61/gGc7EGjg5J3
BAhlYxvQxT0FgB/ulh1aeNu+L2EV0BlxbyIUuJPAiUCLlQAcwUkCX68ta3sAI7quyCUywA72bMge
AHfIFteWYOtkb8mUBFHqoCneXPsLVNQRm4xcbe0VQOckYZNxm8z9mRwvWVgpfZB1/EOqCnYcN3kg
jOKauWMSr1p67AHct5nXL2JUPkjlU55kS+3nhtP5qZ/qVt3R0fQMkyS0V6rFpP9Kv9xQQrbxamtK
o96Q6aTdirE2LFegdzX1325e632anF35P+07oDX5RCb3gQzrlxN3ZYGaP+FfZZeUW8MqUtixpjwx
hWYThgqs7el+qniIiaq8YFdS8IFPl7TweTZejOajUbbIVO/Y+w0AAP//AwA4a1/QsgMAAA==
headers:
CF-RAY:
- 931fcf649bdbed40-SJC
Connection:
- keep-alive
Content-Encoding:
- gzip
Content-Type:
- application/json
Date:
- Thu, 17 Apr 2025 23:47:54 GMT
Server:
- cloudflare
Set-Cookie:
- __cf_bm=8vySwO0xqpm0u93_1_rXQwTeEIWa2ei3_CD5sAdoo3o-1744933674-1.0.1.1-iqZDpH5poOUp4Rcnhfrb0N2Z0c2662RBiPEcx_gefNW.m3tBA3qyFa8tmFv7PitH8u9vyYK7jxUwy4lPiSi830QWNbTMgCMTbrJ7iaUV7hY;
path=/; expires=Fri, 18-Apr-25 00:17:54 GMT; domain=.api.openai.com; HttpOnly;
Secure; SameSite=None
- _cfuvid=IXAyT8eWpFERM53ngcYNmaqhocfGbOHWSoe7SFNdoGI-1744933674288-0.0.1.1-604800000;
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
Transfer-Encoding:
- chunked
X-Content-Type-Options:
- nosniff
access-control-expose-headers:
- X-Request-ID
alt-svc:
- h3=":443"; ma=86400
cf-cache-status:
- DYNAMIC
openai-organization:
- crewai-iuxna1
openai-processing-ms:
- '489'
openai-version:
- '2020-10-01'
strict-transport-security:
- max-age=31536000; includeSubDomains; preload
x-ratelimit-limit-requests:
- '30000'
x-ratelimit-limit-tokens:
- '150000000'
x-ratelimit-remaining-requests:
- '29999'
x-ratelimit-remaining-tokens:
- '149999801'
x-ratelimit-reset-requests:
- 2ms
x-ratelimit-reset-tokens:
- 0s
x-request-id:
- req_151f9d0b786f2022f249ee20ea108b43
status:
code: 200
message: OK
version: 1

View File

@@ -0,0 +1,330 @@
interactions:
- request:
body: '{"input": ["Brandon''s favorite color is red and he likes Mexican food."],
"model": "text-embedding-3-small", "encoding_format": "base64"}'
headers:
accept:
- application/json
accept-encoding:
- gzip, deflate, zstd
connection:
- keep-alive
content-length:
- '137'
content-type:
- application/json
host:
- api.openai.com
user-agent:
- OpenAI/Python 1.68.2
x-stainless-arch:
- arm64
x-stainless-async:
- 'false'
x-stainless-lang:
- python
x-stainless-os:
- MacOS
x-stainless-package-version:
- 1.68.2
x-stainless-read-timeout:
- '600'
x-stainless-retry-count:
- '0'
x-stainless-runtime:
- CPython
x-stainless-runtime-version:
- 3.12.9
method: POST
uri: https://api.openai.com/v1/embeddings
response:
body:
string: !!binary |
H4sIAAAAAAAAA1SaWw+yPtfmz59Pced/yrwR2bV9zhAQ2UkRFHEymYAiOxHZtEDfvN99ovdkNicm
YiNpu1bXdf1W//Nff/7802V1fp/++feff17VOP3z377PHumU/vPvP//9X3/+/Pnzn7/P/29k3mb5
41G9i9/w34/V+5Ev//z7D/9/nvzfQf/+889JZZQeb+UOCJHjaQqtRQfv1mXUaf0OTfTuHjx+CvAU
CWC/KEik3pNeZScAwr5Zzkgne4Gqd6jX2+oDW3BxGx8nMkfrxaq0GM2PNaV5G6QZM0P1joRJl32W
BVuXtTPPo02jZhRX8gjWdj7MgDz2D+wRexhoUHsaTN9P0RfLw5itFrmbCgzCHVFOdx/wQte1qJvK
FxH556YeT0pqoJ0RTNhqPwiskhTe0T7qIpzrwS4arGS24D4uc6y90d4VpMq+wy7hntS8mG297p2B
QNrwZ5pV1p4RZ6vPEHEPDtuWVA0L/w451CylgPUZBhFvXaQWWqXwwKobdNmavrcCyvDk4aRDezby
2c5Hym33obtXgLKlioYGKUfZovrOBtkSu6cOQbHw6POcJoyE+vmMxtegUst+HNh2ZImhpFTNsJ4V
73p1XlWO8NsSscGvH7asThqCqtoe6anKt+6SsI0K91Ef0UtGS5dIB8DD3QV2vtwaybB8lksCj5K7
pTjWccSrU5igG+hfVOuSRl+4sPGhtJ4Qtfr4w4g0EQcaRfCgnrKxXVFD8hnSZ6jTk5PN9ax4JxU5
T/5MsnTwskU6ARNOHnCJYHpsoPbgFfDzsRYaLe9dPbfLoiDTls800h+6vjruXYDSGiGs3fJFXw5Z
2APLfh+ok3y2YDvrfQvLm3/x5fPervlV0QUUF5c33aFAzQRua0JwHeCKb5/FGITQVzsgc1KIg6vA
Mmo+Xg5CiY/80LSf2UqSFwdOKDHoMVyBvk6jq0DzTRTs1uUHLFui9ciAyYGq8fHOhDqzzoCpgkE9
Ta/q9aW/VnRO/IR6zaVjs8DvQ9Q/byvOji+YzUV7ztGHji4+b4t7LZqHuwO9+NRR9aZ0YFFWKQF8
oXu+5D2rgfee2RlmG3/CO+mFXbq0SgJ35cGm+JxKjEnnY6vsjlOI9+aSsPUqdSNMj3FBD3qdMHHA
UgCXpjj46xaeI6GBnAZtrfZJOLMKLKg6VrDCz5jaF6K6/LjWGpwt36FHS430JTwlHtrvYIL3q03d
JUkWDWVaEtKLvglq9sj4HiRDHtFo3yjuyGe2D/nPY8ER4xwmtq/AQtvoptF7rdiAX9jdgRtn7+Bn
tJEZJc2GgD4yNZz3YBjWu3ziAaFrhXenYwImk8wNpOrj/M13t+Y7wz/D3/57Lz0b2K1kHixsVceP
ztUy8VYyH90e/AbvXU+sGZ8aDnKhc6LWu9nVPJp3CYqztfBbYCn6DDf9CG16+VBf9VswK6seI7h7
nqirt5POrDOr4LC/ExoMssgmx6EzlCfv+s2fnS5e3vIM9VeX0YDfaUxs04EHhXb/4KPzJoC1ac2j
jeltac5xgc5gd++B3PYI22Bmw1g9KgFZx+cdGynQMv7ilz7yDuFKkP351LMfTiYsJQCpLhIpWguN
76FsiVd6cLRJX4LU5qEFOUQdt3OBGCfJHX7PD/x7H+9L1xThHJQYF+HiUrWYe6DvogzvTlKvM6OP
EijuT5i01wcF7Pg6QvgK4pZ6vlzo82dzCmBAD4zuU7MfFvypHbRLREZku91ma5+PBfjGjz8e7mbN
sFByKDkcInrs5Iv+oVDhoTppJTYI17LVHD4qvBBo4NB9lmyRDuYMt0bqkQWbJZteYsaBUpZNvFf7
Su+cVQ5gr6YI+7u3A0Sev83KJowLXyzuwGV7N5jR68Zc6iWPoB71O63gwiqEtU9sR3wq2yG0nPZE
ja7fuKuzmQ10PnUctriPysROHCtogQ3D+JPs3O1a1AGULo+SauHJ0XlpX8ZI350yeqCtmjFn4T0w
SHyDz/xqg+2xnU3UZWJP7VURXHaDlgBHg8P00NhvfTy2rgqPRM1pvKIjmPwodNDU8neq2ydTZ2O7
C1F5qFKKX8PFHRZvXGFV81eaBMou4wcOjzLwbk/qpxe1/uyMoEAmCc4Yl4GQrZzstsgq+Qe9D4cZ
LFkbJCg6NjG9Wy8pWzj/3cBY7VSaymuf0eK2jnBXmRE+bZkJlqdp+NAf6gFbh9saPQ/d7Q6siTtg
HG1ubDtLnA8/1fuK/TpfXVbO7gijGR7pQ3e2EVPA2qJ170v+FosWWEPf6iBRgI6dku7Z0ianM2Jl
scWq4R/Z9qkd2l98YU3isD7z7UuBtw4V9Hbef+pZu2wEEE/3DCdFdXDF6j7NMOmftg99uXDFME1T
MCX6SA/OvAGTcm4tlJ1uFoHzsdQZvxHu8BQ+eurtkiXrcuobMLzW2Tf+TCZka+8or+FpYqypDhBU
5eahY3a54ftLBzULDTuF0f3t4X2N7+4QGrsUifHG8We0fdXzbCoOQHjzJlwbKFk/umYDMQsqmunt
0RUDj2ko+/QP6ofSW19xoDVQUHHuw289YJf3MiM+vwQ+f5Xf9Xw3OwIT3eNwcu59sDruWUBjOQ/f
/FPd+V0FDZKlWPrG71Nf1UJvYOyh2R+rUxQxL88D0KnUoeq8sfTV3+cBME6Pid5uPnDXZBOk6Fyf
ZRxe02wQd6LdweOknzHu4zYj42WV0P5QddSbjzv3qydMEMutia3sHgOmk0CDnZ5gjO1HEwm9kUsg
7R47cu6j7bBWwVlDfsy9sNHXls5jXV+R6sYt3hdSyJZQdAo4dvGeWtHJGeYUpzEUkpzhg3UT6l5c
+hVAw4rxJeUfmejpQYAe7+6CdUM4ZGPt2wW4RvRBTtK2rCdD9yE8lUGK3fdJjNa++ljoKDsbsnkN
F53pZyeF72U806h3OnfVpwqiOx8+fbEH2F1lv65QNhUrPp6FYGChsUuQ44QG+XyWpu72qOHhmi4W
tqSLrI8lfOXwu/4EWcF+WPwK3OHNF2y8++UP90gq6LuCgdXyqkXisl5N+DS3D38cDjOb2LrpQYap
R+1XvGckOz8lKB7gGVtNyA9MLeYO1uSe/I1/sm/kM4Q0FPAh4LfR3JwXH+7ulYbNfa6yZdqPBZyK
k0b4Q5zrM4hUAscBB/7mUXRgtsU2hJvLcaDhV691b0uDCJ2G2lcS2YqEX/1Q93NMFi0wIuZTJYQf
73yhd1Tm7Lee8Fffn2G0YyR5OTMEiiz81cNzNXkWKOjJwLio9IHXmm6Gh06JCVceLPbyo9BCttTY
9EGHhz562zGGhnn1iOQLA1jj65BCw5NaelNnkTGzLGcknMDNp+c0ASztKgWN+mZPeLtvXbYTdx0q
J5Zh4/OqgNhc5xWl92tMHca9I4L7xEPcqzDopdvfwGLnxxGQ15nhOMWSPj60xoROc8uwNRhwWC9Z
pwFXwzufhMctYKdJilFrhyN1JbQOS+ExCV5KMaA4JUUtkkYkcHekId1J23KYPlc8Q6F+nbHpiAYT
/H0cgqbY7nF2/SzDeidTAcXus6exvKsznvVVBWzcaVitLu+azP2OQ3nQ3vFR35rDSuswAN/zi6bc
0XZnTnvwEOXrlSyXThyW4v7wIS5eDwJvAj98/VGOXpVWUvVkDBnLrZhAr4MTTQ63NXv1JTThZz8D
ejZOlc7sgFNgs9SCr4xwBMOK3AQw8bDHTnhQ3fmcOxVQRVvyW+cdZrNw23RQTsuQXvdvEnXCTezg
V/9jXyyFYU7eUwC+649x0A/Z2F6bEH52UovvrviKmGQ0AZQjJPn0W8+WROs0eNPGJ/7uV0ZO4baH
7MnLOEorLZtruvGgbGw0uie3sl5W89Er6fFc0KvCDpHYEHuGj6zoaWC/FzDSOMqha60ZvcX3ldEw
TRN48Z83ImSSP7ztfeijDzftyLwXm2hpa0GFtVFb1Lrtp3q2XC+AN5+38a1Gu2HN1spCfpC5fu1F
Klubq05A9zIdvFPfE2P35xIgpQptvAOew8b+Up1RoEUJWVq31L/+xwKdOjk+SIcxG7/+BT6FmMM7
3f/UyzbaqlAe8RG7toOG1RtTC87BpyXKc5gzej51K5DTOiTrz3+MrtlC7Xzf0MygOViFneEgTWMG
EU/PdqDQxyr4xruvnP2rK1YCSGDPVkbk1dOjNS83IXynuUV2TjYPrBJYgooP4b/1XAbjwB0IspIX
wvjrJxfjcW+g/RjI3/zeEsG1ABUqTNAeopppx5CgQx2bRA53Z33xo9SCX/9CQ6fh9NkQtg5U+qnx
pcVadfbYvVV4lOwtdV3dHOayuvLwr5442DAah3uoAilsYuqm2Y0xv0g4aDTWilWeXob+5yerbRHS
3OMXtvRjl8DoNu588P0/oVlOZyTprMWHT5jUc1xsW8QB80z3NYb6Kk2tBQzz4vmgrZuaeGNooe/+
4Hs7hvo69zsICkuT8eGCz7qwSS0Ic1F442P08CIWJ0n+mx/OT3iumXB8KNAk4Zna25HLuqYJKyV/
uRXWb29uIHxnqYg7jDmOv/p5xYHTgs/HWbB52PYubV2nAr1xCjEOoxIsFSYQds1B9cEwCvqH65Cn
tLkAqblbeMZW2ZxBe3R3hKvzVV/VxvLh6SjVON9fpYg8thcffuOF7le+yJazUUL4He+DMDjWq9AK
Ofz6bep/kBmtgfoJwM+P//QSAcPowcpUF2oNZ6zz2+nSwmePI+p847G/eKIJvzzG59zsOKzFsYOQ
K7dbrOW2Fq3skRH48rueujh/RYufcwJMQ2nxV0n9AJ7TLgI8vh8G9oZr/eUxZwOs9v74t77MQWMb
cFrhAf/4AlnDdwJrkifY9rpZnwu1smBwMix6KRxVF+JFVmFgCasv3MpXtBYPJ4Bf/uR/84P1nlZ7
sDrZIba//n972BYB7ByXYrtGZT13ecYB4zNp1N33VT3xaOBgkjCf2l//skRPysHPTmmpcxzLaN4e
hgA+5DjA9qrE7rLDyAdAgwrWpdsH9N94Adc0EPGVxRMYXwKRBN1PMVXdwIrGSpzv6PIIbGorh4e+
3MP1DrlPevOVIi7ZOp24EH6su4rdp+nqc7o1eoBs74AzdXPK1s/q3OGzmAm9G45c08tbXuGrkUey
kblRX81LoMIg2kRkMa4gm8N+nuFq+hrWw/RabzNPWxEHjDPeqWKRfeshgXypxv6WB9eMNRFvIWJ0
2pff2GB94EpBT7uoaGRvJDAE6NSDn5/An6TUZ/JZUmgZqkj9n59Mt0YHvTeqMW5QFTFOnnv0KE3g
S9JFdsku6EOIL3dINet4ZswwPsbPj1Kz6gX2V/898tsWm5F20lnwVjsISTQS5uAh+/kLkBi+hfeL
a4P1Jdkqsh8f8ounaO2yJYeeax+wh5ZTti2ESoDILDVqA61gLdjLCrw/1AjnXaNFQmqqFpJkofzq
rXMtPLJDAwf7qpNNGJVs1Squh0shSr5eTWRgnZYoP/5Hf/V9eegnBzk93JJlSSx3+3EH86/+TJxs
0sePw2k/v4F97TW7rDrZBuTX8UT98HgBf+P9wlcXapy2a0Q9PQhhnM0FtnxhYKstqykiUS8RepR0
ff6ej+jnv9WrEGWdwB9D+OODNnu9o+X1MM8wf9kVVu3rJluaUuNQuz1wVP2spfv14xU0p8ih2lfv
j/1pGuH7cYmw7WIXTOZ+10FWVlsif/VmZa3JqHz1IE6Q2mdTzXcK+PmPmzpfmdjbaovW07mmh1sf
slwB+wKi+pLjQ35p3JHgYYXvQr3R5+x8shliwwBBWkzU6U6gZkpshYB1Jv7LE8lbEO+wjo869SWv
05dVkRQYO+aLajei12xrEg+ixEM07tddLcZJkENjy084v6V3MJfVk4dKTxu6S/U4E19GKoDlnYXY
GODodibRq59/95tOWQZi9887hMcW+BsBnrKuv1Qxep42nE8bpGX0OSUdsvqT9NX7N3196dMK3jGT
8Y8v0vvF4+DcKcDnC/rK2NVYC9gmUYXNuW3qsTnLPuT4QvnL36gZWjkgVnHF8VDw0ZJ06Rn89OXh
y5/nYotm8KwH2R+Ss66z80hWeLZvnb9aj0O09pJigVbYW2TzfG/qL//t4fvVfDCWudFlfc8F4Fuv
/RmUj+FvfP/48zGTXH3eo0aA3SFofHFNunruPlao6PcbwCY7qpFgrQGBJ0f18cXakujzOy/uZPTp
MVVeTOQbKf/xJh8Yu1GfnFtC/uphR65EMCpVpaLwg2O6D8I3G7n7cIfh5xhT9cu/tk2pwb/n704o
5ZrlUOb++m9POIvscxpzD/C6N/scM69s9kdlhPFbcrGfjL07J+uphfkwQiIAS3HZrAQxOoXPHtuq
ttdZ6ZxS2NCtgs3XfaOz1H+EEE/jiUZZsNX71wOkcHFaAWP+1X39b5LA5elCbHCtNyyf10xgfUQT
tsL2FgkuYHf441Xu+3TNVuUSeSDMjS3+zg9M9ypa4Tk71PjohoW7dNtUhd4hWKm1nwFgr7zI4U9f
eN/6Roqu5cBV0yvy+vLHJXx/PAhvpMV6L6VsNHZlBcUYOfiIPM9d3pakKi+xWagfd+eBedIgKXUV
E+zue60WY1LeUX1fH4Trmiqa72OmAtDGM7WGjZqtvixX8DQQ++cHojZLzhWyLmSlh52oAvLVI1Ai
RPXnznZqwQTOGZJBOn737wAYy9UZXomI/K6Sg4wqh2sMPvsV+Eyyxay5pHsJSmEb00MPqD5yZnqH
h13n+gVYAp0gq+GQ8nzr2KrkOXsqr7KFocZdsEfXSm+8/XGEsekH2NIClQnNoUrhV/9hz8nL7Dt/
Dfn87OIo0P1sCuaUh6Of8Pi51FM04scpgYdOinE4LZa7ts1gQOsyrth5C3B4r9E5h29gyT48qaK+
wh0KfvoIm6P7AsupWCowjKNMjU451bM1cw388lYfda4W8T//8tO3qL40NZ3C4QzltkMkXa5DvV7e
xgjD7WVHrbCVo/Vz5Xl4dYMdPnz546/fBNVrfiLsNDZsLTTY/+XBzlu418365DyABK32pQDPOuOC
HUGPdRWwffOBvn75E7D3fuRvP2TOft+V3/ltPOOdvrinZw7H10elKh7e+qwr8vyXJ+jLuxx+fAea
B9X20bc/Jy7r0wT1W+KoSUI4rHC3DWBQrTr2PmQEM51LE3KPgFDHPmn1ll7VCpW9EhDu64f7Je16
CPJDSv/y09/+QXIa8U14V+53fh3gdq5B/Uxn2boeNPJ3v3+8aL7HjgbsVvH9zVqGNVPNWkWFC3yM
maeDWZ3SBB57y8YPN1RdsT4ad3iuY5kweZ3YlHnajMjj8PDrxlwZKVgWKN/8JLydsmGcH64FvvyG
2l7yqIXJPDXw16+5uJ445C03E7Sl1egPYVHojNJGQy3sK7LVYkVfvL0yg/FWX+h+cT9gDo6z+ePZ
1PsQD/A7fbgrWtH09FLWjT7LmRKD24frCT9fB33clpICnVug4PjC37J5rG0Cw3MXUHVBU0buZKqg
9pAbIn/95dZpUAKNIvz2x6gwEJl/8Art04akzwyB1VkOFij8JvKRKZbRXDMmwFLv9vj2kMdo2Vyu
IXwKZw5rEoEDO5eage7lOyVojsmwGn53/ulzwmXRC3Q/fbbi7o3d8fqu57MUWCiWGxMb57cyrPHE
J3Cjdi+q7kUjEz+N3/x4Az7qzjZjiHox9Iz3RL/nq1v20tGAuXUx8SGWnJpwsttA8VE1VK0uh0E0
9WsFke0fiJArn+jbLwgB7qwdVvFwcLfpLr//rV93NfV0QZ96CF3zEGMHZCoQxtkq4MtZL1g7GLo7
f/UUcJos80EkGy7jnrWKmizkCbt4G7AU94uHCpBSAvNyD2axGTiYB82dgAFag6iatQY3jZZh/Cgs
9u0H8gicu+2X56g1PxmbHF6r4ogvQjOBbhaDFZqmusEnutHB+uuH3o2Vo4ar9+58wKqCEsOzsMtX
jM03q+pQrfYMHxVniZaW0zuknfPNr75m7CxGBrSLM4+vn2pwl3Qrj8BiRYKdahrc6ZhgCwqaUmNL
/zRg+PYf//qLxxBw0fzrXzqiUdCQShyjXz+tpM6QUe/L9+kkXToI1nuGL1qc6uu9VgVk1G6HHRO/
f/mz/vgNdpX247L+eWrR1WogVpOyAV9+3EKBozw2/L0BmKy+O0iiTsJJHL0yui0YD6XLsyTAdh71
Mj9mH/7zuxXwX//68+d//G4YtN0jf30vBkz5Mv3H/7kq8B/if4xt+nr9vYZAxrTI//n3/76B8M9n
6NrP9D+nrsnf4z///rMV/t41+GfqpvT1/z7/1/dV//Wv/wUAAP//AwBcfFVx4CAAAA==
headers:
CF-RAY:
- 931fceef786ded38-SJC
Connection:
- keep-alive
Content-Encoding:
- gzip
Content-Type:
- application/json
Date:
- Thu, 17 Apr 2025 23:47:35 GMT
Server:
- cloudflare
Set-Cookie:
- __cf_bm=fj4RMXSXRDQjE2CFC6CGC3dVcJ8cl2Cbu8alijwMHA8-1744933655-1.0.1.1-M3c3AI4XQa.0GJoanNACuOm2aEL4xjqHR1grxIP3olFvq3e0eFHwQTvCF20YwR_OLiMJUH87eNUwgziawMccsxjR9OVZyDr5._5Wts6CrqA;
path=/; expires=Fri, 18-Apr-25 00:17:35 GMT; domain=.api.openai.com; HttpOnly;
Secure; SameSite=None
- _cfuvid=MSkpJsQZtdyIGvrl2mIwy0a_We8H6CIrS7etFgRBl2Y-1744933655703-0.0.1.1-604800000;
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
Transfer-Encoding:
- chunked
X-Content-Type-Options:
- nosniff
access-control-allow-origin:
- '*'
access-control-expose-headers:
- X-Request-ID
alt-svc:
- h3=":443"; ma=86400
cf-cache-status:
- DYNAMIC
openai-model:
- text-embedding-3-small
openai-organization:
- crewai-iuxna1
openai-processing-ms:
- '140'
openai-version:
- '2020-10-01'
strict-transport-security:
- max-age=31536000; includeSubDomains; preload
via:
- envoy-router-84959bbcd5-rzqvq
x-envoy-upstream-service-time:
- '110'
x-ratelimit-limit-requests:
- '10000'
x-ratelimit-limit-tokens:
- '10000000'
x-ratelimit-remaining-requests:
- '9999'
x-ratelimit-remaining-tokens:
- '9999986'
x-ratelimit-reset-requests:
- 6ms
x-ratelimit-reset-tokens:
- 0s
x-request-id:
- req_dd3ef61c4765b46ed7db80ddfe261f41
status:
code: 200
message: OK
- request:
body: '{"messages": [{"role": "system", "content": "You are Information Agent.
You have access to specific knowledge sources.\nYour personal goal is: Provide
information based on knowledge sources\nTo give my best complete final answer
to the task respond using the exact following format:\n\nThought: I now can
give a great answer\nFinal Answer: Your final answer must be the great and the
most complete as possible, it must be outcome described.\n\nI MUST use these
formats, my job depends on it!"}, {"role": "user", "content": "\nCurrent Task:
What is Brandon''s favorite color?\n\nThis is the expected criteria for your
final answer: Brandon''s favorite color.\nyou MUST return the actual complete
content as the final answer, not a summary.\n\nBegin! This is VERY important
to you, use the tools available and give your best Final Answer, your job depends
on it!\n\nThought:"}], "model": "gpt-4o-mini", "stop": ["\nObservation:"]}'
headers:
accept:
- application/json
accept-encoding:
- gzip, deflate, zstd
connection:
- keep-alive
content-length:
- '926'
content-type:
- application/json
host:
- api.openai.com
user-agent:
- OpenAI/Python 1.68.2
x-stainless-arch:
- arm64
x-stainless-async:
- 'false'
x-stainless-lang:
- python
x-stainless-os:
- MacOS
x-stainless-package-version:
- 1.68.2
x-stainless-raw-response:
- 'true'
x-stainless-read-timeout:
- '600.0'
x-stainless-retry-count:
- '0'
x-stainless-runtime:
- CPython
x-stainless-runtime-version:
- 3.12.9
method: POST
uri: https://api.openai.com/v1/chat/completions
response:
body:
string: !!binary |
H4sIAAAAAAAAAwAAAP//jFLBbtswDL37KwhdeokLx0mbOLd2WIAC63YZdthWGIpMO9xkUZDkpEWR
fx/kpLG7dUAvBszHR733yOcEQFAlViDUVgbVWp3efv66LvKPRd7Q/f57vtaf7m6ePjze774sv23E
JDJ48wtVeGFdKm6txkBsjrByKAPGqdPFfF7MZtdXVz3QcoU60hob0jmnLRlK8yyfp9kinS5P7C2T
Qi9W8CMBAHjuv1GnqfBRrCCbvFRa9F42KFbnJgDhWMeKkN6TD9IEMRlAxSag6aXfgeE9KGmgoR2C
hCbKBmn8Hh3AT7MmIzXc9P8ruHXSVGwuPNRyx44CgmLNDsjDRnd4OX7GYd15Ga2aTusRII3hIGNU
vcGHE3I4W9LcWMcb/xdV1GTIb0uH0rOJ8n1gK3r0kAA89NF1r9IQ1nFrQxn4N/bPTa+Xx3li2NgI
LU5g4CD1qL5cTN6YV1YYJGk/Cl8oqbZYDdRhU7KriEdAMnL9r5q3Zh+dk2neM34AlEIbsCqtw4rU
a8dDm8N40P9rO6fcCxYe3Y4UloHQxU1UWMtOH89M+CcfsC1rMg066+h4a7Uts1mRL/M8KzKRHJI/
AAAA//8DALRhJdF5AwAA
headers:
CF-Cache-Status:
- DYNAMIC
CF-RAY:
- 931fcef51a67f947-SJC
Connection:
- keep-alive
Content-Encoding:
- gzip
Content-Type:
- application/json
Date:
- Thu, 17 Apr 2025 23:47:36 GMT
Server:
- cloudflare
Set-Cookie:
- __cf_bm=7agwu5JV1OJvEFNhvfvqdgWf.HoMyIni9D85soRl3WE-1744933656-1.0.1.1-dKUwAZnjjuuiswFKWGsxpwHNBJUpjhYlZvfZpyNQIejxEJrXMCppgPvtQ9wa4SKezLmKqftvn_H.bAx_AEFJD2EWm5V6R_uK8.odneErR6A;
path=/; expires=Fri, 18-Apr-25 00:17:36 GMT; domain=.api.openai.com; HttpOnly;
Secure; SameSite=None
- _cfuvid=LdTrzwZYrB6ZyQLY7NdaaHVpDVFvIjYm3arSpNy87wU-1744933656504-0.0.1.1-604800000;
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
Transfer-Encoding:
- chunked
X-Content-Type-Options:
- nosniff
access-control-expose-headers:
- X-Request-ID
alt-svc:
- h3=":443"; ma=86400
openai-organization:
- crewai-iuxna1
openai-processing-ms:
- '540'
openai-version:
- '2020-10-01'
strict-transport-security:
- max-age=31536000; includeSubDomains; preload
x-ratelimit-limit-requests:
- '30000'
x-ratelimit-limit-tokens:
- '150000000'
x-ratelimit-remaining-requests:
- '29999'
x-ratelimit-remaining-tokens:
- '149999802'
x-ratelimit-reset-requests:
- 2ms
x-ratelimit-reset-tokens:
- 0s
x-request-id:
- req_8837be6510731522fd5ac4b75c11d486
status:
code: 200
message: OK
version: 1

View File

@@ -0,0 +1,59 @@
interactions:
- request:
body: '{"contents": [{"role": "user", "parts": [{"text": "What is the capital
of France?"}]}], "generationConfig": {"stop_sequences": []}}'
headers:
accept:
- '*/*'
accept-encoding:
- gzip, deflate
connection:
- keep-alive
content-length:
- '131'
content-type:
- application/json
host:
- generativelanguage.googleapis.com
user-agent:
- litellm/1.60.2
method: POST
uri: https://generativelanguage.googleapis.com/v1beta/models/gemini-2.0-flash-001:generateContent
response:
body:
string: !!binary |
H4sIAAAAAAAC/62RTU+EMBCG7/0VTY9kIQUT1vXqx0njRokxUQ8jDNAILaFdoyH8dwssbNGrTdo0
807nnT7TEUpZCjITGRjU7IK+2Ail3XgOmpIGpbHCHLLBBlpzyp1W59xtisGv4RFLSqQpNMJARVVO
b1qQKVKhqeftoRXa84JXyZy3/XJ/25wcW1XhUK5WGVZzej8nsFxIocsHBK3kkPaY3O/ZosJncauK
plXvQ9M+D3gUxiE/tzs6i+JtyHdkth5N2UFDgXdowFKB5e/Mlqgbk6gPlJfqMFLZTi4Ow5W8O8pG
WQArJYw3f4rqK2spKhetQ93+HSphvkes188Jc/iYVU8zH+Jg/N3hP3nt1l7kOJVpUE/YajFNpMDa
zsiPAu7nFejS5zxkpCc/6so6tIECAAA=
headers:
Alt-Svc:
- h3=":443"; ma=2592000,h3-29=":443"; ma=2592000
Content-Encoding:
- gzip
Content-Type:
- application/json; charset=UTF-8
Date:
- Tue, 22 Apr 2025 14:25:05 GMT
Server:
- scaffolding on HTTPServer2
Server-Timing:
- gfet4t7; dur=1219
Transfer-Encoding:
- chunked
Vary:
- Origin
- X-Origin
- Referer
X-Content-Type-Options:
- nosniff
X-Frame-Options:
- SAMEORIGIN
X-XSS-Protection:
- '0'
status:
code: 200
message: OK
version: 1

View File

@@ -0,0 +1,59 @@
interactions:
- request:
body: '{"contents": [{"role": "user", "parts": [{"text": "What is the capital
of France?"}]}], "generationConfig": {"stop_sequences": []}}'
headers:
accept:
- '*/*'
accept-encoding:
- gzip, deflate
connection:
- keep-alive
content-length:
- '131'
content-type:
- application/json
host:
- generativelanguage.googleapis.com
user-agent:
- litellm/1.60.2
method: POST
uri: https://generativelanguage.googleapis.com/v1beta/models/gemini-2.0-flash-lite-001:generateContent
response:
body:
string: !!binary |
H4sIAAAAAAAC/61RTU+EMBC98yuanhdSiMjq1Y+Txo0SY6J7GGGARmhJ2zUawn+3wMIWvdpDM5n3
Zt7Mm84jhGYgcp6DQU0vyavNENKN/4BJYVAYC8wpm2xBmRN3ep0TW4rBr6GIphWSDFpuoCayILcK
RIaEa7IDxXXwJqhT1y/xfnNSU7LGoVUjc6xnej8TaMEF19UjgpZioD2lDzu6oPBZ3smyVfJ9GNhn
AQuTMIpjFl/EZyxKwuTcm6VHUXrQUOI9GrCOwLI3tS2a1qTyA8WVPIyOJJOK498K3h5hI+3yKySM
N3+a6msryWvXVsdxuzvU3HyPlt68pNTxx6xmmv3xHBt/T/hPWtu1lne8ynSoZ1SaTxcpsbE38qOA
+UUNuvJtd/QZC6nXez+t5iqFggIAAA==
headers:
Alt-Svc:
- h3=":443"; ma=2592000,h3-29=":443"; ma=2592000
Content-Encoding:
- gzip
Content-Type:
- application/json; charset=UTF-8
Date:
- Tue, 22 Apr 2025 14:25:06 GMT
Server:
- scaffolding on HTTPServer2
Server-Timing:
- gfet4t7; dur=1090
Transfer-Encoding:
- chunked
Vary:
- Origin
- X-Origin
- Referer
X-Content-Type-Options:
- nosniff
X-Frame-Options:
- SAMEORIGIN
X-XSS-Protection:
- '0'
status:
code: 200
message: OK
version: 1

View File

@@ -0,0 +1,58 @@
interactions:
- request:
body: '{"contents": [{"role": "user", "parts": [{"text": "What is the capital
of France?"}]}], "generationConfig": {"stop_sequences": []}}'
headers:
accept:
- '*/*'
accept-encoding:
- gzip, deflate
connection:
- keep-alive
content-length:
- '131'
content-type:
- application/json
host:
- generativelanguage.googleapis.com
user-agent:
- litellm/1.60.2
method: POST
uri: https://generativelanguage.googleapis.com/v1beta/models/gemini-2.0-flash-thinking-exp-01-21:generateContent
response:
body:
string: !!binary |
H4sIAAAAAAAC/22QQWuEMBCF7/6KkKNsFt1DKb2221vp0koplD0M66jDxkSSWbCI/71Rq+vS5pCE
eW9meF8XCSFPYHLKgdHLB/EVKkJ04z1o1jAaDsJcCsUGHF+90+lW/2BhbIcmmVUoTtAQgxa2EM8O
zAkFeRHHB3Dk43grV5398j9urvuc1TgMq22Oerb3s0EWZMhXbwjemsH2nr0e5KKSybEN5SSaF4yj
5cVDiS/IEJLDkk82ztYNZ/aM5tFexuT306wVp39ltiHkjZLebf4M9U9hJek1vhXZkBA08feIbv+Z
yRUFvlk6UxjfY/TLY0L0gc7TxKLEOtBRu22iCg2+UlyROZMpFbaNSlK1S2XURz/Fy1P+CAIAAA==
headers:
Alt-Svc:
- h3=":443"; ma=2592000,h3-29=":443"; ma=2592000
Content-Encoding:
- gzip
Content-Type:
- application/json; charset=UTF-8
Date:
- Tue, 22 Apr 2025 14:25:04 GMT
Server:
- scaffolding on HTTPServer2
Server-Timing:
- gfet4t7; dur=764
Transfer-Encoding:
- chunked
Vary:
- Origin
- X-Origin
- Referer
X-Content-Type-Options:
- nosniff
X-Frame-Options:
- SAMEORIGIN
X-XSS-Protection:
- '0'
status:
code: 200
message: OK
version: 1

View File

@@ -0,0 +1,59 @@
interactions:
- request:
body: '{"contents": [{"role": "user", "parts": [{"text": "What is the capital
of France?"}]}], "generationConfig": {"stop_sequences": []}}'
headers:
accept:
- '*/*'
accept-encoding:
- gzip, deflate
connection:
- keep-alive
content-length:
- '131'
content-type:
- application/json
host:
- generativelanguage.googleapis.com
user-agent:
- litellm/1.60.2
method: POST
uri: https://generativelanguage.googleapis.com/v1beta/models/gemini-2.5-flash-preview-04-17:generateContent
response:
body:
string: !!binary |
H4sIAAAAAAAC/2WQT0+EMBDF7/0UTY9k2ewaDerVPzfjRokxMR4mMEBjaUk76BrCd7fAslvcHppm
5s2bvl/HOBcZ6FzmQOjELf/wFc678R56RhNq8o255IsNWDppp9MFby8h3A9DIq2QZ9BIAsVNwR8t
6Ay5dDyKdmCli6K1CCb74/tzddpnjcLBrDY5qlnezwJRSC1d9YLgjB5kr+nzThy7Uue49+UNmxeM
1qJ1UOITEvjkcMwnGmvqhlLzhfrOtGPy68kr4LRoJzeHPhmfcjmZrM5c3b3fKVXIL0DrI4KS9Duy
e3hPRYCBFtYzBhbQElSZtqzo3we37IBrIviG1skJVYm1hxdfrK/iQoGr4sbit8SfeHMZbxPBevYH
O2bXiSICAAA=
headers:
Alt-Svc:
- h3=":443"; ma=2592000,h3-29=":443"; ma=2592000
Content-Encoding:
- gzip
Content-Type:
- application/json; charset=UTF-8
Date:
- Tue, 22 Apr 2025 14:25:28 GMT
Server:
- scaffolding on HTTPServer2
Server-Timing:
- gfet4t7; dur=20971
Transfer-Encoding:
- chunked
Vary:
- Origin
- X-Origin
- Referer
X-Content-Type-Options:
- nosniff
X-Frame-Options:
- SAMEORIGIN
X-XSS-Protection:
- '0'
status:
code: 200
message: OK
version: 1

View File

@@ -0,0 +1,59 @@
interactions:
- request:
body: '{"contents": [{"role": "user", "parts": [{"text": "What is the capital
of France?"}]}], "generationConfig": {"stop_sequences": []}}'
headers:
accept:
- '*/*'
accept-encoding:
- gzip, deflate
connection:
- keep-alive
content-length:
- '131'
content-type:
- application/json
host:
- generativelanguage.googleapis.com
user-agent:
- litellm/1.60.2
method: POST
uri: https://generativelanguage.googleapis.com/v1beta/models/gemini-2.5-pro-exp-03-25:generateContent
response:
body:
string: !!binary |
H4sIAAAAAAAC/12QT2uEMBDF7/kUIUepi2tZaHttu7fSpZVSKD0MOquhmkgygkX87o26cWM9SJj3
5s/7DYxzkYMqZAGEVjzwL1fhfJj/k6YVoSIn+JIrtmDo6l2+IXg7C2E/NYmsQp5DKwlqrs/8aEDl
yKXlUXQCI20U7UTQOa7v75vrPqNrnIY1usDa20dvEGeppK3eEKxWk+09ez2JVZWqwN6VE+YXzKNF
Z6HEFyRwyWHNJ1qjm5Yy/YPqUXdz8rtlVsBpI++T5GIg7WL+03xzMNc+ua2yDgkGcF1IqCX9zvSe
PzMRgKDNWR4EC3gJqnRXVrQ98T5lF2ALww80Vi6wSmwcvjjdHWJ3Yox9Gye3cXoQbGR/TedYqx4C
AAA=
headers:
Alt-Svc:
- h3=":443"; ma=2592000,h3-29=":443"; ma=2592000
Content-Encoding:
- gzip
Content-Type:
- application/json; charset=UTF-8
Date:
- Tue, 22 Apr 2025 14:25:30 GMT
Server:
- scaffolding on HTTPServer2
Server-Timing:
- gfet4t7; dur=2418
Transfer-Encoding:
- chunked
Vary:
- Origin
- X-Origin
- Referer
X-Content-Type-Options:
- nosniff
X-Frame-Options:
- SAMEORIGIN
X-XSS-Protection:
- '0'
status:
code: 200
message: OK
version: 1

View File

@@ -0,0 +1,59 @@
interactions:
- request:
body: '{"contents": [{"role": "user", "parts": [{"text": "What is the capital
of France?"}]}], "generationConfig": {"stop_sequences": []}}'
headers:
accept:
- '*/*'
accept-encoding:
- gzip, deflate
connection:
- keep-alive
content-length:
- '131'
content-type:
- application/json
host:
- generativelanguage.googleapis.com
user-agent:
- litellm/1.60.2
method: POST
uri: https://generativelanguage.googleapis.com/v1beta/models/gemma-3-12b-it:generateContent
response:
body:
string: !!binary |
H4sIAAAAAAAC/2WRTWvDMAyG7/kVwpdBSEvX7jB23QfsMFa2MAZbD2qipGaOFWwFWkr/+5ykaVPq
gGP0SvLrR/sIQGVoc52jkFcP8BMiAPtubzW2QlaCMIRCsEYn59x+7UfnkCK0bYtUuiHIsNaCBriA
F4c2I9Ae4niJTvs4nv7a/nuVGw9oPIOEIoOuJC+QadmBtkNlsAoIpeF1aJgFZ+SgYAfBUQIF+o1m
m0CJXhxbrnZJV5E1RhpHUzUyeTidV8n5aY4Ntb4rzskM6YchQRXaar/5IPRs27TP9H2pTqq2OW1D
eBYNF3StVeOxpDcSDJDxhFLVjqtaUv4j+8hNB/m+7zUayYW8mB914QD0QrqbJVdd/VO4U5vxqEZT
DE9EE+h2Y3r+TtUIg1yYGjB0/1V0BNIz+iLndQ+jpKrCyWJyO19PtKjoEP0DlZtdIF8CAAA=
headers:
Alt-Svc:
- h3=":443"; ma=2592000,h3-29=":443"; ma=2592000
Content-Encoding:
- gzip
Content-Type:
- application/json; charset=UTF-8
Date:
- Tue, 22 Apr 2025 14:25:39 GMT
Server:
- scaffolding on HTTPServer2
Server-Timing:
- gfet4t7; dur=3835
Transfer-Encoding:
- chunked
Vary:
- Origin
- X-Origin
- Referer
X-Content-Type-Options:
- nosniff
X-Frame-Options:
- SAMEORIGIN
X-XSS-Protection:
- '0'
status:
code: 200
message: OK
version: 1

View File

@@ -0,0 +1,60 @@
interactions:
- request:
body: '{"contents": [{"role": "user", "parts": [{"text": "What is the capital
of France?"}]}], "generationConfig": {"stop_sequences": []}}'
headers:
accept:
- '*/*'
accept-encoding:
- gzip, deflate
connection:
- keep-alive
content-length:
- '131'
content-type:
- application/json
host:
- generativelanguage.googleapis.com
user-agent:
- litellm/1.60.2
method: POST
uri: https://generativelanguage.googleapis.com/v1beta/models/gemma-3-1b-it:generateContent
response:
body:
string: !!binary |
H4sIAAAAAAAC/2VRy07DQAy85yusPUZtBSoIxJWHxAFRQYSQKAc3cVqL7DrKuqKlqsRv8Ht8CZuk
aVOxh314xuP1eBMBmBRdxhkqeXMFbyECsGn2GhOn5DQAXSgES6z0wG3XpncPFKVVnWSSBUGKJSsW
IDncVehSAvYQxxOs2MfxCKZu6u719/vHA0Iq1ooDyz6UTqlUDi9doELDr1M1aMbiinXcSQ9gtlTg
VqOGJc855VAztAZWvMInZ1SsoaJU5o6/KANxNDK9X2/39/fBoddKCqobsRLyO/q2I5icHfvFE6EX
V9Oek8eJ2aPsMlqF8EnUFWikzdLjnB5IMbiOe29NWYktNZEPcteybFy/bLV6MzqCxxc7XCXYcASd
nQ/+qfqbUJOL/ux6Yw0tYsG6buZ2+5qYng169KnOhuZ8j3aGtB69UOW5NWNO1uJwPDydDVlNtI3+
AD6XWQdvAgAA
headers:
Alt-Svc:
- h3=":443"; ma=2592000,h3-29=":443"; ma=2592000
Content-Encoding:
- gzip
Content-Type:
- application/json; charset=UTF-8
Date:
- Tue, 22 Apr 2025 14:25:32 GMT
Server:
- scaffolding on HTTPServer2
Server-Timing:
- gfet4t7; dur=1535
Transfer-Encoding:
- chunked
Vary:
- Origin
- X-Origin
- Referer
X-Content-Type-Options:
- nosniff
X-Frame-Options:
- SAMEORIGIN
X-XSS-Protection:
- '0'
status:
code: 200
message: OK
version: 1

View File

@@ -0,0 +1,60 @@
interactions:
- request:
body: '{"contents": [{"role": "user", "parts": [{"text": "What is the capital
of France?"}]}], "generationConfig": {"stop_sequences": []}}'
headers:
accept:
- '*/*'
accept-encoding:
- gzip, deflate
connection:
- keep-alive
content-length:
- '131'
content-type:
- application/json
host:
- generativelanguage.googleapis.com
user-agent:
- litellm/1.60.2
method: POST
uri: https://generativelanguage.googleapis.com/v1beta/models/gemma-3-27b-it:generateContent
response:
body:
string: !!binary |
H4sIAAAAAAAC/2VRXUvDMBR976+45EUo3RDnUHwTnSA4HFpEcHuI7e16aZqU5NZNxv67abtuHTbQ
hHPu5zm7AEAkUqeUSkYn7uDLIwC79t9wRjNq9kQPebCSlk+x3bcbvH0I47ZJEnGOkMiKWCowGTxZ
qRMEchCGC2nJheEYlnqpn/nCQaHNRkNmLJDvSwkoP1kpbeFAUYHAvtiMsgwVxGaDNmqRF1P/WIR5
7bAuI/ApLXxvE0gRYkumrHL0hIMNKtXcxA4y6XIyOoKkJkcau8ykVlxbHDczNUcM1tof36voJIY1
CptNS5Oi6sP3fYDISJPL31A6o5uw9/h1IY4s6RS3Hr4M+gZtaVE7ucY5svS2yKP4orJ+F45NgfrB
1K0tt12tgYln9PX0wLPxFpxR00n0r6p79D1JDc0d+O5XlIr4tzV29hmLgQx8NlQvQ3uvgoMgnUYf
aB11YqyxLOVoMrq6+R4Ri2Af/AEDrXcbkQIAAA==
headers:
Alt-Svc:
- h3=":443"; ma=2592000,h3-29=":443"; ma=2592000
Content-Encoding:
- gzip
Content-Type:
- application/json; charset=UTF-8
Date:
- Tue, 22 Apr 2025 14:25:41 GMT
Server:
- scaffolding on HTTPServer2
Server-Timing:
- gfet4t7; dur=2447
Transfer-Encoding:
- chunked
Vary:
- Origin
- X-Origin
- Referer
X-Content-Type-Options:
- nosniff
X-Frame-Options:
- SAMEORIGIN
X-XSS-Protection:
- '0'
status:
code: 200
message: OK
version: 1

View File

@@ -0,0 +1,60 @@
interactions:
- request:
body: '{"contents": [{"role": "user", "parts": [{"text": "What is the capital
of France?"}]}], "generationConfig": {"stop_sequences": []}}'
headers:
accept:
- '*/*'
accept-encoding:
- gzip, deflate
connection:
- keep-alive
content-length:
- '131'
content-type:
- application/json
host:
- generativelanguage.googleapis.com
user-agent:
- litellm/1.60.2
method: POST
uri: https://generativelanguage.googleapis.com/v1beta/models/gemma-3-4b-it:generateContent
response:
body:
string: !!binary |
H4sIAAAAAAAC/2WRzUrDQBDH73mKYY+lLUKLiBcPfoAHsWhQwXqYJtNk6WYn7E6woRQ8+wR68t18
Ah/BbWraFPeQXeb/z3z8ZhUBqARtqlMU8uoUnkMEYNV8NxpbIStBaEMhWKKTvXd7Vp13sAgtNz+p
OCdIsNSCBngOVw5tQqA99HoTdNr3ekOY2qm9lu+3Tw8ImeFZ8CahKDmYs4NQrA9z9Llm24cMvTi2
XNQQ2oakMlI5GsLP18d7k+mRK5NCzRUYvSAQhoXl12CuJdc2g4IdAc64Emg6OFOdzte790t/P69j
Q5thCk7JtPZ1a1BzbbXP7wg9243tPr6dqJ2qbUrLED6K2gJNalV5zOiGBAN53PFVpeOilJgXZM+5
asifbHN19nQgj1pdOFA+kMbH/X9Z/UWoqU13f53VhhHRaKmb3V0+xaqDQQ6aajE090v0B2TL6IGc
11sYGRUFDkaD8WygRUXr6BcxmBLccwIAAA==
headers:
Alt-Svc:
- h3=":443"; ma=2592000,h3-29=":443"; ma=2592000
Content-Encoding:
- gzip
Content-Type:
- application/json; charset=UTF-8
Date:
- Tue, 22 Apr 2025 14:25:35 GMT
Server:
- scaffolding on HTTPServer2
Server-Timing:
- gfet4t7; dur=2349
Transfer-Encoding:
- chunked
Vary:
- Origin
- X-Origin
- Referer
X-Content-Type-Options:
- nosniff
X-Frame-Options:
- SAMEORIGIN
X-XSS-Protection:
- '0'
status:
code: 200
message: OK
version: 1

View File

@@ -0,0 +1,101 @@
interactions:
- request:
body: '{"messages": [{"role": "user", "content": "What is the capital of France?"}],
"model": "gpt-4.1-mini-2025-04-14", "stop": []}'
headers:
accept:
- application/json
accept-encoding:
- gzip, deflate
connection:
- keep-alive
content-length:
- '125'
content-type:
- application/json
host:
- api.openai.com
user-agent:
- OpenAI/Python 1.68.2
x-stainless-arch:
- arm64
x-stainless-async:
- 'false'
x-stainless-lang:
- python
x-stainless-os:
- MacOS
x-stainless-package-version:
- 1.68.2
x-stainless-raw-response:
- 'true'
x-stainless-read-timeout:
- '600.0'
x-stainless-retry-count:
- '0'
x-stainless-runtime:
- CPython
x-stainless-runtime-version:
- 3.11.12
method: POST
uri: https://api.openai.com/v1/chat/completions
response:
body:
string: !!binary |
H4sIAAAAAAAAAwAAAP//jJJPb9swDMXv/hSCznEQe+7S5LgAGzDs0P3paSgMRqJtZrIkSPTQoch3
H2Snsbt1wC4+8MdHv0fxKRNCkpZ7IVUHrHpv8nd3t4eIH77w4ePX+/b+8XP56bTdHE7dgb2Sq6Rw
xxMqflatleu9QSZnJ6wCAmOaWmyrmzflrqrejqB3Gk2StZ7zal3kPVnKy015k2+qvKgu8s6Rwij3
4nsmhBBP4zcZtRof5V5sVs+VHmOEFuX+2iSEDM6kioQYKTJYlqsZKmcZ7ej9W4dCgScGI1wj3gew
CgVFcQeB4nqpCtgMEZJ1OxizAGCtY0jRR78PF3K+OjSu9cEd4x9S2ZCl2NUBITqb3ER2Xo70nAnx
MG5ieBFO+uB6zzW7Hzj+rqimcXJ+gBneXhg7BjOXy3L1yrBaIwOZuFikVKA61LNy3joMmtwCZIvI
f3t5bfYUm2z7P+NnoBR6Rl37gJrUy7xzW8B0nf9qu654NCwjhp+ksGbCkJ5BYwODmU5Gxl+Rsa8b
si0GH2i6m8bX291xuztiVTQyO2e/AQAA//8DAP7WRo9GAwAA
headers:
CF-RAY:
- 93458dcf6d0ef53b-GRU
Connection:
- keep-alive
Content-Encoding:
- gzip
Content-Type:
- application/json
Date:
- Tue, 22 Apr 2025 13:44:07 GMT
Server:
- cloudflare
Transfer-Encoding:
- chunked
X-Content-Type-Options:
- nosniff
access-control-expose-headers:
- X-Request-ID
alt-svc:
- h3=":443"; ma=86400
cf-cache-status:
- DYNAMIC
openai-organization:
- crewai-iuxna1
openai-processing-ms:
- '1391'
openai-version:
- '2020-10-01'
strict-transport-security:
- max-age=31536000; includeSubDomains; preload
x-ratelimit-limit-requests:
- '30000'
x-ratelimit-limit-tokens:
- '150000000'
x-ratelimit-remaining-requests:
- '29999'
x-ratelimit-remaining-tokens:
- '149999989'
x-ratelimit-reset-requests:
- 2ms
x-ratelimit-reset-tokens:
- 0s
x-request-id:
- req_74408ec81f430b6e9795cf5332262b8d
status:
code: 200
message: OK
version: 1

View File

@@ -0,0 +1,101 @@
interactions:
- request:
body: '{"messages": [{"role": "user", "content": "What is the capital of France?"}],
"model": "gpt-4.1-nano-2025-04-14", "stop": []}'
headers:
accept:
- application/json
accept-encoding:
- gzip, deflate
connection:
- keep-alive
content-length:
- '125'
content-type:
- application/json
host:
- api.openai.com
user-agent:
- OpenAI/Python 1.68.2
x-stainless-arch:
- arm64
x-stainless-async:
- 'false'
x-stainless-lang:
- python
x-stainless-os:
- MacOS
x-stainless-package-version:
- 1.68.2
x-stainless-raw-response:
- 'true'
x-stainless-read-timeout:
- '600.0'
x-stainless-retry-count:
- '0'
x-stainless-runtime:
- CPython
x-stainless-runtime-version:
- 3.11.12
method: POST
uri: https://api.openai.com/v1/chat/completions
response:
body:
string: !!binary |
H4sIAAAAAAAAAwAAAP//jJJPb9swDMXv/hSCznEQuwrq5bgNBbZTh7aHYigMRqJtbbKkSXT/oMh3
L2Snsdt1wC4+8MdHv0fxOWOMa8V3jMsOSPbe5J8vqy/D96K/+PPjSn/Fh5szMewfb/eiar/d8lVS
uP0vlPSqWkvXe4OknZ2wDAiEaWpxLrZn5SchqhH0TqFJstZTLtZFbsG6vNyU23wj8kIc5Z3TEiPf
sZ8ZY4w9j99k1Cp85Du2Wb1WeowRWuS7UxNjPDiTKhxi1JHAEl/NUDpLaEfv1x0yCV4TGOYadhHA
SmQ6sksIOq6XqoDNECFZt4MxCwDWOoIUffR7dySHk0PjWh/cPr6T8kZbHbs6IERnk5tIzvORHjLG
7sZNDG/CcR9c76km9xvH3xViGsfnB5hhdWTkCMxcLsvVB8NqhQTaxMUiuQTZoZqV89ZhUNotQLaI
/LeXj2ZPsbVt/2f8DKRET6hqH1Bp+Tbv3BYwXee/2k4rHg3ziOFeS6xJY0jPoLCBwUwnw+NTJOzr
RtsWgw96upvG14gKq2ajxJZnh+wFAAD//wMATWCJPkYDAAA=
headers:
CF-RAY:
- 93458dd9aebff53b-GRU
Connection:
- keep-alive
Content-Encoding:
- gzip
Content-Type:
- application/json
Date:
- Tue, 22 Apr 2025 13:44:08 GMT
Server:
- cloudflare
Transfer-Encoding:
- chunked
X-Content-Type-Options:
- nosniff
access-control-expose-headers:
- X-Request-ID
alt-svc:
- h3=":443"; ma=86400
cf-cache-status:
- DYNAMIC
openai-organization:
- crewai-iuxna1
openai-processing-ms:
- '134'
openai-version:
- '2020-10-01'
strict-transport-security:
- max-age=31536000; includeSubDomains; preload
x-ratelimit-limit-requests:
- '30000'
x-ratelimit-limit-tokens:
- '150000000'
x-ratelimit-remaining-requests:
- '29999'
x-ratelimit-remaining-tokens:
- '149999990'
x-ratelimit-reset-requests:
- 2ms
x-ratelimit-reset-tokens:
- 0s
x-request-id:
- req_0ba738662e063fb55ea01793aafdcecc
status:
code: 200
message: OK
version: 1

View File

@@ -0,0 +1,101 @@
interactions:
- request:
body: '{"messages": [{"role": "user", "content": "What is the capital of France?"}],
"model": "gpt-4.1", "stop": []}'
headers:
accept:
- application/json
accept-encoding:
- gzip, deflate
connection:
- keep-alive
content-length:
- '109'
content-type:
- application/json
host:
- api.openai.com
user-agent:
- OpenAI/Python 1.68.2
x-stainless-arch:
- arm64
x-stainless-async:
- 'false'
x-stainless-lang:
- python
x-stainless-os:
- MacOS
x-stainless-package-version:
- 1.68.2
x-stainless-raw-response:
- 'true'
x-stainless-read-timeout:
- '600.0'
x-stainless-retry-count:
- '0'
x-stainless-runtime:
- CPython
x-stainless-runtime-version:
- 3.11.12
method: POST
uri: https://api.openai.com/v1/chat/completions
response:
body:
string: !!binary |
H4sIAAAAAAAAA4xSwYrbMBC9+yvEHEMcHMeh3dx2SwttoYTSW1nMRB7b2pUlIY27XZb8e5GdxM5u
C734MG/e83tP85IIAaqCnQDZIsvO6fRu//6Db26fnvTtx8/5t7v95rmT3y1+pc0XDcvIsIcHknxm
raTtnCZW1oyw9IRMUXX9rthu8pui2A5AZyvSkdY4TovVOs2zfJtmRbouTszWKkkBduJnIoQQL8M3
ejQV/YadyJbnSUchYEOwuywJAd7qOAEMQQVGw7CcQGkNkxls/2hJSHSKUQtbi08ejSShglgs9uhV
WCxWc6anug8YnZte6xmAxljGmHzwfH9CjheX2jbO20N4RYVaGRXa0hMGa6KjwNbBgB4TIe6HNvqr
gOC87RyXbB9p+N26GOVg6n8GnpoCtox6mudn0pVaWRGj0mHWJkiULVUTc6oe+0rZGZDMMr818zft
Mbcyzf/IT4CU5Jiq0nmqlLwOPK15itf5r7VLx4NhCOR/KUklK/LxHSqqsdfj3UB4DkxdWSvTkHde
jcdTuxJllh2yLMskJMfkDwAAAP//AwC4aq9JRgMAAA==
headers:
CF-RAY:
- 93458dcc1b9df53b-GRU
Connection:
- keep-alive
Content-Encoding:
- gzip
Content-Type:
- application/json
Date:
- Tue, 22 Apr 2025 13:44:06 GMT
Server:
- cloudflare
Transfer-Encoding:
- chunked
X-Content-Type-Options:
- nosniff
access-control-expose-headers:
- X-Request-ID
alt-svc:
- h3=":443"; ma=86400
cf-cache-status:
- DYNAMIC
openai-organization:
- crewai-iuxna1
openai-processing-ms:
- '296'
openai-version:
- '2020-10-01'
strict-transport-security:
- max-age=31536000; includeSubDomains; preload
x-ratelimit-limit-requests:
- '10000'
x-ratelimit-limit-tokens:
- '30000000'
x-ratelimit-remaining-requests:
- '9999'
x-ratelimit-remaining-tokens:
- '29999989'
x-ratelimit-reset-requests:
- 6ms
x-ratelimit-reset-tokens:
- 0s
x-request-id:
- req_7b4e1628a0608e9547e71e7857a29b58
status:
code: 200
message: OK
version: 1

File diff suppressed because it is too large Load Diff

View File

@@ -256,6 +256,52 @@ def test_validate_call_params_no_response_format():
llm._validate_call_params()
@pytest.mark.vcr(filter_headers=["authorization"], filter_query_parameters=["key"])
@pytest.mark.parametrize(
"model",
[
"gemini/gemini-2.0-flash-thinking-exp-01-21",
"gemini/gemini-2.0-flash-001",
"gemini/gemini-2.0-flash-lite-001",
"gemini/gemini-2.5-flash-preview-04-17",
"gemini/gemini-2.5-pro-exp-03-25",
],
)
def test_gemini_models(model):
llm = LLM(model=model)
result = llm.call("What is the capital of France?")
assert isinstance(result, str)
assert "Paris" in result
@pytest.mark.vcr(filter_headers=["authorization"], filter_query_parameters=["key"])
@pytest.mark.parametrize(
"model",
[
"gemini/gemma-3-1b-it",
"gemini/gemma-3-4b-it",
"gemini/gemma-3-12b-it",
"gemini/gemma-3-27b-it",
],
)
def test_gemma3(model):
llm = LLM(model=model)
result = llm.call("What is the capital of France?")
assert isinstance(result, str)
assert "Paris" in result
@pytest.mark.vcr(filter_headers=["authorization"])
@pytest.mark.parametrize(
"model", ["gpt-4.1", "gpt-4.1-mini-2025-04-14", "gpt-4.1-nano-2025-04-14"]
)
def test_gpt_4_1(model):
llm = LLM(model=model)
result = llm.call("What is the capital of France?")
assert isinstance(result, str)
assert "Paris" in result
@pytest.mark.vcr(filter_headers=["authorization"])
def test_o3_mini_reasoning_effort_high():
llm = LLM(

View File

@@ -0,0 +1,49 @@
import importlib
import sys
from unittest.mock import MagicMock, patch
import pytest
class TestEmbeddingConfiguratorImports:
"""Test that ChromaDB is not imported at module level."""
def test_no_chromadb_import_at_module_level(self):
"""Test that chromadb is not imported when the module is imported."""
for module_name in list(sys.modules.keys()):
if module_name.startswith('crewai.utilities.embedding_configurator'):
del sys.modules[module_name]
mock_chromadb = MagicMock()
chromadb_imported = [False]
def mock_import(name, *args, **kwargs):
if name == 'chromadb':
chromadb_imported[0] = True
return mock_chromadb
return importlib.__import__(name, *args, **kwargs)
with patch('builtins.__import__', side_effect=mock_import):
from crewai.utilities import embedding_configurator
assert not chromadb_imported[0], "chromadb was imported at module level"
def test_chromadb_import_in_configure_embedder(self):
"""Test that chromadb is imported when configure_embedder is called."""
for module_name in list(sys.modules.keys()):
if module_name.startswith('crewai.utilities.embedding_configurator'):
del sys.modules[module_name]
from crewai.utilities.embedding_configurator import EmbeddingConfigurator
mock_chromadb = MagicMock()
def mock_import(name, *args, **kwargs):
if name == 'chromadb':
raise ImportError("Mock import error for chromadb")
return importlib.__import__(name, *args, **kwargs)
with patch('builtins.__import__', side_effect=mock_import):
with pytest.raises(ImportError, match="Mock import error for chromadb"):
EmbeddingConfigurator().configure_embedder()

View File

@@ -29,7 +29,7 @@ def mock_knowledge_source():
"""
return StringKnowledgeSource(content=content)
@patch('crewai.knowledge.storage.knowledge_storage.chromadb')
@patch('chromadb.PersistentClient')
def test_knowledge_included_in_planning(mock_chroma):
"""Test that verifies knowledge sources are properly included in planning."""
# Mock ChromaDB collection

View File

@@ -100,7 +100,7 @@ class InternalCrewPlanner:
# Knowledge field should not be present when empty
assert '"agent_knowledge"' not in tasks_summary
@patch('crewai.knowledge.storage.knowledge_storage.chromadb')
@patch('chromadb.PersistentClient')
def test_create_tasks_summary_with_knowledge_and_tools(self, mock_chroma):
"""Test task summary generation with both knowledge and tools present."""
# Mock ChromaDB collection

438
uv.lock generated
View File

@@ -1,32 +1,18 @@
version = 1
revision = 1
requires-python = ">=3.10, <3.13"
resolution-markers = [
"python_full_version < '3.11' and platform_python_implementation == 'PyPy' and sys_platform == 'darwin'",
"python_full_version < '3.11' and platform_python_implementation != 'PyPy' and sys_platform == 'darwin'",
"python_version < '0'",
"python_full_version < '3.11' and platform_machine == 'aarch64' and platform_python_implementation == 'PyPy' and sys_platform == 'linux'",
"python_full_version < '3.11' and platform_machine == 'aarch64' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'",
"(python_full_version < '3.11' and platform_machine != 'aarch64' and platform_python_implementation == 'PyPy' and sys_platform == 'linux') or (python_full_version < '3.11' and platform_python_implementation == 'PyPy' and sys_platform != 'darwin' and sys_platform != 'linux')",
"(python_full_version < '3.11' and platform_machine != 'aarch64' and platform_python_implementation != 'PyPy' and sys_platform == 'linux') or (python_full_version < '3.11' and platform_python_implementation != 'PyPy' and sys_platform != 'darwin' and sys_platform != 'linux')",
"python_full_version == '3.11.*' and platform_python_implementation == 'PyPy' and sys_platform == 'darwin'",
"python_full_version == '3.11.*' and platform_python_implementation != 'PyPy' and sys_platform == 'darwin'",
"python_full_version == '3.11.*' and platform_machine == 'aarch64' and platform_python_implementation == 'PyPy' and sys_platform == 'linux'",
"python_full_version == '3.11.*' and platform_machine == 'aarch64' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'",
"(python_full_version == '3.11.*' and platform_machine != 'aarch64' and platform_python_implementation == 'PyPy' and sys_platform == 'linux') or (python_full_version == '3.11.*' and platform_python_implementation == 'PyPy' and sys_platform != 'darwin' and sys_platform != 'linux')",
"(python_full_version == '3.11.*' and platform_machine != 'aarch64' and platform_python_implementation != 'PyPy' and sys_platform == 'linux') or (python_full_version == '3.11.*' and platform_python_implementation != 'PyPy' and sys_platform != 'darwin' and sys_platform != 'linux')",
"python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_python_implementation == 'PyPy' and sys_platform == 'darwin'",
"python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'darwin'",
"python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_machine == 'aarch64' and platform_python_implementation == 'PyPy' and sys_platform == 'linux'",
"python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_machine == 'aarch64' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'",
"(python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_machine != 'aarch64' and platform_python_implementation == 'PyPy' and sys_platform == 'linux') or (python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_python_implementation == 'PyPy' and sys_platform != 'darwin' and sys_platform != 'linux')",
"(python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_machine != 'aarch64' and platform_python_implementation != 'PyPy' and sys_platform == 'linux') or (python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform != 'darwin' and sys_platform != 'linux')",
"python_full_version >= '3.12.4' and platform_python_implementation == 'PyPy' and sys_platform == 'darwin'",
"python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'darwin'",
"python_full_version >= '3.12.4' and platform_machine == 'aarch64' and platform_python_implementation == 'PyPy' and sys_platform == 'linux'",
"python_full_version >= '3.12.4' and platform_machine == 'aarch64' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'",
"(python_full_version >= '3.12.4' and platform_machine != 'aarch64' and platform_python_implementation == 'PyPy' and sys_platform == 'linux') or (python_full_version >= '3.12.4' and platform_python_implementation == 'PyPy' and sys_platform != 'darwin' and sys_platform != 'linux')",
"(python_full_version >= '3.12.4' and platform_machine != 'aarch64' and platform_python_implementation != 'PyPy' and sys_platform == 'linux') or (python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform != 'darwin' and sys_platform != 'linux')",
"python_full_version < '3.11' and sys_platform == 'darwin'",
"python_full_version < '3.11' and platform_machine == 'aarch64' and sys_platform == 'linux'",
"(python_full_version < '3.11' and platform_machine != 'aarch64' and sys_platform == 'linux') or (python_full_version < '3.11' and sys_platform != 'darwin' and sys_platform != 'linux')",
"python_full_version == '3.11.*' and sys_platform == 'darwin'",
"python_full_version == '3.11.*' and platform_machine == 'aarch64' and sys_platform == 'linux'",
"(python_full_version == '3.11.*' and platform_machine != 'aarch64' and sys_platform == 'linux') or (python_full_version == '3.11.*' and sys_platform != 'darwin' and sys_platform != 'linux')",
"python_full_version >= '3.12' and python_full_version < '3.12.4' and sys_platform == 'darwin'",
"python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_machine == 'aarch64' and sys_platform == 'linux'",
"(python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_machine != 'aarch64' and sys_platform == 'linux') or (python_full_version >= '3.12' and python_full_version < '3.12.4' and sys_platform != 'darwin' and sys_platform != 'linux')",
"python_full_version >= '3.12.4' and sys_platform == 'darwin'",
"python_full_version >= '3.12.4' and platform_machine == 'aarch64' and sys_platform == 'linux'",
"(python_full_version >= '3.12.4' and platform_machine != 'aarch64' and sys_platform == 'linux') or (python_full_version >= '3.12.4' and sys_platform != 'darwin' and sys_platform != 'linux')",
]
[[package]]
@@ -128,18 +114,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/76/ac/a7305707cb852b7e16ff80eaf5692309bde30e2b1100a1fcacdc8f731d97/aiosignal-1.3.1-py3-none-any.whl", hash = "sha256:f8376fb07dd1e86a584e4fcdec80b36b7f81aac666ebc724e2c090300dd83b17", size = 7617 },
]
[[package]]
name = "aisuite"
version = "0.1.10"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "httpx" },
]
sdist = { url = "https://files.pythonhosted.org/packages/6a/9d/c7a8a76abb9011dd2bc9a5cb8ffa8231640e20bbdae177ce9ab6cb67c66c/aisuite-0.1.10.tar.gz", hash = "sha256:170e62d4c91fecb22e82a04e058154a111cef473681171e5df7346272e77f414", size = 29052 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/58/c2/9a34a01516de107e5f9406dbfd319b6004340708101d67fa107373da4058/aisuite-0.1.10-py3-none-any.whl", hash = "sha256:c8510ebe38d6546b6a06819171e201fcaf0bf9ae020ffcfe19b6bd90430781ad", size = 43984 },
]
[[package]]
name = "alembic"
version = "1.13.3"
@@ -566,14 +540,14 @@ wheels = [
[[package]]
name = "click"
version = "8.1.8"
version = "8.1.7"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "colorama", marker = "sys_platform == 'win32'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/b9/2e/0090cbf739cee7d23781ad4b89a9894a41538e4fcf4c31dcdd705b78eb8b/click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a", size = 226593 }
sdist = { url = "https://files.pythonhosted.org/packages/96/d3/f04c7bfcf5c1862a2a5b845c6b2b360488cf47af55dfa79c98f6a6bf98b5/click-8.1.7.tar.gz", hash = "sha256:ca9853ad459e787e2192211578cc907e7594e294c7ccc834310722b41b9ca6de", size = 336121 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/7e/d4/7ebdbd03970677812aac39c869717059dbb71a4cfc033ca6e5221787892c/click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2", size = 98188 },
{ url = "https://files.pythonhosted.org/packages/00/2e/d53fa4befbf2cfa713304affc7ca780ce4fc1fd8710527771b58311a3229/click-8.1.7-py3-none-any.whl", hash = "sha256:ae74fb96c20a0277a1d615f1e4d73c8414f5a98db8b799a7931d1582f3390c28", size = 97941 },
]
[[package]]
@@ -620,7 +594,7 @@ wheels = [
[[package]]
name = "crewai"
version = "0.114.0"
version = "0.86.0"
source = { editable = "." }
dependencies = [
{ name = "appdirs" },
@@ -630,7 +604,6 @@ dependencies = [
{ name = "click" },
{ name = "instructor" },
{ name = "json-repair" },
{ name = "json5" },
{ name = "jsonref" },
{ name = "litellm" },
{ name = "openai" },
@@ -652,15 +625,9 @@ dependencies = [
agentops = [
{ name = "agentops" },
]
aisuite = [
{ name = "aisuite" },
]
docling = [
{ name = "docling" },
]
embeddings = [
{ name = "tiktoken" },
]
fastembed = [
{ name = "fastembed" },
]
@@ -693,8 +660,8 @@ dev = [
{ name = "pre-commit" },
{ name = "pytest" },
{ name = "pytest-asyncio" },
{ name = "pytest-recording" },
{ name = "pytest-subprocess" },
{ name = "pytest-vcr" },
{ name = "python-dotenv" },
{ name = "ruff" },
]
@@ -702,27 +669,25 @@ dev = [
[package.metadata]
requires-dist = [
{ name = "agentops", marker = "extra == 'agentops'", specifier = ">=0.3.0" },
{ name = "aisuite", marker = "extra == 'aisuite'", specifier = ">=0.1.10" },
{ name = "appdirs", specifier = ">=1.4.4" },
{ name = "auth0-python", specifier = ">=4.7.1" },
{ name = "blinker", specifier = ">=1.9.0" },
{ name = "chromadb", specifier = ">=0.5.23" },
{ name = "click", specifier = ">=8.1.7" },
{ name = "crewai-tools", marker = "extra == 'tools'", specifier = "~=0.40.1" },
{ name = "crewai-tools", marker = "extra == 'tools'", specifier = ">=0.17.0" },
{ name = "docling", marker = "extra == 'docling'", specifier = ">=2.12.0" },
{ name = "fastembed", marker = "extra == 'fastembed'", specifier = ">=0.4.1" },
{ name = "instructor", specifier = ">=1.3.3" },
{ name = "json-repair", specifier = ">=0.25.2" },
{ name = "json5", specifier = ">=0.10.0" },
{ name = "jsonref", specifier = ">=1.1.0" },
{ name = "litellm", specifier = "==1.60.2" },
{ name = "litellm", specifier = ">=1.44.22" },
{ name = "mem0ai", marker = "extra == 'mem0'", specifier = ">=0.1.29" },
{ name = "openai", specifier = ">=1.13.3" },
{ name = "openpyxl", specifier = ">=3.1.5" },
{ name = "openpyxl", marker = "extra == 'openpyxl'", specifier = ">=3.1.5" },
{ name = "opentelemetry-api", specifier = ">=1.30.0" },
{ name = "opentelemetry-exporter-otlp-proto-http", specifier = ">=1.30.0" },
{ name = "opentelemetry-sdk", specifier = ">=1.30.0" },
{ name = "opentelemetry-api", specifier = ">=1.22.0" },
{ name = "opentelemetry-exporter-otlp-proto-http", specifier = ">=1.22.0" },
{ name = "opentelemetry-sdk", specifier = ">=1.22.0" },
{ name = "pandas", marker = "extra == 'pandas'", specifier = ">=2.2.3" },
{ name = "pdfplumber", specifier = ">=0.11.4" },
{ name = "pdfplumber", marker = "extra == 'pdfplumber'", specifier = ">=0.11.4" },
@@ -730,12 +695,10 @@ requires-dist = [
{ name = "python-dotenv", specifier = ">=1.0.0" },
{ name = "pyvis", specifier = ">=0.3.2" },
{ name = "regex", specifier = ">=2024.9.11" },
{ name = "tiktoken", marker = "extra == 'embeddings'", specifier = "~=0.7.0" },
{ name = "tomli", specifier = ">=2.0.2" },
{ name = "tomli-w", specifier = ">=1.1.0" },
{ name = "uv", specifier = ">=0.4.25" },
]
provides-extras = ["tools", "embeddings", "agentops", "fastembed", "pdfplumber", "pandas", "openpyxl", "mem0", "docling", "aisuite"]
[package.metadata.requires-dev]
dev = [
@@ -750,32 +713,34 @@ dev = [
{ name = "pre-commit", specifier = ">=3.6.0" },
{ name = "pytest", specifier = ">=8.0.0" },
{ name = "pytest-asyncio", specifier = ">=0.23.7" },
{ name = "pytest-recording", specifier = ">=0.13.2" },
{ name = "pytest-subprocess", specifier = ">=1.5.2" },
{ name = "pytest-vcr", specifier = ">=1.0.2" },
{ name = "python-dotenv", specifier = ">=1.0.0" },
{ name = "ruff", specifier = ">=0.8.2" },
]
[[package]]
name = "crewai-tools"
version = "0.40.1"
version = "0.17.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "beautifulsoup4" },
{ name = "chromadb" },
{ name = "click" },
{ name = "crewai" },
{ name = "docker" },
{ name = "docx2txt" },
{ name = "embedchain" },
{ name = "lancedb" },
{ name = "openai" },
{ name = "pydantic" },
{ name = "pyright" },
{ name = "pytest" },
{ name = "pytube" },
{ name = "requests" },
{ name = "selenium" },
]
sdist = { url = "https://files.pythonhosted.org/packages/16/ff/0c16c9943ec1501b12fc72aca7815f191ffe94d5f1fe4e9c353ee8c4ad1d/crewai_tools-0.40.1.tar.gz", hash = "sha256:6af5040b2277df8fd592238a17bf584f95dcc9ef7766236534999c8a9e9d0b52", size = 744094 }
sdist = { url = "https://files.pythonhosted.org/packages/cc/15/365f74e0e8313e7a3399bf01d908aa73575c823275f9196ec14c23159878/crewai_tools-0.17.0.tar.gz", hash = "sha256:2a2986000775c76bad45b9f3a2be857d293cf5daffe5f316abc052e630b1e5ce", size = 818983 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/35/05/619c00bae2dda038f0d218dd5197120c938e9c9ccef1b9e50cfb037486f6/crewai_tools-0.40.1-py3-none-any.whl", hash = "sha256:8f459f74dee64364bfdbc524c815c4afcfb9ed532b51e6b8b4f616398d46cf1e", size = 573286 },
{ url = "https://files.pythonhosted.org/packages/f4/1d/976adc2a4e5237cb03625de412cd051dea7d524084ed442adedfda871526/crewai_tools-0.17.0-py3-none-any.whl", hash = "sha256:85cf15286684ecad579b5a497888c6bf8a079ca443f7dd63a52bf1709655e4a3", size = 467975 },
]
[[package]]
@@ -1066,6 +1031,12 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/d5/7c/e9fcff7623954d86bdc17782036cbf715ecab1bec4847c008557affe1ca8/docstring_parser-0.16-py3-none-any.whl", hash = "sha256:bf0a1387354d3691d102edef7ec124f219ef639982d096e26e3b60aeffa90637", size = 36533 },
]
[[package]]
name = "docx2txt"
version = "0.8"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/7d/7d/60ee3f2b16d9bfdfa72e8599470a2c1a5b759cb113c6fe1006be28359327/docx2txt-0.8.tar.gz", hash = "sha256:2c06d98d7cfe2d3947e5760a57d924e3ff07745b379c8737723922e7009236e5", size = 2814 }
[[package]]
name = "durationpy"
version = "0.9"
@@ -1623,42 +1594,39 @@ wheels = [
[[package]]
name = "grpcio-tools"
version = "1.67.0"
version = "1.62.3"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "grpcio" },
{ name = "protobuf" },
{ name = "setuptools" },
]
sdist = { url = "https://files.pythonhosted.org/packages/e7/f8/62e15867651b72f6f95313e21d81f5f1c210b69a4cc664aecf52ec4c8a53/grpcio_tools-1.67.0.tar.gz", hash = "sha256:181b3d4e61b83142c182ec366f3079b0023509743986e54c9465ca38cac255f8", size = 5159163 }
sdist = { url = "https://files.pythonhosted.org/packages/54/fa/b69bd8040eafc09b88bb0ec0fea59e8aacd1a801e688af087cead213b0d0/grpcio-tools-1.62.3.tar.gz", hash = "sha256:7c7136015c3d62c3eef493efabaf9e3380e3e66d24ee8e94c01cb71377f57833", size = 4538520 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/91/9d/7608eb89b41433a49dbf96f56d9c05b3a5ba08951702d54c368d370ab6aa/grpcio_tools-1.67.0-cp310-cp310-linux_armv7l.whl", hash = "sha256:12aa38af76b5ef00a55808c7c374ed18d5dc7cc8081b717e56da3c50df1776e2", size = 2308120 },
{ url = "https://files.pythonhosted.org/packages/93/f2/d8cbc35e63bba98e4352427d01c64801fef9e9d9cd7fc5eea0538128e0e6/grpcio_tools-1.67.0-cp310-cp310-macosx_12_0_universal2.whl", hash = "sha256:b0b03d055127bbc7c629454804b53b5cad2cedfcf904576d159a8a04c22b8e66", size = 5500124 },
{ url = "https://files.pythonhosted.org/packages/eb/b5/131d0eac92205d0ae3d3f7eecf655884ef7746aecac5a93520fb83d907d0/grpcio_tools-1.67.0-cp310-cp310-manylinux_2_17_aarch64.whl", hash = "sha256:02b0b50c59a8f7428326197027a2f586d216c46138c547f861533c46bff78bfe", size = 2282058 },
{ url = "https://files.pythonhosted.org/packages/3f/3f/5e4de8d7fe38e8e42567a49a39f77d67e2905b00c69165e2e88f9d3005ac/grpcio_tools-1.67.0-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b2afdfe151ed9edbd4a3fd646716f83b58010769c57f9c0aa1cf4c3bfb1240a8", size = 2617363 },
{ url = "https://files.pythonhosted.org/packages/eb/53/3eb4eb7c178a229676d1ff0bcda640ebc0a104d12cdbd884f6796d118c56/grpcio_tools-1.67.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fc3eeb87575b2b360c5ef5aef22eb76cfdd6a255d2f628a48ab0e5a61a0039fb", size = 2416026 },
{ url = "https://files.pythonhosted.org/packages/a6/9a/9c584d21ed1fb8f7adac6135a569c9b3b1378b6b467fba8d94d14de70606/grpcio_tools-1.67.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:ead78089c4771605a1ff8894e47f2267440693f1beeee06fd5a788aede83370f", size = 3224904 },
{ url = "https://files.pythonhosted.org/packages/93/6a/dab92a7aa1bae0d2e0735462fbde778011916e5124d7ee9b52d214f42552/grpcio_tools-1.67.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:0671dcdccef09ca4eb415c1d6f470a857c6486733c146676f6810a3ade1d42cb", size = 2870381 },
{ url = "https://files.pythonhosted.org/packages/49/be/3f2c958ef65161f3eeae5a1013358ca3c2eab25174ec4fc8d46b6d6146c8/grpcio_tools-1.67.0-cp310-cp310-win32.whl", hash = "sha256:a7398d90b8c7da479aec8f853d3664d5a93c209f8ac3bd41cb7ae4e8677a45c6", size = 941140 },
{ url = "https://files.pythonhosted.org/packages/17/e9/461db9af08badc647659fa4a382ab546981ebccb413fc625e4b7c0413305/grpcio_tools-1.67.0-cp310-cp310-win_amd64.whl", hash = "sha256:f7e7d70a74df7e07be7cceaa694b7e8e5f3bef8e0299906f60885ecf7a40adb4", size = 1091151 },
{ url = "https://files.pythonhosted.org/packages/cd/0d/88f181eecef84c9c8c009fa4d49ce812a5717539b75aacea4a7be8b9587c/grpcio_tools-1.67.0-cp311-cp311-linux_armv7l.whl", hash = "sha256:655716bf931a22a090134d87953710033640996d31e36f5f9b0106ff5f552d8e", size = 2307990 },
{ url = "https://files.pythonhosted.org/packages/de/22/94855e18588800c96eca95af3be918249f635e4635e3e46895949b0ca27e/grpcio_tools-1.67.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:484ae782f9d3ff58e0bbb2f4cad14d5f5d9132fc701835b1dffd2c2a06f73ba6", size = 5526488 },
{ url = "https://files.pythonhosted.org/packages/c3/c7/086f6c287fed85c2a5e19cb457a42a0eae2df9534666ed252947170daf8e/grpcio_tools-1.67.0-cp311-cp311-manylinux_2_17_aarch64.whl", hash = "sha256:f3e34de876efe1273f91e25ef241e449ed7f9411472dd9ff56d2039618017c30", size = 2282139 },
{ url = "https://files.pythonhosted.org/packages/40/1a/d8e2171ef7b5b1fda54fa2dc82807725c9e31dd6b4878e9d68ab8f3c48b7/grpcio_tools-1.67.0-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d8301719edde2c3d388995703cdd962f558b76e9750405f772dce61402e4c3d0", size = 2617333 },
{ url = "https://files.pythonhosted.org/packages/08/e8/e2b0a3e5890ad650d0cc9d92227f87a407784a9fc110438b85d01cf1ec71/grpcio_tools-1.67.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1629ea246044ccd473d9ac4c9f137a440d830b5e08d35225e1b354dbbb15b75d", size = 2415805 },
{ url = "https://files.pythonhosted.org/packages/6a/43/a1731299e1662c24d89795a8ae4bb725f4a8a0c8e2dc6e12d3276eb96e14/grpcio_tools-1.67.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:d77a3c5cec0065267ca1a0b2589ececd1277ce25aa67f13ec50c816ee6f26f7f", size = 3224764 },
{ url = "https://files.pythonhosted.org/packages/5d/03/968dd4b8de9ec4c6d287a8ba8b844f515a2cfcb350acefdb1fcb6f2945d9/grpcio_tools-1.67.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:c9bf992bcc7d9e6eaa20705056e1b955593092a38cec1746fef389d873ab2056", size = 2870440 },
{ url = "https://files.pythonhosted.org/packages/9f/ea/e6bb028fec6f37aace620bd0a68e7c369bc975ece940dd3de08a2ef66edc/grpcio_tools-1.67.0-cp311-cp311-win32.whl", hash = "sha256:7e6e3db119c38629e0767cdb2ee18726ecc87e2249117d4c9e7ce06ea8c894ea", size = 940888 },
{ url = "https://files.pythonhosted.org/packages/e5/26/b6f98fc9c1e6b8fa5b676bbb07e2bc70f388d4c513140fa38ffa9a15b934/grpcio_tools-1.67.0-cp311-cp311-win_amd64.whl", hash = "sha256:c6c27aec301a0e6cf231f9ee1c467c64002af51170fa7c0f3bb10bbfcd03fee7", size = 1091094 },
{ url = "https://files.pythonhosted.org/packages/d6/b6/57e67c0244db8d7c0c312041293b806bfb1c9d66c26159e6faf39cc10356/grpcio_tools-1.67.0-cp312-cp312-linux_armv7l.whl", hash = "sha256:dca7f053cbdb26a587d4410ddb893877c585fb60a31f22fdd128e4f7c4dab27c", size = 2307646 },
{ url = "https://files.pythonhosted.org/packages/52/43/837f08b85b04ac225aebe1d7da1a7a79fc313f231306c865b5112cef7dc4/grpcio_tools-1.67.0-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:de8c4f68ffa690769d84329c17c7fdd5fbe4c61b8f8a0de03f1ad8ef8bb06963", size = 5525447 },
{ url = "https://files.pythonhosted.org/packages/3e/5f/adb8b87f5c403ba53529b6645184beddfa63abf2c524a6dabaa430e6bab3/grpcio_tools-1.67.0-cp312-cp312-manylinux_2_17_aarch64.whl", hash = "sha256:6e4ecb24c27a78f09fead45d4ed873805d6026124ccb6793b6fb93a490b78ddf", size = 2281767 },
{ url = "https://files.pythonhosted.org/packages/6e/cd/3d6a7971e28b96cb618abb281325517443744ecfe48aa03f27a17cd5d4e1/grpcio_tools-1.67.0-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:004d6ef1b5f724480f05c0bdc904bf8c78c43d633c537d99abe51b52ce0cadeb", size = 2617363 },
{ url = "https://files.pythonhosted.org/packages/2d/a9/b8f1eae3db0f1b6f9548bd2032f48cb6f1ec9bc6781436d52dff4b352fab/grpcio_tools-1.67.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9dd257072c86eb9b36791b3674a513a215ba76bbdd38fc228f0e8c6dc5ce3524", size = 2415322 },
{ url = "https://files.pythonhosted.org/packages/9b/fc/0045bf2e5c97a5ffe0ff2c9a4e4a62894402e8d7094162c2084a809c9d1c/grpcio_tools-1.67.0-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:a8cca551317ed26e17d13b6ee27b2bd62f5fe9b3842b4e88389deb984f995848", size = 3225044 },
{ url = "https://files.pythonhosted.org/packages/dc/73/eaf40958dd648dd98a0fbd30df2b51c5beb7ee24127c1f0bb99ea44fd435/grpcio_tools-1.67.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:a7ac3b4f837c693142f6688b629d1f6408f6ab250d927159b572555f5339fe25", size = 2870418 },
{ url = "https://files.pythonhosted.org/packages/b4/77/e307e91816123444ff657bbae2269cb912f31a9390118ed371bde9d0c1f3/grpcio_tools-1.67.0-cp312-cp312-win32.whl", hash = "sha256:95feec33388e2a8f72c360a68efe6f0bfed9c771e94d21b7f2359d0010f60219", size = 940540 },
{ url = "https://files.pythonhosted.org/packages/be/2a/0c1a64e88fbc17235b68d3178be6cf4a69aea5bd1deed683c0bbd2f5e9f9/grpcio_tools-1.67.0-cp312-cp312-win_amd64.whl", hash = "sha256:50a31d035193ebe7154181eac84734e25bdcdb36adba849d3b2adf1c3b0c382b", size = 1090427 },
{ url = "https://files.pythonhosted.org/packages/ff/eb/eb0a3aa9480c3689d31fd2ad536df6a828e97a60f667c8a93d05bdf07150/grpcio_tools-1.62.3-cp310-cp310-macosx_12_0_universal2.whl", hash = "sha256:2f968b049c2849540751ec2100ab05e8086c24bead769ca734fdab58698408c1", size = 5117556 },
{ url = "https://files.pythonhosted.org/packages/f3/fb/8be3dda485f7fab906bfa02db321c3ecef953a87cdb5f6572ca08b187bcb/grpcio_tools-1.62.3-cp310-cp310-manylinux_2_17_aarch64.whl", hash = "sha256:0a8c0c4724ae9c2181b7dbc9b186df46e4f62cb18dc184e46d06c0ebeccf569e", size = 2719330 },
{ url = "https://files.pythonhosted.org/packages/63/de/6978f8d10066e240141cd63d1fbfc92818d96bb53427074f47a8eda921e1/grpcio_tools-1.62.3-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5782883a27d3fae8c425b29a9d3dcf5f47d992848a1b76970da3b5a28d424b26", size = 3070818 },
{ url = "https://files.pythonhosted.org/packages/74/34/bb8f816893fc73fd6d830e895e8638d65d13642bb7a434f9175c5ca7da11/grpcio_tools-1.62.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f3d812daffd0c2d2794756bd45a353f89e55dc8f91eb2fc840c51b9f6be62667", size = 2804993 },
{ url = "https://files.pythonhosted.org/packages/78/60/b2198d7db83293cdb9760fc083f077c73e4c182da06433b3b157a1567d06/grpcio_tools-1.62.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:b47d0dda1bdb0a0ba7a9a6de88e5a1ed61f07fad613964879954961e36d49193", size = 3684915 },
{ url = "https://files.pythonhosted.org/packages/61/20/56dbdc4ecb14d42a03cd164ff45e6e84572bbe61ee59c50c39f4d556a8d5/grpcio_tools-1.62.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:ca246dffeca0498be9b4e1ee169b62e64694b0f92e6d0be2573e65522f39eea9", size = 3297482 },
{ url = "https://files.pythonhosted.org/packages/4a/dc/e417a313c905744ce8cedf1e1edd81c41dc45ff400ae1c45080e18f26712/grpcio_tools-1.62.3-cp310-cp310-win32.whl", hash = "sha256:6a56d344b0bab30bf342a67e33d386b0b3c4e65868ffe93c341c51e1a8853ca5", size = 909793 },
{ url = "https://files.pythonhosted.org/packages/d9/69/75e7ebfd8d755d3e7be5c6d1aa6d13220f5bba3a98965e4b50c329046777/grpcio_tools-1.62.3-cp310-cp310-win_amd64.whl", hash = "sha256:710fecf6a171dcbfa263a0a3e7070e0df65ba73158d4c539cec50978f11dad5d", size = 1052459 },
{ url = "https://files.pythonhosted.org/packages/23/52/2dfe0a46b63f5ebcd976570aa5fc62f793d5a8b169e211c6a5aede72b7ae/grpcio_tools-1.62.3-cp311-cp311-macosx_10_10_universal2.whl", hash = "sha256:703f46e0012af83a36082b5f30341113474ed0d91e36640da713355cd0ea5d23", size = 5147623 },
{ url = "https://files.pythonhosted.org/packages/f0/2e/29fdc6c034e058482e054b4a3c2432f84ff2e2765c1342d4f0aa8a5c5b9a/grpcio_tools-1.62.3-cp311-cp311-manylinux_2_17_aarch64.whl", hash = "sha256:7cc83023acd8bc72cf74c2edbe85b52098501d5b74d8377bfa06f3e929803492", size = 2719538 },
{ url = "https://files.pythonhosted.org/packages/f9/60/abe5deba32d9ec2c76cdf1a2f34e404c50787074a2fee6169568986273f1/grpcio_tools-1.62.3-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7ff7d58a45b75df67d25f8f144936a3e44aabd91afec833ee06826bd02b7fbe7", size = 3070964 },
{ url = "https://files.pythonhosted.org/packages/bc/ad/e2b066684c75f8d9a48508cde080a3a36618064b9cadac16d019ca511444/grpcio_tools-1.62.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7f2483ea232bd72d98a6dc6d7aefd97e5bc80b15cd909b9e356d6f3e326b6e43", size = 2805003 },
{ url = "https://files.pythonhosted.org/packages/9c/3f/59bf7af786eae3f9d24ee05ce75318b87f541d0950190ecb5ffb776a1a58/grpcio_tools-1.62.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:962c84b4da0f3b14b3cdb10bc3837ebc5f136b67d919aea8d7bb3fd3df39528a", size = 3685154 },
{ url = "https://files.pythonhosted.org/packages/f1/79/4dd62478b91e27084c67b35a2316ce8a967bd8b6cb8d6ed6c86c3a0df7cb/grpcio_tools-1.62.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:8ad0473af5544f89fc5a1ece8676dd03bdf160fb3230f967e05d0f4bf89620e3", size = 3297942 },
{ url = "https://files.pythonhosted.org/packages/b8/cb/86449ecc58bea056b52c0b891f26977afc8c4464d88c738f9648da941a75/grpcio_tools-1.62.3-cp311-cp311-win32.whl", hash = "sha256:db3bc9fa39afc5e4e2767da4459df82b095ef0cab2f257707be06c44a1c2c3e5", size = 910231 },
{ url = "https://files.pythonhosted.org/packages/45/a4/9736215e3945c30ab6843280b0c6e1bff502910156ea2414cd77fbf1738c/grpcio_tools-1.62.3-cp311-cp311-win_amd64.whl", hash = "sha256:e0898d412a434e768a0c7e365acabe13ff1558b767e400936e26b5b6ed1ee51f", size = 1052496 },
{ url = "https://files.pythonhosted.org/packages/2a/a5/d6887eba415ce318ae5005e8dfac3fa74892400b54b6d37b79e8b4f14f5e/grpcio_tools-1.62.3-cp312-cp312-macosx_10_10_universal2.whl", hash = "sha256:d102b9b21c4e1e40af9a2ab3c6d41afba6bd29c0aa50ca013bf85c99cdc44ac5", size = 5147690 },
{ url = "https://files.pythonhosted.org/packages/8a/7c/3cde447a045e83ceb4b570af8afe67ffc86896a2fe7f59594dc8e5d0a645/grpcio_tools-1.62.3-cp312-cp312-manylinux_2_17_aarch64.whl", hash = "sha256:0a52cc9444df978438b8d2332c0ca99000521895229934a59f94f37ed896b133", size = 2720538 },
{ url = "https://files.pythonhosted.org/packages/88/07/f83f2750d44ac4f06c07c37395b9c1383ef5c994745f73c6bfaf767f0944/grpcio_tools-1.62.3-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:141d028bf5762d4a97f981c501da873589df3f7e02f4c1260e1921e565b376fa", size = 3071571 },
{ url = "https://files.pythonhosted.org/packages/37/74/40175897deb61e54aca716bc2e8919155b48f33aafec8043dda9592d8768/grpcio_tools-1.62.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47a5c093ab256dec5714a7a345f8cc89315cb57c298b276fa244f37a0ba507f0", size = 2806207 },
{ url = "https://files.pythonhosted.org/packages/ec/ee/d8de915105a217cbcb9084d684abdc032030dcd887277f2ef167372287fe/grpcio_tools-1.62.3-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:f6831fdec2b853c9daa3358535c55eed3694325889aa714070528cf8f92d7d6d", size = 3685815 },
{ url = "https://files.pythonhosted.org/packages/fd/d9/4360a6c12be3d7521b0b8c39e5d3801d622fbb81cc2721dbd3eee31e28c8/grpcio_tools-1.62.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:e02d7c1a02e3814c94ba0cfe43d93e872c758bd8fd5c2797f894d0c49b4a1dfc", size = 3298378 },
{ url = "https://files.pythonhosted.org/packages/29/3b/7cdf4a9e5a3e0a35a528b48b111355cd14da601413a4f887aa99b6da468f/grpcio_tools-1.62.3-cp312-cp312-win32.whl", hash = "sha256:b881fd9505a84457e9f7e99362eeedd86497b659030cf57c6f0070df6d9c2b9b", size = 910416 },
{ url = "https://files.pythonhosted.org/packages/6c/66/dd3ec249e44c1cc15e902e783747819ed41ead1336fcba72bf841f72c6e9/grpcio_tools-1.62.3-cp312-cp312-win_amd64.whl", hash = "sha256:11c625eebefd1fd40a228fc8bae385e448c7e32a6ae134e43cf13bbc23f902b7", size = 1052856 },
]
[[package]]
@@ -1984,15 +1952,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/23/38/34cb843cee4c5c27aa5c822e90e99bf96feb3dfa705713b5b6e601d17f5c/json_repair-0.30.0-py3-none-any.whl", hash = "sha256:bda4a5552dc12085c6363ff5acfcdb0c9cafc629989a2112081b7e205828228d", size = 17641 },
]
[[package]]
name = "json5"
version = "0.10.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/85/3d/bbe62f3d0c05a689c711cff57b2e3ac3d3e526380adb7c781989f075115c/json5-0.10.0.tar.gz", hash = "sha256:e66941c8f0a02026943c52c2eb34ebeb2a6f819a0be05920a6f5243cd30fd559", size = 48202 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/aa/42/797895b952b682c3dafe23b1834507ee7f02f4d6299b65aaa61425763278/json5-0.10.0-py3-none-any.whl", hash = "sha256:19b23410220a7271e8377f81ba8aacba2fdd56947fbb137ee5977cbe1f5e8dfa", size = 34049 },
]
[[package]]
name = "jsonlines"
version = "3.1.0"
@@ -2266,24 +2225,24 @@ wheels = [
[[package]]
name = "litellm"
version = "1.60.2"
version = "1.50.2"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "aiohttp" },
{ name = "click" },
{ name = "httpx" },
{ name = "importlib-metadata" },
{ name = "jinja2" },
{ name = "jsonschema" },
{ name = "openai" },
{ name = "pydantic" },
{ name = "python-dotenv" },
{ name = "requests" },
{ name = "tiktoken" },
{ name = "tokenizers" },
]
sdist = { url = "https://files.pythonhosted.org/packages/94/8f/704cdb0fdbdd49dc5062a39ae5f1a8f308ae0ffd746df6e0137fc1776b8a/litellm-1.60.2.tar.gz", hash = "sha256:a8170584fcfd6f5175201d869e61ccd8a40ffe3264fc5e53c5b805ddf8a6e05a", size = 6447447 }
sdist = { url = "https://files.pythonhosted.org/packages/a7/45/4d54617b267a96f1f7c17c0010ea1aba20e30a3672b873fe92a6001e5952/litellm-1.50.2.tar.gz", hash = "sha256:b244c9a0e069cc626b85fb9f5cc252114aaff1225500da30ce0940f841aef8ea", size = 6096949 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/8a/ba/0eaec9aee9f99fdf46ef1c0bddcfe7f5720b182f84f6ed27f13145d5ded2/litellm-1.60.2-py3-none-any.whl", hash = "sha256:1cb08cda04bf8c5ef3e690171a779979e4b16a5e3a24cd8dc1f198e7f198d5c4", size = 6746809 },
{ url = "https://files.pythonhosted.org/packages/22/f3/89a4d65d1b9286eb5ac6a6e92dd93523d92f3142a832e60c00d5cad64176/litellm-1.50.2-py3-none-any.whl", hash = "sha256:99cac60c78037946ab809b7cfbbadad53507bb2db8ae39391b4be215a0869fdd", size = 6318265 },
]
[[package]]
@@ -2988,6 +2947,7 @@ name = "nvidia-nccl-cu12"
version = "2.20.5"
source = { registry = "https://pypi.org/simple" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/c1/bb/d09dda47c881f9ff504afd6f9ca4f502ded6d8fc2f572cacc5e39da91c28/nvidia_nccl_cu12-2.20.5-py3-none-manylinux2014_aarch64.whl", hash = "sha256:1fc150d5c3250b170b29410ba682384b14581db722b2531b0d8d33c595f33d01", size = 176238458 },
{ url = "https://files.pythonhosted.org/packages/4b/2a/0a131f572aa09f741c30ccd45a8e56316e8be8dfc7bc19bf0ab7cfef7b19/nvidia_nccl_cu12-2.20.5-py3-none-manylinux2014_x86_64.whl", hash = "sha256:057f6bf9685f75215d0c53bf3ac4a10b3e6578351de307abad9e18a99182af56", size = 176249402 },
]
@@ -2997,6 +2957,7 @@ version = "12.6.85"
source = { registry = "https://pypi.org/simple" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/9d/d7/c5383e47c7e9bf1c99d5bd2a8c935af2b6d705ad831a7ec5c97db4d82f4f/nvidia_nvjitlink_cu12-12.6.85-py3-none-manylinux2010_x86_64.manylinux_2_12_x86_64.whl", hash = "sha256:eedc36df9e88b682efe4309aa16b5b4e78c2407eac59e8c10a6a47535164369a", size = 19744971 },
{ url = "https://files.pythonhosted.org/packages/31/db/dc71113d441f208cdfe7ae10d4983884e13f464a6252450693365e166dcf/nvidia_nvjitlink_cu12-12.6.85-py3-none-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:cf4eaa7d4b6b543ffd69d6abfb11efdeb2db48270d94dfd3a452c24150829e41", size = 19270338 },
]
[[package]]
@@ -3075,7 +3036,7 @@ wheels = [
[[package]]
name = "openai"
version = "1.68.2"
version = "1.52.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "anyio" },
@@ -3087,9 +3048,9 @@ dependencies = [
{ name = "tqdm" },
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/3f/6b/6b002d5d38794645437ae3ddb42083059d556558493408d39a0fcea608bc/openai-1.68.2.tar.gz", hash = "sha256:b720f0a95a1dbe1429c0d9bb62096a0d98057bcda82516f6e8af10284bdd5b19", size = 413429 }
sdist = { url = "https://files.pythonhosted.org/packages/80/ac/54c76352d493866637756b7c0ecec44f0b5bafb8fe753d98472cf6cfe4ce/openai-1.52.1.tar.gz", hash = "sha256:383b96c7e937cbec23cad5bf5718085381e4313ca33c5c5896b54f8e1b19d144", size = 310069 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/fd/34/cebce15f64eb4a3d609a83ac3568d43005cc9a1cba9d7fde5590fd415423/openai-1.68.2-py3-none-any.whl", hash = "sha256:24484cb5c9a33b58576fdc5acf0e5f92603024a4e39d0b99793dfa1eb14c2b36", size = 606073 },
{ url = "https://files.pythonhosted.org/packages/ad/31/28a83e124e9f9dd04c83b5aeb6f8b1770f45addde4dd3d34d9a9091590ad/openai-1.52.1-py3-none-any.whl", hash = "sha256:f23e83df5ba04ee0e82c8562571e8cb596cd88f9a84ab783e6c6259e5ffbfb4a", size = 386945 },
]
[[package]]
@@ -3123,32 +3084,32 @@ wheels = [
[[package]]
name = "opentelemetry-api"
version = "1.31.1"
version = "1.27.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "deprecated" },
{ name = "importlib-metadata" },
]
sdist = { url = "https://files.pythonhosted.org/packages/8a/cf/db26ab9d748bf50d6edf524fb863aa4da616ba1ce46c57a7dff1112b73fb/opentelemetry_api-1.31.1.tar.gz", hash = "sha256:137ad4b64215f02b3000a0292e077641c8611aab636414632a9b9068593b7e91", size = 64059 }
sdist = { url = "https://files.pythonhosted.org/packages/c9/83/93114b6de85a98963aec218a51509a52ed3f8de918fe91eb0f7299805c3f/opentelemetry_api-1.27.0.tar.gz", hash = "sha256:ed673583eaa5f81b5ce5e86ef7cdaf622f88ef65f0b9aab40b843dcae5bef342", size = 62693 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/6c/c8/86557ff0da32f3817bc4face57ea35cfdc2f9d3bcefd42311ef860dcefb7/opentelemetry_api-1.31.1-py3-none-any.whl", hash = "sha256:1511a3f470c9c8a32eeea68d4ea37835880c0eed09dd1a0187acc8b1301da0a1", size = 65197 },
{ url = "https://files.pythonhosted.org/packages/fb/1f/737dcdbc9fea2fa96c1b392ae47275165a7c641663fbb08a8d252968eed2/opentelemetry_api-1.27.0-py3-none-any.whl", hash = "sha256:953d5871815e7c30c81b56d910c707588000fff7a3ca1c73e6531911d53065e7", size = 63970 },
]
[[package]]
name = "opentelemetry-exporter-otlp-proto-common"
version = "1.31.1"
version = "1.27.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "opentelemetry-proto" },
]
sdist = { url = "https://files.pythonhosted.org/packages/53/e5/48662d9821d28f05ab8350a9a986ab99d9c0e8b23f8ff391c8df82742a9c/opentelemetry_exporter_otlp_proto_common-1.31.1.tar.gz", hash = "sha256:c748e224c01f13073a2205397ba0e415dcd3be9a0f95101ba4aace5fc730e0da", size = 20627 }
sdist = { url = "https://files.pythonhosted.org/packages/cd/2e/7eaf4ba595fb5213cf639c9158dfb64aacb2e4c7d74bfa664af89fa111f4/opentelemetry_exporter_otlp_proto_common-1.27.0.tar.gz", hash = "sha256:159d27cf49f359e3798c4c3eb8da6ef4020e292571bd8c5604a2a573231dd5c8", size = 17860 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/82/70/134282413000a3fc02e6b4e301b8c5d7127c43b50bd23cddbaf406ab33ff/opentelemetry_exporter_otlp_proto_common-1.31.1-py3-none-any.whl", hash = "sha256:7cadf89dbab12e217a33c5d757e67c76dd20ce173f8203e7370c4996f2e9efd8", size = 18823 },
{ url = "https://files.pythonhosted.org/packages/41/27/4610ab3d9bb3cde4309b6505f98b3aabca04a26aa480aa18cede23149837/opentelemetry_exporter_otlp_proto_common-1.27.0-py3-none-any.whl", hash = "sha256:675db7fffcb60946f3a5c43e17d1168a3307a94a930ecf8d2ea1f286f3d4f79a", size = 17848 },
]
[[package]]
name = "opentelemetry-exporter-otlp-proto-grpc"
version = "1.31.1"
version = "1.27.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "deprecated" },
@@ -3159,14 +3120,14 @@ dependencies = [
{ name = "opentelemetry-proto" },
{ name = "opentelemetry-sdk" },
]
sdist = { url = "https://files.pythonhosted.org/packages/0f/37/6ce465827ac69c52543afb5534146ccc40f54283a3a8a71ef87c91eb8933/opentelemetry_exporter_otlp_proto_grpc-1.31.1.tar.gz", hash = "sha256:c7f66b4b333c52248dc89a6583506222c896c74824d5d2060b818ae55510939a", size = 26620 }
sdist = { url = "https://files.pythonhosted.org/packages/a1/d0/c1e375b292df26e0ffebf194e82cd197e4c26cc298582bda626ce3ce74c5/opentelemetry_exporter_otlp_proto_grpc-1.27.0.tar.gz", hash = "sha256:af6f72f76bcf425dfb5ad11c1a6d6eca2863b91e63575f89bb7b4b55099d968f", size = 26244 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/ee/25/9974fa3a431d7499bd9d179fb9bd7daaa3ad9eba3313f72da5226b6d02df/opentelemetry_exporter_otlp_proto_grpc-1.31.1-py3-none-any.whl", hash = "sha256:f4055ad2c9a2ea3ae00cbb927d6253233478b3b87888e197d34d095a62305fae", size = 18588 },
{ url = "https://files.pythonhosted.org/packages/8d/80/32217460c2c64c0568cea38410124ff680a9b65f6732867bbf857c4d8626/opentelemetry_exporter_otlp_proto_grpc-1.27.0-py3-none-any.whl", hash = "sha256:56b5bbd5d61aab05e300d9d62a6b3c134827bbd28d0b12f2649c2da368006c9e", size = 18541 },
]
[[package]]
name = "opentelemetry-exporter-otlp-proto-http"
version = "1.31.1"
version = "1.27.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "deprecated" },
@@ -3177,29 +3138,28 @@ dependencies = [
{ name = "opentelemetry-sdk" },
{ name = "requests" },
]
sdist = { url = "https://files.pythonhosted.org/packages/6d/9c/d8718fce3d14042beab5a41c8e17be1864c48d2067be3a99a5652d2414a3/opentelemetry_exporter_otlp_proto_http-1.31.1.tar.gz", hash = "sha256:723bd90eb12cfb9ae24598641cb0c92ca5ba9f1762103902f6ffee3341ba048e", size = 15140 }
sdist = { url = "https://files.pythonhosted.org/packages/31/0a/f05c55e8913bf58a033583f2580a0ec31a5f4cf2beacc9e286dcb74d6979/opentelemetry_exporter_otlp_proto_http-1.27.0.tar.gz", hash = "sha256:2103479092d8eb18f61f3fbff084f67cc7f2d4a7d37e75304b8b56c1d09ebef5", size = 15059 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/f2/19/5041dbfdd0b2a6ab340596693759bfa7dcfa8f30b9fa7112bb7117358571/opentelemetry_exporter_otlp_proto_http-1.31.1-py3-none-any.whl", hash = "sha256:5dee1f051f096b13d99706a050c39b08e3f395905f29088bfe59e54218bd1cf4", size = 17257 },
{ url = "https://files.pythonhosted.org/packages/2d/8d/4755884afc0b1db6000527cac0ca17273063b6142c773ce4ecd307a82e72/opentelemetry_exporter_otlp_proto_http-1.27.0-py3-none-any.whl", hash = "sha256:688027575c9da42e179a69fe17e2d1eba9b14d81de8d13553a21d3114f3b4d75", size = 17203 },
]
[[package]]
name = "opentelemetry-instrumentation"
version = "0.52b1"
version = "0.48b0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "opentelemetry-api" },
{ name = "opentelemetry-semantic-conventions" },
{ name = "packaging" },
{ name = "setuptools" },
{ name = "wrapt" },
]
sdist = { url = "https://files.pythonhosted.org/packages/49/c9/c52d444576b0776dbee71d2a4485be276cf46bec0123a5ba2f43f0cf7cde/opentelemetry_instrumentation-0.52b1.tar.gz", hash = "sha256:739f3bfadbbeec04dd59297479e15660a53df93c131d907bb61052e3d3c1406f", size = 28406 }
sdist = { url = "https://files.pythonhosted.org/packages/04/0e/d9394839af5d55c8feb3b22cd11138b953b49739b20678ca96289e30f904/opentelemetry_instrumentation-0.48b0.tar.gz", hash = "sha256:94929685d906380743a71c3970f76b5f07476eea1834abd5dd9d17abfe23cc35", size = 24724 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/61/dd/a2b35078170941990e7a5194b9600fa75868958a9a2196a752da0e7b97a0/opentelemetry_instrumentation-0.52b1-py3-none-any.whl", hash = "sha256:8c0059c4379d77bbd8015c8d8476020efe873c123047ec069bb335e4b8717477", size = 31036 },
{ url = "https://files.pythonhosted.org/packages/0a/7f/405c41d4f359121376c9d5117dcf68149b8122d3f6c718996d037bd4d800/opentelemetry_instrumentation-0.48b0-py3-none-any.whl", hash = "sha256:a69750dc4ba6a5c3eb67986a337185a25b739966d80479befe37b546fc870b44", size = 29449 },
]
[[package]]
name = "opentelemetry-instrumentation-asgi"
version = "0.52b1"
version = "0.48b0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "asgiref" },
@@ -3208,14 +3168,14 @@ dependencies = [
{ name = "opentelemetry-semantic-conventions" },
{ name = "opentelemetry-util-http" },
]
sdist = { url = "https://files.pythonhosted.org/packages/bc/db/79bdc2344b38e60fecc7e99159a3f5b4c0e1acec8de305fba0a713cc3692/opentelemetry_instrumentation_asgi-0.52b1.tar.gz", hash = "sha256:a6dbce9cb5b2c2f45ce4817ad21f44c67fd328358ad3ab911eb46f0be67f82ec", size = 24203 }
sdist = { url = "https://files.pythonhosted.org/packages/44/ac/fd3d40bab3234ec3f5c052a815100676baaae1832fa1067935f11e5c59c6/opentelemetry_instrumentation_asgi-0.48b0.tar.gz", hash = "sha256:04c32174b23c7fa72ddfe192dad874954968a6a924608079af9952964ecdf785", size = 23435 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/19/de/39ec078ae94a365d2f434b7e25886c267864aca5695b48fa5b60f80fbfb3/opentelemetry_instrumentation_asgi-0.52b1-py3-none-any.whl", hash = "sha256:f7179f477ed665ba21871972f979f21e8534edb971232e11920c8a22f4759236", size = 16338 },
{ url = "https://files.pythonhosted.org/packages/db/74/a0e0d38622856597dd8e630f2bd793760485eb165708e11b8be1696bbb5a/opentelemetry_instrumentation_asgi-0.48b0-py3-none-any.whl", hash = "sha256:ddb1b5fc800ae66e85a4e2eca4d9ecd66367a8c7b556169d9e7b57e10676e44d", size = 15958 },
]
[[package]]
name = "opentelemetry-instrumentation-fastapi"
version = "0.52b1"
version = "0.48b0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "opentelemetry-api" },
@@ -3224,57 +3184,57 @@ dependencies = [
{ name = "opentelemetry-semantic-conventions" },
{ name = "opentelemetry-util-http" },
]
sdist = { url = "https://files.pythonhosted.org/packages/30/01/d159829077f2795c716445df6f8edfdd33391e82d712ba4613fb62b99dc5/opentelemetry_instrumentation_fastapi-0.52b1.tar.gz", hash = "sha256:d26ab15dc49e041301d5c2571605b8f5c3a6ee4a85b60940338f56c120221e98", size = 19247 }
sdist = { url = "https://files.pythonhosted.org/packages/58/20/43477da5850ef2cd3792715d442aecd051e885e0603b6ee5783b2104ba8f/opentelemetry_instrumentation_fastapi-0.48b0.tar.gz", hash = "sha256:21a72563ea412c0b535815aeed75fc580240f1f02ebc72381cfab672648637a2", size = 18497 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/23/89/acef7f625b218523873e32584dc5243d95ffa4facba737fd8b854c049c58/opentelemetry_instrumentation_fastapi-0.52b1-py3-none-any.whl", hash = "sha256:73c8804f053c5eb2fd2c948218bff9561f1ef65e89db326a6ab0b5bf829969f4", size = 12114 },
{ url = "https://files.pythonhosted.org/packages/ee/50/745ab075a3041b7a5f29a579d2c28eaad54f64b4589d8f9fd364c62cf0f3/opentelemetry_instrumentation_fastapi-0.48b0-py3-none-any.whl", hash = "sha256:afeb820a59e139d3e5d96619600f11ce0187658b8ae9e3480857dd790bc024f2", size = 11777 },
]
[[package]]
name = "opentelemetry-proto"
version = "1.31.1"
version = "1.27.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "protobuf" },
]
sdist = { url = "https://files.pythonhosted.org/packages/5b/b0/e763f335b9b63482f1f31f46f9299c4d8388e91fc12737aa14fdb5d124ac/opentelemetry_proto-1.31.1.tar.gz", hash = "sha256:d93e9c2b444e63d1064fb50ae035bcb09e5822274f1683886970d2734208e790", size = 34363 }
sdist = { url = "https://files.pythonhosted.org/packages/9a/59/959f0beea798ae0ee9c979b90f220736fbec924eedbefc60ca581232e659/opentelemetry_proto-1.27.0.tar.gz", hash = "sha256:33c9345d91dafd8a74fc3d7576c5a38f18b7fdf8d02983ac67485386132aedd6", size = 34749 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/b6/f1/3baee86eab4f1b59b755f3c61a9b5028f380c88250bb9b7f89340502dbba/opentelemetry_proto-1.31.1-py3-none-any.whl", hash = "sha256:1398ffc6d850c2f1549ce355744e574c8cd7c1dba3eea900d630d52c41d07178", size = 55854 },
{ url = "https://files.pythonhosted.org/packages/94/56/3d2d826834209b19a5141eed717f7922150224d1a982385d19a9444cbf8d/opentelemetry_proto-1.27.0-py3-none-any.whl", hash = "sha256:b133873de5581a50063e1e4b29cdcf0c5e253a8c2d8dc1229add20a4c3830ace", size = 52464 },
]
[[package]]
name = "opentelemetry-sdk"
version = "1.31.1"
version = "1.27.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "opentelemetry-api" },
{ name = "opentelemetry-semantic-conventions" },
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/63/d9/4fe159908a63661e9e635e66edc0d0d816ed20cebcce886132b19ae87761/opentelemetry_sdk-1.31.1.tar.gz", hash = "sha256:c95f61e74b60769f8ff01ec6ffd3d29684743404603df34b20aa16a49dc8d903", size = 159523 }
sdist = { url = "https://files.pythonhosted.org/packages/0d/9a/82a6ac0f06590f3d72241a587cb8b0b751bd98728e896cc4cbd4847248e6/opentelemetry_sdk-1.27.0.tar.gz", hash = "sha256:d525017dea0ccce9ba4e0245100ec46ecdc043f2d7b8315d56b19aff0904fa6f", size = 145019 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/bc/36/758e5d3746bc86a2af20aa5e2236a7c5aa4264b501dc0e9f40efd9078ef0/opentelemetry_sdk-1.31.1-py3-none-any.whl", hash = "sha256:882d021321f223e37afaca7b4e06c1d8bbc013f9e17ff48a7aa017460a8e7dae", size = 118866 },
{ url = "https://files.pythonhosted.org/packages/c1/bd/a6602e71e315055d63b2ff07172bd2d012b4cba2d4e00735d74ba42fc4d6/opentelemetry_sdk-1.27.0-py3-none-any.whl", hash = "sha256:365f5e32f920faf0fd9e14fdfd92c086e317eaa5f860edba9cdc17a380d9197d", size = 110505 },
]
[[package]]
name = "opentelemetry-semantic-conventions"
version = "0.52b1"
version = "0.48b0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "deprecated" },
{ name = "opentelemetry-api" },
]
sdist = { url = "https://files.pythonhosted.org/packages/06/8c/599f9f27cff097ec4d76fbe9fe6d1a74577ceec52efe1a999511e3c42ef5/opentelemetry_semantic_conventions-0.52b1.tar.gz", hash = "sha256:7b3d226ecf7523c27499758a58b542b48a0ac8d12be03c0488ff8ec60c5bae5d", size = 111275 }
sdist = { url = "https://files.pythonhosted.org/packages/0a/89/1724ad69f7411772446067cdfa73b598694c8c91f7f8c922e344d96d81f9/opentelemetry_semantic_conventions-0.48b0.tar.gz", hash = "sha256:12d74983783b6878162208be57c9effcb89dc88691c64992d70bb89dc00daa1a", size = 89445 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/98/be/d4ba300cfc1d4980886efbc9b48ee75242b9fcf940d9c4ccdc9ef413a7cf/opentelemetry_semantic_conventions-0.52b1-py3-none-any.whl", hash = "sha256:72b42db327e29ca8bb1b91e8082514ddf3bbf33f32ec088feb09526ade4bc77e", size = 183409 },
{ url = "https://files.pythonhosted.org/packages/b7/7a/4f0063dbb0b6c971568291a8bc19a4ca70d3c185db2d956230dd67429dfc/opentelemetry_semantic_conventions-0.48b0-py3-none-any.whl", hash = "sha256:a0de9f45c413a8669788a38569c7e0a11ce6ce97861a628cca785deecdc32a1f", size = 149685 },
]
[[package]]
name = "opentelemetry-util-http"
version = "0.52b1"
version = "0.48b0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/23/3f/16a4225a953bbaae7d800140ed99813f092ea3071ba7780683299a87049b/opentelemetry_util_http-0.52b1.tar.gz", hash = "sha256:c03c8c23f1b75fadf548faece7ead3aecd50761c5593a2b2831b48730eee5b31", size = 8044 }
sdist = { url = "https://files.pythonhosted.org/packages/d6/d7/185c494754340e0a3928fd39fde2616ee78f2c9d66253affaad62d5b7935/opentelemetry_util_http-0.48b0.tar.gz", hash = "sha256:60312015153580cc20f322e5cdc3d3ecad80a71743235bdb77716e742814623c", size = 7863 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/2c/00/1591b397c9efc0e4215d223553a1cb9090c8499888a4447f842443077d31/opentelemetry_util_http-0.52b1-py3-none-any.whl", hash = "sha256:6a6ab6bfa23fef96f4995233e874f67602adf9d224895981b4ab9d4dde23de78", size = 7305 },
{ url = "https://files.pythonhosted.org/packages/ad/2e/36097c0a4d0115b8c7e377c90bab7783ac183bc5cb4071308f8959454311/opentelemetry_util_http-0.48b0-py3-none-any.whl", hash = "sha256:76f598af93aab50328d2a69c786beaedc8b6a7770f7a818cc307eb353debfffb", size = 6946 },
]
[[package]]
@@ -3315,6 +3275,18 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/9f/8a/ce7c28e4ea337f6d95261345d7c61322f8561c52f57b263a3ad7025984f4/orjson-3.10.10-cp312-none-win_amd64.whl", hash = "sha256:384cd13579a1b4cd689d218e329f459eb9ddc504fa48c5a83ef4889db7fd7a4f", size = 139389 },
]
[[package]]
name = "outcome"
version = "1.3.0.post0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "attrs" },
]
sdist = { url = "https://files.pythonhosted.org/packages/98/df/77698abfac98571e65ffeb0c1fba8ffd692ab8458d617a0eed7d9a8d38f2/outcome-1.3.0.post0.tar.gz", hash = "sha256:9dcf02e65f2971b80047b377468e72a268e15c0af3cf1238e6ff14f7f91143b8", size = 21060 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/55/8b/5ab7257531a5d830fc8000c476e63c935488d74609b50f9384a643ec0a62/outcome-1.3.0.post0-py2.py3-none-any.whl", hash = "sha256:e771c5ce06d1415e356078d3bdd68523f284b4ce5419828922b6871e65eda82b", size = 10692 },
]
[[package]]
name = "overrides"
version = "7.7.0"
@@ -3636,16 +3608,16 @@ wheels = [
[[package]]
name = "protobuf"
version = "5.29.4"
version = "4.25.5"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/17/7d/b9dca7365f0e2c4fa7c193ff795427cfa6290147e5185ab11ece280a18e7/protobuf-5.29.4.tar.gz", hash = "sha256:4f1dfcd7997b31ef8f53ec82781ff434a28bf71d9102ddde14d076adcfc78c99", size = 424902 }
sdist = { url = "https://files.pythonhosted.org/packages/67/dd/48d5fdb68ec74d70fabcc252e434492e56f70944d9f17b6a15e3746d2295/protobuf-4.25.5.tar.gz", hash = "sha256:7f8249476b4a9473645db7f8ab42b02fe1488cbe5fb72fddd445e0665afd8584", size = 380315 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/9a/b2/043a1a1a20edd134563699b0e91862726a0dc9146c090743b6c44d798e75/protobuf-5.29.4-cp310-abi3-win32.whl", hash = "sha256:13eb236f8eb9ec34e63fc8b1d6efd2777d062fa6aaa68268fb67cf77f6839ad7", size = 422709 },
{ url = "https://files.pythonhosted.org/packages/79/fc/2474b59570daa818de6124c0a15741ee3e5d6302e9d6ce0bdfd12e98119f/protobuf-5.29.4-cp310-abi3-win_amd64.whl", hash = "sha256:bcefcdf3976233f8a502d265eb65ea740c989bacc6c30a58290ed0e519eb4b8d", size = 434506 },
{ url = "https://files.pythonhosted.org/packages/46/de/7c126bbb06aa0f8a7b38aaf8bd746c514d70e6a2a3f6dd460b3b7aad7aae/protobuf-5.29.4-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:307ecba1d852ec237e9ba668e087326a67564ef83e45a0189a772ede9e854dd0", size = 417826 },
{ url = "https://files.pythonhosted.org/packages/a2/b5/bade14ae31ba871a139aa45e7a8183d869efe87c34a4850c87b936963261/protobuf-5.29.4-cp38-abi3-manylinux2014_aarch64.whl", hash = "sha256:aec4962f9ea93c431d5714ed1be1c93f13e1a8618e70035ba2b0564d9e633f2e", size = 319574 },
{ url = "https://files.pythonhosted.org/packages/46/88/b01ed2291aae68b708f7d334288ad5fb3e7aa769a9c309c91a0d55cb91b0/protobuf-5.29.4-cp38-abi3-manylinux2014_x86_64.whl", hash = "sha256:d7d3f7d1d5a66ed4942d4fefb12ac4b14a29028b209d4bfb25c68ae172059922", size = 319672 },
{ url = "https://files.pythonhosted.org/packages/12/fb/a586e0c973c95502e054ac5f81f88394f24ccc7982dac19c515acd9e2c93/protobuf-5.29.4-py3-none-any.whl", hash = "sha256:3fde11b505e1597f71b875ef2fc52062b6a9740e5f7c8997ce878b6009145862", size = 172551 },
{ url = "https://files.pythonhosted.org/packages/00/35/1b3c5a5e6107859c4ca902f4fbb762e48599b78129a05d20684fef4a4d04/protobuf-4.25.5-cp310-abi3-win32.whl", hash = "sha256:5e61fd921603f58d2f5acb2806a929b4675f8874ff5f330b7d6f7e2e784bbcd8", size = 392457 },
{ url = "https://files.pythonhosted.org/packages/a7/ad/bf3f358e90b7e70bf7fb520702cb15307ef268262292d3bdb16ad8ebc815/protobuf-4.25.5-cp310-abi3-win_amd64.whl", hash = "sha256:4be0571adcbe712b282a330c6e89eae24281344429ae95c6d85e79e84780f5ea", size = 413449 },
{ url = "https://files.pythonhosted.org/packages/51/49/d110f0a43beb365758a252203c43eaaad169fe7749da918869a8c991f726/protobuf-4.25.5-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:b2fde3d805354df675ea4c7c6338c1aecd254dfc9925e88c6d31a2bcb97eb173", size = 394248 },
{ url = "https://files.pythonhosted.org/packages/c6/ab/0f384ca0bc6054b1a7b6009000ab75d28a5506e4459378b81280ae7fd358/protobuf-4.25.5-cp37-abi3-manylinux2014_aarch64.whl", hash = "sha256:919ad92d9b0310070f8356c24b855c98df2b8bd207ebc1c0c6fcc9ab1e007f3d", size = 293717 },
{ url = "https://files.pythonhosted.org/packages/05/a6/094a2640be576d760baa34c902dcb8199d89bce9ed7dd7a6af74dcbbd62d/protobuf-4.25.5-cp37-abi3-manylinux2014_x86_64.whl", hash = "sha256:fe14e16c22be926d3abfcb500e60cab068baf10b542b8c858fa27e098123e331", size = 294635 },
{ url = "https://files.pythonhosted.org/packages/33/90/f198a61df8381fb43ae0fe81b3d2718e8dcc51ae8502c7657ab9381fbc4f/protobuf-4.25.5-py3-none-any.whl", hash = "sha256:0aebecb809cae990f8129ada5ca273d9d670b76d9bfc9b1809f0a9c02b7dbf41", size = 156467 },
]
[[package]]
@@ -4035,6 +4007,15 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/48/0a/c99fb7d7e176f8b176ef19704a32e6a9c6aafdf19ef75a187f701fc15801/pysbd-0.3.4-py3-none-any.whl", hash = "sha256:cd838939b7b0b185fcf86b0baf6636667dfb6e474743beeff878e9f42e022953", size = 71082 },
]
[[package]]
name = "pysocks"
version = "1.7.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/bd/11/293dd436aea955d45fc4e8a35b6ae7270f5b8e00b53cf6c024c83b657a11/PySocks-1.7.1.tar.gz", hash = "sha256:3f8804571ebe159c380ac6de37643bb4685970655d3bba243530d6558b799aa0", size = 284429 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/8d/59/b4572118e098ac8e46e399a1dd0f2d85403ce8bbaad9ec79373ed6badaf9/PySocks-1.7.1-py3-none-any.whl", hash = "sha256:2725bd0a9925919b9b51739eea5f9e2bae91e83288108a9ad338b2e3a4435ee5", size = 16725 },
]
[[package]]
name = "pytest"
version = "8.3.3"
@@ -4064,20 +4045,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/96/31/6607dab48616902f76885dfcf62c08d929796fc3b2d2318faf9fd54dbed9/pytest_asyncio-0.24.0-py3-none-any.whl", hash = "sha256:a811296ed596b69bf0b6f3dc40f83bcaf341b155a269052d82efa2b25ac7037b", size = 18024 },
]
[[package]]
name = "pytest-recording"
version = "0.13.2"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "pytest" },
{ name = "vcrpy", version = "5.1.0", source = { registry = "https://pypi.org/simple" }, marker = "platform_python_implementation == 'PyPy'" },
{ name = "vcrpy", version = "7.0.0", source = { registry = "https://pypi.org/simple" }, marker = "platform_python_implementation != 'PyPy'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/fe/2a/ea6b8036ae01979eae02d8ad5a7da14dec90d9176b613e49fb8d134c78fc/pytest_recording-0.13.2.tar.gz", hash = "sha256:000c3babbb466681457fd65b723427c1779a0c6c17d9e381c3142a701e124877", size = 25270 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/72/52/8e67a969e9fad3fa5ec4eab9f2a7348ff04692065c7deda21d76e9112703/pytest_recording-0.13.2-py3-none-any.whl", hash = "sha256:3820fe5743d1ac46e807989e11d073cb776a60bdc544cf43ebca454051b22d13", size = 12783 },
]
[[package]]
name = "pytest-subprocess"
version = "1.5.2"
@@ -4090,6 +4057,19 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/10/77/a80e8f9126b95ffd5ad4d04bd14005c68dcbf0d88f53b2b14893f6cc7232/pytest_subprocess-1.5.2-py3-none-any.whl", hash = "sha256:23ac7732aa8bd45f1757265b1316eb72a7f55b41fb21e2ca22e149ba3629fa46", size = 20886 },
]
[[package]]
name = "pytest-vcr"
version = "1.0.2"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "pytest" },
{ name = "vcrpy" },
]
sdist = { url = "https://files.pythonhosted.org/packages/1a/60/104c619483c1a42775d3f8b27293f1ecfc0728014874d065e68cb9702d49/pytest-vcr-1.0.2.tar.gz", hash = "sha256:23ee51b75abbcc43d926272773aae4f39f93aceb75ed56852d0bf618f92e1896", size = 3810 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/9d/d3/ff520d11e6ee400602711d1ece8168dcfc5b6d8146fb7db4244a6ad6a9c3/pytest_vcr-1.0.2-py2.py3-none-any.whl", hash = "sha256:2f316e0539399bea0296e8b8401145c62b6f85e9066af7e57b6151481b0d6d9c", size = 4137 },
]
[[package]]
name = "python-bidi"
version = "0.6.3"
@@ -4695,6 +4675,23 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/aa/7d/43ab67228ef98c6b5dd42ab386eae2d7877036970a0d7e3dd3eb47a0d530/scipy-1.14.1-cp312-cp312-win_amd64.whl", hash = "sha256:2ff38e22128e6c03ff73b6bb0f85f897d2362f8c052e3b8ad00532198fbdae3f", size = 44521212 },
]
[[package]]
name = "selenium"
version = "4.25.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "certifi" },
{ name = "trio" },
{ name = "trio-websocket" },
{ name = "typing-extensions" },
{ name = "urllib3", extra = ["socks"] },
{ name = "websocket-client" },
]
sdist = { url = "https://files.pythonhosted.org/packages/0e/5a/d3735b189b91715fd0f5a9b8d55e2605061309849470e96ab830f02cba40/selenium-4.25.0.tar.gz", hash = "sha256:95d08d3b82fb353f3c474895154516604c7f0e6a9a565ae6498ef36c9bac6921", size = 957765 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/aa/85/fa44f23dd5d5066a72f7c4304cce4b5ff9a6e7fd92431a48b2c63fbf63ec/selenium-4.25.0-py3-none-any.whl", hash = "sha256:3798d2d12b4a570bc5790163ba57fef10b2afee958bf1d80f2a3cf07c4141f33", size = 9693127 },
]
[[package]]
name = "semchunk"
version = "2.2.0"
@@ -4773,6 +4770,15 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235 },
]
[[package]]
name = "sortedcontainers"
version = "2.4.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/e8/c4/ba2f8066cceb6f23394729afe52f3bf7adec04bf9ed2c820b39e19299111/sortedcontainers-2.4.0.tar.gz", hash = "sha256:25caa5a06cc30b6b83d11423433f65d1f9d76c4c6a0c90e3379eaa43b9bfdb88", size = 30594 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/32/46/9cb0e58b2deb7f82b84065f37f3bffeb12413f947f9388e4cac22c4621ce/sortedcontainers-2.4.0-py2.py3-none-any.whl", hash = "sha256:a163dcaede0f1c021485e957a39245190e74249897e2ae4b2aa38595db237ee0", size = 29575 },
]
[[package]]
name = "soupsieve"
version = "2.6"
@@ -5118,6 +5124,38 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/51/51/b87caa939fedf307496e4dbf412f4b909af3d9ca8b189fc3b65c1faa456f/transformers-4.46.3-py3-none-any.whl", hash = "sha256:a12ef6f52841fd190a3e5602145b542d03507222f2c64ebb7ee92e8788093aef", size = 10034536 },
]
[[package]]
name = "trio"
version = "0.27.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "attrs" },
{ name = "cffi", marker = "(implementation_name != 'pypy' and os_name == 'nt' and platform_machine != 'aarch64' and sys_platform == 'linux') or (implementation_name != 'pypy' and os_name == 'nt' and sys_platform != 'darwin' and sys_platform != 'linux')" },
{ name = "exceptiongroup", marker = "python_full_version < '3.11'" },
{ name = "idna" },
{ name = "outcome" },
{ name = "sniffio" },
{ name = "sortedcontainers" },
]
sdist = { url = "https://files.pythonhosted.org/packages/17/d1/a83dee5be404da7afe5a71783a33b8907bacb935a6dc8c69ab785e4a3eed/trio-0.27.0.tar.gz", hash = "sha256:1dcc95ab1726b2da054afea8fd761af74bad79bd52381b84eae408e983c76831", size = 568064 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/3c/83/ec3196c360afffbc5b342ead48d1eb7393dd74fa70bca75d33905a86f211/trio-0.27.0-py3-none-any.whl", hash = "sha256:68eabbcf8f457d925df62da780eff15ff5dc68fd6b367e2dde59f7aaf2a0b884", size = 481734 },
]
[[package]]
name = "trio-websocket"
version = "0.11.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "exceptiongroup", marker = "python_full_version < '3.11'" },
{ name = "trio" },
{ name = "wsproto" },
]
sdist = { url = "https://files.pythonhosted.org/packages/dd/36/abad2385853077424a11b818d9fd8350d249d9e31d583cb9c11cd4c85eda/trio-websocket-0.11.1.tar.gz", hash = "sha256:18c11793647703c158b1f6e62de638acada927344d534e3c7628eedcb746839f", size = 26511 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/48/be/a9ae5f50cad5b6f85bd2574c2c923730098530096e170c1ce7452394d7aa/trio_websocket-0.11.1-py3-none-any.whl", hash = "sha256:520d046b0d030cf970b8b2b2e00c4c2245b3807853ecd44214acd33d74581638", size = 17408 },
]
[[package]]
name = "triton"
version = "3.0.0"
@@ -5198,6 +5236,11 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/ce/d9/5f4c13cecde62396b0d3fe530a50ccea91e7dfc1ccf0e09c228841bb5ba8/urllib3-2.2.3-py3-none-any.whl", hash = "sha256:ca899ca043dcb1bafa3e262d73aa25c465bfb49e0bd9dd5d59f1d0acba2f8fac", size = 126338 },
]
[package.optional-dependencies]
socks = [
{ name = "pysocks" },
]
[[package]]
name = "uv"
version = "0.4.26"
@@ -5278,59 +5321,16 @@ wheels = [
name = "vcrpy"
version = "5.1.0"
source = { registry = "https://pypi.org/simple" }
resolution-markers = [
"python_full_version < '3.11' and platform_python_implementation == 'PyPy' and sys_platform == 'darwin'",
"python_full_version < '3.11' and platform_machine == 'aarch64' and platform_python_implementation == 'PyPy' and sys_platform == 'linux'",
"(python_full_version < '3.11' and platform_machine != 'aarch64' and platform_python_implementation == 'PyPy' and sys_platform == 'linux') or (python_full_version < '3.11' and platform_python_implementation == 'PyPy' and sys_platform != 'darwin' and sys_platform != 'linux')",
"python_full_version == '3.11.*' and platform_python_implementation == 'PyPy' and sys_platform == 'darwin'",
"python_full_version == '3.11.*' and platform_machine == 'aarch64' and platform_python_implementation == 'PyPy' and sys_platform == 'linux'",
"(python_full_version == '3.11.*' and platform_machine != 'aarch64' and platform_python_implementation == 'PyPy' and sys_platform == 'linux') or (python_full_version == '3.11.*' and platform_python_implementation == 'PyPy' and sys_platform != 'darwin' and sys_platform != 'linux')",
"python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_python_implementation == 'PyPy' and sys_platform == 'darwin'",
"python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_machine == 'aarch64' and platform_python_implementation == 'PyPy' and sys_platform == 'linux'",
"(python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_machine != 'aarch64' and platform_python_implementation == 'PyPy' and sys_platform == 'linux') or (python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_python_implementation == 'PyPy' and sys_platform != 'darwin' and sys_platform != 'linux')",
"python_full_version >= '3.12.4' and platform_python_implementation == 'PyPy' and sys_platform == 'darwin'",
"python_full_version >= '3.12.4' and platform_machine == 'aarch64' and platform_python_implementation == 'PyPy' and sys_platform == 'linux'",
"(python_full_version >= '3.12.4' and platform_machine != 'aarch64' and platform_python_implementation == 'PyPy' and sys_platform == 'linux') or (python_full_version >= '3.12.4' and platform_python_implementation == 'PyPy' and sys_platform != 'darwin' and sys_platform != 'linux')",
]
dependencies = [
{ name = "pyyaml", marker = "platform_python_implementation == 'PyPy'" },
{ name = "wrapt", marker = "platform_python_implementation == 'PyPy'" },
{ name = "yarl", marker = "platform_python_implementation == 'PyPy'" },
{ name = "pyyaml" },
{ name = "wrapt" },
{ name = "yarl" },
]
sdist = { url = "https://files.pythonhosted.org/packages/a5/ea/a166a3cce4ac5958ba9bbd9768acdb1ba38ae17ff7986da09fa5b9dbc633/vcrpy-5.1.0.tar.gz", hash = "sha256:bbf1532f2618a04f11bce2a99af3a9647a32c880957293ff91e0a5f187b6b3d2", size = 84576 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/2a/5b/3f70bcb279ad30026cc4f1df0a0491a0205a24dddd88301f396c485de9e7/vcrpy-5.1.0-py2.py3-none-any.whl", hash = "sha256:605e7b7a63dcd940db1df3ab2697ca7faf0e835c0852882142bafb19649d599e", size = 41969 },
]
[[package]]
name = "vcrpy"
version = "7.0.0"
source = { registry = "https://pypi.org/simple" }
resolution-markers = [
"python_full_version < '3.11' and platform_python_implementation != 'PyPy' and sys_platform == 'darwin'",
"python_full_version < '3.11' and platform_machine == 'aarch64' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'",
"(python_full_version < '3.11' and platform_machine != 'aarch64' and platform_python_implementation != 'PyPy' and sys_platform == 'linux') or (python_full_version < '3.11' and platform_python_implementation != 'PyPy' and sys_platform != 'darwin' and sys_platform != 'linux')",
"python_full_version == '3.11.*' and platform_python_implementation != 'PyPy' and sys_platform == 'darwin'",
"python_full_version == '3.11.*' and platform_machine == 'aarch64' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'",
"(python_full_version == '3.11.*' and platform_machine != 'aarch64' and platform_python_implementation != 'PyPy' and sys_platform == 'linux') or (python_full_version == '3.11.*' and platform_python_implementation != 'PyPy' and sys_platform != 'darwin' and sys_platform != 'linux')",
"python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'darwin'",
"python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_machine == 'aarch64' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'",
"(python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_machine != 'aarch64' and platform_python_implementation != 'PyPy' and sys_platform == 'linux') or (python_full_version >= '3.12' and python_full_version < '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform != 'darwin' and sys_platform != 'linux')",
"python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform == 'darwin'",
"python_full_version >= '3.12.4' and platform_machine == 'aarch64' and platform_python_implementation != 'PyPy' and sys_platform == 'linux'",
"(python_full_version >= '3.12.4' and platform_machine != 'aarch64' and platform_python_implementation != 'PyPy' and sys_platform == 'linux') or (python_full_version >= '3.12.4' and platform_python_implementation != 'PyPy' and sys_platform != 'darwin' and sys_platform != 'linux')",
]
dependencies = [
{ name = "pyyaml", marker = "platform_python_implementation != 'PyPy'" },
{ name = "urllib3", marker = "platform_python_implementation != 'PyPy'" },
{ name = "wrapt", marker = "platform_python_implementation != 'PyPy'" },
{ name = "yarl", marker = "platform_python_implementation != 'PyPy'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/25/d3/856e06184d4572aada1dd559ddec3bedc46df1f2edc5ab2c91121a2cccdb/vcrpy-7.0.0.tar.gz", hash = "sha256:176391ad0425edde1680c5b20738ea3dc7fb942520a48d2993448050986b3a50", size = 85502 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/13/5d/1f15b252890c968d42b348d1e9b0aa12d5bf3e776704178ec37cceccdb63/vcrpy-7.0.0-py2.py3-none-any.whl", hash = "sha256:55791e26c18daa363435054d8b35bd41a4ac441b6676167635d1b37a71dbe124", size = 42321 },
]
[[package]]
name = "virtualenv"
version = "20.27.0"
@@ -5550,6 +5550,18 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/ff/21/abdedb4cdf6ff41ebf01a74087740a709e2edb146490e4d9beea054b0b7a/wrapt-1.16.0-py3-none-any.whl", hash = "sha256:6906c4100a8fcbf2fa735f6059214bb13b97f75b1a61777fcf6432121ef12ef1", size = 23362 },
]
[[package]]
name = "wsproto"
version = "1.2.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "h11" },
]
sdist = { url = "https://files.pythonhosted.org/packages/c9/4a/44d3c295350d776427904d73c189e10aeae66d7f555bb2feee16d1e4ba5a/wsproto-1.2.0.tar.gz", hash = "sha256:ad565f26ecb92588a3e43bc3d96164de84cd9902482b130d0ddbaa9664a85065", size = 53425 }
wheels = [
{ url = "https://files.pythonhosted.org/packages/78/58/e860788190eba3bcce367f74d29c4675466ce8dddfba85f7827588416f01/wsproto-1.2.0-py3-none-any.whl", hash = "sha256:b9acddd652b585d75b20477888c56642fdade28bdfd3579aa24a4d2c037dd736", size = 24226 },
]
[[package]]
name = "xlsxwriter"
version = "3.2.0"