mirror of
https://github.com/crewAIInc/crewAI.git
synced 2026-01-03 13:18:29 +00:00
Compare commits
10 Commits
lorenze/ag
...
devin/1746
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
0f0623d31c | ||
|
|
6db161465b | ||
|
|
2f7cba6b23 | ||
|
|
5bc01b15a0 | ||
|
|
cb1a98cabf | ||
|
|
369e6d109c | ||
|
|
2c011631f9 | ||
|
|
d3fc2b4477 | ||
|
|
516d45deaa | ||
|
|
7ad51d9d05 |
38
.github/security.md
vendored
38
.github/security.md
vendored
@@ -1,19 +1,27 @@
|
||||
CrewAI takes the security of our software products and services seriously, which includes all source code repositories managed through our GitHub organization.
|
||||
If you believe you have found a security vulnerability in any CrewAI product or service, please report it to us as described below.
|
||||
## CrewAI Security Vulnerability Reporting Policy
|
||||
|
||||
## Reporting a Vulnerability
|
||||
Please do not report security vulnerabilities through public GitHub issues.
|
||||
To report a vulnerability, please email us at security@crewai.com.
|
||||
Please include the requested information listed below so that we can triage your report more quickly
|
||||
CrewAI prioritizes the security of our software products, services, and GitHub repositories. To promptly address vulnerabilities, follow these steps for reporting security issues:
|
||||
|
||||
- Type of issue (e.g. SQL injection, cross-site scripting, etc.)
|
||||
- Full paths of source file(s) related to the manifestation of the issue
|
||||
- The location of the affected source code (tag/branch/commit or direct URL)
|
||||
- Any special configuration required to reproduce the issue
|
||||
- Step-by-step instructions to reproduce the issue (please include screenshots if needed)
|
||||
- Proof-of-concept or exploit code (if possible)
|
||||
- Impact of the issue, including how an attacker might exploit the issue
|
||||
### Reporting Process
|
||||
Do **not** report vulnerabilities via public GitHub issues.
|
||||
|
||||
Once we have received your report, we will respond to you at the email address you provide. If the issue is confirmed, we will release a patch as soon as possible depending on the complexity of the issue.
|
||||
Email all vulnerability reports directly to:
|
||||
**security@crewai.com**
|
||||
|
||||
At this time, we are not offering a bug bounty program. Any rewards will be at our discretion.
|
||||
### Required Information
|
||||
To help us quickly validate and remediate the issue, your report must include:
|
||||
|
||||
- **Vulnerability Type:** Clearly state the vulnerability type (e.g., SQL injection, XSS, privilege escalation).
|
||||
- **Affected Source Code:** Provide full file paths and direct URLs (branch, tag, or commit).
|
||||
- **Reproduction Steps:** Include detailed, step-by-step instructions. Screenshots are recommended.
|
||||
- **Special Configuration:** Document any special settings or configurations required to reproduce.
|
||||
- **Proof-of-Concept (PoC):** Provide exploit or PoC code (if available).
|
||||
- **Impact Assessment:** Clearly explain the severity and potential exploitation scenarios.
|
||||
|
||||
### Our Response
|
||||
- We will acknowledge receipt of your report promptly via your provided email.
|
||||
- Confirmed vulnerabilities will receive priority remediation based on severity.
|
||||
- Patches will be released as swiftly as possible following verification.
|
||||
|
||||
### Reward Notice
|
||||
Currently, we do not offer a bug bounty program. Rewards, if issued, are discretionary.
|
||||
|
||||
8
.github/workflows/tests.yml
vendored
8
.github/workflows/tests.yml
vendored
@@ -13,8 +13,9 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 15
|
||||
strategy:
|
||||
fail-fast: true # Stop testing on subsequent Python versions if an earlier one fails
|
||||
matrix:
|
||||
python-version: ['3.10', '3.11', '3.12']
|
||||
python-version: ['3.10', '3.11', '3.12', '3.13']
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v4
|
||||
@@ -28,7 +29,12 @@ jobs:
|
||||
run: uv python install ${{ matrix.python-version }}
|
||||
|
||||
- name: Install the project
|
||||
if: matrix.python-version != '3.13'
|
||||
run: uv sync --dev --all-extras
|
||||
|
||||
- name: Install the project (Python 3.13)
|
||||
if: matrix.python-version == '3.13'
|
||||
run: uv sync --dev # Skip all-extras for Python 3.13 due to onnxruntime compatibility
|
||||
|
||||
- name: Run tests
|
||||
run: uv run pytest --block-network --timeout=60 -vv
|
||||
|
||||
22
Dockerfile.test
Normal file
22
Dockerfile.test
Normal file
@@ -0,0 +1,22 @@
|
||||
FROM python:3.13-slim-bookworm
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Health check
|
||||
HEALTHCHECK --interval=30s --timeout=30s --start-period=5s --retries=3 CMD python -c "import crewai" || exit 1
|
||||
|
||||
# Improved installation process
|
||||
RUN pip install --upgrade pip && pip install --no-cache-dir -U pip setuptools wheel && rm -rf /root/.cache/pip/* \
|
||||
|| { echo 'Installation failed'; exit 1; }
|
||||
|
||||
# Copy the current directory contents into the container
|
||||
COPY . /app/
|
||||
|
||||
# Install the package with error handling
|
||||
RUN pip install -e . || { echo 'Package installation failed'; exit 1; }
|
||||
|
||||
# Version confirmation
|
||||
RUN python -c "import crewai; assert crewai.__version__, 'Version check failed'"
|
||||
|
||||
# Test importing the package
|
||||
CMD ["python", "-c", "import crewai; print(f'Successfully imported crewai version {crewai.__version__}')"]
|
||||
@@ -397,6 +397,53 @@ result = crew.kickoff(inputs={"question": "What city does John live in and how o
|
||||
John is 30 years old and lives in San Francisco.
|
||||
```
|
||||
</CodeGroup>
|
||||
|
||||
## Query Rewriting
|
||||
|
||||
CrewAI implements an intelligent query rewriting mechanism to optimize knowledge retrieval. When an agent needs to search through knowledge sources, the raw task prompt is automatically transformed into a more effective search query.
|
||||
|
||||
### How Query Rewriting Works
|
||||
|
||||
1. When an agent executes a task with knowledge sources available, the `_get_knowledge_search_query` method is triggered
|
||||
2. The agent's LLM is used to transform the original task prompt into an optimized search query
|
||||
3. This optimized query is then used to retrieve relevant information from knowledge sources
|
||||
|
||||
### Benefits of Query Rewriting
|
||||
|
||||
<CardGroup cols={2}>
|
||||
<Card title="Improved Retrieval Accuracy" icon="bullseye-arrow">
|
||||
By focusing on key concepts and removing irrelevant content, query rewriting helps retrieve more relevant information.
|
||||
</Card>
|
||||
<Card title="Context Awareness" icon="brain">
|
||||
The rewritten queries are designed to be more specific and context-aware for vector database retrieval.
|
||||
</Card>
|
||||
</CardGroup>
|
||||
|
||||
### Implementation Details
|
||||
|
||||
Query rewriting happens transparently using a system prompt that instructs the LLM to:
|
||||
|
||||
- Focus on key words of the intended task
|
||||
- Make the query more specific and context-aware
|
||||
- Remove irrelevant content like output format instructions
|
||||
- Generate only the rewritten query without preamble or postamble
|
||||
|
||||
<Tip>
|
||||
This mechanism is fully automatic and requires no configuration from users. The agent's LLM is used to perform the query rewriting, so using a more capable LLM can improve the quality of rewritten queries.
|
||||
</Tip>
|
||||
|
||||
### Example
|
||||
|
||||
```python
|
||||
# Original task prompt
|
||||
task_prompt = "Answer the following questions about the user's favorite movies: What movie did John watch last week? Format your answer in JSON."
|
||||
|
||||
# Behind the scenes, this might be rewritten as:
|
||||
rewritten_query = "What movies did John watch last week?"
|
||||
```
|
||||
|
||||
The rewritten query is more focused on the core information need and removes irrelevant instructions about output formatting.
|
||||
|
||||
## Clearing Knowledge
|
||||
|
||||
If you need to clear the knowledge stored in CrewAI, you can use the `crewai reset-memories` command with the `--knowledge` option.
|
||||
|
||||
@@ -169,19 +169,55 @@ In this section, you'll find detailed examples that help you select, configure,
|
||||
```
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="Google">
|
||||
Set the following environment variables in your `.env` file:
|
||||
<Accordion title="Google (Gemini API)">
|
||||
Set your API key in your `.env` file. If you need a key, or need to find an
|
||||
existing key, check [AI Studio](https://aistudio.google.com/apikey).
|
||||
|
||||
```toml Code
|
||||
# Option 1: Gemini accessed with an API key.
|
||||
```toml .env
|
||||
# https://ai.google.dev/gemini-api/docs/api-key
|
||||
GEMINI_API_KEY=<your-api-key>
|
||||
|
||||
# Option 2: Vertex AI IAM credentials for Gemini, Anthropic, and Model Garden.
|
||||
# https://cloud.google.com/vertex-ai/generative-ai/docs/overview
|
||||
```
|
||||
|
||||
Get credentials from your Google Cloud Console and save it to a JSON file with the following code:
|
||||
Example usage in your CrewAI project:
|
||||
```python Code
|
||||
from crewai import LLM
|
||||
|
||||
llm = LLM(
|
||||
model="gemini/gemini-2.0-flash",
|
||||
temperature=0.7,
|
||||
)
|
||||
```
|
||||
|
||||
### Gemini models
|
||||
|
||||
Google offers a range of powerful models optimized for different use cases.
|
||||
|
||||
| Model | Context Window | Best For |
|
||||
|--------------------------------|----------------|-------------------------------------------------------------------|
|
||||
| gemini-2.5-flash-preview-04-17 | 1M tokens | Adaptive thinking, cost efficiency |
|
||||
| gemini-2.5-pro-preview-05-06 | 1M tokens | Enhanced thinking and reasoning, multimodal understanding, advanced coding, and more |
|
||||
| gemini-2.0-flash | 1M tokens | Next generation features, speed, thinking, and realtime streaming |
|
||||
| gemini-2.0-flash-lite | 1M tokens | Cost efficiency and low latency |
|
||||
| gemini-1.5-flash | 1M tokens | Balanced multimodal model, good for most tasks |
|
||||
| gemini-1.5-flash-8B | 1M tokens | Fastest, most cost-efficient, good for high-frequency tasks |
|
||||
| gemini-1.5-pro | 2M tokens | Best performing, wide variety of reasoning tasks including logical reasoning, coding, and creative collaboration |
|
||||
|
||||
The full list of models is available in the [Gemini model docs](https://ai.google.dev/gemini-api/docs/models).
|
||||
|
||||
### Gemma
|
||||
|
||||
The Gemini API also allows you to use your API key to access [Gemma models](https://ai.google.dev/gemma/docs) hosted on Google infrastructure.
|
||||
|
||||
| Model | Context Window |
|
||||
|----------------|----------------|
|
||||
| gemma-3-1b-it | 32k tokens |
|
||||
| gemma-3-4b-it | 32k tokens |
|
||||
| gemma-3-12b-it | 32k tokens |
|
||||
| gemma-3-27b-it | 128k tokens |
|
||||
|
||||
</Accordion>
|
||||
<Accordion title="Google (Vertex AI)">
|
||||
Get credentials from your Google Cloud Console and save it to a JSON file, then load it with the following code:
|
||||
```python Code
|
||||
import json
|
||||
|
||||
@@ -205,14 +241,18 @@ In this section, you'll find detailed examples that help you select, configure,
|
||||
vertex_credentials=vertex_credentials_json
|
||||
)
|
||||
```
|
||||
|
||||
Google offers a range of powerful models optimized for different use cases:
|
||||
|
||||
| Model | Context Window | Best For |
|
||||
|-----------------------|----------------|------------------------------------------------------------------|
|
||||
| gemini-2.0-flash-exp | 1M tokens | Higher quality at faster speed, multimodal model, good for most tasks |
|
||||
| gemini-1.5-flash | 1M tokens | Balanced multimodal model, good for most tasks |
|
||||
| gemini-1.5-flash-8B | 1M tokens | Fastest, most cost-efficient, good for high-frequency tasks |
|
||||
| gemini-1.5-pro | 2M tokens | Best performing, wide variety of reasoning tasks including logical reasoning, coding, and creative collaboration |
|
||||
| Model | Context Window | Best For |
|
||||
|--------------------------------|----------------|-------------------------------------------------------------------|
|
||||
| gemini-2.5-flash-preview-04-17 | 1M tokens | Adaptive thinking, cost efficiency |
|
||||
| gemini-2.5-pro-preview-05-06 | 1M tokens | Enhanced thinking and reasoning, multimodal understanding, advanced coding, and more |
|
||||
| gemini-2.0-flash | 1M tokens | Next generation features, speed, thinking, and realtime streaming |
|
||||
| gemini-2.0-flash-lite | 1M tokens | Cost efficiency and low latency |
|
||||
| gemini-1.5-flash | 1M tokens | Balanced multimodal model, good for most tasks |
|
||||
| gemini-1.5-flash-8B | 1M tokens | Fastest, most cost-efficient, good for high-frequency tasks |
|
||||
| gemini-1.5-pro | 2M tokens | Best performing, wide variety of reasoning tasks including logical reasoning, coding, and creative collaboration |
|
||||
</Accordion>
|
||||
|
||||
<Accordion title="Azure">
|
||||
|
||||
@@ -68,7 +68,13 @@ We'll create a CrewAI application where two agents collaborate to research and w
|
||||
```python
|
||||
from crewai import Agent, Crew, Process, Task
|
||||
from crewai_tools import SerperDevTool
|
||||
from openinference.instrumentation.crewai import CrewAIInstrumentor
|
||||
from phoenix.otel import register
|
||||
|
||||
# setup monitoring for your crew
|
||||
tracer_provider = register(
|
||||
endpoint="http://localhost:6006/v1/traces")
|
||||
CrewAIInstrumentor().instrument(skip_dep_check=True, tracer_provider=tracer_provider)
|
||||
search_tool = SerperDevTool()
|
||||
|
||||
# Define your agents with roles and goals
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
[project]
|
||||
name = "crewai"
|
||||
version = "0.118.0"
|
||||
version = "0.119.0"
|
||||
description = "Cutting-edge framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks."
|
||||
readme = "README.md"
|
||||
requires-python = ">=3.10,<3.13"
|
||||
requires-python = ">=3.10,<3.14"
|
||||
authors = [
|
||||
{ name = "Joao Moura", email = "joao@crewai.com" }
|
||||
]
|
||||
@@ -45,7 +45,7 @@ Documentation = "https://docs.crewai.com"
|
||||
Repository = "https://github.com/crewAIInc/crewAI"
|
||||
|
||||
[project.optional-dependencies]
|
||||
tools = ["crewai-tools~=0.42.2"]
|
||||
tools = ["crewai-tools~=0.44.0"]
|
||||
embeddings = [
|
||||
"tiktoken~=0.7.0"
|
||||
]
|
||||
|
||||
@@ -17,7 +17,7 @@ warnings.filterwarnings(
|
||||
category=UserWarning,
|
||||
module="pydantic.main",
|
||||
)
|
||||
__version__ = "0.118.0"
|
||||
__version__ = "0.119.0"
|
||||
__all__ = [
|
||||
"Agent",
|
||||
"Crew",
|
||||
|
||||
@@ -31,6 +31,14 @@ from crewai.utilities.events.agent_events import (
|
||||
AgentExecutionStartedEvent,
|
||||
)
|
||||
from crewai.utilities.events.crewai_event_bus import crewai_event_bus
|
||||
from crewai.utilities.events.knowledge_events import (
|
||||
KnowledgeQueryCompletedEvent,
|
||||
KnowledgeQueryFailedEvent,
|
||||
KnowledgeQueryStartedEvent,
|
||||
KnowledgeRetrievalCompletedEvent,
|
||||
KnowledgeRetrievalStartedEvent,
|
||||
KnowledgeSearchQueryFailedEvent,
|
||||
)
|
||||
from crewai.utilities.llm_utils import create_llm
|
||||
from crewai.utilities.token_counter_callback import TokenCalcHandler
|
||||
from crewai.utilities.training_handler import CrewTrainingHandler
|
||||
@@ -122,6 +130,10 @@ class Agent(BaseAgent):
|
||||
default=None,
|
||||
description="Knowledge context for the crew.",
|
||||
)
|
||||
knowledge_search_query: Optional[str] = Field(
|
||||
default=None,
|
||||
description="Knowledge search query for the agent dynamically generated by the agent.",
|
||||
)
|
||||
|
||||
@model_validator(mode="after")
|
||||
def post_init_setup(self):
|
||||
@@ -185,7 +197,7 @@ class Agent(BaseAgent):
|
||||
self,
|
||||
task: Task,
|
||||
context: Optional[str] = None,
|
||||
tools: Optional[List[BaseTool]] = None
|
||||
tools: Optional[List[BaseTool]] = None,
|
||||
) -> str:
|
||||
"""Execute a task with the agent.
|
||||
|
||||
@@ -245,27 +257,65 @@ class Agent(BaseAgent):
|
||||
knowledge_config = (
|
||||
self.knowledge_config.model_dump() if self.knowledge_config else {}
|
||||
)
|
||||
if self.knowledge:
|
||||
agent_knowledge_snippets = self.knowledge.query(
|
||||
[task.prompt()], **knowledge_config
|
||||
)
|
||||
if agent_knowledge_snippets:
|
||||
self.agent_knowledge_context = extract_knowledge_context(
|
||||
agent_knowledge_snippets
|
||||
)
|
||||
if self.agent_knowledge_context:
|
||||
task_prompt += self.agent_knowledge_context
|
||||
|
||||
if self.crew:
|
||||
knowledge_snippets = self.crew.query_knowledge(
|
||||
[task.prompt()], **knowledge_config
|
||||
if self.knowledge:
|
||||
crewai_event_bus.emit(
|
||||
self,
|
||||
event=KnowledgeRetrievalStartedEvent(
|
||||
agent=self,
|
||||
),
|
||||
)
|
||||
if knowledge_snippets:
|
||||
self.crew_knowledge_context = extract_knowledge_context(
|
||||
knowledge_snippets
|
||||
try:
|
||||
self.knowledge_search_query = self._get_knowledge_search_query(
|
||||
task_prompt
|
||||
)
|
||||
if self.knowledge_search_query:
|
||||
agent_knowledge_snippets = self.knowledge.query(
|
||||
[self.knowledge_search_query], **knowledge_config
|
||||
)
|
||||
if agent_knowledge_snippets:
|
||||
self.agent_knowledge_context = extract_knowledge_context(
|
||||
agent_knowledge_snippets
|
||||
)
|
||||
if self.agent_knowledge_context:
|
||||
task_prompt += self.agent_knowledge_context
|
||||
if self.crew:
|
||||
knowledge_snippets = self.crew.query_knowledge(
|
||||
[self.knowledge_search_query], **knowledge_config
|
||||
)
|
||||
if knowledge_snippets:
|
||||
self.crew_knowledge_context = extract_knowledge_context(
|
||||
knowledge_snippets
|
||||
)
|
||||
if self.crew_knowledge_context:
|
||||
task_prompt += self.crew_knowledge_context
|
||||
|
||||
crewai_event_bus.emit(
|
||||
self,
|
||||
event=KnowledgeRetrievalCompletedEvent(
|
||||
query=self.knowledge_search_query,
|
||||
agent=self,
|
||||
retrieved_knowledge=(
|
||||
(self.agent_knowledge_context or "")
|
||||
+ (
|
||||
"\n"
|
||||
if self.agent_knowledge_context
|
||||
and self.crew_knowledge_context
|
||||
else ""
|
||||
)
|
||||
+ (self.crew_knowledge_context or "")
|
||||
),
|
||||
),
|
||||
)
|
||||
except Exception as e:
|
||||
crewai_event_bus.emit(
|
||||
self,
|
||||
event=KnowledgeSearchQueryFailedEvent(
|
||||
query=self.knowledge_search_query or "",
|
||||
agent=self,
|
||||
error=str(e),
|
||||
),
|
||||
)
|
||||
if self.crew_knowledge_context:
|
||||
task_prompt += self.crew_knowledge_context
|
||||
|
||||
tools = tools or self.tools or []
|
||||
self.create_agent_executor(tools=tools, task=task)
|
||||
@@ -288,12 +338,19 @@ class Agent(BaseAgent):
|
||||
|
||||
# Determine execution method based on timeout setting
|
||||
if self.max_execution_time is not None:
|
||||
if not isinstance(self.max_execution_time, int) or self.max_execution_time <= 0:
|
||||
raise ValueError("Max Execution time must be a positive integer greater than zero")
|
||||
result = self._execute_with_timeout(task_prompt, task, self.max_execution_time)
|
||||
if (
|
||||
not isinstance(self.max_execution_time, int)
|
||||
or self.max_execution_time <= 0
|
||||
):
|
||||
raise ValueError(
|
||||
"Max Execution time must be a positive integer greater than zero"
|
||||
)
|
||||
result = self._execute_with_timeout(
|
||||
task_prompt, task, self.max_execution_time
|
||||
)
|
||||
else:
|
||||
result = self._execute_without_timeout(task_prompt, task)
|
||||
|
||||
|
||||
except TimeoutError as e:
|
||||
# Propagate TimeoutError without retry
|
||||
crewai_event_bus.emit(
|
||||
@@ -345,54 +402,46 @@ class Agent(BaseAgent):
|
||||
)
|
||||
return result
|
||||
|
||||
def _execute_with_timeout(
|
||||
self,
|
||||
task_prompt: str,
|
||||
task: Task,
|
||||
timeout: int
|
||||
) -> str:
|
||||
def _execute_with_timeout(self, task_prompt: str, task: Task, timeout: int) -> str:
|
||||
"""Execute a task with a timeout.
|
||||
|
||||
|
||||
Args:
|
||||
task_prompt: The prompt to send to the agent.
|
||||
task: The task being executed.
|
||||
timeout: Maximum execution time in seconds.
|
||||
|
||||
|
||||
Returns:
|
||||
The output of the agent.
|
||||
|
||||
|
||||
Raises:
|
||||
TimeoutError: If execution exceeds the timeout.
|
||||
RuntimeError: If execution fails for other reasons.
|
||||
"""
|
||||
import concurrent.futures
|
||||
|
||||
with concurrent.futures.ThreadPoolExecutor() as executor:
|
||||
future = executor.submit(
|
||||
self._execute_without_timeout,
|
||||
task_prompt=task_prompt,
|
||||
task=task
|
||||
self._execute_without_timeout, task_prompt=task_prompt, task=task
|
||||
)
|
||||
|
||||
|
||||
try:
|
||||
return future.result(timeout=timeout)
|
||||
except concurrent.futures.TimeoutError:
|
||||
future.cancel()
|
||||
raise TimeoutError(f"Task '{task.description}' execution timed out after {timeout} seconds. Consider increasing max_execution_time or optimizing the task.")
|
||||
raise TimeoutError(
|
||||
f"Task '{task.description}' execution timed out after {timeout} seconds. Consider increasing max_execution_time or optimizing the task."
|
||||
)
|
||||
except Exception as e:
|
||||
future.cancel()
|
||||
raise RuntimeError(f"Task execution failed: {str(e)}")
|
||||
|
||||
def _execute_without_timeout(
|
||||
self,
|
||||
task_prompt: str,
|
||||
task: Task
|
||||
) -> str:
|
||||
def _execute_without_timeout(self, task_prompt: str, task: Task) -> str:
|
||||
"""Execute a task without a timeout.
|
||||
|
||||
|
||||
Args:
|
||||
task_prompt: The prompt to send to the agent.
|
||||
task: The task being executed.
|
||||
|
||||
|
||||
Returns:
|
||||
The output of the agent.
|
||||
"""
|
||||
@@ -560,6 +609,61 @@ class Agent(BaseAgent):
|
||||
def set_fingerprint(self, fingerprint: Fingerprint):
|
||||
self.security_config.fingerprint = fingerprint
|
||||
|
||||
def _get_knowledge_search_query(self, task_prompt: str) -> str | None:
|
||||
"""Generate a search query for the knowledge base based on the task description."""
|
||||
crewai_event_bus.emit(
|
||||
self,
|
||||
event=KnowledgeQueryStartedEvent(
|
||||
task_prompt=task_prompt,
|
||||
agent=self,
|
||||
),
|
||||
)
|
||||
query = self.i18n.slice("knowledge_search_query").format(
|
||||
task_prompt=task_prompt
|
||||
)
|
||||
rewriter_prompt = self.i18n.slice("knowledge_search_query_system_prompt")
|
||||
if not isinstance(self.llm, BaseLLM):
|
||||
self._logger.log(
|
||||
"warning",
|
||||
f"Knowledge search query failed: LLM for agent '{self.role}' is not an instance of BaseLLM",
|
||||
)
|
||||
crewai_event_bus.emit(
|
||||
self,
|
||||
event=KnowledgeQueryFailedEvent(
|
||||
agent=self,
|
||||
error="LLM is not compatible with knowledge search queries",
|
||||
),
|
||||
)
|
||||
return None
|
||||
|
||||
try:
|
||||
rewritten_query = self.llm.call(
|
||||
[
|
||||
{
|
||||
"role": "system",
|
||||
"content": rewriter_prompt,
|
||||
},
|
||||
{"role": "user", "content": query},
|
||||
]
|
||||
)
|
||||
crewai_event_bus.emit(
|
||||
self,
|
||||
event=KnowledgeQueryCompletedEvent(
|
||||
query=query,
|
||||
agent=self,
|
||||
),
|
||||
)
|
||||
return rewritten_query
|
||||
except Exception as e:
|
||||
crewai_event_bus.emit(
|
||||
self,
|
||||
event=KnowledgeQueryFailedEvent(
|
||||
agent=self,
|
||||
error=str(e),
|
||||
),
|
||||
)
|
||||
return None
|
||||
|
||||
def kickoff(
|
||||
self,
|
||||
messages: Union[str, List[Dict[str, str]]],
|
||||
|
||||
@@ -13,7 +13,7 @@ ENV_VARS = {
|
||||
],
|
||||
"gemini": [
|
||||
{
|
||||
"prompt": "Enter your GEMINI API key (press Enter to skip)",
|
||||
"prompt": "Enter your GEMINI API key from https://ai.dev/apikey (press Enter to skip)",
|
||||
"key_name": "GEMINI_API_KEY",
|
||||
}
|
||||
],
|
||||
|
||||
@@ -3,9 +3,9 @@ name = "{{folder_name}}"
|
||||
version = "0.1.0"
|
||||
description = "{{name}} using crewAI"
|
||||
authors = [{ name = "Your Name", email = "you@example.com" }]
|
||||
requires-python = ">=3.10,<3.13"
|
||||
requires-python = ">=3.10,<3.14"
|
||||
dependencies = [
|
||||
"crewai[tools]>=0.118.0,<1.0.0"
|
||||
"crewai[tools]>=0.119.0,<1.0.0"
|
||||
]
|
||||
|
||||
[project.scripts]
|
||||
|
||||
@@ -3,9 +3,9 @@ name = "{{folder_name}}"
|
||||
version = "0.1.0"
|
||||
description = "{{name}} using crewAI"
|
||||
authors = [{ name = "Your Name", email = "you@example.com" }]
|
||||
requires-python = ">=3.10,<3.13"
|
||||
requires-python = ">=3.10,<3.14"
|
||||
dependencies = [
|
||||
"crewai[tools]>=0.118.0,<1.0.0",
|
||||
"crewai[tools]>=0.119.0,<1.0.0",
|
||||
]
|
||||
|
||||
[project.scripts]
|
||||
|
||||
@@ -3,9 +3,9 @@ name = "{{folder_name}}"
|
||||
version = "0.1.0"
|
||||
description = "Power up your crews with {{folder_name}}"
|
||||
readme = "README.md"
|
||||
requires-python = ">=3.10,<3.13"
|
||||
requires-python = ">=3.10,<3.14"
|
||||
dependencies = [
|
||||
"crewai[tools]>=0.118.0"
|
||||
"crewai[tools]>=0.119.0"
|
||||
]
|
||||
|
||||
[tool.crewai]
|
||||
|
||||
@@ -27,7 +27,9 @@
|
||||
"feedback_instructions": "User feedback: {feedback}\nInstructions: Use this feedback to enhance the next output iteration.\nNote: Do not respond or add commentary.",
|
||||
"lite_agent_system_prompt_with_tools": "You are {role}. {backstory}\nYour personal goal is: {goal}\n\nYou ONLY have access to the following tools, and should NEVER make up tools that are not listed here:\n\n{tools}\n\nIMPORTANT: Use the following format in your response:\n\n```\nThought: you should always think about what to do\nAction: the action to take, only one name of [{tool_names}], just the name, exactly as it's written.\nAction Input: the input to the action, just a simple JSON object, enclosed in curly braces, using \" to wrap keys and values.\nObservation: the result of the action\n```\n\nOnce all necessary information is gathered, return the following format:\n\n```\nThought: I now know the final answer\nFinal Answer: the final answer to the original input question\n```",
|
||||
"lite_agent_system_prompt_without_tools": "You are {role}. {backstory}\nYour personal goal is: {goal}\n\nTo give my best complete final answer to the task respond using the exact following format:\n\nThought: I now can give a great answer\nFinal Answer: Your final answer must be the great and the most complete as possible, it must be outcome described.\n\nI MUST use these formats, my job depends on it!",
|
||||
"lite_agent_response_format": "\nIMPORTANT: Your final answer MUST contain all the information requested in the following format: {response_format}\n\nIMPORTANT: Ensure the final output does not include any code block markers like ```json or ```python."
|
||||
"lite_agent_response_format": "\nIMPORTANT: Your final answer MUST contain all the information requested in the following format: {response_format}\n\nIMPORTANT: Ensure the final output does not include any code block markers like ```json or ```python.",
|
||||
"knowledge_search_query": "The original query is: {task_prompt}.",
|
||||
"knowledge_search_query_system_prompt": "Your goal is to rewrite the user query so that it is optimized for retrieval from a vector database. Consider how the query will be used to find relevant documents, and aim to make it more specific and context-aware. \n\n Do not include any other text than the rewritten query, especially any preamble or postamble and only add expected output format if its relevant to the rewritten query. \n\n Focus on the key words of the intended task and to retrieve the most relevant information. \n\n There will be some extra context provided that might need to be removed such as expected_output formats structured_outputs and other instructions."
|
||||
},
|
||||
"errors": {
|
||||
"force_final_answer_error": "You can't keep going, here is the best final answer you generated:\n\n {formatted_answer}",
|
||||
|
||||
@@ -8,6 +8,14 @@ from crewai.telemetry.telemetry import Telemetry
|
||||
from crewai.utilities import Logger
|
||||
from crewai.utilities.constants import EMITTER_COLOR
|
||||
from crewai.utilities.events.base_event_listener import BaseEventListener
|
||||
from crewai.utilities.events.knowledge_events import (
|
||||
KnowledgeQueryCompletedEvent,
|
||||
KnowledgeQueryFailedEvent,
|
||||
KnowledgeQueryStartedEvent,
|
||||
KnowledgeRetrievalCompletedEvent,
|
||||
KnowledgeRetrievalStartedEvent,
|
||||
KnowledgeSearchQueryFailedEvent,
|
||||
)
|
||||
from crewai.utilities.events.llm_events import (
|
||||
LLMCallCompletedEvent,
|
||||
LLMCallFailedEvent,
|
||||
@@ -57,6 +65,8 @@ class EventListener(BaseEventListener):
|
||||
execution_spans: Dict[Task, Any] = Field(default_factory=dict)
|
||||
next_chunk = 0
|
||||
text_stream = StringIO()
|
||||
knowledge_retrieval_in_progress = False
|
||||
knowledge_query_in_progress = False
|
||||
|
||||
def __new__(cls):
|
||||
if cls._instance is None:
|
||||
@@ -342,5 +352,59 @@ class EventListener(BaseEventListener):
|
||||
def on_crew_test_failed(source, event: CrewTestFailedEvent):
|
||||
self.formatter.handle_crew_test_failed(event.crew_name or "Crew")
|
||||
|
||||
@crewai_event_bus.on(KnowledgeRetrievalStartedEvent)
|
||||
def on_knowledge_retrieval_started(
|
||||
source, event: KnowledgeRetrievalStartedEvent
|
||||
):
|
||||
if self.knowledge_retrieval_in_progress:
|
||||
return
|
||||
|
||||
self.knowledge_retrieval_in_progress = True
|
||||
|
||||
self.formatter.handle_knowledge_retrieval_started(
|
||||
self.formatter.current_agent_branch,
|
||||
self.formatter.current_crew_tree,
|
||||
)
|
||||
|
||||
@crewai_event_bus.on(KnowledgeRetrievalCompletedEvent)
|
||||
def on_knowledge_retrieval_completed(
|
||||
source, event: KnowledgeRetrievalCompletedEvent
|
||||
):
|
||||
if not self.knowledge_retrieval_in_progress:
|
||||
return
|
||||
|
||||
self.knowledge_retrieval_in_progress = False
|
||||
self.formatter.handle_knowledge_retrieval_completed(
|
||||
self.formatter.current_agent_branch,
|
||||
self.formatter.current_crew_tree,
|
||||
event.retrieved_knowledge,
|
||||
)
|
||||
|
||||
@crewai_event_bus.on(KnowledgeQueryStartedEvent)
|
||||
def on_knowledge_query_started(source, event: KnowledgeQueryStartedEvent):
|
||||
pass
|
||||
|
||||
@crewai_event_bus.on(KnowledgeQueryFailedEvent)
|
||||
def on_knowledge_query_failed(source, event: KnowledgeQueryFailedEvent):
|
||||
self.formatter.handle_knowledge_query_failed(
|
||||
self.formatter.current_agent_branch,
|
||||
event.error,
|
||||
self.formatter.current_crew_tree,
|
||||
)
|
||||
|
||||
@crewai_event_bus.on(KnowledgeQueryCompletedEvent)
|
||||
def on_knowledge_query_completed(source, event: KnowledgeQueryCompletedEvent):
|
||||
pass
|
||||
|
||||
@crewai_event_bus.on(KnowledgeSearchQueryFailedEvent)
|
||||
def on_knowledge_search_query_failed(
|
||||
source, event: KnowledgeSearchQueryFailedEvent
|
||||
):
|
||||
self.formatter.handle_knowledge_search_query_failed(
|
||||
self.formatter.current_agent_branch,
|
||||
event.error,
|
||||
self.formatter.current_crew_tree,
|
||||
)
|
||||
|
||||
|
||||
event_listener = EventListener()
|
||||
|
||||
56
src/crewai/utilities/events/knowledge_events.py
Normal file
56
src/crewai/utilities/events/knowledge_events.py
Normal file
@@ -0,0 +1,56 @@
|
||||
from typing import TYPE_CHECKING, Any
|
||||
|
||||
from crewai.agents.agent_builder.base_agent import BaseAgent
|
||||
from crewai.utilities.events.base_events import BaseEvent
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from crewai.agents.agent_builder.base_agent import BaseAgent
|
||||
|
||||
|
||||
class KnowledgeRetrievalStartedEvent(BaseEvent):
|
||||
"""Event emitted when a knowledge retrieval is started."""
|
||||
|
||||
type: str = "knowledge_search_query_started"
|
||||
agent: BaseAgent
|
||||
|
||||
|
||||
class KnowledgeRetrievalCompletedEvent(BaseEvent):
|
||||
"""Event emitted when a knowledge retrieval is completed."""
|
||||
|
||||
query: str
|
||||
type: str = "knowledge_search_query_completed"
|
||||
agent: BaseAgent
|
||||
retrieved_knowledge: Any
|
||||
|
||||
|
||||
class KnowledgeQueryStartedEvent(BaseEvent):
|
||||
"""Event emitted when a knowledge query is started."""
|
||||
|
||||
task_prompt: str
|
||||
type: str = "knowledge_query_started"
|
||||
agent: BaseAgent
|
||||
|
||||
|
||||
class KnowledgeQueryFailedEvent(BaseEvent):
|
||||
"""Event emitted when a knowledge query fails."""
|
||||
|
||||
type: str = "knowledge_query_failed"
|
||||
agent: BaseAgent
|
||||
error: str
|
||||
|
||||
|
||||
class KnowledgeQueryCompletedEvent(BaseEvent):
|
||||
"""Event emitted when a knowledge query is completed."""
|
||||
|
||||
query: str
|
||||
type: str = "knowledge_query_completed"
|
||||
agent: BaseAgent
|
||||
|
||||
|
||||
class KnowledgeSearchQueryFailedEvent(BaseEvent):
|
||||
"""Event emitted when a knowledge search query fails."""
|
||||
|
||||
query: str
|
||||
type: str = "knowledge_search_query_failed"
|
||||
agent: BaseAgent
|
||||
error: str
|
||||
@@ -783,3 +783,202 @@ class ConsoleFormatter:
|
||||
self.update_lite_agent_status(
|
||||
self.current_lite_agent_branch, lite_agent_role, status, **fields
|
||||
)
|
||||
|
||||
def handle_knowledge_retrieval_started(
|
||||
self,
|
||||
agent_branch: Optional[Tree],
|
||||
crew_tree: Optional[Tree],
|
||||
) -> Optional[Tree]:
|
||||
"""Handle knowledge retrieval started event."""
|
||||
if not self.verbose:
|
||||
return None
|
||||
|
||||
branch_to_use = agent_branch or self.current_lite_agent_branch
|
||||
tree_to_use = branch_to_use or crew_tree
|
||||
|
||||
if branch_to_use is None or tree_to_use is None:
|
||||
# If we don't have a valid branch, use crew_tree as the branch if available
|
||||
if crew_tree is not None:
|
||||
branch_to_use = tree_to_use = crew_tree
|
||||
else:
|
||||
return None
|
||||
|
||||
knowledge_branch = branch_to_use.add("")
|
||||
self.update_tree_label(
|
||||
knowledge_branch, "🔍", "Knowledge Retrieval Started", "blue"
|
||||
)
|
||||
|
||||
self.print(tree_to_use)
|
||||
self.print()
|
||||
return knowledge_branch
|
||||
|
||||
def handle_knowledge_retrieval_completed(
|
||||
self,
|
||||
agent_branch: Optional[Tree],
|
||||
crew_tree: Optional[Tree],
|
||||
retrieved_knowledge: Any,
|
||||
) -> None:
|
||||
"""Handle knowledge retrieval completed event."""
|
||||
if not self.verbose:
|
||||
return None
|
||||
|
||||
branch_to_use = self.current_lite_agent_branch or agent_branch
|
||||
tree_to_use = branch_to_use or crew_tree
|
||||
|
||||
if branch_to_use is None and tree_to_use is not None:
|
||||
branch_to_use = tree_to_use
|
||||
|
||||
if branch_to_use is None or tree_to_use is None:
|
||||
if retrieved_knowledge:
|
||||
knowledge_text = str(retrieved_knowledge)
|
||||
if len(knowledge_text) > 500:
|
||||
knowledge_text = knowledge_text[:497] + "..."
|
||||
|
||||
knowledge_panel = Panel(
|
||||
Text(knowledge_text, style="white"),
|
||||
title="📚 Retrieved Knowledge",
|
||||
border_style="green",
|
||||
padding=(1, 2),
|
||||
)
|
||||
self.print(knowledge_panel)
|
||||
self.print()
|
||||
return None
|
||||
|
||||
knowledge_branch_found = False
|
||||
for child in branch_to_use.children:
|
||||
if "Knowledge Retrieval Started" in str(child.label):
|
||||
self.update_tree_label(
|
||||
child, "✅", "Knowledge Retrieval Completed", "green"
|
||||
)
|
||||
knowledge_branch_found = True
|
||||
break
|
||||
|
||||
if not knowledge_branch_found:
|
||||
for child in branch_to_use.children:
|
||||
if (
|
||||
"Knowledge Retrieval" in str(child.label)
|
||||
and "Started" not in str(child.label)
|
||||
and "Completed" not in str(child.label)
|
||||
):
|
||||
self.update_tree_label(
|
||||
child, "✅", "Knowledge Retrieval Completed", "green"
|
||||
)
|
||||
knowledge_branch_found = True
|
||||
break
|
||||
|
||||
if not knowledge_branch_found:
|
||||
knowledge_branch = branch_to_use.add("")
|
||||
self.update_tree_label(
|
||||
knowledge_branch, "✅", "Knowledge Retrieval Completed", "green"
|
||||
)
|
||||
|
||||
self.print(tree_to_use)
|
||||
|
||||
if retrieved_knowledge:
|
||||
knowledge_text = str(retrieved_knowledge)
|
||||
if len(knowledge_text) > 500:
|
||||
knowledge_text = knowledge_text[:497] + "..."
|
||||
|
||||
knowledge_panel = Panel(
|
||||
Text(knowledge_text, style="white"),
|
||||
title="📚 Retrieved Knowledge",
|
||||
border_style="green",
|
||||
padding=(1, 2),
|
||||
)
|
||||
self.print(knowledge_panel)
|
||||
|
||||
self.print()
|
||||
|
||||
def handle_knowledge_query_started(
|
||||
self,
|
||||
agent_branch: Optional[Tree],
|
||||
task_prompt: str,
|
||||
crew_tree: Optional[Tree],
|
||||
) -> None:
|
||||
"""Handle knowledge query generated event."""
|
||||
if not self.verbose:
|
||||
return None
|
||||
|
||||
branch_to_use = self.current_lite_agent_branch or agent_branch
|
||||
tree_to_use = branch_to_use or crew_tree
|
||||
if branch_to_use is None or tree_to_use is None:
|
||||
return None
|
||||
|
||||
query_branch = branch_to_use.add("")
|
||||
self.update_tree_label(
|
||||
query_branch, "🔎", f"Query: {task_prompt[:50]}...", "yellow"
|
||||
)
|
||||
|
||||
self.print(tree_to_use)
|
||||
self.print()
|
||||
|
||||
def handle_knowledge_query_failed(
|
||||
self,
|
||||
agent_branch: Optional[Tree],
|
||||
error: str,
|
||||
crew_tree: Optional[Tree],
|
||||
) -> None:
|
||||
"""Handle knowledge query failed event."""
|
||||
if not self.verbose:
|
||||
return
|
||||
|
||||
tree_to_use = self.current_lite_agent_branch or crew_tree
|
||||
branch_to_use = self.current_lite_agent_branch or agent_branch
|
||||
|
||||
if branch_to_use and tree_to_use:
|
||||
query_branch = branch_to_use.add("")
|
||||
self.update_tree_label(query_branch, "❌", "Knowledge Query Failed", "red")
|
||||
self.print(tree_to_use)
|
||||
self.print()
|
||||
|
||||
# Show error panel
|
||||
error_content = self.create_status_content(
|
||||
"Knowledge Query Failed", "Query Error", "red", Error=error
|
||||
)
|
||||
self.print_panel(error_content, "Knowledge Error", "red")
|
||||
|
||||
def handle_knowledge_query_completed(
|
||||
self,
|
||||
agent_branch: Optional[Tree],
|
||||
crew_tree: Optional[Tree],
|
||||
) -> None:
|
||||
"""Handle knowledge query completed event."""
|
||||
if not self.verbose:
|
||||
return None
|
||||
|
||||
branch_to_use = self.current_lite_agent_branch or agent_branch
|
||||
tree_to_use = branch_to_use or crew_tree
|
||||
|
||||
if branch_to_use is None or tree_to_use is None:
|
||||
return None
|
||||
|
||||
query_branch = branch_to_use.add("")
|
||||
self.update_tree_label(query_branch, "✅", "Knowledge Query Completed", "green")
|
||||
|
||||
self.print(tree_to_use)
|
||||
self.print()
|
||||
|
||||
def handle_knowledge_search_query_failed(
|
||||
self,
|
||||
agent_branch: Optional[Tree],
|
||||
error: str,
|
||||
crew_tree: Optional[Tree],
|
||||
) -> None:
|
||||
"""Handle knowledge search query failed event."""
|
||||
if not self.verbose:
|
||||
return
|
||||
|
||||
tree_to_use = self.current_lite_agent_branch or crew_tree
|
||||
branch_to_use = self.current_lite_agent_branch or agent_branch
|
||||
|
||||
if branch_to_use and tree_to_use:
|
||||
query_branch = branch_to_use.add("")
|
||||
self.update_tree_label(query_branch, "❌", "Knowledge Search Failed", "red")
|
||||
self.print(tree_to_use)
|
||||
self.print()
|
||||
|
||||
# Show error panel
|
||||
error_content = self.create_status_content(
|
||||
"Knowledge Search Failed", "Search Error", "red", Error=error
|
||||
)
|
||||
self.print_panel(error_content, "Search Error", "red")
|
||||
|
||||
@@ -9,7 +9,6 @@ import pytest
|
||||
from crewai import Agent, Crew, Task
|
||||
from crewai.agents.cache import CacheHandler
|
||||
from crewai.agents.crew_agent_executor import AgentFinish, CrewAgentExecutor
|
||||
from crewai.agents.parser import CrewAgentParser, OutputParserException
|
||||
from crewai.knowledge.knowledge import Knowledge
|
||||
from crewai.knowledge.knowledge_config import KnowledgeConfig
|
||||
from crewai.knowledge.source.base_knowledge_source import BaseKnowledgeSource
|
||||
@@ -73,6 +72,7 @@ def test_agent_creation():
|
||||
assert agent.goal == "test goal"
|
||||
assert agent.backstory == "test backstory"
|
||||
|
||||
|
||||
def test_agent_with_only_system_template():
|
||||
"""Test that an agent with only system_template works without errors."""
|
||||
agent = Agent(
|
||||
@@ -88,6 +88,7 @@ def test_agent_with_only_system_template():
|
||||
assert agent.goal == "Test Goal"
|
||||
assert agent.backstory == "Test Backstory"
|
||||
|
||||
|
||||
def test_agent_with_only_prompt_template():
|
||||
"""Test that an agent with only system_template works without errors."""
|
||||
agent = Agent(
|
||||
@@ -119,7 +120,8 @@ def test_agent_with_missing_response_template():
|
||||
assert agent.role == "Test Role"
|
||||
assert agent.goal == "Test Goal"
|
||||
assert agent.backstory == "Test Backstory"
|
||||
|
||||
|
||||
|
||||
def test_agent_default_values():
|
||||
agent = Agent(role="test role", goal="test goal", backstory="test backstory")
|
||||
assert agent.llm.model == "gpt-4o-mini"
|
||||
@@ -1630,13 +1632,10 @@ def test_agent_with_knowledge_sources():
|
||||
# Create a knowledge source with some content
|
||||
content = "Brandon's favorite color is red and he likes Mexican food."
|
||||
string_source = StringKnowledgeSource(content=content)
|
||||
|
||||
with patch(
|
||||
"crewai.knowledge.storage.knowledge_storage.KnowledgeStorage"
|
||||
) as MockKnowledge:
|
||||
with patch("crewai.knowledge") as MockKnowledge:
|
||||
mock_knowledge_instance = MockKnowledge.return_value
|
||||
mock_knowledge_instance.sources = [string_source]
|
||||
mock_knowledge_instance.query.return_value = [{"content": content}]
|
||||
mock_knowledge_instance.search.return_value = [{"content": content}]
|
||||
|
||||
agent = Agent(
|
||||
role="Information Agent",
|
||||
@@ -1690,7 +1689,7 @@ def test_agent_with_knowledge_sources_with_query_limit_and_score_threshold():
|
||||
|
||||
assert agent.knowledge is not None
|
||||
mock_knowledge_query.assert_called_once_with(
|
||||
[task.prompt()],
|
||||
["Brandon's favorite color"],
|
||||
**knowledge_config.model_dump(),
|
||||
)
|
||||
|
||||
@@ -1727,7 +1726,7 @@ def test_agent_with_knowledge_sources_with_query_limit_and_score_threshold_defau
|
||||
|
||||
assert agent.knowledge is not None
|
||||
mock_knowledge_query.assert_called_once_with(
|
||||
[task.prompt()],
|
||||
["Brandon's favorite color"],
|
||||
**knowledge_config.model_dump(),
|
||||
)
|
||||
|
||||
@@ -1737,9 +1736,7 @@ def test_agent_with_knowledge_sources_extensive_role():
|
||||
content = "Brandon's favorite color is red and he likes Mexican food."
|
||||
string_source = StringKnowledgeSource(content=content)
|
||||
|
||||
with patch(
|
||||
"crewai.knowledge.storage.knowledge_storage.KnowledgeStorage"
|
||||
) as MockKnowledge:
|
||||
with patch("crewai.knowledge") as MockKnowledge:
|
||||
mock_knowledge_instance = MockKnowledge.return_value
|
||||
mock_knowledge_instance.sources = [string_source]
|
||||
mock_knowledge_instance.query.return_value = [{"content": content}]
|
||||
@@ -1803,6 +1800,40 @@ def test_agent_with_knowledge_sources_works_with_copy():
|
||||
assert isinstance(agent_copy.llm, LLM)
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_agent_with_knowledge_sources_generate_search_query():
|
||||
content = "Brandon's favorite color is red and he likes Mexican food."
|
||||
string_source = StringKnowledgeSource(content=content)
|
||||
|
||||
with patch("crewai.knowledge") as MockKnowledge:
|
||||
mock_knowledge_instance = MockKnowledge.return_value
|
||||
mock_knowledge_instance.sources = [string_source]
|
||||
mock_knowledge_instance.query.return_value = [{"content": content}]
|
||||
|
||||
agent = Agent(
|
||||
role="Information Agent with extensive role description that is longer than 80 characters",
|
||||
goal="Provide information based on knowledge sources",
|
||||
backstory="You have access to specific knowledge sources.",
|
||||
llm=LLM(model="gpt-4o-mini"),
|
||||
knowledge_sources=[string_source],
|
||||
)
|
||||
|
||||
task = Task(
|
||||
description="What is Brandon's favorite color?",
|
||||
expected_output="The answer to the question, in a format like this: `{{name: str, favorite_color: str}}`",
|
||||
agent=agent,
|
||||
)
|
||||
|
||||
crew = Crew(agents=[agent], tasks=[task])
|
||||
result = crew.kickoff()
|
||||
|
||||
# Updated assertion to check the JSON content
|
||||
assert "Brandon" in str(agent.knowledge_search_query)
|
||||
assert "favorite color" in str(agent.knowledge_search_query)
|
||||
|
||||
assert "red" in result.raw.lower()
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_litellm_auth_error_handling():
|
||||
"""Test that LiteLLM authentication errors are handled correctly and not retried."""
|
||||
@@ -1940,3 +1971,57 @@ def test_litellm_anthropic_error_handling():
|
||||
|
||||
# Verify the LLM call was only made once (no retries)
|
||||
mock_llm_call.assert_called_once()
|
||||
|
||||
|
||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||
def test_get_knowledge_search_query():
|
||||
"""Test that _get_knowledge_search_query calls the LLM with the correct prompts."""
|
||||
from crewai.utilities.i18n import I18N
|
||||
|
||||
content = "The capital of France is Paris."
|
||||
string_source = StringKnowledgeSource(content=content)
|
||||
|
||||
agent = Agent(
|
||||
role="Information Agent",
|
||||
goal="Provide information based on knowledge sources",
|
||||
backstory="I have access to knowledge sources",
|
||||
llm=LLM(model="gpt-4"),
|
||||
knowledge_sources=[string_source],
|
||||
)
|
||||
|
||||
task = Task(
|
||||
description="What is the capital of France?",
|
||||
expected_output="The capital of France is Paris.",
|
||||
agent=agent,
|
||||
)
|
||||
|
||||
i18n = I18N()
|
||||
task_prompt = task.prompt()
|
||||
|
||||
with patch.object(agent, "_get_knowledge_search_query") as mock_get_query:
|
||||
mock_get_query.return_value = "Capital of France"
|
||||
|
||||
crew = Crew(agents=[agent], tasks=[task])
|
||||
crew.kickoff()
|
||||
|
||||
mock_get_query.assert_called_once_with(task_prompt)
|
||||
|
||||
with patch.object(agent.llm, "call") as mock_llm_call:
|
||||
agent._get_knowledge_search_query(task_prompt)
|
||||
|
||||
mock_llm_call.assert_called_once_with(
|
||||
[
|
||||
{
|
||||
"role": "system",
|
||||
"content": i18n.slice(
|
||||
"knowledge_search_query_system_prompt"
|
||||
).format(task_prompt=task.description),
|
||||
},
|
||||
{
|
||||
"role": "user",
|
||||
"content": i18n.slice("knowledge_search_query").format(
|
||||
task_prompt=task_prompt
|
||||
),
|
||||
},
|
||||
]
|
||||
)
|
||||
|
||||
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
@@ -0,0 +1,660 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"input": ["Brandon''s favorite color is red and he likes Mexican food."],
|
||||
"model": "text-embedding-3-small", "encoding_format": "base64"}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '137'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.68.2
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.68.2
|
||||
x-stainless-read-timeout:
|
||||
- '600'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.9
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/embeddings
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA1SaWw+6SrPm799PsbJumTcCIt297hDkjDQnESeTCSAqB0VODfTO/u4T/O/smbkx
|
||||
EYimm+qq5/dU/ce//vrr7zarinz8+5+//m7KYfz7f2zX7umY/v3PX//zX3/99ddf//H7/P+eLN5Z
|
||||
cb+Xn+fv8d/N8nMvlr//+Yv97yv/96F//vpbqXVC1KY6Ai5tbEYUvL2Fz3d3UKbls2ooTr4sDq2d
|
||||
H3I7zRfROAtPcpupB9i58SM02CpPTBYpFV92+RsAu3VxJt5ItRyJHKMCiSkJHii1FzN45mier6KL
|
||||
Ip7L5nWpWVTu8pScO3MAizDvZ1Br3wLbot335HPPZbheQ8HlvOuQzV81D0SsxfIkjqoL9pgYb1Q+
|
||||
zXbinHJXEY9NT+ix5hM+rQMCq7GWOepKI8RXuDL0vftIJ5iHrwKfMaParIgPOQwa5kEkgbyrZRSy
|
||||
Dl6qJCJ3xdLCqeGqGYbZeYdlLiz71WQDBkXLyGPNY7yQ92PjDYVFvOOz4rfZHLYNj8YH5+AsdFSF
|
||||
fF8vF7Xw8iU4PqFsoZFdI98/mwRfEMjoTvVb1JiOQwqjTKqJQ7WHOI5KRG0Ene4vpiGJX7vN8Xkl
|
||||
n2r5vuUCBR9vj43K+tLZPXcWQJp6Jtfiydm0/ewk2JVWSHz18sqmmGQz9HHeupQ5JP3qJVwCucbm
|
||||
iGw9cMhP6ZogL1sb4iS3WqFpw7qQaxpE5KvaKcNuKixoJM6daLe7aXMPwYzgmaQKCcX7XNHYeUkI
|
||||
3JNpip3ZydagABq8jpk18R2gYNB6+IRDyi4kX4/Hig7rUUSxdohIKLtKSCMb8pBrPgjrRbAos0nL
|
||||
CaR9qBOJHTjAy7H4hqOzRi4PdbPijmzII/C8f4hxEqRsXzq6AR7YWXG0OqeeawJpBgXKA+y7Ec0G
|
||||
yecsxOQWcMNwfmSL03AMOF0ilahvBSjr4wZEqLCTiPX38gXLuQ069BIFndg8zik7v4QIqDBQiMIN
|
||||
ZTWnp8uKVFNMiDXMLV1jtgnQddesOL60MFu68VSgM2Ft7KlVXu1fx8GCa3xrybkBLaDcNXmCZiCO
|
||||
OwuvsmeVWx/BipUnbHweGJCBigm8na4m0YW7QOc3QKUo+H6ATQ4nlApvY4BqGj8JJk1CuYtleJCB
|
||||
s+YKZyYKOc1yNdifqDLdWrkEs07uJQTkExPJnyV7r59DGRZItojy5EKFJlfBQaadJ9hMEbEpdzvK
|
||||
SCmfAYn3rVfR7FNPgDGZkERrcQDklt1c+DK/Cw6ekq1weTMbSOdHmaQVNAE3io4FsehbOCbegY4S
|
||||
JR3QQ1fGdx71/bK66wlcM7fEsqwn/eTHUg2Jfg6xw9d2tb/4bgTjA9GJbmpZv9zn0ICi/lRwiGXZ
|
||||
5sGFuqg7qhJ2BWlP50fKWgjsYp8c+92x2gP+lSCr5B/u82GLyrzX0hm+zPOXON/sDRZfoDG6nahP
|
||||
7Gs7KmsrVyX8NPVEgowXlEG87mb4PkdXUhjmMeQv7W2GU21kJG0zmfIN18/geHNaLH3yCcxKFrJI
|
||||
oA5H/OzoKTT7DBM4KB3CEttQMJR1wKPxSe7YYaCc8aK7uOiaa+sE+fu3WlRt1KDb7CBxjVKoZsk7
|
||||
dTCF9pUY6X1UZjm6sfDUpoiYVWcDrn0YOTQn1cBe2noKPw/XFAmlXWLF2i3Z2ATSCq5TmGJ8O3TK
|
||||
nFthDm+nizm9u5GAxWkQhNE5bohmaM9wrYfFg8wOUyIfSNfPTEUt9CgJnRY347Kl4IYW+IF+cytO
|
||||
1Ko12R8ZxIyPkKjgHlXf4yyyUHC7F972l850vUmQJdIJxy/4ohQAbYCN6ToTD/pXOKqKo4H9aGrY
|
||||
Oi5l2LHU9OBvv9x3ZgH+g25QPCbr013aB7AXTX3OaOKpTbD08Kqp80gJOVlDWDITM+RZagZwMDWf
|
||||
HIdxZ9PpMJ/Qh88ZfHycJLr/PJzytz5s+PRoc7obetAvXy8ixaGl7Et1idEnemXEOnZStoJb5ICv
|
||||
c6qxP2Qm4J/pU0OGATrivBLeXlEmiBDYnUvkx/qmQ0NtCcq7oSDF/n7uB/dcWsjcGRnRy0ql2/4G
|
||||
KLy5KZGOetS3wjlfoVUmV3K356PNFzc8HELz/iAyF8pVd13mHI3OHGHND/mMPnfgjWZmzon/gTOg
|
||||
/TAniEFt/ItXezb46xNm7nVH9LPcZVOmlwOcqinE1wvR7dXRahcSo+qxqgth+Lrltxyk+qpjo5VT
|
||||
ZQ8OhQuDg3LFUkpWe76hbICwrM8ktM9cSHfG2qH2Ox1ccZgNMKPMaKHlX0/YpUe1mv1kiX7xj939
|
||||
/VyxBn8tYeKKDZafLlaWxVVFKJrqk1zh81vN4nW3AsZKMnxJkG5zyf0+Q9/Hpsva0dPeD1RMQWFf
|
||||
B+LCfAdIabwN9OEP2sRF7UtZnM87h0LZdMQRD0vfy9A9QaL3GZYqSaP88hE1kfYf7U88sR/x4KCX
|
||||
cbnhmyqAamFO3xSOj72Dj8H3ln0J48dI87DjHg6npqI73bJAaILPdOBiEXybJK6hUiUlya7t2eb3
|
||||
uiIjqdHuxMHDR1kqK6hhHlaFy1OnrVa1PM5IOt0Dl6HGp1oPZ2OCvSAxOBsLF6zcLlrRNWs74iS2
|
||||
ZK9z+awRx8XCFr8PZeaA8oSbnnKJ/wrDRegnD8wj2fTCaijzbioMML6akfjb+Zg7TSpQ3RoHXEA3
|
||||
A7yNDi1sRjvCtpa8s4kpVwGhz7slcnaXwJ6RDAt8wlTDJmfEgGryLEPrBTE2VK0Oeb9wBTDxiz2l
|
||||
zIXrF3GpGCSscYN//8+2SrUikgRvbI9DQOdmSZ/Q0ESV6HSw+lmOrRjy7ZtipVH2ShsUFg8eLyPG
|
||||
j/h0zzhNmT10ip0LVncfPSPv6PsEXBQ2031NXxUhygThbJ9uGPtkHy5OZRqI9hYz7YvnRaHnT5r+
|
||||
0Wd5/mntNRFLiLhafrrgSrBNdayUKEyfK5akg9cvhPEjdLrE6jQqSh1+F6sW/uRPk8MCnSykFvDA
|
||||
i8Ikcne1X6y7ncPds7Cw+z5rYI5qo4StGJ+2fCWH/On90eBdvOduLfJzNaGGTICTd2di57qmjOsF
|
||||
C3Bmogi7p4jtl/zzbKHH1AmJof6tRqn/RpB8LR6fI54LqXk5uvBqFjLGpS9R+nk4T/jwVGVi3u9C
|
||||
mfmdN8HO33ku/JYtWJKHFkBivHoSbnqkzYwSok+0q9y9AYyQ/dUP4JziCSnKKVzrOQ2gY0YXcjvU
|
||||
BZ0L72bA3VMIyFZPlZF/pDN0mIbDDm5Ue8bRYIBm8E940/M9T5dkhqfYvUyH4WrQ2j2XBhqM1iQP
|
||||
QnM6+jCP4bb+ieu0HqyPxE4hcOCbeCzd04XLlhlpLz13m12f9PT8lUV0EvbqtLv0b3u10dL+7mOn
|
||||
OZeAi+15Rf3hGpOTzH3CQe8EB92i04nku8utpy8NtX/0Xt48BGWkF1aDRP9m+KTosKcNTmQw38Oj
|
||||
W08eBxY8JTG6vYOBuKfz2q+PGxXgquoe0Yn5rPaA+UzwOu6DjX9egACGDPAuHiOMy8eJssdLbIHF
|
||||
Qyr25mIBmx57wpPAqcRTlSrjM72cwBolyh99Ph66F4O08zvDGlW1fv4wgQPK4qaTNDya9srLiIWH
|
||||
xkqmw7Hb91TkkAuRhu7Tejyw/cZHBcqr6UVcZ+izrX5O8PR5jiTe9PS7bAcNTpwASJYdXnRZ/UmE
|
||||
qv3gXHFgBtCCGOSAkysV//LLEp6sN6BomdzO+gbZ/PrgFhp6ExBv9kbavj56Czf9j1VQ8/0as6MH
|
||||
Nr2Iz07YZySyTjF8rNIbXxS9CRfDZz1oiJbkvmrrYi/D4cpAekuK7Tx/MvLSuA7C41PE+aZ3VlTt
|
||||
HKhwtkzcib6qpTicO/F8zp8kkjk95NXSnGGYnjribfVnYuoqhYyVZuQ+FSudplRMIAZ2OrHf0Qbv
|
||||
3pFdpHCmPAHZr8NZXzQJTopuEOtwGKvlmjgepGA2sd/vjv0yUNlA+t623Pc0S9UsOdWmRzsLY8qP
|
||||
1Rq9Xx6qL6WJpY9r0UGO5QiJRp9MjGG+wqki0wk0w+i4MzoN/bjxL/zwBYOlQf7S1QIRhM6QnbFu
|
||||
nlG/nN+dAbnIbyakj3M2RbWwAl68hhPIP4+MllbcQYutd+QK1wLQ17G20OGkqxOzg+9s6tydBKaA
|
||||
VdxdX8Rg73YggR0V6UQ3XprFGAcwc+J0klAw9xSiKkH3NmCx07xFe8js64o2fsPqrk+qVbnDGlJE
|
||||
pz/nmwcX4ICqDNyJczJUrVv8omya1Gku+EiZp9QyoP9KHOJVN0aZn8XFgqrVNC58Wauy2vJVglxj
|
||||
cuQX3+ve3LOwgYWOjSCF4aBWsgT6CxsTjXg3One6wECF9Vas+U6Utc8XY0G9PgXEn/cLndOdkMAU
|
||||
SLLLrYPWc96yRCgPd+8fH1RrNXJv9LXdiEhVALf6Fxvgfmgcl9lxdTjeh9VAgsdZOHsugTLXpJRA
|
||||
3VoH7BZKpLCdLkAYvNMP1vapUy23RCjgj88C5jRXy+KeRcjuxIiob39nf4l0FEXDeJRYMjumH6Iy
|
||||
kVD/9gqcfB4ErMu3e4P2GyzY0s5dNqi1WIIfn2kv+AIU710Bph2/c9eM8Eqrj6MkNj0Pib1Qli7S
|
||||
rLXgIl+PE++ra0jbwnChuasrnLqMEA43uQkgFklAsCA9+9lmFgHaQii7ezidq3nMtAJ+bSciGF60
|
||||
aonPXw/8ePynlyZwyx3Yfr2FOJ6BQ37bP/jT6/r5Omft56hrsNwVqSseDud+MaMEwnE9cthUEzmk
|
||||
XZdNMDs7HdGFoKnmaHB5yN7h4s6D/AX7C77w8NWNJ6zublVG+aqWwOngn0mkuEVFH3vzBOPY0bEj
|
||||
rrginrBPYPKabvh8z2eFPsXyBJeXYZB7VEsKpwhfCT7f1uLujLkJF90SPbj5T64f7O9h5yiKA//s
|
||||
78b/+8V4RtC+YoK3fE2XGg8ayIqXTI6sWFajuPQM7AXgEjs5cdk8Ph8MfKzymxzVyyucBcP2oGoW
|
||||
HpYzEtu//At++coO06/dpscWgrRL9vj2WUYwXfnLyhXmion7jgw68YyXo90ztwje6h+9j2UCnzV/
|
||||
c5d+96rmQzAF0AGDhM2Y2MqP58DmJ2DfPfvZaqxdDnNJmMi1nA7KpJbmCgv7Mkzr1RkUyl6eEnx4
|
||||
u3Dal9x/8Q/c/C5sHrtrxelOuf7hA1P1n9kKi3GCadJe3Z1PrtmMo9pAoS7IRB5YE6zYk0WEk1NJ
|
||||
/DY52N2l9VcgLMKdaJfjS6HD7KdwhxOeHDd+5ZB8GuAnQhU+4axUFsBKHbrmpeiiiQpgunRWAC9c
|
||||
DYm5HCO6Hv3v6ceX5DwNPP2j/356wsxOvrJ22jxAdN2P02IKfTYLJpjB8rIMrH53JlhC5iChrT7h
|
||||
k/ypwuWW+wWMD6OOpTPrZ5y3l3m45XtiBs5LebvOV4QqMkIcG2c5ZO/ObKA2SV+b3ooqXm2vNeS/
|
||||
D2USiulF52c1dVDww8A9iWTqF0YWRKiE5weRysOJUoJ8Db3XiE7M7WbYvJz0GtwtUYqTjb/Hfr0z
|
||||
P97A8jWd7T/rPV4Fj5wm7wKo1H1PUKq7CzHO+zUcL5YUQHQ1nljttJ7O1uGZon0+MdP7YigK5fJS
|
||||
QhsfEXm5huC7b88BnHf3L9Hh7VMtC+YTSAy1wmbF7ezFygIG5U3IEDVqX/b8FvISPrzKJuebnWUT
|
||||
LMYBPjwUYifnHJtIg1/D/BjzExivA31u8Sx+ai/GXhx2NmmCgwwqKeo3fXalXKl4b1TuhJKoIA76
|
||||
Wx1xNfz5r9bjU4Hxk/crjGbjRh4G8822emUAqYEjMX88xcVtAFLDxUS5Uk8Z3EDPYXX8KsTO3VaZ
|
||||
bZCIf/wGTEKlWh4iY0DjCxEpJnqsuFsyFz99g2/cPQeUK3csjHVaEyNQ44x7opQHWz7E+Kz22bc6
|
||||
0zesr+rbfW56bHR6nMOOjYF7mMUAtEEexOgQspIL5UrOxn4UWiSmNwEf2+SmbPqLB/L5I2CjZzkw
|
||||
DBfIwI+8Apcxrk1GY399QulyLbG2MnU1ie7BhfaHFbGCs1s4mkFbgO284eIUsdUW7xH45uVA1M1/
|
||||
Xq/8fQbauodukxBFWa8HZv35f+6idnq4wENqgOf+bk5I6XbV5v92sP5435/fbv/hva1eu0ip7v2f
|
||||
+GbdL7PpMTtcvetJhJorVS7Lp20186h1RW3lIHYYTgo5TX2ukBpPF99ccw6/12Bd0fZ98yMayglN
|
||||
UsA3vkL3l2+IGSYTLKfNR+/BHvzqN6LwEROJgE81bHoSKsFyJdbP/9p4C6boQbCpLoeK6qYo/+Fv
|
||||
W8731Xd7v+DsRrMr9OBK6fctDrBAkoXd27OzVzn03/DVeXBaH0i0l0mcYySUnw6rCa8qG6/F0HMO
|
||||
Ila+j52yJNkYwNlqfXI9LVzVYR+kf/jJvd1aez44Sf5bDz7xrdPT9jFPEF258cf3IS8Y9PnLp0Q1
|
||||
tGs/t0fqgahwOKy9/Q+YOEBXyDDXCp82/2GuSSf98atd6AKw6v2zgG3XmPhMjYISorwZQAy9mvpt
|
||||
vfT7+joQOMwbH9tHpkwJ5xfwp0d+ft0Mb4kh/n5PPh6ifm1PuST+/F6FG+SKN8gxR+PZzSfOZkuF
|
||||
+ntwAhwrzhufSP1Saoc3HIzOJPbX6+hbcKIS7dZu/cPzU3C33pAc1qPLQMuquJ1hJTCE0pkch0gH
|
||||
K128GdZJxrh15HjZ5D/2KQg7RnRFxqX920sbAT5eVkzO3ouEgy11OdzqlVtt55lMsBaR9qAKdsR+
|
||||
7B8PYXlD4VVcsBEfG+V9D87Dn3oqFb1EefSSU5jt6guWj9MroxfTkNEasTb2Y9G1xydKWbhIJw5f
|
||||
9GoMp/HqJ5AmMMbFln/XsrFP8Mfv51uAsiZ7swXc/A13af29stVr56eP8HE4N4CeqyUFbJEciC5V
|
||||
fkUVxNTQjXDu8lt94K7JKYDylGrToTVrSmhoR/DZeK8pQVJfzTeJnWHO3Y/kGHwP4Sycoxm2Wnvc
|
||||
3iew6ZFzNHi14mCCTVTTdetvQTFhM4Jj+xG+7yHjAM2VK/fwEQhdRe81occq8vjXX6Dh7uuB9/Md
|
||||
uqJVztlqQ9MQsRha5GzPR2UeC1JA1B4kcnbUTzgn/G0Amz9F7KV9/eFBmDxZ0+W2/txe2RENkG/N
|
||||
EFm7wH6+yY0HeZVRsPOdB7A41dGCWz/mj5+4565eiZJX4E+7/sqCbz8KHXy8soz8/FOuOJxb2DD3
|
||||
AXv+vrTpS8lbsPl7RNUFmq1iJE/QMePL5udcbfqOUxlozxS7C1qDat3rioRqlbpYL3MF0F//4sdf
|
||||
kR1J9h4dTzlc6CpM4tOalGGvBTNaL7un+2GNlZLj0p/Eza+aIJBoP5472wA6x65Ewd97yIXSq/75
|
||||
ycTb9MBry29IapjRre3oqczxHMmoo3I18ZeXGP78fuAM+wvZ+AysfuRp6HPZU+J8ZwewWz4Twdx2
|
||||
5HGpa2V+lV0Mpkbup91Z7cPp5+/ibyLih8/fMmrFhwkeosEjOpzGfvj1UxrmWE/Lxpe8l6AEzsya
|
||||
E5MzeDDNn7MgMqiLp2TwGJsCuHdAfYG++/O/53VSeCiFs4oTsxw2ntMDSHuDwa4xw359vcoTMgwl
|
||||
ndYDmnpaykkEt/7rxFP13X8Lrm5h/IEfbB7sT7VawtNA1Dhp2DW+Yr+SF5vAzb8mliqcMv5UMzVc
|
||||
SjnBqr3jsrn8DjEcX5+R2PUogc9LPRuwVi8a1s+yFU7raNdwPHc1sTmk99yw/xRw84snTrl8w7Uz
|
||||
Xi4gh/mIzfil2/vpyOQwZ40vzvLcUXiTTSHMQz3++e+ApXP7hMZ1veCzvVfsrf/AgG19LsXmyZ61
|
||||
dyWhsSjZaeWPO7DCnHPQ2+3IdKguKli2fjTc9Oq0p43Rc4dJZuBUWxl2vqVBOV+sWMQdW45oG9+z
|
||||
boALKLykM85zONnfrR8CdUXa4bhUFEAjI2NhflwZojnfLltRNovItGoDu2Cl1Yxx0P76M79+Srgg
|
||||
pmpRsJ92+LxzpYzOp9CAlpSzW/+rt5cY32aw9Yuwdqt7e3g1DwOKt6Da/KMadBaIGCAphojTMGTC
|
||||
xRcrHm36kNypwdCxSfWTyIwgI9KnmsHIqM3w6zfhokxSZZG+M4+6Kmyxw8efihxGeYJYHANshcev
|
||||
vb4eyxtZkgCx2Zo1WNdRecNxVbiffwXWmt3XcJIdYetHN/YQvCoWloQ8poO6u1e0LSQX/v2bCvjP
|
||||
f/311//6TRi823vRbIMBY7GM//7vUYF/7/89vNOm+TOGMA3ps/j7n/+aQPj727fv7/i/x7YuPsPf
|
||||
//zF8X9mDf4e2zFt/t/r/9r+6j//9X8AAAD//wMAEEMP2eAgAAA=
|
||||
headers:
|
||||
CF-RAY:
|
||||
- 93bd468618792506-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Wed, 07 May 2025 02:26:58 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=b8RyPEId4yq9HJyuFnK7KNXV1hEa38vaf3KsPaYMi6U-1746584818-1.0.1.1-D2L05owANBA1NNJNxdD5avYizVIMB0Q9M_6PgN4YJzuXkQLOyORtRMDfNCF4SCptihGS_hISsNIh4LqfOcp9pQDRlLaFsYpAvHOaWt6teXk;
|
||||
path=/; expires=Wed, 07-May-25 02:56:58 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=xH94XekAl_WXtZ8yJYk4wagWOpjufglIcgBHuIK4j5s-1746584818263-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-allow-origin:
|
||||
- '*'
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
cf-cache-status:
|
||||
- DYNAMIC
|
||||
openai-model:
|
||||
- text-embedding-3-small
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '271'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
via:
|
||||
- envoy-router-6fcbcbb5fd-rlx2b
|
||||
x-envoy-upstream-service-time:
|
||||
- '276'
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '10000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '9999986'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_dfb1b7e20cfae7dd4c21a591f5989210
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "Your goal is to rewrite the
|
||||
user query so that it is optimized for retrieval from a vector database. Consider
|
||||
how the query will be used to find relevant documents, and aim to make it more
|
||||
specific and context-aware. \n\n Do not include any other text than the rewritten
|
||||
query, especially any preamble or postamble and only add expected output format
|
||||
if its relevant to the rewritten query. \n\n Focus on the key words of the intended
|
||||
task and to retrieve the most relevant information. \n\n There will be some
|
||||
extra context provided that might need to be removed such as expected_output
|
||||
formats structured_outputs and other instructions."}, {"role": "user", "content":
|
||||
"The original query is: What is Brandon''s favorite color?\n\nThis is the expected
|
||||
criteria for your final answer: The answer to the question, in a format like
|
||||
this: `{{name: str, favorite_color: str}}`\nyou MUST return the actual complete
|
||||
content as the final answer, not a summary.."}], "model": "gpt-4o-mini", "stop":
|
||||
["\nObservation:"]}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1054'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.68.2
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.68.2
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-read-timeout:
|
||||
- '600.0'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.9
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA4xSsW7bMBTc9RXEW7pYhaw6leylQJClU4AGyRIEAkM+yUwoPoJ8MloE/veAkmMp
|
||||
aQp04cB7d7w7vpdMCDAadgLUXrLqvc0vb697eroKNzdey/hLF9XzdXm4uLqTZfUTVolBj0+o+I31
|
||||
VVHvLbIhN8EqoGRMqutq8/2i3tTregR60mgTrfOcbyjvjTN5WZSbvKjydX1i78kojLAT95kQQryM
|
||||
Z/LpNP6GnShWbzc9xig7hN15SAgIZNMNyBhNZOkYVjOoyDG60fplkE6T+xJFKw8UDKNQZCn8WM4H
|
||||
bIcok2c3WLsApHPEMmUenT6ckOPZm6XOB3qMH6jQGmfivgkoI7nkIzJ5GNFjJsTD2MHwLhb4QL3n
|
||||
hukZx+fW23LSg7n6Ga1OGBNLuyRtV5/INRpZGhsXJYKSao96ps6Ny0EbWgDZIvTfZj7TnoIb1/2P
|
||||
/AwohZ5RNz6gNup94HksYFrMf42dSx4NQ8RwMAobNhjSR2hs5WCndYH4JzL2TWtch8EHM+1M65vi
|
||||
27asy7LYFpAds1cAAAD//wMA3xmId0EDAAA=
|
||||
headers:
|
||||
CF-RAY:
|
||||
- 93bd468ac97dcedd-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Wed, 07 May 2025 02:26:58 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=RAnX9bxMu6FRFRvWLdkruoVeTpKeJSsewnbE5u1SKNc-1746584818-1.0.1.1-08O3HvJLNgXLW2GhIFer0bWIw7kc_bnco7201aq5kLNaI2.5R_LzcmmIHlEQmos6TsjWG..AYDzzeYQBts4AfDWCT__jWc1iMNREXvz_Bk4;
|
||||
path=/; expires=Wed, 07-May-25 02:56:58 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=hVuA8E89306pCEvNIEtxK0bavBXUyyJLC45CNZ0NFcY-1746584818774-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
cf-cache-status:
|
||||
- DYNAMIC
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '267'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-envoy-upstream-service-time:
|
||||
- '300'
|
||||
x-ratelimit-limit-requests:
|
||||
- '30000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '150000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '29999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '149999769'
|
||||
x-ratelimit-reset-requests:
|
||||
- 2ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_9be67025184f64bbc77df86b89c5f894
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"input": ["Brandon''s favorite color?"], "model": "text-embedding-3-small",
|
||||
"encoding_format": "base64"}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '104'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=b8RyPEId4yq9HJyuFnK7KNXV1hEa38vaf3KsPaYMi6U-1746584818-1.0.1.1-D2L05owANBA1NNJNxdD5avYizVIMB0Q9M_6PgN4YJzuXkQLOyORtRMDfNCF4SCptihGS_hISsNIh4LqfOcp9pQDRlLaFsYpAvHOaWt6teXk;
|
||||
_cfuvid=xH94XekAl_WXtZ8yJYk4wagWOpjufglIcgBHuIK4j5s-1746584818263-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.68.2
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.68.2
|
||||
x-stainless-read-timeout:
|
||||
- '600'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.9
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/embeddings
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA1R6SROyTLPl/vsVT7xb+oZMUsW3YxIRkEJBxI6ODlBkEpGhqqBu3P/eoU9HDxsX
|
||||
QIAkmSfPOZn/+a8/f/7p86a4z//8+88/r3qa//lv32OPbM7++fef//6vP3/+/PnP3+//d2XR5cXj
|
||||
Ub/L3+W/k/X7USz//PsP/3+O/N+L/v3nHy10PsRPcxuIYz6sIG03O6RXYmZOpt0pqpupCN1ta2+K
|
||||
2T5U1Gg2d8TiXAzYNTdb1V+PVxL4y4exz8UvwQOkRxSSJ4tWZ2tkKpuTicTBlRsXtnHarSCuWzyr
|
||||
8Xukk9e3ahtwIUF5NIA1fk8ZqD/yndw55DQYvzVRNRuDD/Jwh/O1q+ZYwffBJUfYrWDZhO2qCmzb
|
||||
4QTlN7B8vB1VcQJa5DzSyqOKN1nwJKknZL74FSx8MrfwwkdHdAzUEqzqZ+tDDT0zooO9ZK6Orqwg
|
||||
AmVAvE8yRvh1ppPagBvB8527NWyD7ETNW8fGwD6xCKcF5OCsG/sAOI+7t67vA/4br9M48GCun6da
|
||||
fSsdR4xB2edrsb9RVQVaRyxn64IFtEuq0mU6kui6e0VLGm0n2LVGhA6R30UCfT4MYLEpRIaVUrYc
|
||||
l2cB6+QCiN9yXrPcxmKAxZO+SCjT2Vv0MS1hpRUFySFXeVQ7nGVYB+cuKB/RPl/X4pRu7waTMTjg
|
||||
1VwV99jD6+2iByUHhYYdz2kPa+0+EVQYMViot6/Vw7Zw8bIUCND94d7DPHy/CNo9p4adDpWv6p/u
|
||||
HbB933hT7Wxt9TpoCJ0rvsqXOWl4NUFbg6T9E+a09NEAw3QwyYH7kIbGfG+oFPY8CmUjjeh4ag3o
|
||||
hZNFTjuReYJX1zx4QfpGns3zDTUymMFFjUOyy9wLmDAeJpgnsYcuo1mNrJw6ET5AdkSaC9/N/Ny8
|
||||
FFh212dwPTveNx6ZDCctGYnzOB4bCZ6UBPI62QWcHzgNW5owUU1NrJCOhiOTqtVp4ZGaJbHmuzi+
|
||||
04vjqq/D1ibIPlYR3atSpjYFZ6Br/Nzn5HVOVjV0XQUd7feNSSF4WoBu6j3eEv+YSwsyeCilmwzL
|
||||
NyPzhOwVBZBkRYSlQmzyCb5UGUqhYpPd8X0yv/Evf89HXnIS2Rw7eQ+54dIRywzNSIKnNYbf+KNg
|
||||
N6WM3yw5B5zS8P/+XywQRYbPOyjxBgNtFDPABvh2spEYM28x5lVWqr6yNkbXi+GMVFhuCaSJfSX7
|
||||
s+OZS9UfeHg5rw65Fk7QYAhOoqr49Qtdju/FZJ+LVapbRxVJoah3sKgbM4DZpSfol89DZA0h5Er7
|
||||
iE58OkfUJ34N59sux8vWNjxh/xRsGMVhg5KLemjoR3NLyEwFEuNYJIBPyN6BJpsM9FSAZdKzJHbw
|
||||
E2MPoYIeGsE+aYW6d7kr8atqYoT6F6iiTXAmaCx0wISsl+HeWc9ffHEBlbTAAEZOFWJvDb3heSle
|
||||
VfvoBOQUG5to0Z4ght7n3gWC1Nk5aWRLho8T9yC7d44i8XC3Wugn7yrgztniMa24KzDE1ZlcrTjM
|
||||
cZt/NGXoUIHs3ftp4t08JX/z86hx51HaZE2tajtfJ9dCbKJfvNXv/YgpbA5MDMGiwfi1iOSuWBeP
|
||||
YfvWw2x73f6+hyfJykb7/T/0zJ1rJPDJq1Nr7xYhJxGXCBstcOHUEhMFxPEZ/w64EH7xErnKxhp/
|
||||
9Q648/lMdro15ROQihKS8tqQo4iShu7VTQbrVrCIb7HFW5v50CuczBNitNrgkXP6EOFhe3fxBsJb
|
||||
RLR1SIFd1W9k3ayK4Zjpraqn4Ib2KJibicXnXv3ia8AP9c1cEHYUqHBYxtDqH80SnkwLLpFcooP9
|
||||
0UfpW1/wXDSYOA9Q52xfJL0an62JPIcvvjWiWCid29Yo2ZtOI9WHVweHRYjJbqqEkfFmV0M7pVYg
|
||||
KaIb8VTzz5BGyMJL5+ueiI63CXzxnOziyTCla252Kt/tHmgHJ3+kurQGanHwHeJowyuX3uRRwM1o
|
||||
GsSysJTTxLxjmHpNTHQeBM38q4+jNpXk3JGIsbHDBrRswuFyaDW2ZvtUUeNBCIgWH/VcFEBoQb2t
|
||||
ECl6aOW8cMI9tPEYBQCfVDZdYiGGEGgpcu/CJl/1QexUd3tPyQ21sbdYV1NWK+1ekKc3XjzJsEyo
|
||||
lpI2B3RoNSBs7gcOVJvlgfL7bgHk9OEK2HJcgQ79o2lGon7OimKzzbcfnpn4CF0bLsfYQDegvMal
|
||||
3WkdFPyPgrdC8xnJ5sQC9W4sMnL85yWnwnKKVXsTV8Ra3q5Jedze1UKuHGJvgsFk2GxXOLbmGx36
|
||||
c2hSKuUywDcqkdPqtw1VroczcMfhQdwdLke+/gQxuIz1QOwmaDwpvxwMsL3eN6TIDDOfe3iy1E8Z
|
||||
60RTdNMUkQ5LOCVHhbgnbHkC3LNW/eIdbrK6NakzJSt875qUGFfhDMRL1UN1e6+vyFW3hrd21SuB
|
||||
xzf0AmixxSSrWrvqvg5M5JXhLhK440YBuK4y4g78sWHsE/fwx29uFYgAf8g4F+rHqCHWM3Sj5UGX
|
||||
Mzic/SfJb8mbzTFvW7AXGp64G0Xypq2zCQBnD3dk8dV7pMpz2/3wCcsvuuZTorSxOixSTJwnvEYS
|
||||
K6cYyrdBxJBDfcNSbjfBMb0txOvuR8b83OFAROSUhJ/IBbz2KDM1CdsTikvM5WtIc1GJEUqRkxVm
|
||||
s6Kt5UAhjSiyyu0CyBXhDJqCvP++7ytipk1FNRILEWlH8WVK+WMbw6tBWxL3Vestet37cLOtPbS/
|
||||
mo/mh9dqe8UMrxpYvSWjcgqtoP+QxJlCb5X0ewG+/REdnsTz+lAZOnALMgFpppSMIu/LFDrD6YXX
|
||||
O9iac3avZbjKJxsZhY88Atd9q2KPg2i/0DaafOUzwVTrZqzozWwOizwkcNdMBQkFkYxLUs936Nht
|
||||
jpwPeAE6PvRAJZw4Eo0e+3ztZreEW0ZNlNunKF9wWKbQn+J9cDOJZeINUWvloeANCZZYGNngxz5c
|
||||
RpUPBNddc9zNbg1fC68hC+QWE2tuOINtgO/IaAQxWty+C2Eevl7EnfFofutfg6vneIHs6XouRCeP
|
||||
A6/y2ZKDTm85e5hlCnmZo+iLb9FyEVcI90l7JLYmad58Hu8yvIGKIsPwHZMeMtGBab1fgx9+0Zz/
|
||||
UNiLR4Pku4Zrpv7hxODHt8zJQ6YgzSUPb8gyg0tYBQ29+UcFHvTqRFIo2EzafwAPOnyaSGQQhS3c
|
||||
cSMDg2kdik930mDh7bZgErITCr71x1tC4AO0ZClxz6ddxHb6FcIMIp/YUTdH81wfDOgM0QsL9LAH
|
||||
4tBGtvp8UkauerYyTM4sVkXU5HhrDQmgLD4P6o/f/fCe3zWvFl619olSpo+A//EDpaEnct29umiF
|
||||
hO9VoxiOxJht6jFVbSlsm/iDLlb1jfdqBrCJpTPabThpXGitywrmDgi/tugImGGZnCqFsk2MvYDy
|
||||
5X1cViXmww3J9aRkQhzJIuTmYItF+NRydtaorFrb9zPgL+/GFGh9kJVXmqFAufpXwMdqmijJuktI
|
||||
npwSIOI2hyDbXrbIvHtjQ0/lbYB9bBDkjrWRT+Yhp2DsZIU8jpMLpNc5obBa+wV58k4GS9FrHBT4
|
||||
lidewYNxiW7wDAnHjwRVw9tjgp1n21er7Uhehq98KfHxDGXPm0ngB864Zlt7hbdXX5Hd5516bC7r
|
||||
DnZbUCOX9v3Irg9aqJJpewTxug1EfR7vMNyf3iRNImouKCh8GM9ThdeN8MjJgHRL/fIPdNzLBKyJ
|
||||
Lp6VrbMRkQc53RT2T9WCZ/O9x9tvPq+Jzp0Be9YLcgO+a5ju8yGshNJCD1ZOIymnjodhlXUEzU/J
|
||||
XBJsDEBlO0CCV7yaU2X7PiyeuyvJrMfQLHFERTWtdyu5e3Pw49cpTN98Sg5BxHmTkfGZim+rRAL5
|
||||
1jWUjuIEv/gdwJKlJhnMRoaiEIboqqu8t2puFav34jZjNVkpY8X5osHayyMsa88T6G+BQiFejAYF
|
||||
0bj3/urDbz4FlMRPRp/jmihavguRMfMtGILN/hvfbY3ubT7kL2WAPNyNbEX6V+/hWf4UIOHhB7lS
|
||||
vQOL9rDvQCyMlljvPh/xLd2WQDc1l+QFn4+L+ZIG9boBBab02EfsroHyL58/AP2R85Ey3uG4Ge+Y
|
||||
aXjH2E5/cnAjDjI6ikgc13TIHHh7DRXa11HtkbZrz+pciDq6bDipmQYpc2C8ja8ohEAbx+/3hXz2
|
||||
eAdLS68Nc3zOgJzJDnjrnz8mcdLcB9ne64KtUWCTNbKvKLMZ+MiXOz2nnaYP0FFFmzieXuWMv0ID
|
||||
DI8CBPAp76J1IasGw4x30F5xhBFf6aOD6T2okcl2fkTts89BHPDo10/ZKs2BDGXODNAxQL4nlWYV
|
||||
qnPgMmQN9dac194O1MSgCTm665z3lZqL4NJ/rmR/ysR8Bu02hWnyhEiv9eO44tC+w6cejZh9+bRQ
|
||||
4mMIX53vojztI3Oh9UGBFPNbdDFZ4rHc5jjIl0FAPFJHYG1S3oWBnzQBj10vYqp2mtRXmiJUfPn+
|
||||
mnNlDUue71Fs1EGDNbdKgH48Nd/+swPzoyY+9NIlJZpMZ3Ot328F4mu3Ek+0zEZwPNIBUQsG4juc
|
||||
Nf7w8McvUbB36og2IwjgpXltvv2g9OiY3WP4qtyVfOuRkVHtMwiTQcMrvtreeqWPFn7u7EkQvfHj
|
||||
rLr9/ZefxMl1i0m/fPniAXH15mjS0t8PMMvwHgv3avVw+B5r0LVahAzVjppBF/MV5pZKvveLG1rZ
|
||||
fgBj8WEhm5iRJ3RG3MHJqGkAo0s5rqJrUdgc0J1YidtGsxVfNSgYaYU0NthsbS08gIPenLCQyW4j
|
||||
gauwQnWqHiRpFivi1fgSwmktOWIb8sVkNtmEAGBiB6FlDDl9ZCEPo7c6o+Ag+ZEoW3oMY/FpoeAV
|
||||
n80v/8igAwY92IwRzrFyGURQiFeT+PVqAOGgBAosF/VC3MwW8l7Grxg+7PiNUC5OHjW6WweV7afE
|
||||
KzvLY1/zTakmk2+jXOt9b8E7z4cI+RY5Z3o+kvowt1AY+RNyK6+LcETvGIRn8CDGp3gAelhoDd54
|
||||
4+CJz48Md9WcQMm0PHRw7TJfLkYawPCoHUl4E51xfDTq8KsXTENjZsumVUJ4FKhNnvNT8uiLnzKl
|
||||
3Moaun/10t/3cfvFRdZ8T8a14KpMFXeh8fMHwCLfNV/VNVciewn6jG4dKVA5YXMnpnASzOnLXxUh
|
||||
DO4BhceR0clhGM7kIwciC7pmDQolAd2ucoN5K/WMBjHrIFdax4CFnuMxon5C+Mq6mPzw5qdPFeua
|
||||
+8ES3K6MteTeQUEnEbK+/RPDtaqhI4D8y//EkYGrQGEGjz666GvlLQFvxSpI3Q+WLkbfsOG6zZTt
|
||||
66xgSZY2I50Neob9wdr/xfvv+wVqXoUMXdRMj8SvH/jLZ+Qcj4PJnz46hEnYnb76rjaHiSEMwirt
|
||||
8DY9T+MCbrEPJyE9kSh6MG8KynBV7cgsvvxWHZfzGCtwkV4D8tx7xyjGNVbdfbhDN0BWb0mCJ4T8
|
||||
6AhY/NbD4vLmXX02Dw6rc2yZxJuqTj0RH6Nbut+a9Oz7LqTbpUIet0lMQT5yDpTeRYPpU3qM7Omf
|
||||
OrhgYU+O1AzMJb5XjvrYjbcAPrbZSNORdmp43j4IGujRW5NUHqAwiifiUw552NEVCiHJaoIK+hlX
|
||||
zx0hPNIJkwNRDHMdCA3lc22F6Kye3h7zKj+DrwOwv/6mFy2fgRYqKIGPtx+tBYv52vSKvgsQMjnl
|
||||
nq/mXk3gvvbNQIXdyr71k0Bx5J9E/+q51XMbDrrociM/fCf5Rdf+6ttD/ekbxi7IBSeXT0j6SHVT
|
||||
ZAIXAwf0OkkMWTAXttE66A+Oga7ffMLuXrIgbZQzOUZylPOXWEigtObe14+bGWvM2oHTh+bkHtzl
|
||||
aN4/BUt17C4PODlgJuGbWwAbsmyJ230ykzm+qKnK0vroDtnGXDcgw6DxLYLulCMmbq2uh+9pL+DY
|
||||
SQ6juH2bdxioQo98bC3jHJ5MWzVewYiM6qR5K+HmFdj7vP/y897sm8aQwY//m9GDmdQgugxfloSR
|
||||
K/KLSd3Dowblsrng5C7tzeXnZx5Zp+MTRz7R6sxz8IsvKr78Y6WX3odffR5sHcHNhWiHKfzqE+Sm
|
||||
uzKaVdZpCr3kB2R3ow6o+9ECWEn4gPbFiJi0XIG79VKWIstJDg3lFX1QfUO+In+Ueybwt3Px80sC
|
||||
enMYWL79FTjPWCX3XfVi86fWHLVL/JQ83uXHXI3bBSo/f8wvmWziX79JumcZvKJQjn7nwVcfBuC2
|
||||
PZh8rTkUBm/uRnan1WpEWZ4sSBpfJI7XGs1S0dWFP7/7UA4qYMGhT/7i+R6LNsPPIVeU6oa0QG7a
|
||||
T7OYfVvDYm5LcqzisplyNbMh3u4lFNjZPNJt/5HBT997t2QP6PbhYyguMwnWr34Sja3Wg1+9G6eP
|
||||
G61VXrp/+aHBe5W5+lZZq2jjn1F04EdAxf5jgJ8etV7WK18Eusm2pLMOyBss7dcvWvidL2Dl6kts
|
||||
5dKwhcoms4NqHHg24dea/vX3dSI4Ed0sOYTn4dAGbbzF+eJCsEKgBgghC47RmoW6r6KSuFi5KlPD
|
||||
vvobTIX7wuI6vnIGmoyDwYvG5Iqd1RyfGp+pNA3vJBtrI2KSf52UzvhIAfvmH8ty6w4F6b7Hy6vw
|
||||
o/HupxP44gVeluieT1//AYjCOURaEojRfMNmCS6bI0/8XXLOmSL1dyC97w0yNsIjouTUKjC6XyRk
|
||||
BL7XjPXFpdB2uUOw7R/myN7mqEEnThN0dNdjxIeLEoIvf/j5n953njCot8P1gQzl+mleX7yHXSUG
|
||||
6MgHVUOJA1twuD/lQFzUcJzSkbZq7k4X4kikbGi4rCHMlvML2Xuzb5Yz1qkqBcYaiO10NdmqqKHy
|
||||
9d/xOtuhyQcxaKEZV2PA2gay7/kz+PrLxDtsX82y3a+T6pi6ggL5ZjfrqQA+OAtLhoyrsIIpfXQ1
|
||||
GFv9jX75OS2KtkLj1gmYa5xPRKIGGFAPxR0ySX9sVqG9TMpPDxyATxl9v+igroV0+enniLd8p4ej
|
||||
UdfE//KlhTtKMrxKvEvyg0cZuU2zDcEx0DF3iEeTlObnDIRt7yMU3+ZxahpDgew+FgHzdjqQuos3
|
||||
wFXIGQlwJnv05u8U0AYwRPHPX4Vho8B3xd+xuFvXhp2EfIB0/w7IcX/B4+IEcfHjr+TXn7/40MPH
|
||||
HYlIjwQ9X79+CDhYaUscctuOS+rpmWL2to90Pj3ma+qSAkrFpAWc+9JN9p0/qD9+f/j2V5Z11QAZ
|
||||
QxDZm2XM1932UIKPfhjIUZ/eEeWuhqNa5ytP9s6zYXOTVAG8hK1JDOV6GJfyKGtQrUOLxGgXRaIZ
|
||||
ZSXUNw+R6JWomOTLLwDYRQk5oDY2KfWqQqXnSsQdWfZMCDafGih++SLhj2/8/OoPu6rE8np+XGeB
|
||||
7/7iz1OhtMGkCGu1vVOP7MRLGC0342ZDbtR4lE0PwVx+84X2POa4t8UZUEsbC3hDtvmdL1SMSP5z
|
||||
gsknZiRqmztbbtisFVUWK3QIouLrh8V3db7tc/ylGc0iH0UXuoY/I4eea3Ntkn0Ibkt4JfoSwZwW
|
||||
p7JUlwrnxAtDnf2d511vVx3tjbFn82lTp5D4Q0qOcLEjfvLKVt341TZQ50M48j6xapgomo6877yU
|
||||
uo/GgE+L91GcVtooTZddBre9eyS7Kzebf/X1N/7IvGOfsRBcLQjCIxeI1Xobhbx3bKjvfER2UZjm
|
||||
tXgWbHh82XIgBJcd4GtNW0ElTQf0zCOXSfY6OTBlRYBM80FG2uDSUtu9+UK691RyvKSbAZ6Su4+8
|
||||
O+uaQSCrrGp8o+OXqhrjb97306OBRGMF4CEVFODubI3YL/Xo8bJi1PAFd1+LKzQjPlTqVu3criaH
|
||||
5LMzv/rxrOYL1NHztd+Bv3yzw9GEfv7PGrgTD1fP9TD39fuWi+fUwLymXMCxSIvEdf/U4JcPIRS8
|
||||
1maRw0ZTv/WJfn71Ejw5HjavskTpV+9JR8601Wve2Mjn8yOg7akO1e/9kasJaU4HKXPl6L2ZkQ72
|
||||
V5OKD1NU8bVdySnaT834Ohcr3PQeIYZ3xt5X7/pQ64GHAkPFjLlvJMIp50rkHF5XtoSvtIVK6ubI
|
||||
knYqoCigrvrXP5OINi6Yqj3cR1hB+5sKPJY0TIbXIr2hvGeDtwhh7YCv3sbiWi5s2fZ7CP/5bQX8
|
||||
17/+/Pkfvw2Drn8Ur+9iwFws83/8n1WB/5D+Y+qy1+vvGgKesrL459//ewPhn8/Yd5/5f859W7yn
|
||||
f/79Z/t31eCfuZ+z1/9z+F/fB/3Xv/4XAAAA//8DAHXQUXneIAAA
|
||||
headers:
|
||||
CF-RAY:
|
||||
- 93bd468e08302506-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Wed, 07 May 2025 02:26:59 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-allow-origin:
|
||||
- '*'
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
cf-cache-status:
|
||||
- DYNAMIC
|
||||
openai-model:
|
||||
- text-embedding-3-small
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '140'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
via:
|
||||
- envoy-router-678b766599-k7s96
|
||||
x-envoy-upstream-service-time:
|
||||
- '61'
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '10000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '9999994'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_22e020337220a8384462c62d1e51bcc6
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are Information Agent
|
||||
with extensive role description that is longer than 80 characters. You have
|
||||
access to specific knowledge sources.\nYour personal goal is: Provide information
|
||||
based on knowledge sources\nTo give my best complete final answer to the task
|
||||
respond using the exact following format:\n\nThought: I now can give a great
|
||||
answer\nFinal Answer: Your final answer must be the great and the most complete
|
||||
as possible, it must be outcome described.\n\nI MUST use these formats, my job
|
||||
depends on it!"}, {"role": "user", "content": "\nCurrent Task: What is Brandon''s
|
||||
favorite color?\n\nThis is the expected criteria for your final answer: The
|
||||
answer to the question, in a format like this: `{{name: str, favorite_color:
|
||||
str}}`\nyou MUST return the actual complete content as the final answer, not
|
||||
a summary.Additional Information: Brandon''s favorite color is red and he likes
|
||||
Mexican food.\n\nBegin! This is VERY important to you, use the tools available
|
||||
and give your best Final Answer, your job depends on it!\n\nThought:"}], "model":
|
||||
"gpt-4o-mini", "stop": ["\nObservation:"]}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '1136'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=RAnX9bxMu6FRFRvWLdkruoVeTpKeJSsewnbE5u1SKNc-1746584818-1.0.1.1-08O3HvJLNgXLW2GhIFer0bWIw7kc_bnco7201aq5kLNaI2.5R_LzcmmIHlEQmos6TsjWG..AYDzzeYQBts4AfDWCT__jWc1iMNREXvz_Bk4;
|
||||
_cfuvid=hVuA8E89306pCEvNIEtxK0bavBXUyyJLC45CNZ0NFcY-1746584818774-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.68.2
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.68.2
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-read-timeout:
|
||||
- '600.0'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.9
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAAwAAAP//jFNNb+IwEL3nV4x8JqsQYAu50UOl7mG/JE5LFU3tSXBxPJZt6K4Q/33l
|
||||
QCHtdqVeInnevOf3ZpxDBiC0EhUIucEoO2fy29W3zq2+L2dmpb4sFj+2X5dm80Td9qe8j2KUGPz4
|
||||
RDK+sD5J7pyhqNmeYOkJIyXV8c3082w+nY8XPdCxIpNorYv5lPNOW52XRTnNi5t8PD+zN6wlBVHB
|
||||
rwwA4NB/k0+r6LeooBi9VDoKAVsS1aUJQHg2qSIwBB0i2pPnMyjZRrK99Xuw/AwSLbR6T4DQJtuA
|
||||
NjyTB1jbO23RwLI/V3A4WOyogrW49WgV27UYQYN79jpSLdmwT6AntRbH4/BOT80uYMptd8YMALSW
|
||||
I6a59Wkfzsjxks9w6zw/hjdU0Wirw6b2hIFtyhIiO9GjxwzgoZ/j7tVohPPcuVhH3lJ/XTmenPTE
|
||||
dX0DdHYGI0c0g/pkPnpHr1YUUZsw2ISQKDekrtTr2nCnNA+AbJD6XzfvaZ+Sa9t+RP4KSEkukqqd
|
||||
J6Xl68TXNk/pdf+v7TLl3rAI5PdaUh01+bQJRQ3uzPk/CX9CpK5utG3JO69PD69xdTFZlPOyLBaF
|
||||
yI7ZXwAAAP//AwCISUFdhgMAAA==
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 93bd46929f55cedd-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Wed, 07 May 2025 02:27:00 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '394'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-envoy-upstream-service-time:
|
||||
- '399'
|
||||
x-ratelimit-limit-requests:
|
||||
- '30000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '150000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '29999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '149999749'
|
||||
x-ratelimit-reset-requests:
|
||||
- 2ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_08f3bc0843f6a5d9afa8380d28251c47
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
@@ -6,7 +6,7 @@ interactions:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, zstd
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
@@ -151,7 +151,7 @@ interactions:
|
||||
6NrP9D+nrsnf4z///rMV/t41+GfqpvT1/z7/1/dV//Wv/wUAAP//AwBcfFVx4CAAAA==
|
||||
headers:
|
||||
CF-RAY:
|
||||
- 931fcf607c16eb34-SJC
|
||||
- 93bd535cca31f973-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
@@ -159,14 +159,14 @@ interactions:
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Thu, 17 Apr 2025 23:47:53 GMT
|
||||
- Wed, 07 May 2025 02:35:43 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=CncSMPCav.9EJL3emmM0sTqugx5GN6_Oy8JPFBssVho-1744933673-1.0.1.1-Q1XMvHbQdrfEWkkCYeeNHwFdZ1NpjAGJ_0jOUYIk_APelFe7nCanjW_xlOj12b3JQql9.iWQDiHvCeYJDTWkdxnNiMQOEiFMYHX5YZXUuJs;
|
||||
path=/; expires=Fri, 18-Apr-25 00:17:53 GMT; domain=.api.openai.com; HttpOnly;
|
||||
- __cf_bm=FaqN2sfsTata5eZF3jpzsswr9Ry6.aLOWPP..HstyKk-1746585343-1.0.1.1-9IGOA.WxYd0mtZoXXs5PV_DSi6IzwCB.H8l4mQxLdl3V1cQ9rGr5FSQPLoDVJA5uPwxduxFEbLVxJobTW2J_P0iBVcEQSvxcMnsJ8Jtnsxk;
|
||||
path=/; expires=Wed, 07-May-25 03:05:43 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=unfPTYCpF5COtm5PuZDuaJqlhefP0iibfjsXHc9lKq0-1744933673515-0.0.1.1-604800000;
|
||||
- _cfuvid=SlYSO8wQlhrJsTTYoTXd7IBl_D9ZddMlIzW1PTFiZIE-1746585343627-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
@@ -185,15 +185,15 @@ interactions:
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '75'
|
||||
- '38'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
via:
|
||||
- envoy-router-8687b6cbdb-4qpmr
|
||||
- envoy-router-6fcbcbb5fd-pxw6t
|
||||
x-envoy-upstream-service-time:
|
||||
- '46'
|
||||
- '41'
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
@@ -207,32 +207,33 @@ interactions:
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_b8c884a7fe2bd4732903ecbdc632576d
|
||||
- req_39d01dc72178a8952d00ba36c7512521
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are Information Agent.
|
||||
You have access to specific knowledge sources.\nYour personal goal is: Provide
|
||||
information based on knowledge sources\nTo give my best complete final answer
|
||||
to the task respond using the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: Your final answer must be the great and the
|
||||
most complete as possible, it must be outcome described.\n\nI MUST use these
|
||||
formats, my job depends on it!"}, {"role": "user", "content": "\nCurrent Task:
|
||||
What is Brandon''s favorite color?\n\nThis is the expected criteria for your
|
||||
final answer: Brandon''s favorite color.\nyou MUST return the actual complete
|
||||
content as the final answer, not a summary.\n\nBegin! This is VERY important
|
||||
to you, use the tools available and give your best Final Answer, your job depends
|
||||
on it!\n\nThought:"}], "model": "gpt-4o-mini", "stop": ["\nObservation:"]}'
|
||||
body: '{"messages": [{"role": "system", "content": "Your goal is to rewrite the
|
||||
user query so that it is optimized for retrieval from a vector database. Consider
|
||||
how the query will be used to find relevant documents, and aim to make it more
|
||||
specific and context-aware. \n\n Do not include any other text than the rewritten
|
||||
query, especially any preamble or postamble and only add expected output format
|
||||
if its relevant to the rewritten query. \n\n Focus on the key words of the intended
|
||||
task and to retrieve the most relevant information. \n\n There will be some
|
||||
extra context provided that might need to be removed such as expected_output
|
||||
formats structured_outputs and other instructions."}, {"role": "user", "content":
|
||||
"The original query is: What is Brandon''s favorite color?\n\nThis is the expected
|
||||
criteria for your final answer: Brandon''s favorite color.\nyou MUST return
|
||||
the actual complete content as the final answer, not a summary.."}], "model":
|
||||
"gpt-4o-mini", "stop": ["\nObservation:"]}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, zstd
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '926'
|
||||
- '992'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
@@ -264,18 +265,19 @@ interactions:
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAAwAAAP//jJNNi9swEIbv+RWDLr0ki/NBvm5NYUsplFK29NAuZiKNnWlkjVaSk02X
|
||||
/PdiJxtn2y30YrCeecfvvCM/9QAUG7UEpTeYdOXtYPXp7vbLw7evm4ePmObvt/jrsVgVn9/5dbhj
|
||||
1W8Usv5JOj2rbrRU3lJicSesA2GiputwNpksxuPpbNyCSgzZRlb6NJjIoGLHg1E2mgyy2WA4P6s3
|
||||
wpqiWsL3HgDAU/tsfDpDj2oJWf/5pKIYsSS1vBQBqCC2OVEYI8eELql+B7W4RK61/gGc7EGjg5J3
|
||||
BAhlYxvQxT0FgB/ulh1aeNu+L2EV0BlxbyIUuJPAiUCLlQAcwUkCX68ta3sAI7quyCUywA72bMge
|
||||
AHfIFteWYOtkb8mUBFHqoCneXPsLVNQRm4xcbe0VQOckYZNxm8z9mRwvWVgpfZB1/EOqCnYcN3kg
|
||||
jOKauWMSr1p67AHct5nXL2JUPkjlU55kS+3nhtP5qZ/qVt3R0fQMkyS0V6rFpP9Kv9xQQrbxamtK
|
||||
o96Q6aTdirE2LFegdzX1325e632anF35P+07oDX5RCb3gQzrlxN3ZYGaP+FfZZeUW8MqUtixpjwx
|
||||
hWYThgqs7el+qniIiaq8YFdS8IFPl7TweTZejOajUbbIVO/Y+w0AAP//AwA4a1/QsgMAAA==
|
||||
H4sIAAAAAAAAAwAAAP//jFJNa9wwFLz7V4h36WVdvF5nv46BQEsPpYWeSjCK9GwrlfVU6XlpCfvf
|
||||
i+zN2klT6EUHzZvRzOg9ZUKA0XAUoDrJqvc2v/32+fTh68e7bel/UtXFR6/vKv+FP6191cMqMejh
|
||||
ERU/s94r6r1FNuQmWAWUjEl1vau2N/ubTbUZgZ402kRrPecV5b1xJi+LssqLXb7eX9gdGYURjuJ7
|
||||
JoQQT+OZfDqNv+AoitXzTY8xyhbheB0SAgLZdAMyRhNZOobVDCpyjG60fhuk0+TeRdHIEwXDKBRZ
|
||||
CsvxgM0QZbLsBmsXgHSOWKbIo9H7C3K+WrPU+kAP8RUVGuNM7OqAMpJLNiKThxE9Z0LcjxUML1KB
|
||||
D9R7rpl+4PjcereZ9GBufka3F4yJpV2SDqs35GqNLI2Niw5BSdWhnqlz4XLQhhZAtgj9t5m3tKfg
|
||||
xrX/Iz8DSqFn1LUPqI16GXgeC5j28l9j15JHwxAxnIzCmg2G9BEaGznYaVsg/o6Mfd0Y12LwwUwr
|
||||
0/i62BzKfVkWhwKyc/YHAAD//wMAwl9O/EADAAA=
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 931fcf649bdbed40-SJC
|
||||
- 93bd535e5f0b3ad4-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
@@ -283,14 +285,14 @@ interactions:
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Thu, 17 Apr 2025 23:47:54 GMT
|
||||
- Wed, 07 May 2025 02:35:43 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=8vySwO0xqpm0u93_1_rXQwTeEIWa2ei3_CD5sAdoo3o-1744933674-1.0.1.1-iqZDpH5poOUp4Rcnhfrb0N2Z0c2662RBiPEcx_gefNW.m3tBA3qyFa8tmFv7PitH8u9vyYK7jxUwy4lPiSi830QWNbTMgCMTbrJ7iaUV7hY;
|
||||
path=/; expires=Fri, 18-Apr-25 00:17:54 GMT; domain=.api.openai.com; HttpOnly;
|
||||
- __cf_bm=4ExRXOhgXGvPCnJZJFlvggG1kkRKGLpJmVtf53soQhg-1746585343-1.0.1.1-X3_EsGB.4aHojKVKihPI6WFlCtq43Qvk.iFgVlsU18nGDyeau8Mi0Y.LCQ8J8.g512gWoCQCEakoWWjNpR4G.sMDqDrKit3KUFaL71iPZXo;
|
||||
path=/; expires=Wed, 07-May-25 03:05:43 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=IXAyT8eWpFERM53ngcYNmaqhocfGbOHWSoe7SFNdoGI-1744933674288-0.0.1.1-604800000;
|
||||
- _cfuvid=vNgB2gnZiY_kSsrGNv.zug22PCkhqeyHmMQUQ5_FfM8-1746585343998-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
@@ -300,16 +302,133 @@ interactions:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '167'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-envoy-upstream-service-time:
|
||||
- '174'
|
||||
x-ratelimit-limit-requests:
|
||||
- '30000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '150000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '29999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '149999783'
|
||||
x-ratelimit-reset-requests:
|
||||
- 2ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_efb615e12a042605322c615ab896925c
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are Information Agent.
|
||||
You have access to specific knowledge sources.\nYour personal goal is: Provide
|
||||
information based on knowledge sources\nTo give my best complete final answer
|
||||
to the task respond using the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: Your final answer must be the great and the
|
||||
most complete as possible, it must be outcome described.\n\nI MUST use these
|
||||
formats, my job depends on it!"}, {"role": "user", "content": "\nCurrent Task:
|
||||
What is Brandon''s favorite color?\n\nThis is the expected criteria for your
|
||||
final answer: Brandon''s favorite color.\nyou MUST return the actual complete
|
||||
content as the final answer, not a summary.\n\nBegin! This is VERY important
|
||||
to you, use the tools available and give your best Final Answer, your job depends
|
||||
on it!\n\nThought:"}], "model": "gpt-4o-mini", "stop": ["\nObservation:"]}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '926'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=4ExRXOhgXGvPCnJZJFlvggG1kkRKGLpJmVtf53soQhg-1746585343-1.0.1.1-X3_EsGB.4aHojKVKihPI6WFlCtq43Qvk.iFgVlsU18nGDyeau8Mi0Y.LCQ8J8.g512gWoCQCEakoWWjNpR4G.sMDqDrKit3KUFaL71iPZXo;
|
||||
_cfuvid=vNgB2gnZiY_kSsrGNv.zug22PCkhqeyHmMQUQ5_FfM8-1746585343998-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.68.2
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.68.2
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-read-timeout:
|
||||
- '600.0'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.9
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA4xTTU/bQBC951eM9tJLghITIORWVKFSDq0qoR5aZE12x/aW9Yy7O06IEP+9shPi
|
||||
0FKpF0ueN+/tm6+nEYDxzizB2ArV1k2YXN19Xt/cVtnjh+2XbLH99W399a759HFzy8Xs0Yw7hqx+
|
||||
ktUX1omVugmkXngH20io1KnOLubnZ4uz0/m8B2pxFDpa2ehkLpPas59k02w+mV5MZos9uxJvKZkl
|
||||
fB8BADz1384nO3o0S5iOXyI1pYQlmeUhCcBECV3EYEo+KbKa8QBaYSXurd8AywYsMpR+TYBQdrYB
|
||||
OW0oAvzga88Y4H3/v4SriOyE3yUocC3RK4GVIBF8AhaFpl0Fb8MWnNi2JlZy4Bms1LVw2AKu0Qdc
|
||||
BYIHlk0gVxIkaaOldALXEgGtbSMqgedCYo1dP8fgFTbSBgcrghUlBRXA9PBiB5yPZDVsQSJY4dQG
|
||||
hYZiks77Xh82FUUCrXw6Focat51sqjCSOzluU6SiTdiNitsQjgBkFu3Z/YDu98jzYSRByibKKv1B
|
||||
NYVnn6o8Eibhrv1JpTE9+jwCuO9H376apmmi1I3mKg/UPzc7X+z0zLBxAzq/3IMqimGIZ7OL8Rt6
|
||||
uSNFH9LR8hiLtiI3UIdNw9Z5OQJGR1X/7eYt7V3lnsv/kR8Aa6lRcnkTyXn7uuIhLVJ3kP9KO3S5
|
||||
N2wSxbW3lKun2E3CUYFt2J2JSdukVOeF55JiE/3uVoomn55eZossm15Ozeh59BsAAP//AwAaTaZd
|
||||
OQQAAA==
|
||||
headers:
|
||||
CF-RAY:
|
||||
- 93bd53604e3f3ad4-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Wed, 07 May 2025 02:35:45 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
cf-cache-status:
|
||||
- DYNAMIC
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '489'
|
||||
- '933'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-envoy-upstream-service-time:
|
||||
- '936'
|
||||
x-ratelimit-limit-requests:
|
||||
- '30000'
|
||||
x-ratelimit-limit-tokens:
|
||||
@@ -317,13 +436,13 @@ interactions:
|
||||
x-ratelimit-remaining-requests:
|
||||
- '29999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '149999801'
|
||||
- '149999802'
|
||||
x-ratelimit-reset-requests:
|
||||
- 2ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_151f9d0b786f2022f249ee20ea108b43
|
||||
- req_0001c38df543cc383617c370087f0ee3
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
|
||||
@@ -6,7 +6,7 @@ interactions:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, zstd
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
@@ -150,8 +150,10 @@ interactions:
|
||||
Mj9mH/7zuxXwX//68+d//G4YtN0jf30vBkz5Mv3H/7kq8B/if4xt+nr9vYZAxrTI//n3/76B8M9n
|
||||
6NrP9D+nrsnf4z///rMV/t41+GfqpvT1/z7/1/dV//Wv/wUAAP//AwBcfFVx4CAAAA==
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 931fceef786ded38-SJC
|
||||
- 93bd57189acf15be-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
@@ -159,14 +161,14 @@ interactions:
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Thu, 17 Apr 2025 23:47:35 GMT
|
||||
- Wed, 07 May 2025 02:38:16 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=fj4RMXSXRDQjE2CFC6CGC3dVcJ8cl2Cbu8alijwMHA8-1744933655-1.0.1.1-M3c3AI4XQa.0GJoanNACuOm2aEL4xjqHR1grxIP3olFvq3e0eFHwQTvCF20YwR_OLiMJUH87eNUwgziawMccsxjR9OVZyDr5._5Wts6CrqA;
|
||||
path=/; expires=Fri, 18-Apr-25 00:17:35 GMT; domain=.api.openai.com; HttpOnly;
|
||||
- __cf_bm=VGdrMAj2834vuX5RC6lPbHVNwWHXnBmqLb0kAhiGO4g-1746585496-1.0.1.1-kvgkEGO9fI9sasCfJjizGBG4k82_KhCRbH8CEyFrjJatzMoxhM0Z3suJO_hFFH13Wyi2wThiM9QSPvH3dddjfC7hC_tscxijZwiGqtCVnnE;
|
||||
path=/; expires=Wed, 07-May-25 03:08:16 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=MSkpJsQZtdyIGvrl2mIwy0a_We8H6CIrS7etFgRBl2Y-1744933655703-0.0.1.1-604800000;
|
||||
- _cfuvid=sAoMYVxAaEFBkQttcKO7GlBZ5NlUNUIaJomZ05pGlCs-1746585496569-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
@@ -178,22 +180,20 @@ interactions:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
cf-cache-status:
|
||||
- DYNAMIC
|
||||
openai-model:
|
||||
- text-embedding-3-small
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '140'
|
||||
- '69'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
via:
|
||||
- envoy-router-84959bbcd5-rzqvq
|
||||
- envoy-router-7d545f8f56-jx5wk
|
||||
x-envoy-upstream-service-time:
|
||||
- '110'
|
||||
- '52'
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
@@ -207,32 +207,33 @@ interactions:
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_dd3ef61c4765b46ed7db80ddfe261f41
|
||||
- req_73f3f0d371e3c19b16c7a6d7cc45d3ee
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are Information Agent.
|
||||
You have access to specific knowledge sources.\nYour personal goal is: Provide
|
||||
information based on knowledge sources\nTo give my best complete final answer
|
||||
to the task respond using the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: Your final answer must be the great and the
|
||||
most complete as possible, it must be outcome described.\n\nI MUST use these
|
||||
formats, my job depends on it!"}, {"role": "user", "content": "\nCurrent Task:
|
||||
What is Brandon''s favorite color?\n\nThis is the expected criteria for your
|
||||
final answer: Brandon''s favorite color.\nyou MUST return the actual complete
|
||||
content as the final answer, not a summary.\n\nBegin! This is VERY important
|
||||
to you, use the tools available and give your best Final Answer, your job depends
|
||||
on it!\n\nThought:"}], "model": "gpt-4o-mini", "stop": ["\nObservation:"]}'
|
||||
body: '{"messages": [{"role": "system", "content": "Your goal is to rewrite the
|
||||
user query so that it is optimized for retrieval from a vector database. Consider
|
||||
how the query will be used to find relevant documents, and aim to make it more
|
||||
specific and context-aware. \n\n Do not include any other text than the rewritten
|
||||
query, especially any preamble or postamble and only add expected output format
|
||||
if its relevant to the rewritten query. \n\n Focus on the key words of the intended
|
||||
task and to retrieve the most relevant information. \n\n There will be some
|
||||
extra context provided that might need to be removed such as expected_output
|
||||
formats structured_outputs and other instructions."}, {"role": "user", "content":
|
||||
"The original query is: What is Brandon''s favorite color?\n\nThis is the expected
|
||||
criteria for your final answer: Brandon''s favorite color.\nyou MUST return
|
||||
the actual complete content as the final answer, not a summary.."}], "model":
|
||||
"gpt-4o-mini", "stop": ["\nObservation:"]}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, zstd
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '926'
|
||||
- '992'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
@@ -264,20 +265,19 @@ interactions:
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAAwAAAP//jFLBbtswDL37KwhdeokLx0mbOLd2WIAC63YZdthWGIpMO9xkUZDkpEWR
|
||||
fx/kpLG7dUAvBszHR733yOcEQFAlViDUVgbVWp3efv66LvKPRd7Q/f57vtaf7m6ePjze774sv23E
|
||||
JDJ48wtVeGFdKm6txkBsjrByKAPGqdPFfF7MZtdXVz3QcoU60hob0jmnLRlK8yyfp9kinS5P7C2T
|
||||
Qi9W8CMBAHjuv1GnqfBRrCCbvFRa9F42KFbnJgDhWMeKkN6TD9IEMRlAxSag6aXfgeE9KGmgoR2C
|
||||
hCbKBmn8Hh3AT7MmIzXc9P8ruHXSVGwuPNRyx44CgmLNDsjDRnd4OX7GYd15Ga2aTusRII3hIGNU
|
||||
vcGHE3I4W9LcWMcb/xdV1GTIb0uH0rOJ8n1gK3r0kAA89NF1r9IQ1nFrQxn4N/bPTa+Xx3li2NgI
|
||||
LU5g4CD1qL5cTN6YV1YYJGk/Cl8oqbZYDdRhU7KriEdAMnL9r5q3Zh+dk2neM34AlEIbsCqtw4rU
|
||||
a8dDm8N40P9rO6fcCxYe3Y4UloHQxU1UWMtOH89M+CcfsC1rMg066+h4a7Uts1mRL/M8KzKRHJI/
|
||||
AAAA//8DALRhJdF5AwAA
|
||||
H4sIAAAAAAAAA4xSy27bMBC86yuIvfRiFbKs+HVLUKBFL0YPRosWgcCQK5kNxSXItZEi8L8XlBxL
|
||||
aVOgFx44O8OZ4T5nQoDRsBWgDpJV521+t989PX78tF/fnr5X+w/fdgv6WnxuWruzX25hlhj08BMV
|
||||
v7DeK+q8RTbkBlgFlIxJdb6qljfrm2qz7IGONNpEaz3nFeWdcSYvi7LKi1U+X1/YBzIKI2zFj0wI
|
||||
IZ77M/l0Gp9gK4rZy02HMcoWYXsdEgIC2XQDMkYTWTqG2Qgqcoyut34XpNPk3kXRyBMFwygUWQrT
|
||||
8YDNMcpk2R2tnQDSOWKZIvdG7y/I+WrNUusDPcQ/qNAYZ+KhDigjuWQjMnno0XMmxH1fwfFVKvCB
|
||||
Os810yP2z81Xi0EPxuZHdHnBmFjaKWkze0Ou1sjS2DjpEJRUB9QjdSxcHrWhCZBNQv9t5i3tIbhx
|
||||
7f/Ij4BS6Bl17QNqo14HHscCpr3819i15N4wRAwno7BmgyF9hMZGHu2wLRB/RcauboxrMfhghpVp
|
||||
fF0sNuW6LItNAdk5+w0AAP//AwDAmd1xQAMAAA==
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 931fcef51a67f947-SJC
|
||||
- 93bd571a5a7267e2-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
@@ -285,14 +285,14 @@ interactions:
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Thu, 17 Apr 2025 23:47:36 GMT
|
||||
- Wed, 07 May 2025 02:38:17 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=7agwu5JV1OJvEFNhvfvqdgWf.HoMyIni9D85soRl3WE-1744933656-1.0.1.1-dKUwAZnjjuuiswFKWGsxpwHNBJUpjhYlZvfZpyNQIejxEJrXMCppgPvtQ9wa4SKezLmKqftvn_H.bAx_AEFJD2EWm5V6R_uK8.odneErR6A;
|
||||
path=/; expires=Fri, 18-Apr-25 00:17:36 GMT; domain=.api.openai.com; HttpOnly;
|
||||
- __cf_bm=62_LRbzx15KBnTorpnulb_ZMoUJCYXHWEnTXVApNOr4-1746585497-1.0.1.1-KqnrR_1Udr1SzCiZW4umsNj1gQgcKOjAPf24HsqotTebuxO48nvo8g_X5O7Mng9tGurC0otvvkjYjsSWuRaddXculJnfdeGq5W3hJhxI21k;
|
||||
path=/; expires=Wed, 07-May-25 03:08:17 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=LdTrzwZYrB6ZyQLY7NdaaHVpDVFvIjYm3arSpNy87wU-1744933656504-0.0.1.1-604800000;
|
||||
- _cfuvid=LPWfk79PGAoGrMHseblqRazN9H8qdBY0BP50Y1Bp5wI-1746585497006-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
@@ -305,11 +305,130 @@ interactions:
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '540'
|
||||
- '183'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-envoy-upstream-service-time:
|
||||
- '187'
|
||||
x-ratelimit-limit-requests:
|
||||
- '30000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '150000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '29999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '149999783'
|
||||
x-ratelimit-reset-requests:
|
||||
- 2ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_50fa35cb9ba592c55aacf7ddded877ac
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are Information Agent.
|
||||
You have access to specific knowledge sources.\nYour personal goal is: Provide
|
||||
information based on knowledge sources\nTo give my best complete final answer
|
||||
to the task respond using the exact following format:\n\nThought: I now can
|
||||
give a great answer\nFinal Answer: Your final answer must be the great and the
|
||||
most complete as possible, it must be outcome described.\n\nI MUST use these
|
||||
formats, my job depends on it!"}, {"role": "user", "content": "\nCurrent Task:
|
||||
What is Brandon''s favorite color?\n\nThis is the expected criteria for your
|
||||
final answer: Brandon''s favorite color.\nyou MUST return the actual complete
|
||||
content as the final answer, not a summary.\n\nBegin! This is VERY important
|
||||
to you, use the tools available and give your best Final Answer, your job depends
|
||||
on it!\n\nThought:"}], "model": "gpt-4o-mini", "stop": ["\nObservation:"]}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '926'
|
||||
content-type:
|
||||
- application/json
|
||||
cookie:
|
||||
- __cf_bm=62_LRbzx15KBnTorpnulb_ZMoUJCYXHWEnTXVApNOr4-1746585497-1.0.1.1-KqnrR_1Udr1SzCiZW4umsNj1gQgcKOjAPf24HsqotTebuxO48nvo8g_X5O7Mng9tGurC0otvvkjYjsSWuRaddXculJnfdeGq5W3hJhxI21k;
|
||||
_cfuvid=LPWfk79PGAoGrMHseblqRazN9H8qdBY0BP50Y1Bp5wI-1746585497006-0.0.1.1-604800000
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.68.2
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.68.2
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-read-timeout:
|
||||
- '600.0'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.9
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAAwAAAP//jFNNb9swDL3nVxC67JIMSZo0aW4ttmI77bIO3UdhMBLtcJVJQZKTBkX/
|
||||
+2CnrdOuA3YxYD4+8lGPvB8AGHZmBcZuMNs6+NHF1Zc7Xycnp/Rhdv3x2/fL82r+9Tr/uDrxn8yw
|
||||
Zej6N9n8xHpvtQ6eMqscYBsJM7VVJ4vZ6Xw5n50tOqBWR76lVSGPZjqqWXg0HU9no/FiNFk+sjfK
|
||||
lpJZwc8BAMB99211iqM7s4Lx8ClSU0pYkVk9JwGYqL6NGEyJU0bJZtiDViWTdNI/g+gOLApUvCVA
|
||||
qFrZgJJ2FAF+ySULejjv/ldwEVGcyrsEJW41ciaw6jUCJxDNEJq1Z+v3cCu6E9AIuEX2uPYELGC1
|
||||
rlU60JOrCJI20VIaAiYIFJO2zUKkkiKJpQSeb+lVrwQYCfI+sEXv9xAibzEToLhukC3GPezYkd8D
|
||||
1ioVsDjesmvQJ9hx3mhzpDRtMJIDllJjja1/74/fKlLZJGz9ksb7IwBFNHf5nUs3j8jDsy9eqxB1
|
||||
nV5RTcnCaVNEwqTSepCyBtOhDwOAm87/5oWlJkStQy6y3lLXbnK6PNQz/dr16GzxCGbN6Pv4dDIf
|
||||
vlGvcJSRfTraIGPRbsj11H7dsHGsR8DgaOq/1bxV+zA5S/U/5XvAWgqZXBEiObYvJ+7TIrVX+a+0
|
||||
51fuBJtEccuWiswUWyccldj4w62YtE+Z6qJkqSiGyIeDKUMxPjmbLqfT8dnYDB4GfwAAAP//AwA/
|
||||
0jeHPgQAAA==
|
||||
headers:
|
||||
CF-RAY:
|
||||
- 93bd571c9cf367e2-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Wed, 07 May 2025 02:38:18 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
cf-cache-status:
|
||||
- DYNAMIC
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '785'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-envoy-upstream-service-time:
|
||||
- '931'
|
||||
x-ratelimit-limit-requests:
|
||||
- '30000'
|
||||
x-ratelimit-limit-tokens:
|
||||
@@ -323,7 +442,7 @@ interactions:
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_8837be6510731522fd5ac4b75c11d486
|
||||
- req_9bf7c8e011b2b1a8e8546b68c82384a7
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
|
||||
333
tests/cassettes/test_get_knowledge_search_query.yaml
Normal file
333
tests/cassettes/test_get_knowledge_search_query.yaml
Normal file
@@ -0,0 +1,333 @@
|
||||
interactions:
|
||||
- request:
|
||||
body: '{"input": ["Capital of France"], "model": "text-embedding-3-small", "encoding_format":
|
||||
"base64"}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, zstd
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '96'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.68.2
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.68.2
|
||||
x-stainless-read-timeout:
|
||||
- '600'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.9
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/embeddings
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA1R6Ww+yyrbl+/4VK+uV3pGLUFXrjbsISKEgYqfTAUEEBeRWQJ2c/97Br/t094uJ
|
||||
WIZUzVljjjHm/I9//fXX321a5Y/x73/++vtTDuPf/217liVj8vc/f/33f/31119//cfv8/9bmddp
|
||||
nmVlU/yW/34smyxf/v7nL/a/nvzfRf/89XfIFDaOu8OtX0ttiOESiQE+pXaRrnxot/BYFAy5kEUP
|
||||
ONOCJaymU0mOVijT9bPeH/BZXg4kYptvv4S7/QQztojx/ay+Ul5kFR8F+9XEUeizziKFvQ+uIr4T
|
||||
E107h77ZSw1r9zPiUCqSYL4SxUbv5usRy5ZLusgkecBnVXUYOzGbrs25voiDG1c4vEhCTxHz8GCW
|
||||
BAdvL9+sngv6iwv3uxDj7HWgPUkXUIBvFy3YnkpJo7AQIvipwoIEgINatQMnCZbG54EP4NtQGgm1
|
||||
j84HJBDbhxeNI+7DBsboBuS5G27VeLUbGV4TmSEyXDVNYO2qg963GbCJrrazpK3iIk9XzQn1/aGa
|
||||
HvHpDatPvZDj9a5UvL67PpApmxGO6maite0dfTjujIfHLN4F0Dun8IiTkpSc0vyuLRl9hRAz+on4
|
||||
10ruubG6mchYruLEKW2rfYOdyUKFSzKs4XeRsjXNYsmeAMVeXOk9n7hjDu9xnGNTZ57p3DVSDWr5
|
||||
fMXpOdNo4zDGG1m7/QtHFVHoPOe4gw9gDliBlhTQ8z7hRY3aLjneZ4Uu8u3TAfe1ODjU9lrFPfww
|
||||
hvez5uNTtZcBx1Zhglg5CrF231NAhv5ewND/BCQ5FX2/CJdjAW8sKYkyly7lHFe2AdR9HZ9Jc6EC
|
||||
8O8qelzOmcdoegrac2xJ0KLvI8be8UXp28I+fJ/uiCQH4ews2KhKpFTMwxPPo5S+M+Oxh4esr7Hb
|
||||
TDigi6VIaPf8aMS6GVol/PItMgbfA8z3HghaTguYSWJFrmr1AovFCxN6lWOCH8Zy0Cj3tG24rysZ
|
||||
n9r61c8ncz9A32GbqY46xeETxDNIpUswCWHyqRbOfMVIaG0bG4uy9Et2iGSYhTeJqEEyARpwbgEt
|
||||
fzpMxzJXU9bImBBMrPAicq0fwCyP9wiGaWsRR481jeNRCZHD8pG348i7IurdyaE9iRTHha1X0yeE
|
||||
NbicCPCK19j063LnYvSxK5+oxyWshHE3y2h/CD/kfrjXdEaVFIL3y9fwbQWHgApiPMGnM2U4gotf
|
||||
rZ/1nKNkskqcWfRTrXUytNBlfAWr5VD1rak4ETSWm4iVsLxt+8tVwDlaOfWqg7XlfHY6cDjfPGIy
|
||||
4EFJveg+1JsHxmnzzgDl35IHDwafeKabnYLRmXoXamJ1m+ZoDTQara0OOTVZsHr21YoviMHCV9w2
|
||||
Xhyo0Fma0VZh/dXfON9HRTV/7vMAFpFeJxbnZ20Rs28L+x0YiFcKNzDj21P95SexzvRYcVejYNHL
|
||||
3p9w6l15ZzaGYQZWc7gQo/rMKQEr1P+s12KUOD+8gU/9ir193EcVh3mkwnF3eGDjcbS1LV9l8aiE
|
||||
lbfU4hqMtzd6S8X3POFnBStnNPtbjcDbqLGm0n06n77BCkrCD948wBDMDjY8+L7vWRztmSPoLobs
|
||||
/fCAaLN67eeMnxKYHFWJHOpFTvlQEfcwPpWLR7+T6nBPHQ9S9MI6ceprH0yrM/OI8IDBTs6jdNu/
|
||||
DN9WTr3yzmgaX5ATD3fr0/R4tjlWbCmuMlpPOSTqnak0sgPGHo6vm7bhs619j3KhIsKLzDT/8EeV
|
||||
1hpe4nkhF+HtaLPNtTnkXo2LtUA2KUfDD4SscAUY21nSc339iVEYXiSiAuAEfKu9BnSa9hl+HG4i
|
||||
WMTs1aGT3p/w7z7wL7dc0RLgAeMrbKtB08gM7kdXw9nexVrfiZGOPs9uT3R6PmksFWUT1dXxhuWD
|
||||
++l7duVnMGVqiPVdGjpcC+YHhF+uw9bK7bWFrcIYct54xGcd2g6b9t7lT71zGdpRKl2frhTXkeJ9
|
||||
TEnVKDafDxh0yteD3rOp1rgp3uiZ2Tnxjq2vsVdrlKRLvC5Y9vRLT+tFv6DrpZ+JWoHIWVRpfcMt
|
||||
HsSNn2oq3OsgRiqlAbaSl+1wVXSckWv6N5I43Q3wrfdkwHzTHyRWXyxYSb/av/wg3m6t6cJPsJTU
|
||||
lDFxoE2rs6T3ywxD+rxO+8ukpUIowTesh9kh0fhM6dqy1IXtM9OxfBtdwKUjTuA8ePrEcrmiCTwT
|
||||
STCTvSvOZOPjbPjeIvdFHeKJitivfqDP6G7CjBzN/KSNuyV4w21/WAuA3As69GPIvT4uPsx2oq1R
|
||||
zQ+wegGbGLWJtCH83k0QNKHhzVnUpHzifh4ofz1PHmdKpdOuB30AeUY/3jc+Tz2NKrH9xYPkr7Oh
|
||||
Cca7f0PXOb2wdXocA5Zw7AOBV6wSP9bHtPM5wkrIHQdispqv0fNBidD3XJUT3zHYoW7nQyh809Hb
|
||||
H42vI2QobaEa+0+c4XB11putTSA5yhLxrDFKZ41QHd4WDXit8S2rqTo+HvA1zphoPP4GRHncV4jl
|
||||
YzR9g2oE81l+mGLTgpToVcqAWc7LPUpQhDzY2FfA8rkO4Wjw2OOy+FEtmjBc4GMML0S1StXhbOH1
|
||||
RkrTavhxApPGqcevDleddYhy+egBOUmCj2q3GT3JSz/BELGi+wd/5LfHBvRY7WTYIQETXDSPlN3w
|
||||
AgyJhXDEAEgH7RTLcOc5KjlcTueeFtkl+lMPsEISh33czh26z7gnp8K9g29/kic0WAMkh8WgQBh0
|
||||
5gKKy74ix0vLassrjy5o6d4SkfkP0eawuEyQu5V3YufKBcyXbIboy6bvaTl1di+8IzRImSWY09kl
|
||||
PhAI/A6QdbuBKLcG9QMPvrzQTt8nztr6VbHzwJhSFKcslmP9lAp5UXQIjFPiOUTrwWqpX+tP/mtc
|
||||
iPvJCZZQcly+J9rXtirhtexZIJhVTzb8C9aIGy1YfS8x8T7tJ2gMl1/h/az4ntBbRb9AtS7gEoEA
|
||||
O80bgd/9hnyvCuQQJp+eWnp5kXhcatOeQwZdNHy2YWsZPLbXz6tarvfrAz1hKGKv65j+veEV5BhT
|
||||
nahzKpxZqqMVupNRYVvMvJ7/Sv0FbnhHrLpfHHrX5hB+PE8gWos9sAiP3Sy1RnzBxzQK+3l9v2oQ
|
||||
0uyKQ3RuANH8kw6/kxzjbOOXQmSGK/RVTCdwMqtq3Yn6/MtPbL1fOhBOr0uNBll38F2LngHHFZ4O
|
||||
F/4wEcv4qpUQa14Ob2eh8Fgdvx1ivKs3xPWww7kIg2BelEcLLr4tT8vjyqRrVS/RH34h2+2gLexq
|
||||
s7DjvHxipUJK6cangcZxM06uwSWgOHcvcMoDwXvkDy7o+ciZ4XqpP8RRHezwQ/JiID/bxSS6vaJ9
|
||||
X/arQyKBPTGGbglG4SGscPceBhzQQ9ovtzf3hvU1QRPParO27KWwg/zJ2OHj1B6D9ZuZMVQiV8VB
|
||||
9DgFg8a8CpAnB8YTZlvSfvkNtXlf4xv4HgArpW0BNzwm6l37BlRT8g4wnxudqslQg8mLAhOZgoG8
|
||||
9cdHEsRDIKwumnapmPbzQJMJHoKaxcrGX3/4iMDpBYnmfc1KcLsYwvq4PxLtNA2UCF3ng+h10qdd
|
||||
nRFap50aojJPeqx+TnPaSb26R8wcf7G/C0j1zbuOgXs3IB6bjExA90dBB32sF95Mj1YvbN/Rxk9I
|
||||
Yow64HOQFKChzJngJv72q6r2rOSwbIQV5VgFg4dWH5k3scTu+q0Ambs3A1fR8vBt/I7BPBwOCdyJ
|
||||
fEiMV6pro+d9JVCY94FYXzcDvaWXPkpFGeN4qY2eqyJlRnBfH7yPfmTpwOc6I92mMMJX5MQVbYyO
|
||||
lcbbxd/yray2/V/g5/o2cZ7qi7a691eC1FoVf3yhmp+q+YbvCjUkQ/lXowa9qHA9PSBR7WFK5/KR
|
||||
WX/qXYfYCfQ5m13gGLy/2FuPQz/o+rlD6mwM5NZbcs9zXlLD4/1IyHH+wIrGZ1aCA7l88AlWJFjk
|
||||
OjQRMydfD5qDDFojYyL44+PO+lyDRa1OM7zlzh5rytugQh2QGEz6ycAGcvbV+sZ6iXKDuU/rwW/o
|
||||
d+MbwJyHFWdfjnVmfgAJ+O4PDnHzB5cSK/AHwJeJOQmrPFakbNccJRwi2OqRo/G96Tykl1buvPHw
|
||||
clK2ZVAJxITTSJ4chWAKqBPDTb8RV3zNdEnvyQq9em9h88GE/WrRtQU/fXXAg06Xj36SITeKb2Ie
|
||||
P0Ww6GaygmOUO8RwzmUwd/xLRdM9GsnJrJd+0x8h2PALO1nTASp/Yhvm/JvDxySVwAgnkUcbvyWG
|
||||
SC4O6198HlbOtcAGhiCYHkzBI2ZiGqw97RP9jNXThGlkn7APhKPDO8ESAeMqGMTK0Fwt+nOXgy6S
|
||||
NSID4aiRXVyyUGVYi4RZ7lA+oE4CZRrJRNvPvEM13zChJr5u3szEVzoPV8SDa6Iy+CSpaiXgQZog
|
||||
aOgJK2fz7FBWGX3QS7DE2vHm9iy+3VT49mJ54saPowmPq1xA4SSJZLu/gGbJ/EblmwrePn1UYCK7
|
||||
eoYnK2KJ9uNb2KgKVLZZThyLVMEsj+fwd37TaxrELV8T9Yd3WJ51tV+b8+RDJN/DTZ/LgQAPlxYq
|
||||
OU+J+Rk4rVX2hxruibNMHGNeAqGLqIys0rLIvTqCYH6xCgPZ83rFch+9wNAyXAmsTH16/J75gtWF
|
||||
7gPmXmOTw+GLUrryJxPeB93A12BXgpmmvQtxpcX48OHNfnnqeIK/88eTH/XLcjR4KJbu5efngHm3
|
||||
93kwOe+AeJirwLpOwR5BE7XE84vM4YmX+fAwV19vJ5lvSpMskyHoZIpl6B37ae/DCGqWBIjiiZlG
|
||||
iywJYQpG6rH53ajmvBRXsHfPBGNmycDS5EHNFfYhmCStmOliGZMLW2Xq8UkJb9Wq3ezo5/8Qyw7T
|
||||
fmzBPof5x2L/3C9CSz9H3rMQt/yWA77eKwWsDl+RHGuurKZz0/vQct45OZiaXa0JtSZo4zEmj/Rt
|
||||
pbN/vOSIUZ8L0eXqQckS7UNkZfKThPdD3xP541uIpHebeIOwVpueV9FTv2FsVB8/ZfscenCPUYST
|
||||
p2AGwtwNDEj1ycWHwOpTOuj8Bb7qu4LtyXsFlG2tEpYscyVHxaXOUvrnAaaXR01wRnC62rtklTa+
|
||||
hR1hGdI581Ifci2TTNLkKcH8uJ1bBKvHk+SPqA8oLeMc6L6h4OTerRUlju8iIn0+5KkQyVm3+waR
|
||||
LVv48PMn9kfNhsGS7DF2A+/P+6Dn6fuJ9xqxmupvasKyKsNpOZtnjQaXVw43vj7B6kaccfN34GGI
|
||||
NXw5LmxFJLDm6E7BBePzxwros/Yn1LRiSix/jyk7MRIPzVGzscmVWspl+6aAnkk6rL9V2elEET+A
|
||||
wWkFUTAbAbI2UQdTLmwnfsOrmR9oAn5+1MZXNYrIZwbnqSqwBaq4GqD8SX7xxNiBI/3hF6yk4U4y
|
||||
G52BUAdNjN479eEtHybu1+SdXODiFxYxjF2bzk3UxaAy44hc47HqOU6Q9rC7hA/y03NLt+/sP3h0
|
||||
P9xNSoB/lmHER8MEnsvNmdeDPv38S2IEO5XysGFruOHHJLEO2087f+mQJl9sfCx0Lp2E9mVLDfz8
|
||||
H3z9+Y+92T5J0N1TjRejdwKb2e+wB0eZCoo+MPBz3j9JnD4qOvO2HyI218epZ773dGaZvSVKTvbE
|
||||
zmRdnMU2YxNsfi22FokJ6DU/1jDx4yOJDFF0Vv0hJbBdbwdyXIoo5cluWiHiISLKs/s4czHd9/B0
|
||||
OjPeromPlaDloICptzJTtfEvfqsvUHbCK85pvNPWTI8hXKAMiBJBvRLq3TpA53jUsXWm34rW5ZWF
|
||||
9sFsiCwUFqDrTsvBhic/Pzcgh1PRIl5/etNSlhdnLcx1QMHBabB5CWQgiPpeguvudfeks9CDGV01
|
||||
CRJhrxDsxGGwgK8yo9N95TwmONvpli8xOuFBxUotrikVTkYsbviDbZsw2urm5A0vjnojxqbvOYja
|
||||
C9TOk7Pxt5VOh+fDhrITXYlyMPlgIl52gT7lLwTv/XMw9zl0oV0HKsafMtJ6rRH3YNPHHkpz0SEO
|
||||
c6rBFl8PHm4inU971UM/v3njc/1YKr0PkuQBMN78ofVjehbIh0KdOss79/RCwgeaUWdgIypwRa/D
|
||||
Xv35H8T/HCetu+elBze/yqNB+OxX7aaG6B4OAlZrq3MWOC0s2vQrPi4Fn071/ljCULAhdkKxC4Tj
|
||||
3Vp//yd4ifV0OUlaKL5cZ/akMI77OWsECAfxGk/DdQrB/LAqV9rwHpt6V1fLK899wIZPcfps/IGA
|
||||
t9FC/eN+SSSKVT/sv6MJvOvjgI/bfRluJcx/+OsJwuVQkW1/8OcnRZ/hqpEhGAawOMmTyGe3SJfY
|
||||
st9wjdGM5UEBThMOng02PMHpeIrTdZ1SCT7vuYCtpf5Um1/jwWU3lfiw3yf9eH5KJuSr73d68c8d
|
||||
HZ5CfIGJ2EdeluC6H+7+bpB0r9thNb9qwdLg0Yab3+S9SLMC2j3F5Fff8LFdOUCYizTDyzv4bHpP
|
||||
6ac6eXfwgFh92vCkWhRNVCWPNyti2/rH+cMHNj1JsNoFlHuN7gB/eu6w6c25a9Y3PPUePzEgrau5
|
||||
H8sWfs+vkiiS0zgr2U3zD3+J+wVawLMiq4p5qitb/jt0nC+6h+BYnT1287/G5XhioWDjCza+Xgfo
|
||||
O+ImOEGXkOweukDoxFwH0XNRsU+Pbb/A8wMC9TsU5K64VBuDR79CY/QCchIPLaBRtbTIL+Mz9oJz
|
||||
l87v0OrgtH9Zf/g0a+CDBJ4NPhNN2ec9laPXgPKhVLE5JdeKX0x5RUNfM8S5srrDea1Xw+FCCXFD
|
||||
CrRe5NUYNpa/89Ys+VYziBMXuJ5oEWVcdsEqioccNP1RmVB9P1V00BkfIvKZsDKadUA2fQ/XhcrE
|
||||
PJNDwNvzOwTIJQO2ETvR9YVYH8nXu0WwA0+UZRIA4bQcpenr9i9tuJvHHAYnk8XWk+oOnYzWRvVR
|
||||
OmI1HL7OWl4SHsTCqnprBXhtBfPK/PETVLK8gxXoTQt/9ce1rUOwvtxuhTrfdjgUOSXd+jUR4n1o
|
||||
kNvVYPqhOQAIG9ioRFWYkfbV1JVQNdl8i9/ozBoBJpS6XvLKPnlr9BBHe9hO/dMT/UNTzbS7t+Cn
|
||||
f8yBVakwn+UL0voZYVUyBNpMT52Hm77CrtAjbfzohiotXS0RXSqkYNz8Ydg4TE0co7lWa/SwHrAX
|
||||
fEDOF8WmA2aphGqp/GAVnYyK/eknTG4zdrf+Bx837RvWt64izidZKpKhtIOWjn1vTU9cvw4OKP74
|
||||
G27y9SqujqvyT3ycnM+C+aenLkXDTEAzW0B//mckqQGxxPuQbniXw7NPD9P+ZlT93BwoA5Rl4rFN
|
||||
Mgp+fgtqC2AS75kuwXjlpQhKB7qfaIwkrb37wgCUnKWbuXTUtn5OCX563rt7WbVemb6Al0cee4Ji
|
||||
zXRuojL+g/+O63zBuvVv4MjnPTGKwk6XJCgH4BhZNoHhwlS02KEYZucu8sQDklN+5y8t2urptLjE
|
||||
p+MLExaQUztjL3T1YN09CxMpjOYRZ/emv35E9/N7sBeNM/0mQTdB8WR+seXga7pgoy/h7/w3Pkzn
|
||||
PNQiKO/PNn7gRAl+/BSuzXTC0fgEdG57aw+as4mx7J/NXiidaw09XTbxTx99Gc/rwOPb9eTg3phf
|
||||
P6aFy90WPenD3qtl67cARTR1bJH7uZ/rZ7GiD7u+iMWhD5imW9BB/l7zxFJq4AxN1CVQi/rrNOtp
|
||||
nC7P1/AGSuFzePOHqzk9f99w81vxIXqfKf1IdvvTo+QEKxyspfSB0OpmER8SsDrLvm5l2Do0m3a+
|
||||
eXXW2pxldLw7hDjbeQ9bPwKW7TPHbrC8Kirydgw7WUqJcUw9IJCR+n/8Gquw39Xy45PdyaNbf0yv
|
||||
eNeWINSDucUGIUK6/PrV/bd4TWCrH6yUFgXqtM8O6+cdAtMtsHW4a5OAWN6t0pYqPUAoNUzk9bJS
|
||||
/vRVDiRDD6dff41w4/wGh3Z4ELvwTMpLvSr9/GmibPySpgstkYZcxoNKYlLaRUCFW78HywNk6fLh
|
||||
8lL6Puob1hP1E7TNATDQGTST2JE3ONRIPy6MmF3mwQ0vh7h9F1B+zBGJk5et8Y3bmpDgffTzryjX
|
||||
vp0WTiz3msRVJ2CuVWcPHGXhyeH2dCrOlDwVdIXJEwdNu75Tzxcf7YziTDJ2bqu1Fx4Q/v2bCvjP
|
||||
f/311//4TRjUbZZ/tsGAMV/Gf//XqMC/hX8PdfL5/BlDmIakyP/+539PIPz97dv6O/7PsX3nzfD3
|
||||
P38Jf0YN/h7bMfn8P4//tb3oP//1vwAAAP//AwDPjjDU3iAAAA==
|
||||
headers:
|
||||
CF-Cache-Status:
|
||||
- DYNAMIC
|
||||
CF-RAY:
|
||||
- 93c2407849cb943e-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Wed, 07 May 2025 16:56:39 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=iRcThWdZ.NO0HPhZg.pUV1AiG0u.0Dkd58N9HGucKdQ-1746636999-1.0.1.1-Cswtia9bUNC0npExHV2GcZLT2MVo6tEQbFU_dsKpjNN5R3s37B6JGWTE1IIZV9V0UGLhiy04og474anpJW4c6yLw0.9q5F4MPcxtAOjwBvo;
|
||||
path=/; expires=Wed, 07-May-25 17:26:39 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=rvDDZbBWaissP0luvtyuyyAWcPx3AiaoZS9LkAuK4sM-1746636999152-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-allow-origin:
|
||||
- '*'
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
openai-model:
|
||||
- text-embedding-3-small
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '116'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
via:
|
||||
- envoy-router-6b78fbf94c-z6prb
|
||||
x-envoy-upstream-service-time:
|
||||
- '123'
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '10000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '9999996'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 0s
|
||||
x-request-id:
|
||||
- req_3f67e7a1b90d845c25e9cef31147aba0
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
- request:
|
||||
body: '{"messages": [{"role": "system", "content": "You are Information Agent.
|
||||
I have access to knowledge sources\nYour personal goal is: Provide information
|
||||
based on knowledge sources\nTo give my best complete final answer to the task
|
||||
respond using the exact following format:\n\nThought: I now can give a great
|
||||
answer\nFinal Answer: Your final answer must be the great and the most complete
|
||||
as possible, it must be outcome described.\n\nI MUST use these formats, my job
|
||||
depends on it!"}, {"role": "user", "content": "\nCurrent Task: What is the capital
|
||||
of France?\n\nThis is the expected criteria for your final answer: The capital
|
||||
of France is Paris.\nyou MUST return the actual complete content as the final
|
||||
answer, not a summary.\n\nBegin! This is VERY important to you, use the tools
|
||||
available and give your best Final Answer, your job depends on it!\n\nThought:"}],
|
||||
"model": "gpt-4", "stop": ["\nObservation:"]}'
|
||||
headers:
|
||||
accept:
|
||||
- application/json
|
||||
accept-encoding:
|
||||
- gzip, deflate, zstd
|
||||
connection:
|
||||
- keep-alive
|
||||
content-length:
|
||||
- '911'
|
||||
content-type:
|
||||
- application/json
|
||||
host:
|
||||
- api.openai.com
|
||||
user-agent:
|
||||
- OpenAI/Python 1.68.2
|
||||
x-stainless-arch:
|
||||
- arm64
|
||||
x-stainless-async:
|
||||
- 'false'
|
||||
x-stainless-lang:
|
||||
- python
|
||||
x-stainless-os:
|
||||
- MacOS
|
||||
x-stainless-package-version:
|
||||
- 1.68.2
|
||||
x-stainless-raw-response:
|
||||
- 'true'
|
||||
x-stainless-read-timeout:
|
||||
- '600.0'
|
||||
x-stainless-retry-count:
|
||||
- '0'
|
||||
x-stainless-runtime:
|
||||
- CPython
|
||||
x-stainless-runtime-version:
|
||||
- 3.12.9
|
||||
method: POST
|
||||
uri: https://api.openai.com/v1/chat/completions
|
||||
response:
|
||||
body:
|
||||
string: !!binary |
|
||||
H4sIAAAAAAAAA4xSwW7bMAy9+ysIXXpJgmbN0iW3DEPQAtswDBt2WAuDlWhbjSxpEt00KPLvg+Uk
|
||||
drYeetGBj3x6fHwvGYDQSixByApZ1t6MP/6UX75vV82n7Yf17uvKX4dVuXn/Sxc3n2+uxKidcA+P
|
||||
JPk4NZGu9oZYO9vBMhAytazT69l8fjVfLBYJqJ0i046Vnsez8eV8eiCUldOSoljC7wwA4CW9rTar
|
||||
6Fks4XJ0rNQUI5YklqcmABGcaSsCY9SR0bIY9aB0lskmubcg0VrH4IN70ooA7Q4cVxRA28KFGtst
|
||||
ACNwRcAYNyANYTA7iIxMXZ2ePUkmBYW2aABt3FIAtAqUo2gvGAL9aXQgQKV0y4hmyD+BW4iVa4w6
|
||||
6ehoUfKR7cCgJnf2zq7TP6uELOFHRSDRa0YDroB1QCsJdIRvGHScDFcPVDQRW8ttY8wASC4kMcn0
|
||||
+wOyP9lsXOmDe4j/jIpCWx2rPBBGZ1tLIzsvErrPAO7TOZuzCwkfXO05Z7eh9N10vuj4RJ+cHp1N
|
||||
DyA7RtPX300PITjnyxUxahMHgRASZUWqH+3Tg43SbgBkg63/V/Mad7e5tuVb6HtASvJMKveBlJbn
|
||||
G/dtgR5Tsl5vO7mcBItI4UlLyllTaC+hqMDGdNEXcReZ6rzQtqTgg075by+Z7bO/AAAA//8DAI9H
|
||||
rN32AwAA
|
||||
headers:
|
||||
CF-RAY:
|
||||
- 93c2407d5cfaface-SJC
|
||||
Connection:
|
||||
- keep-alive
|
||||
Content-Encoding:
|
||||
- gzip
|
||||
Content-Type:
|
||||
- application/json
|
||||
Date:
|
||||
- Wed, 07 May 2025 16:56:41 GMT
|
||||
Server:
|
||||
- cloudflare
|
||||
Set-Cookie:
|
||||
- __cf_bm=g.ZMxB8fB7ZSkwD6w5ws93pGGw6nEi3uFVh.JDp2OOU-1746637001-1.0.1.1-59mPPW0bDWyD6ngFx6m9LdHurrdN9Kaem.eFcKAwWp_H_4kabp2CzCRiEaW2QhRYYPWE6fZPgqWU8amQtZqpRZHtEjTyoL8t6UtyzyTCoAQ;
|
||||
path=/; expires=Wed, 07-May-25 17:26:41 GMT; domain=.api.openai.com; HttpOnly;
|
||||
Secure; SameSite=None
|
||||
- _cfuvid=WOtfHIloFTPkupN1gC2z.3cExzObgfz.p4fXYpCK0aI-1746637001631-0.0.1.1-604800000;
|
||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
||||
Transfer-Encoding:
|
||||
- chunked
|
||||
X-Content-Type-Options:
|
||||
- nosniff
|
||||
access-control-expose-headers:
|
||||
- X-Request-ID
|
||||
alt-svc:
|
||||
- h3=":443"; ma=86400
|
||||
cf-cache-status:
|
||||
- DYNAMIC
|
||||
openai-organization:
|
||||
- crewai-iuxna1
|
||||
openai-processing-ms:
|
||||
- '2084'
|
||||
openai-version:
|
||||
- '2020-10-01'
|
||||
strict-transport-security:
|
||||
- max-age=31536000; includeSubDomains; preload
|
||||
x-envoy-upstream-service-time:
|
||||
- '2086'
|
||||
x-ratelimit-limit-requests:
|
||||
- '10000'
|
||||
x-ratelimit-limit-tokens:
|
||||
- '1000000'
|
||||
x-ratelimit-remaining-requests:
|
||||
- '9999'
|
||||
x-ratelimit-remaining-tokens:
|
||||
- '999805'
|
||||
x-ratelimit-reset-requests:
|
||||
- 6ms
|
||||
x-ratelimit-reset-tokens:
|
||||
- 11ms
|
||||
x-request-id:
|
||||
- req_1ff0f0c079f8e7f5feb17fe762b5e40a
|
||||
status:
|
||||
code: 200
|
||||
message: OK
|
||||
version: 1
|
||||
@@ -231,7 +231,7 @@ class TestDeployCommand(unittest.TestCase):
|
||||
[project]
|
||||
name = "test_project"
|
||||
version = "0.1.0"
|
||||
requires-python = ">=3.10,<3.13"
|
||||
requires-python = ">=3.10,<3.14"
|
||||
dependencies = ["crewai"]
|
||||
""",
|
||||
)
|
||||
@@ -250,7 +250,7 @@ class TestDeployCommand(unittest.TestCase):
|
||||
[project]
|
||||
name = "test_project"
|
||||
version = "0.1.0"
|
||||
requires-python = ">=3.10,<3.13"
|
||||
requires-python = ">=3.10,<3.14"
|
||||
dependencies = ["crewai"]
|
||||
""",
|
||||
)
|
||||
|
||||
39
tests/test_python_compatibility.py
Normal file
39
tests/test_python_compatibility.py
Normal file
@@ -0,0 +1,39 @@
|
||||
"""Tests for Python version compatibility."""
|
||||
import sys
|
||||
|
||||
import pytest
|
||||
from packaging import version
|
||||
|
||||
|
||||
def validate_python_version():
|
||||
"""Validate that the current Python version is supported."""
|
||||
min_version = (3, 10)
|
||||
max_version = (3, 14)
|
||||
current = sys.version_info[:2]
|
||||
|
||||
if not (min_version <= current < max_version):
|
||||
raise RuntimeError(
|
||||
f"This package requires Python {min_version[0]}.{min_version[1]} to "
|
||||
f"{max_version[0]}.{max_version[1]-1}. You have Python {current[0]}.{current[1]}"
|
||||
)
|
||||
|
||||
|
||||
def test_python_version_compatibility():
|
||||
"""Test that the package supports the current Python version."""
|
||||
assert isinstance(sys.version_info, tuple), "Version Information must be a tuple"
|
||||
|
||||
current_version = version.parse(f"{sys.version_info.major}.{sys.version_info.minor}")
|
||||
assert current_version >= version.parse("3.10"), "Python version too old"
|
||||
assert current_version < version.parse("3.14"), "Python version too new"
|
||||
|
||||
# This test will fail if the package doesn't support the current Python version
|
||||
import crewai
|
||||
|
||||
# Print the Python version for debugging
|
||||
print(f"Python version: {sys.version}")
|
||||
|
||||
# Print the crewai version for debugging
|
||||
print(f"CrewAI version: {crewai.__version__}")
|
||||
|
||||
# Validate Python version
|
||||
validate_python_version()
|
||||
12
uv.lock
generated
12
uv.lock
generated
@@ -1,5 +1,5 @@
|
||||
version = 1
|
||||
requires-python = ">=3.10, <3.13"
|
||||
requires-python = ">=3.10, <3.14"
|
||||
resolution-markers = [
|
||||
"python_full_version < '3.11' and platform_python_implementation == 'PyPy' and platform_system == 'Darwin' and sys_platform == 'darwin'",
|
||||
"python_full_version < '3.11' and platform_machine == 'aarch64' and platform_python_implementation == 'PyPy' and platform_system == 'Linux' and sys_platform == 'darwin'",
|
||||
@@ -738,7 +738,7 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "crewai"
|
||||
version = "0.118.0"
|
||||
version = "0.119.0"
|
||||
source = { editable = "." }
|
||||
dependencies = [
|
||||
{ name = "appdirs" },
|
||||
@@ -828,7 +828,7 @@ requires-dist = [
|
||||
{ name = "blinker", specifier = ">=1.9.0" },
|
||||
{ name = "chromadb", specifier = ">=0.5.23" },
|
||||
{ name = "click", specifier = ">=8.1.7" },
|
||||
{ name = "crewai-tools", marker = "extra == 'tools'", specifier = "~=0.42.2" },
|
||||
{ name = "crewai-tools", marker = "extra == 'tools'", specifier = "~=0.44.0" },
|
||||
{ name = "docling", marker = "extra == 'docling'", specifier = ">=2.12.0" },
|
||||
{ name = "fastembed", marker = "extra == 'fastembed'", specifier = ">=0.4.1" },
|
||||
{ name = "instructor", specifier = ">=1.3.3" },
|
||||
@@ -879,7 +879,7 @@ dev = [
|
||||
|
||||
[[package]]
|
||||
name = "crewai-tools"
|
||||
version = "0.42.2"
|
||||
version = "0.44.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "chromadb" },
|
||||
@@ -894,9 +894,9 @@ dependencies = [
|
||||
{ name = "pytube" },
|
||||
{ name = "requests" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/17/34/9e63e2db53d8f5c30353f271a3240687a48e55204bbd176a057c0b7658c8/crewai_tools-0.42.2.tar.gz", hash = "sha256:69365ffb168cccfea970e09b308905aa5007cfec60024d731ffac1362a0153c0", size = 754967 }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/b8/1f/2977dc72628c1225bf5788ae22a65e5a53df384d19b197646d2c4760684e/crewai_tools-0.44.0.tar.gz", hash = "sha256:44e0c26079396503a326efdd9ff34bf369d410cbf95c362cc523db65b18f3c3a", size = 892004 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/4e/43/0f70b95350084e5cb1e1d74e9acb9e18a89ba675b1d579c787c2662baba7/crewai_tools-0.42.2-py3-none-any.whl", hash = "sha256:13727fb68f0efefd21edeb281be3d66ff2f5a3b5029d4e6adef388b11fd5846a", size = 583933 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ba/80/b91aa837d06edbb472445ea3c92d7619518894fd3049d480e5fffbf0c21b/crewai_tools-0.44.0-py3-none-any.whl", hash = "sha256:119e2365fe66ee16e18a5e8e222994b19f76bafcc8c1bb87f61609c1e39b2463", size = 583462 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
||||
Reference in New Issue
Block a user