mirror of
https://github.com/crewAIInc/crewAI.git
synced 2025-12-17 12:58:31 +00:00
Compare commits
11 Commits
fix/testin
...
bugfix/res
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
61a4d7b8da | ||
|
|
e5b222c049 | ||
|
|
fb396cbaaa | ||
|
|
3ce44764aa | ||
|
|
25690347d2 | ||
|
|
97b85e1830 | ||
|
|
cf39d73e64 | ||
|
|
26b0375349 | ||
|
|
e322953c8b | ||
|
|
7d5da94382 | ||
|
|
589447c5c4 |
2
.github/workflows/linter.yml
vendored
2
.github/workflows/linter.yml
vendored
@@ -13,4 +13,4 @@ jobs:
|
|||||||
pip install ruff
|
pip install ruff
|
||||||
|
|
||||||
- name: Run Ruff Linter
|
- name: Run Ruff Linter
|
||||||
run: ruff check
|
run: ruff check --exclude "templates","__init__.py"
|
||||||
|
|||||||
8
.github/workflows/stale.yml
vendored
8
.github/workflows/stale.yml
vendored
@@ -1,10 +1,5 @@
|
|||||||
name: Mark stale issues and pull requests
|
name: Mark stale issues and pull requests
|
||||||
|
|
||||||
permissions:
|
|
||||||
contents: write
|
|
||||||
issues: write
|
|
||||||
pull-requests: write
|
|
||||||
|
|
||||||
on:
|
on:
|
||||||
schedule:
|
schedule:
|
||||||
- cron: '10 12 * * *'
|
- cron: '10 12 * * *'
|
||||||
@@ -13,6 +8,9 @@ on:
|
|||||||
jobs:
|
jobs:
|
||||||
stale:
|
stale:
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
|
permissions:
|
||||||
|
issues: write
|
||||||
|
pull-requests: write
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/stale@v9
|
- uses: actions/stale@v9
|
||||||
with:
|
with:
|
||||||
|
|||||||
@@ -1,7 +1,9 @@
|
|||||||
repos:
|
repos:
|
||||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||||
rev: v0.8.2
|
rev: v0.4.4
|
||||||
hooks:
|
hooks:
|
||||||
- id: ruff
|
- id: ruff
|
||||||
args: ["--fix"]
|
args: ["--fix"]
|
||||||
|
exclude: "templates"
|
||||||
- id: ruff-format
|
- id: ruff-format
|
||||||
|
exclude: "templates"
|
||||||
|
|||||||
@@ -1,9 +0,0 @@
|
|||||||
exclude = [
|
|
||||||
"templates",
|
|
||||||
"__init__.py",
|
|
||||||
]
|
|
||||||
|
|
||||||
[lint]
|
|
||||||
select = [
|
|
||||||
"I", # isort rules
|
|
||||||
]
|
|
||||||
177
README.md
177
README.md
@@ -4,7 +4,7 @@
|
|||||||
|
|
||||||
# **CrewAI**
|
# **CrewAI**
|
||||||
|
|
||||||
🤖 **CrewAI**: Production-grade framework for orchestrating sophisticated AI agent systems. From simple automations to complex real-world applications, CrewAI provides precise control and deep customization. By fostering collaborative intelligence through flexible, production-ready architecture, CrewAI empowers agents to work together seamlessly, tackling complex business challenges with predictable, consistent results.
|
🤖 **CrewAI**: Cutting-edge framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks.
|
||||||
|
|
||||||
<h3>
|
<h3>
|
||||||
|
|
||||||
@@ -22,17 +22,13 @@
|
|||||||
- [Why CrewAI?](#why-crewai)
|
- [Why CrewAI?](#why-crewai)
|
||||||
- [Getting Started](#getting-started)
|
- [Getting Started](#getting-started)
|
||||||
- [Key Features](#key-features)
|
- [Key Features](#key-features)
|
||||||
- [Understanding Flows and Crews](#understanding-flows-and-crews)
|
|
||||||
- [CrewAI vs LangGraph](#how-crewai-compares)
|
|
||||||
- [Examples](#examples)
|
- [Examples](#examples)
|
||||||
- [Quick Tutorial](#quick-tutorial)
|
- [Quick Tutorial](#quick-tutorial)
|
||||||
- [Write Job Descriptions](#write-job-descriptions)
|
- [Write Job Descriptions](#write-job-descriptions)
|
||||||
- [Trip Planner](#trip-planner)
|
- [Trip Planner](#trip-planner)
|
||||||
- [Stock Analysis](#stock-analysis)
|
- [Stock Analysis](#stock-analysis)
|
||||||
- [Using Crews and Flows Together](#using-crews-and-flows-together)
|
|
||||||
- [Connecting Your Crew to a Model](#connecting-your-crew-to-a-model)
|
- [Connecting Your Crew to a Model](#connecting-your-crew-to-a-model)
|
||||||
- [How CrewAI Compares](#how-crewai-compares)
|
- [How CrewAI Compares](#how-crewai-compares)
|
||||||
- [Frequently Asked Questions (FAQ)](#frequently-asked-questions-faq)
|
|
||||||
- [Contribution](#contribution)
|
- [Contribution](#contribution)
|
||||||
- [Telemetry](#telemetry)
|
- [Telemetry](#telemetry)
|
||||||
- [License](#license)
|
- [License](#license)
|
||||||
@@ -40,51 +36,22 @@
|
|||||||
## Why CrewAI?
|
## Why CrewAI?
|
||||||
|
|
||||||
The power of AI collaboration has too much to offer.
|
The power of AI collaboration has too much to offer.
|
||||||
CrewAI is a standalone framework, built from the ground up without dependencies on Langchain or other agent frameworks. It's designed to enable AI agents to assume roles, share goals, and operate in a cohesive unit - much like a well-oiled crew. Whether you're building a smart assistant platform, an automated customer service ensemble, or a multi-agent research team, CrewAI provides the backbone for sophisticated multi-agent interactions.
|
CrewAI is designed to enable AI agents to assume roles, share goals, and operate in a cohesive unit - much like a well-oiled crew. Whether you're building a smart assistant platform, an automated customer service ensemble, or a multi-agent research team, CrewAI provides the backbone for sophisticated multi-agent interactions.
|
||||||
|
|
||||||
## Getting Started
|
## Getting Started
|
||||||
|
|
||||||
### Learning Resources
|
|
||||||
|
|
||||||
Learn CrewAI through our comprehensive courses:
|
|
||||||
- [Multi AI Agent Systems with CrewAI](https://www.deeplearning.ai/short-courses/multi-ai-agent-systems-with-crewai/) - Master the fundamentals of multi-agent systems
|
|
||||||
- [Practical Multi AI Agents and Advanced Use Cases](https://www.deeplearning.ai/short-courses/practical-multi-ai-agents-and-advanced-use-cases-with-crewai/) - Deep dive into advanced implementations
|
|
||||||
|
|
||||||
### Understanding Flows and Crews
|
|
||||||
|
|
||||||
CrewAI offers two powerful, complementary approaches that work seamlessly together to build sophisticated AI applications:
|
|
||||||
|
|
||||||
1. **Crews**: Teams of AI agents with true autonomy and agency, working together to accomplish complex tasks through role-based collaboration. Crews enable:
|
|
||||||
- Natural, autonomous decision-making between agents
|
|
||||||
- Dynamic task delegation and collaboration
|
|
||||||
- Specialized roles with defined goals and expertise
|
|
||||||
- Flexible problem-solving approaches
|
|
||||||
|
|
||||||
2. **Flows**: Production-ready, event-driven workflows that deliver precise control over complex automations. Flows provide:
|
|
||||||
- Fine-grained control over execution paths for real-world scenarios
|
|
||||||
- Secure, consistent state management between tasks
|
|
||||||
- Clean integration of AI agents with production Python code
|
|
||||||
- Conditional branching for complex business logic
|
|
||||||
|
|
||||||
The true power of CrewAI emerges when combining Crews and Flows. This synergy allows you to:
|
|
||||||
- Build complex, production-grade applications
|
|
||||||
- Balance autonomy with precise control
|
|
||||||
- Handle sophisticated real-world scenarios
|
|
||||||
- Maintain clean, maintainable code structure
|
|
||||||
|
|
||||||
### Getting Started with Installation
|
|
||||||
|
|
||||||
To get started with CrewAI, follow these simple steps:
|
To get started with CrewAI, follow these simple steps:
|
||||||
|
|
||||||
### 1. Installation
|
### 1. Installation
|
||||||
|
|
||||||
Ensure you have Python >=3.10 <3.13 installed on your system. CrewAI uses [UV](https://docs.astral.sh/uv/) for dependency management and package handling, offering a seamless setup and execution experience.
|
Ensure you have Python >=3.10 <=3.12 installed on your system. CrewAI uses [UV](https://docs.astral.sh/uv/) for dependency management and package handling, offering a seamless setup and execution experience.
|
||||||
|
|
||||||
First, install CrewAI:
|
First, install CrewAI:
|
||||||
|
|
||||||
```shell
|
```shell
|
||||||
pip install crewai
|
pip install crewai
|
||||||
```
|
```
|
||||||
|
|
||||||
If you want to install the 'crewai' package along with its optional features that include additional tools for agents, you can do so by using the following command:
|
If you want to install the 'crewai' package along with its optional features that include additional tools for agents, you can do so by using the following command:
|
||||||
|
|
||||||
```shell
|
```shell
|
||||||
@@ -92,22 +59,6 @@ pip install 'crewai[tools]'
|
|||||||
```
|
```
|
||||||
The command above installs the basic package and also adds extra components which require more dependencies to function.
|
The command above installs the basic package and also adds extra components which require more dependencies to function.
|
||||||
|
|
||||||
### Troubleshooting Dependencies
|
|
||||||
|
|
||||||
If you encounter issues during installation or usage, here are some common solutions:
|
|
||||||
|
|
||||||
#### Common Issues
|
|
||||||
|
|
||||||
1. **ModuleNotFoundError: No module named 'tiktoken'**
|
|
||||||
- Install tiktoken explicitly: `pip install 'crewai[embeddings]'`
|
|
||||||
- If using embedchain or other tools: `pip install 'crewai[tools]'`
|
|
||||||
|
|
||||||
2. **Failed building wheel for tiktoken**
|
|
||||||
- Ensure Rust compiler is installed (see installation steps above)
|
|
||||||
- For Windows: Verify Visual C++ Build Tools are installed
|
|
||||||
- Try upgrading pip: `pip install --upgrade pip`
|
|
||||||
- If issues persist, use a pre-built wheel: `pip install tiktoken --prefer-binary`
|
|
||||||
|
|
||||||
### 2. Setting Up Your Crew with the YAML Configuration
|
### 2. Setting Up Your Crew with the YAML Configuration
|
||||||
|
|
||||||
To create a new CrewAI project, run the following CLI (Command Line Interface) command:
|
To create a new CrewAI project, run the following CLI (Command Line Interface) command:
|
||||||
@@ -313,16 +264,13 @@ In addition to the sequential process, you can use the hierarchical process, whi
|
|||||||
|
|
||||||
## Key Features
|
## Key Features
|
||||||
|
|
||||||
**Note**: CrewAI is a standalone framework built from the ground up, without dependencies on Langchain or other agent frameworks.
|
- **Role-Based Agent Design**: Customize agents with specific roles, goals, and tools.
|
||||||
|
- **Autonomous Inter-Agent Delegation**: Agents can autonomously delegate tasks and inquire amongst themselves, enhancing problem-solving efficiency.
|
||||||
- **Deep Customization**: Build sophisticated agents with full control over the system - from overriding inner prompts to accessing low-level APIs. Customize roles, goals, tools, and behaviors while maintaining clean abstractions.
|
- **Flexible Task Management**: Define tasks with customizable tools and assign them to agents dynamically.
|
||||||
- **Autonomous Inter-Agent Delegation**: Agents can autonomously delegate tasks and inquire amongst themselves, enabling complex problem-solving in real-world scenarios.
|
- **Processes Driven**: Currently only supports `sequential` task execution and `hierarchical` processes, but more complex processes like consensual and autonomous are being worked on.
|
||||||
- **Flexible Task Management**: Define and customize tasks with granular control, from simple operations to complex multi-step processes.
|
- **Save output as file**: Save the output of individual tasks as a file, so you can use it later.
|
||||||
- **Production-Grade Architecture**: Support for both high-level abstractions and low-level customization, with robust error handling and state management.
|
- **Parse output as Pydantic or Json**: Parse the output of individual tasks as a Pydantic model or as a Json if you want to.
|
||||||
- **Predictable Results**: Ensure consistent, accurate outputs through programmatic guardrails, agent training capabilities, and flow-based execution control. See our [documentation on guardrails](https://docs.crewai.com/how-to/guardrails/) for implementation details.
|
- **Works with Open Source Models**: Run your crew using Open AI or open source models refer to the [Connect CrewAI to LLMs](https://docs.crewai.com/how-to/LLM-Connections/) page for details on configuring your agents' connections to models, even ones running locally!
|
||||||
- **Model Flexibility**: Run your crew using OpenAI or open source models with production-ready integrations. See [Connect CrewAI to LLMs](https://docs.crewai.com/how-to/LLM-Connections/) for detailed configuration options.
|
|
||||||
- **Event-Driven Flows**: Build complex, real-world workflows with precise control over execution paths, state management, and conditional logic.
|
|
||||||
- **Process Orchestration**: Achieve any workflow pattern through flows - from simple sequential and hierarchical processes to complex, custom orchestration patterns with conditional branching and parallel execution.
|
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
@@ -357,98 +305,6 @@ You can test different real life examples of AI crews in the [CrewAI-examples re
|
|||||||
|
|
||||||
[](https://www.youtube.com/watch?v=e0Uj4yWdaAg "Stock Analysis")
|
[](https://www.youtube.com/watch?v=e0Uj4yWdaAg "Stock Analysis")
|
||||||
|
|
||||||
### Using Crews and Flows Together
|
|
||||||
|
|
||||||
CrewAI's power truly shines when combining Crews with Flows to create sophisticated automation pipelines. Here's how you can orchestrate multiple Crews within a Flow:
|
|
||||||
|
|
||||||
```python
|
|
||||||
from crewai.flow.flow import Flow, listen, start, router
|
|
||||||
from crewai import Crew, Agent, Task
|
|
||||||
from pydantic import BaseModel
|
|
||||||
|
|
||||||
# Define structured state for precise control
|
|
||||||
class MarketState(BaseModel):
|
|
||||||
sentiment: str = "neutral"
|
|
||||||
confidence: float = 0.0
|
|
||||||
recommendations: list = []
|
|
||||||
|
|
||||||
class AdvancedAnalysisFlow(Flow[MarketState]):
|
|
||||||
@start()
|
|
||||||
def fetch_market_data(self):
|
|
||||||
# Demonstrate low-level control with structured state
|
|
||||||
self.state.sentiment = "analyzing"
|
|
||||||
return {"sector": "tech", "timeframe": "1W"} # These parameters match the task description template
|
|
||||||
|
|
||||||
@listen(fetch_market_data)
|
|
||||||
def analyze_with_crew(self, market_data):
|
|
||||||
# Show crew agency through specialized roles
|
|
||||||
analyst = Agent(
|
|
||||||
role="Senior Market Analyst",
|
|
||||||
goal="Conduct deep market analysis with expert insight",
|
|
||||||
backstory="You're a veteran analyst known for identifying subtle market patterns"
|
|
||||||
)
|
|
||||||
researcher = Agent(
|
|
||||||
role="Data Researcher",
|
|
||||||
goal="Gather and validate supporting market data",
|
|
||||||
backstory="You excel at finding and correlating multiple data sources"
|
|
||||||
)
|
|
||||||
|
|
||||||
analysis_task = Task(
|
|
||||||
description="Analyze {sector} sector data for the past {timeframe}",
|
|
||||||
expected_output="Detailed market analysis with confidence score",
|
|
||||||
agent=analyst
|
|
||||||
)
|
|
||||||
research_task = Task(
|
|
||||||
description="Find supporting data to validate the analysis",
|
|
||||||
expected_output="Corroborating evidence and potential contradictions",
|
|
||||||
agent=researcher
|
|
||||||
)
|
|
||||||
|
|
||||||
# Demonstrate crew autonomy
|
|
||||||
analysis_crew = Crew(
|
|
||||||
agents=[analyst, researcher],
|
|
||||||
tasks=[analysis_task, research_task],
|
|
||||||
process=Process.sequential,
|
|
||||||
verbose=True
|
|
||||||
)
|
|
||||||
return analysis_crew.kickoff(inputs=market_data) # Pass market_data as named inputs
|
|
||||||
|
|
||||||
@router(analyze_with_crew)
|
|
||||||
def determine_next_steps(self):
|
|
||||||
# Show flow control with conditional routing
|
|
||||||
if self.state.confidence > 0.8:
|
|
||||||
return "high_confidence"
|
|
||||||
elif self.state.confidence > 0.5:
|
|
||||||
return "medium_confidence"
|
|
||||||
return "low_confidence"
|
|
||||||
|
|
||||||
@listen("high_confidence")
|
|
||||||
def execute_strategy(self):
|
|
||||||
# Demonstrate complex decision making
|
|
||||||
strategy_crew = Crew(
|
|
||||||
agents=[
|
|
||||||
Agent(role="Strategy Expert",
|
|
||||||
goal="Develop optimal market strategy")
|
|
||||||
],
|
|
||||||
tasks=[
|
|
||||||
Task(description="Create detailed strategy based on analysis",
|
|
||||||
expected_output="Step-by-step action plan")
|
|
||||||
]
|
|
||||||
)
|
|
||||||
return strategy_crew.kickoff()
|
|
||||||
|
|
||||||
@listen("medium_confidence", "low_confidence")
|
|
||||||
def request_additional_analysis(self):
|
|
||||||
self.state.recommendations.append("Gather more data")
|
|
||||||
return "Additional analysis required"
|
|
||||||
```
|
|
||||||
|
|
||||||
This example demonstrates how to:
|
|
||||||
1. Use Python code for basic data operations
|
|
||||||
2. Create and execute Crews as steps in your workflow
|
|
||||||
3. Use Flow decorators to manage the sequence of operations
|
|
||||||
4. Implement conditional branching based on Crew results
|
|
||||||
|
|
||||||
## Connecting Your Crew to a Model
|
## Connecting Your Crew to a Model
|
||||||
|
|
||||||
CrewAI supports using various LLMs through a variety of connection options. By default your agents will use the OpenAI API when querying the model. However, there are several other ways to allow your agents to connect to models. For example, you can configure your agents to use a local model via the Ollama tool.
|
CrewAI supports using various LLMs through a variety of connection options. By default your agents will use the OpenAI API when querying the model. However, there are several other ways to allow your agents to connect to models. For example, you can configure your agents to use a local model via the Ollama tool.
|
||||||
@@ -457,13 +313,9 @@ Please refer to the [Connect CrewAI to LLMs](https://docs.crewai.com/how-to/LLM-
|
|||||||
|
|
||||||
## How CrewAI Compares
|
## How CrewAI Compares
|
||||||
|
|
||||||
**CrewAI's Advantage**: CrewAI combines autonomous agent intelligence with precise workflow control through its unique Crews and Flows architecture. The framework excels at both high-level orchestration and low-level customization, enabling complex, production-grade systems with granular control.
|
**CrewAI's Advantage**: CrewAI is built with production in mind. It offers the flexibility of Autogen's conversational agents and the structured process approach of ChatDev, but without the rigidity. CrewAI's processes are designed to be dynamic and adaptable, fitting seamlessly into both development and production workflows.
|
||||||
|
|
||||||
- **LangGraph**: While LangGraph provides a foundation for building agent workflows, its approach requires significant boilerplate code and complex state management patterns. The framework's tight coupling with LangChain can limit flexibility when implementing custom agent behaviors or integrating with external systems.
|
- **Autogen**: While Autogen does good in creating conversational agents capable of working together, it lacks an inherent concept of process. In Autogen, orchestrating agents' interactions requires additional programming, which can become complex and cumbersome as the scale of tasks grows.
|
||||||
|
|
||||||
*P.S. CrewAI demonstrates significant performance advantages over LangGraph, executing 5.76x faster in certain cases like this QA task example ([see comparison](https://github.com/crewAIInc/crewAI-examples/tree/main/Notebooks/CrewAI%20Flows%20%26%20Langgraph/QA%20Agent)) while achieving higher evaluation scores with faster completion times in certain coding tasks, like in this example ([detailed analysis](https://github.com/crewAIInc/crewAI-examples/blob/main/Notebooks/CrewAI%20Flows%20%26%20Langgraph/Coding%20Assistant/coding_assistant_eval.ipynb)).*
|
|
||||||
|
|
||||||
- **Autogen**: While Autogen excels at creating conversational agents capable of working together, it lacks an inherent concept of process. In Autogen, orchestrating agents' interactions requires additional programming, which can become complex and cumbersome as the scale of tasks grows.
|
|
||||||
|
|
||||||
- **ChatDev**: ChatDev introduced the idea of processes into the realm of AI agents, but its implementation is quite rigid. Customizations in ChatDev are limited and not geared towards production environments, which can hinder scalability and flexibility in real-world applications.
|
- **ChatDev**: ChatDev introduced the idea of processes into the realm of AI agents, but its implementation is quite rigid. Customizations in ChatDev are limited and not geared towards production environments, which can hinder scalability and flexibility in real-world applications.
|
||||||
|
|
||||||
@@ -588,8 +440,5 @@ A: CrewAI uses anonymous telemetry to collect usage data for improvement purpose
|
|||||||
### Q: Where can I find examples of CrewAI in action?
|
### Q: Where can I find examples of CrewAI in action?
|
||||||
A: You can find various real-life examples in the [CrewAI-examples repository](https://github.com/crewAIInc/crewAI-examples), including trip planners, stock analysis tools, and more.
|
A: You can find various real-life examples in the [CrewAI-examples repository](https://github.com/crewAIInc/crewAI-examples), including trip planners, stock analysis tools, and more.
|
||||||
|
|
||||||
### Q: What is the difference between Crews and Flows?
|
|
||||||
A: Crews and Flows serve different but complementary purposes in CrewAI. Crews are teams of AI agents working together to accomplish specific tasks through role-based collaboration, delivering accurate and predictable results. Flows, on the other hand, are event-driven workflows that can orchestrate both Crews and regular Python code, allowing you to build complex automation pipelines with secure state management and conditional execution paths.
|
|
||||||
|
|
||||||
### Q: How can I contribute to CrewAI?
|
### Q: How can I contribute to CrewAI?
|
||||||
A: Contributions are welcome! You can fork the repository, create a new branch for your feature, add your improvement, and send a pull request. Check the Contribution section in the README for more details.
|
A: Contributions are welcome! You can fork the repository, create a new branch for your feature, add your improvement, and send a pull request. Check the Contribution section in the README for more details.
|
||||||
|
|||||||
@@ -32,6 +32,7 @@ A crew in crewAI represents a collaborative group of agents working together to
|
|||||||
| **Share Crew** _(optional)_ | `share_crew` | Whether you want to share the complete crew information and execution with the crewAI team to make the library better, and allow us to train models. |
|
| **Share Crew** _(optional)_ | `share_crew` | Whether you want to share the complete crew information and execution with the crewAI team to make the library better, and allow us to train models. |
|
||||||
| **Output Log File** _(optional)_ | `output_log_file` | Whether you want to have a file with the complete crew output and execution. You can set it using True and it will default to the folder you are currently in and it will be called logs.txt or passing a string with the full path and name of the file. |
|
| **Output Log File** _(optional)_ | `output_log_file` | Whether you want to have a file with the complete crew output and execution. You can set it using True and it will default to the folder you are currently in and it will be called logs.txt or passing a string with the full path and name of the file. |
|
||||||
| **Manager Agent** _(optional)_ | `manager_agent` | `manager` sets a custom agent that will be used as a manager. |
|
| **Manager Agent** _(optional)_ | `manager_agent` | `manager` sets a custom agent that will be used as a manager. |
|
||||||
|
| **Manager Callbacks** _(optional)_ | `manager_callbacks` | `manager_callbacks` takes a list of callback handlers to be executed by the manager agent when a hierarchical process is used. |
|
||||||
| **Prompt File** _(optional)_ | `prompt_file` | Path to the prompt JSON file to be used for the crew. |
|
| **Prompt File** _(optional)_ | `prompt_file` | Path to the prompt JSON file to be used for the crew. |
|
||||||
| **Planning** *(optional)* | `planning` | Adds planning ability to the Crew. When activated before each Crew iteration, all Crew data is sent to an AgentPlanner that will plan the tasks and this plan will be added to each task description. |
|
| **Planning** *(optional)* | `planning` | Adds planning ability to the Crew. When activated before each Crew iteration, all Crew data is sent to an AgentPlanner that will plan the tasks and this plan will be added to each task description. |
|
||||||
| **Planning LLM** *(optional)* | `planning_llm` | The language model used by the AgentPlanner in a planning process. |
|
| **Planning LLM** *(optional)* | `planning_llm` | The language model used by the AgentPlanner in a planning process. |
|
||||||
|
|||||||
@@ -79,55 +79,6 @@ crew = Crew(
|
|||||||
result = crew.kickoff(inputs={"question": "What city does John live in and how old is he?"})
|
result = crew.kickoff(inputs={"question": "What city does John live in and how old is he?"})
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
Here's another example with the `CrewDoclingSource`
|
|
||||||
```python Code
|
|
||||||
from crewai import LLM, Agent, Crew, Process, Task
|
|
||||||
from crewai.knowledge.source.crew_docling_source import CrewDoclingSource
|
|
||||||
|
|
||||||
# Create a knowledge source
|
|
||||||
content_source = CrewDoclingSource(
|
|
||||||
file_paths=[
|
|
||||||
"https://lilianweng.github.io/posts/2024-11-28-reward-hacking",
|
|
||||||
"https://lilianweng.github.io/posts/2024-07-07-hallucination",
|
|
||||||
],
|
|
||||||
)
|
|
||||||
|
|
||||||
# Create an LLM with a temperature of 0 to ensure deterministic outputs
|
|
||||||
llm = LLM(model="gpt-4o-mini", temperature=0)
|
|
||||||
|
|
||||||
# Create an agent with the knowledge store
|
|
||||||
agent = Agent(
|
|
||||||
role="About papers",
|
|
||||||
goal="You know everything about the papers.",
|
|
||||||
backstory="""You are a master at understanding papers and their content.""",
|
|
||||||
verbose=True,
|
|
||||||
allow_delegation=False,
|
|
||||||
llm=llm,
|
|
||||||
)
|
|
||||||
task = Task(
|
|
||||||
description="Answer the following questions about the papers: {question}",
|
|
||||||
expected_output="An answer to the question.",
|
|
||||||
agent=agent,
|
|
||||||
)
|
|
||||||
|
|
||||||
crew = Crew(
|
|
||||||
agents=[agent],
|
|
||||||
tasks=[task],
|
|
||||||
verbose=True,
|
|
||||||
process=Process.sequential,
|
|
||||||
knowledge_sources=[
|
|
||||||
content_source
|
|
||||||
], # Enable knowledge by adding the sources here. You can also add more sources to the sources list.
|
|
||||||
)
|
|
||||||
|
|
||||||
result = crew.kickoff(
|
|
||||||
inputs={
|
|
||||||
"question": "What is the reward hacking paper about? Be sure to provide sources."
|
|
||||||
}
|
|
||||||
)
|
|
||||||
```
|
|
||||||
|
|
||||||
## Knowledge Configuration
|
## Knowledge Configuration
|
||||||
|
|
||||||
### Chunking Configuration
|
### Chunking Configuration
|
||||||
@@ -171,58 +122,6 @@ crewai reset-memories --knowledge
|
|||||||
|
|
||||||
This is useful when you've updated your knowledge sources and want to ensure that the agents are using the most recent information.
|
This is useful when you've updated your knowledge sources and want to ensure that the agents are using the most recent information.
|
||||||
|
|
||||||
## Agent-Specific Knowledge
|
|
||||||
|
|
||||||
While knowledge can be provided at the crew level using `crew.knowledge_sources`, individual agents can also have their own knowledge sources using the `knowledge_sources` parameter:
|
|
||||||
|
|
||||||
```python Code
|
|
||||||
from crewai import Agent, Task, Crew
|
|
||||||
from crewai.knowledge.source.string_knowledge_source import StringKnowledgeSource
|
|
||||||
|
|
||||||
# Create agent-specific knowledge about a product
|
|
||||||
product_specs = StringKnowledgeSource(
|
|
||||||
content="""The XPS 13 laptop features:
|
|
||||||
- 13.4-inch 4K display
|
|
||||||
- Intel Core i7 processor
|
|
||||||
- 16GB RAM
|
|
||||||
- 512GB SSD storage
|
|
||||||
- 12-hour battery life""",
|
|
||||||
metadata={"category": "product_specs"}
|
|
||||||
)
|
|
||||||
|
|
||||||
# Create a support agent with product knowledge
|
|
||||||
support_agent = Agent(
|
|
||||||
role="Technical Support Specialist",
|
|
||||||
goal="Provide accurate product information and support.",
|
|
||||||
backstory="You are an expert on our laptop products and specifications.",
|
|
||||||
knowledge_sources=[product_specs] # Agent-specific knowledge
|
|
||||||
)
|
|
||||||
|
|
||||||
# Create a task that requires product knowledge
|
|
||||||
support_task = Task(
|
|
||||||
description="Answer this customer question: {question}",
|
|
||||||
agent=support_agent
|
|
||||||
)
|
|
||||||
|
|
||||||
# Create and run the crew
|
|
||||||
crew = Crew(
|
|
||||||
agents=[support_agent],
|
|
||||||
tasks=[support_task]
|
|
||||||
)
|
|
||||||
|
|
||||||
# Get answer about the laptop's specifications
|
|
||||||
result = crew.kickoff(
|
|
||||||
inputs={"question": "What is the storage capacity of the XPS 13?"}
|
|
||||||
)
|
|
||||||
```
|
|
||||||
|
|
||||||
<Info>
|
|
||||||
Benefits of agent-specific knowledge:
|
|
||||||
- Give agents specialized information for their roles
|
|
||||||
- Maintain separation of concerns between agents
|
|
||||||
- Combine with crew-level knowledge for layered information access
|
|
||||||
</Info>
|
|
||||||
|
|
||||||
## Custom Knowledge Sources
|
## Custom Knowledge Sources
|
||||||
|
|
||||||
CrewAI allows you to create custom knowledge sources for any type of data by extending the `BaseKnowledgeSource` class. Let's create a practical example that fetches and processes space news articles.
|
CrewAI allows you to create custom knowledge sources for any type of data by extending the `BaseKnowledgeSource` class. Let's create a practical example that fetches and processes space news articles.
|
||||||
|
|||||||
@@ -29,7 +29,7 @@ Large Language Models (LLMs) are the core intelligence behind CrewAI agents. The
|
|||||||
|
|
||||||
## Available Models and Their Capabilities
|
## Available Models and Their Capabilities
|
||||||
|
|
||||||
Here's a detailed breakdown of supported models and their capabilities, you can compare performance at [lmarena.ai](https://lmarena.ai/?leaderboard) and [artificialanalysis.ai](https://artificialanalysis.ai/):
|
Here's a detailed breakdown of supported models and their capabilities:
|
||||||
|
|
||||||
<Tabs>
|
<Tabs>
|
||||||
<Tab title="OpenAI">
|
<Tab title="OpenAI">
|
||||||
@@ -43,104 +43,13 @@ Here's a detailed breakdown of supported models and their capabilities, you can
|
|||||||
1 token ≈ 4 characters in English. For example, 8,192 tokens ≈ 32,768 characters or about 6,000 words.
|
1 token ≈ 4 characters in English. For example, 8,192 tokens ≈ 32,768 characters or about 6,000 words.
|
||||||
</Note>
|
</Note>
|
||||||
</Tab>
|
</Tab>
|
||||||
<Tab title="Nvidia NIM">
|
|
||||||
| Model | Context Window | Best For |
|
|
||||||
|-------|---------------|-----------|
|
|
||||||
| nvidia/mistral-nemo-minitron-8b-8k-instruct | 8,192 tokens | State-of-the-art small language model delivering superior accuracy for chatbot, virtual assistants, and content generation. |
|
|
||||||
| nvidia/nemotron-4-mini-hindi-4b-instruct| 4,096 tokens | A bilingual Hindi-English SLM for on-device inference, tailored specifically for Hindi Language. |
|
|
||||||
| "nvidia/llama-3.1-nemotron-70b-instruct | 128k tokens | Llama-3.1-Nemotron-70B-Instruct is a large language model customized by NVIDIA in order to improve the helpfulness of LLM generated responses. |
|
|
||||||
| nvidia/llama3-chatqa-1.5-8b | 128k tokens | Advanced LLM to generate high-quality, context-aware responses for chatbots and search engines. |
|
|
||||||
| nvidia/llama3-chatqa-1.5-70b | 128k tokens | Advanced LLM to generate high-quality, context-aware responses for chatbots and search engines. |
|
|
||||||
| nvidia/vila | 128k tokens | Multi-modal vision-language model that understands text/img/video and creates informative responses |
|
|
||||||
| nvidia/neva-22| 4,096 tokens | Multi-modal vision-language model that understands text/images and generates informative responses |
|
|
||||||
| nvidia/nemotron-mini-4b-instruct | 8,192 tokens | General-purpose tasks |
|
|
||||||
| nvidia/usdcode-llama3-70b-instruct | 128k tokens | State-of-the-art LLM that answers OpenUSD knowledge queries and generates USD-Python code. |
|
|
||||||
| nvidia/nemotron-4-340b-instruct | 4,096 tokens | Creates diverse synthetic data that mimics the characteristics of real-world data. |
|
|
||||||
| meta/codellama-70b | 100k tokens | LLM capable of generating code from natural language and vice versa. |
|
|
||||||
| meta/llama2-70b | 4,096 tokens | Cutting-edge large language AI model capable of generating text and code in response to prompts. |
|
|
||||||
| meta/llama3-8b-instruct | 8,192 tokens | Advanced state-of-the-art LLM with language understanding, superior reasoning, and text generation. |
|
|
||||||
| meta/llama3-70b-instruct | 8,192 tokens | Powers complex conversations with superior contextual understanding, reasoning and text generation. |
|
|
||||||
| meta/llama-3.1-8b-instruct | 128k tokens | Advanced state-of-the-art model with language understanding, superior reasoning, and text generation. |
|
|
||||||
| meta/llama-3.1-70b-instruct | 128k tokens | Powers complex conversations with superior contextual understanding, reasoning and text generation. |
|
|
||||||
| meta/llama-3.1-405b-instruct | 128k tokens | Advanced LLM for synthetic data generation, distillation, and inference for chatbots, coding, and domain-specific tasks. |
|
|
||||||
| meta/llama-3.2-1b-instruct | 128k tokens | Advanced state-of-the-art small language model with language understanding, superior reasoning, and text generation. |
|
|
||||||
| meta/llama-3.2-3b-instruct | 128k tokens | Advanced state-of-the-art small language model with language understanding, superior reasoning, and text generation. |
|
|
||||||
| meta/llama-3.2-11b-vision-instruct | 128k tokens | Advanced state-of-the-art small language model with language understanding, superior reasoning, and text generation. |
|
|
||||||
| meta/llama-3.2-90b-vision-instruct | 128k tokens | Advanced state-of-the-art small language model with language understanding, superior reasoning, and text generation. |
|
|
||||||
| meta/llama-3.1-70b-instruct | 128k tokens | Powers complex conversations with superior contextual understanding, reasoning and text generation. |
|
|
||||||
| google/gemma-7b | 8,192 tokens | Cutting-edge text generation model text understanding, transformation, and code generation. |
|
|
||||||
| google/gemma-2b | 8,192 tokens | Cutting-edge text generation model text understanding, transformation, and code generation. |
|
|
||||||
| google/codegemma-7b | 8,192 tokens | Cutting-edge model built on Google's Gemma-7B specialized for code generation and code completion. |
|
|
||||||
| google/codegemma-1.1-7b | 8,192 tokens | Advanced programming model for code generation, completion, reasoning, and instruction following. |
|
|
||||||
| google/recurrentgemma-2b | 8,192 tokens | Novel recurrent architecture based language model for faster inference when generating long sequences. |
|
|
||||||
| google/gemma-2-9b-it | 8,192 tokens | Cutting-edge text generation model text understanding, transformation, and code generation. |
|
|
||||||
| google/gemma-2-27b-it | 8,192 tokens | Cutting-edge text generation model text understanding, transformation, and code generation. |
|
|
||||||
| google/gemma-2-2b-it | 8,192 tokens | Cutting-edge text generation model text understanding, transformation, and code generation. |
|
|
||||||
| google/deplot | 512 tokens | One-shot visual language understanding model that translates images of plots into tables. |
|
|
||||||
| google/paligemma | 8,192 tokens | Vision language model adept at comprehending text and visual inputs to produce informative responses. |
|
|
||||||
| mistralai/mistral-7b-instruct-v0.2 | 32k tokens | This LLM follows instructions, completes requests, and generates creative text. |
|
|
||||||
| mistralai/mixtral-8x7b-instruct-v0.1 | 8,192 tokens | An MOE LLM that follows instructions, completes requests, and generates creative text. |
|
|
||||||
| mistralai/mistral-large | 4,096 tokens | Creates diverse synthetic data that mimics the characteristics of real-world data. |
|
|
||||||
| mistralai/mixtral-8x22b-instruct-v0.1 | 8,192 tokens | Creates diverse synthetic data that mimics the characteristics of real-world data. |
|
|
||||||
| mistralai/mistral-7b-instruct-v0.3 | 32k tokens | This LLM follows instructions, completes requests, and generates creative text. |
|
|
||||||
| nv-mistralai/mistral-nemo-12b-instruct | 128k tokens | Most advanced language model for reasoning, code, multilingual tasks; runs on a single GPU. |
|
|
||||||
| mistralai/mamba-codestral-7b-v0.1 | 256k tokens | Model for writing and interacting with code across a wide range of programming languages and tasks. |
|
|
||||||
| microsoft/phi-3-mini-128k-instruct | 128K tokens | Lightweight, state-of-the-art open LLM with strong math and logical reasoning skills. |
|
|
||||||
| microsoft/phi-3-mini-4k-instruct | 4,096 tokens | Lightweight, state-of-the-art open LLM with strong math and logical reasoning skills. |
|
|
||||||
| microsoft/phi-3-small-8k-instruct | 8,192 tokens | Lightweight, state-of-the-art open LLM with strong math and logical reasoning skills. |
|
|
||||||
| microsoft/phi-3-small-128k-instruct | 128K tokens | Lightweight, state-of-the-art open LLM with strong math and logical reasoning skills. |
|
|
||||||
| microsoft/phi-3-medium-4k-instruct | 4,096 tokens | Lightweight, state-of-the-art open LLM with strong math and logical reasoning skills. |
|
|
||||||
| microsoft/phi-3-medium-128k-instruct | 128K tokens | Lightweight, state-of-the-art open LLM with strong math and logical reasoning skills. |
|
|
||||||
| microsoft/phi-3.5-mini-instruct | 128K tokens | Lightweight multilingual LLM powering AI applications in latency bound, memory/compute constrained environments |
|
|
||||||
| microsoft/phi-3.5-moe-instruct | 128K tokens | Advanced LLM based on Mixture of Experts architecure to deliver compute efficient content generation |
|
|
||||||
| microsoft/kosmos-2 | 1,024 tokens | Groundbreaking multimodal model designed to understand and reason about visual elements in images. |
|
|
||||||
| microsoft/phi-3-vision-128k-instruct | 128k tokens | Cutting-edge open multimodal model exceling in high-quality reasoning from images. |
|
|
||||||
| microsoft/phi-3.5-vision-instruct | 128k tokens | Cutting-edge open multimodal model exceling in high-quality reasoning from images. |
|
|
||||||
| databricks/dbrx-instruct | 12k tokens | A general-purpose LLM with state-of-the-art performance in language understanding, coding, and RAG. |
|
|
||||||
| snowflake/arctic | 1,024 tokens | Delivers high efficiency inference for enterprise applications focused on SQL generation and coding. |
|
|
||||||
| aisingapore/sea-lion-7b-instruct | 4,096 tokens | LLM to represent and serve the linguistic and cultural diversity of Southeast Asia |
|
|
||||||
| ibm/granite-8b-code-instruct | 4,096 tokens | Software programming LLM for code generation, completion, explanation, and multi-turn conversion. |
|
|
||||||
| ibm/granite-34b-code-instruct | 8,192 tokens | Software programming LLM for code generation, completion, explanation, and multi-turn conversion. |
|
|
||||||
| ibm/granite-3.0-8b-instruct | 4,096 tokens | Advanced Small Language Model supporting RAG, summarization, classification, code, and agentic AI |
|
|
||||||
| ibm/granite-3.0-3b-a800m-instruct | 4,096 tokens | Highly efficient Mixture of Experts model for RAG, summarization, entity extraction, and classification |
|
|
||||||
| mediatek/breeze-7b-instruct | 4,096 tokens | Creates diverse synthetic data that mimics the characteristics of real-world data. |
|
|
||||||
| upstage/solar-10.7b-instruct | 4,096 tokens | Excels in NLP tasks, particularly in instruction-following, reasoning, and mathematics. |
|
|
||||||
| writer/palmyra-med-70b-32k | 32k tokens | Leading LLM for accurate, contextually relevant responses in the medical domain. |
|
|
||||||
| writer/palmyra-med-70b | 32k tokens | Leading LLM for accurate, contextually relevant responses in the medical domain. |
|
|
||||||
| writer/palmyra-fin-70b-32k | 32k tokens | Specialized LLM for financial analysis, reporting, and data processing |
|
|
||||||
| 01-ai/yi-large | 32k tokens | Powerful model trained on English and Chinese for diverse tasks including chatbot and creative writing. |
|
|
||||||
| deepseek-ai/deepseek-coder-6.7b-instruct | 2k tokens | Powerful coding model offering advanced capabilities in code generation, completion, and infilling |
|
|
||||||
| rakuten/rakutenai-7b-instruct | 1,024 tokens | Advanced state-of-the-art LLM with language understanding, superior reasoning, and text generation. |
|
|
||||||
| rakuten/rakutenai-7b-chat | 1,024 tokens | Advanced state-of-the-art LLM with language understanding, superior reasoning, and text generation. |
|
|
||||||
| baichuan-inc/baichuan2-13b-chat | 4,096 tokens | Support Chinese and English chat, coding, math, instruction following, solving quizzes |
|
|
||||||
|
|
||||||
<Note>
|
|
||||||
NVIDIA's NIM support for models is expanding continuously! For the most up-to-date list of available models, please visit build.nvidia.com.
|
|
||||||
</Note>
|
|
||||||
</Tab>
|
|
||||||
<Tab title="Gemini">
|
|
||||||
| Model | Context Window | Best For |
|
|
||||||
|-------|---------------|-----------|
|
|
||||||
| gemini-2.0-flash-exp | 1M tokens | Higher quality at faster speed, multimodal model, good for most tasks |
|
|
||||||
| gemini-1.5-flash | 1M tokens | Balanced multimodal model, good for most tasks |
|
|
||||||
| gemini-1.5-flash-8B | 1M tokens | Fastest, most cost-efficient, good for high-frequency tasks |
|
|
||||||
| gemini-1.5-pro | 2M tokens | Best performing, wide variety of reasoning tasks including logical reasoning, coding, and creative collaboration |
|
|
||||||
|
|
||||||
<Tip>
|
|
||||||
Google's Gemini models are all multimodal, supporting audio, images, video and text, supporting context caching, json schema, function calling, etc.
|
|
||||||
|
|
||||||
These models are available via API_KEY from
|
|
||||||
[The Gemini API](https://ai.google.dev/gemini-api/docs) and also from
|
|
||||||
[Google Cloud Vertex](https://cloud.google.com/vertex-ai/generative-ai/docs/migrate/migrate-google-ai) as part of the
|
|
||||||
[Model Garden](https://cloud.google.com/vertex-ai/generative-ai/docs/model-garden/explore-models).
|
|
||||||
</Tip>
|
|
||||||
</Tab>
|
|
||||||
<Tab title="Groq">
|
<Tab title="Groq">
|
||||||
| Model | Context Window | Best For |
|
| Model | Context Window | Best For |
|
||||||
|-------|---------------|-----------|
|
|-------|---------------|-----------|
|
||||||
| Llama 3.1 70B/8B | 131,072 tokens | High-performance, large context tasks |
|
| Llama 3.1 70B/8B | 131,072 tokens | High-performance, large context tasks |
|
||||||
| Llama 3.2 Series | 8,192 tokens | General-purpose tasks |
|
| Llama 3.2 Series | 8,192 tokens | General-purpose tasks |
|
||||||
| Mixtral 8x7B | 32,768 tokens | Balanced performance and context |
|
| Mixtral 8x7B | 32,768 tokens | Balanced performance and context |
|
||||||
|
| Gemma Series | 8,192 tokens | Efficient, smaller-scale tasks |
|
||||||
|
|
||||||
<Tip>
|
<Tip>
|
||||||
Groq is known for its fast inference speeds, making it suitable for real-time applications.
|
Groq is known for its fast inference speeds, making it suitable for real-time applications.
|
||||||
@@ -151,7 +60,7 @@ Here's a detailed breakdown of supported models and their capabilities, you can
|
|||||||
|----------|---------------|--------------|
|
|----------|---------------|--------------|
|
||||||
| Deepseek Chat | 128,000 tokens | Specialized in technical discussions |
|
| Deepseek Chat | 128,000 tokens | Specialized in technical discussions |
|
||||||
| Claude 3 | Up to 200K tokens | Strong reasoning, code understanding |
|
| Claude 3 | Up to 200K tokens | Strong reasoning, code understanding |
|
||||||
| Gemma Series | 8,192 tokens | Efficient, smaller-scale tasks |
|
| Gemini | Varies by model | Multimodal capabilities |
|
||||||
|
|
||||||
<Info>
|
<Info>
|
||||||
Provider selection should consider factors like:
|
Provider selection should consider factors like:
|
||||||
@@ -219,10 +128,10 @@ There are three ways to configure LLMs in CrewAI. Choose the method that best fi
|
|||||||
# llm: anthropic/claude-2.1
|
# llm: anthropic/claude-2.1
|
||||||
# llm: anthropic/claude-2.0
|
# llm: anthropic/claude-2.0
|
||||||
|
|
||||||
# Google Models - Strong reasoning, large cachable context window, multimodal
|
# Google Models - Good for general tasks
|
||||||
|
# llm: gemini/gemini-pro
|
||||||
# llm: gemini/gemini-1.5-pro-latest
|
# llm: gemini/gemini-1.5-pro-latest
|
||||||
# llm: gemini/gemini-1.5-flash-latest
|
# llm: gemini/gemini-1.0-pro-latest
|
||||||
# llm: gemini/gemini-1.5-flash-8b-latest
|
|
||||||
|
|
||||||
# AWS Bedrock Models - Enterprise-grade
|
# AWS Bedrock Models - Enterprise-grade
|
||||||
# llm: bedrock/anthropic.claude-3-sonnet-20240229-v1:0
|
# llm: bedrock/anthropic.claude-3-sonnet-20240229-v1:0
|
||||||
@@ -441,18 +350,13 @@ Learn how to get the most out of your LLM configuration:
|
|||||||
|
|
||||||
<Accordion title="Google">
|
<Accordion title="Google">
|
||||||
```python Code
|
```python Code
|
||||||
# Option 1. Gemini accessed with an API key.
|
|
||||||
# https://ai.google.dev/gemini-api/docs/api-key
|
|
||||||
GEMINI_API_KEY=<your-api-key>
|
GEMINI_API_KEY=<your-api-key>
|
||||||
|
|
||||||
# Option 2. Vertex AI IAM credentials for Gemini, Anthropic, and anything in the Model Garden.
|
|
||||||
# https://cloud.google.com/vertex-ai/generative-ai/docs/overview
|
|
||||||
```
|
```
|
||||||
|
|
||||||
Example usage:
|
Example usage:
|
||||||
```python Code
|
```python Code
|
||||||
llm = LLM(
|
llm = LLM(
|
||||||
model="gemini/gemini-1.5-pro-latest",
|
model="gemini/gemini-pro",
|
||||||
temperature=0.7
|
temperature=0.7
|
||||||
)
|
)
|
||||||
```
|
```
|
||||||
@@ -508,20 +412,6 @@ Learn how to get the most out of your LLM configuration:
|
|||||||
```
|
```
|
||||||
</Accordion>
|
</Accordion>
|
||||||
|
|
||||||
<Accordion title="Nvidia NIM">
|
|
||||||
```python Code
|
|
||||||
NVIDIA_API_KEY=<your-api-key>
|
|
||||||
```
|
|
||||||
|
|
||||||
Example usage:
|
|
||||||
```python Code
|
|
||||||
llm = LLM(
|
|
||||||
model="nvidia_nim/meta/llama3-70b-instruct",
|
|
||||||
temperature=0.7
|
|
||||||
)
|
|
||||||
```
|
|
||||||
</Accordion>
|
|
||||||
|
|
||||||
<Accordion title="Groq">
|
<Accordion title="Groq">
|
||||||
```python Code
|
```python Code
|
||||||
GROQ_API_KEY=<your-api-key>
|
GROQ_API_KEY=<your-api-key>
|
||||||
@@ -612,6 +502,20 @@ Learn how to get the most out of your LLM configuration:
|
|||||||
```
|
```
|
||||||
</Accordion>
|
</Accordion>
|
||||||
|
|
||||||
|
<Accordion title="Nvidia NIM">
|
||||||
|
```python Code
|
||||||
|
NVIDIA_API_KEY=<your-api-key>
|
||||||
|
```
|
||||||
|
|
||||||
|
Example usage:
|
||||||
|
```python Code
|
||||||
|
llm = LLM(
|
||||||
|
model="nvidia_nim/meta/llama3-70b-instruct",
|
||||||
|
temperature=0.7
|
||||||
|
)
|
||||||
|
```
|
||||||
|
</Accordion>
|
||||||
|
|
||||||
<Accordion title="SambaNova">
|
<Accordion title="SambaNova">
|
||||||
```python Code
|
```python Code
|
||||||
SAMBANOVA_API_KEY=<your-api-key>
|
SAMBANOVA_API_KEY=<your-api-key>
|
||||||
|
|||||||
@@ -263,148 +263,8 @@ analysis_task = Task(
|
|||||||
)
|
)
|
||||||
```
|
```
|
||||||
|
|
||||||
## Task Guardrails
|
|
||||||
|
|
||||||
Task guardrails provide a way to validate and transform task outputs before they
|
|
||||||
are passed to the next task. This feature helps ensure data quality and provides
|
|
||||||
efeedback to agents when their output doesn't meet specific criteria.
|
|
||||||
|
|
||||||
### Using Task Guardrails
|
|
||||||
|
|
||||||
To add a guardrail to a task, provide a validation function through the `guardrail` parameter:
|
|
||||||
|
|
||||||
```python Code
|
|
||||||
from typing import Tuple, Union, Dict, Any
|
|
||||||
|
|
||||||
def validate_blog_content(result: str) -> Tuple[bool, Union[Dict[str, Any], str]]:
|
|
||||||
"""Validate blog content meets requirements."""
|
|
||||||
try:
|
|
||||||
# Check word count
|
|
||||||
word_count = len(result.split())
|
|
||||||
if word_count > 200:
|
|
||||||
return (False, {
|
|
||||||
"error": "Blog content exceeds 200 words",
|
|
||||||
"code": "WORD_COUNT_ERROR",
|
|
||||||
"context": {"word_count": word_count}
|
|
||||||
})
|
|
||||||
|
|
||||||
# Additional validation logic here
|
|
||||||
return (True, result.strip())
|
|
||||||
except Exception as e:
|
|
||||||
return (False, {
|
|
||||||
"error": "Unexpected error during validation",
|
|
||||||
"code": "SYSTEM_ERROR"
|
|
||||||
})
|
|
||||||
|
|
||||||
blog_task = Task(
|
|
||||||
description="Write a blog post about AI",
|
|
||||||
expected_output="A blog post under 200 words",
|
|
||||||
agent=blog_agent,
|
|
||||||
guardrail=validate_blog_content # Add the guardrail function
|
|
||||||
)
|
|
||||||
```
|
|
||||||
|
|
||||||
### Guardrail Function Requirements
|
|
||||||
|
|
||||||
1. **Function Signature**:
|
|
||||||
- Must accept exactly one parameter (the task output)
|
|
||||||
- Should return a tuple of `(bool, Any)`
|
|
||||||
- Type hints are recommended but optional
|
|
||||||
|
|
||||||
2. **Return Values**:
|
|
||||||
- Success: Return `(True, validated_result)`
|
|
||||||
- Failure: Return `(False, error_details)`
|
|
||||||
|
|
||||||
### Error Handling Best Practices
|
|
||||||
|
|
||||||
1. **Structured Error Responses**:
|
|
||||||
```python Code
|
|
||||||
def validate_with_context(result: str) -> Tuple[bool, Union[Dict[str, Any], str]]:
|
|
||||||
try:
|
|
||||||
# Main validation logic
|
|
||||||
validated_data = perform_validation(result)
|
|
||||||
return (True, validated_data)
|
|
||||||
except ValidationError as e:
|
|
||||||
return (False, {
|
|
||||||
"error": str(e),
|
|
||||||
"code": "VALIDATION_ERROR",
|
|
||||||
"context": {"input": result}
|
|
||||||
})
|
|
||||||
except Exception as e:
|
|
||||||
return (False, {
|
|
||||||
"error": "Unexpected error",
|
|
||||||
"code": "SYSTEM_ERROR"
|
|
||||||
})
|
|
||||||
```
|
|
||||||
|
|
||||||
2. **Error Categories**:
|
|
||||||
- Use specific error codes
|
|
||||||
- Include relevant context
|
|
||||||
- Provide actionable feedback
|
|
||||||
|
|
||||||
3. **Validation Chain**:
|
|
||||||
```python Code
|
|
||||||
from typing import Any, Dict, List, Tuple, Union
|
|
||||||
|
|
||||||
def complex_validation(result: str) -> Tuple[bool, Union[str, Dict[str, Any]]]:
|
|
||||||
"""Chain multiple validation steps."""
|
|
||||||
# Step 1: Basic validation
|
|
||||||
if not result:
|
|
||||||
return (False, {"error": "Empty result", "code": "EMPTY_INPUT"})
|
|
||||||
|
|
||||||
# Step 2: Content validation
|
|
||||||
try:
|
|
||||||
validated = validate_content(result)
|
|
||||||
if not validated:
|
|
||||||
return (False, {"error": "Invalid content", "code": "CONTENT_ERROR"})
|
|
||||||
|
|
||||||
# Step 3: Format validation
|
|
||||||
formatted = format_output(validated)
|
|
||||||
return (True, formatted)
|
|
||||||
except Exception as e:
|
|
||||||
return (False, {
|
|
||||||
"error": str(e),
|
|
||||||
"code": "VALIDATION_ERROR",
|
|
||||||
"context": {"step": "content_validation"}
|
|
||||||
})
|
|
||||||
```
|
|
||||||
|
|
||||||
### Handling Guardrail Results
|
|
||||||
|
|
||||||
When a guardrail returns `(False, error)`:
|
|
||||||
1. The error is sent back to the agent
|
|
||||||
2. The agent attempts to fix the issue
|
|
||||||
3. The process repeats until:
|
|
||||||
- The guardrail returns `(True, result)`
|
|
||||||
- Maximum retries are reached
|
|
||||||
|
|
||||||
Example with retry handling:
|
|
||||||
```python Code
|
|
||||||
from typing import Optional, Tuple, Union
|
|
||||||
|
|
||||||
def validate_json_output(result: str) -> Tuple[bool, Union[Dict[str, Any], str]]:
|
|
||||||
"""Validate and parse JSON output."""
|
|
||||||
try:
|
|
||||||
# Try to parse as JSON
|
|
||||||
data = json.loads(result)
|
|
||||||
return (True, data)
|
|
||||||
except json.JSONDecodeError as e:
|
|
||||||
return (False, {
|
|
||||||
"error": "Invalid JSON format",
|
|
||||||
"code": "JSON_ERROR",
|
|
||||||
"context": {"line": e.lineno, "column": e.colno}
|
|
||||||
})
|
|
||||||
|
|
||||||
task = Task(
|
|
||||||
description="Generate a JSON report",
|
|
||||||
expected_output="A valid JSON object",
|
|
||||||
agent=analyst,
|
|
||||||
guardrail=validate_json_output,
|
|
||||||
max_retries=3 # Limit retry attempts
|
|
||||||
)
|
|
||||||
```
|
|
||||||
|
|
||||||
## Getting Structured Consistent Outputs from Tasks
|
## Getting Structured Consistent Outputs from Tasks
|
||||||
|
When you need to ensure that a task outputs a structured and consistent format, you can use the `output_pydantic` or `output_json` properties on a task. These properties allow you to define the expected output structure, making it easier to parse and utilize the results in your application.
|
||||||
|
|
||||||
<Note>
|
<Note>
|
||||||
It's also important to note that the output of the final task of a crew becomes the final output of the actual crew itself.
|
It's also important to note that the output of the final task of a crew becomes the final output of the actual crew itself.
|
||||||
@@ -748,114 +608,6 @@ While creating and executing tasks, certain validation mechanisms are in place t
|
|||||||
|
|
||||||
These validations help in maintaining the consistency and reliability of task executions within the crewAI framework.
|
These validations help in maintaining the consistency and reliability of task executions within the crewAI framework.
|
||||||
|
|
||||||
## Task Guardrails
|
|
||||||
|
|
||||||
Task guardrails provide a powerful way to validate, transform, or filter task outputs before they are passed to the next task. Guardrails are optional functions that execute before the next task starts, allowing you to ensure that task outputs meet specific requirements or formats.
|
|
||||||
|
|
||||||
### Basic Usage
|
|
||||||
|
|
||||||
```python Code
|
|
||||||
from typing import Tuple, Union
|
|
||||||
from crewai import Task
|
|
||||||
|
|
||||||
def validate_json_output(result: str) -> Tuple[bool, Union[dict, str]]:
|
|
||||||
"""Validate that the output is valid JSON."""
|
|
||||||
try:
|
|
||||||
json_data = json.loads(result)
|
|
||||||
return (True, json_data)
|
|
||||||
except json.JSONDecodeError:
|
|
||||||
return (False, "Output must be valid JSON")
|
|
||||||
|
|
||||||
task = Task(
|
|
||||||
description="Generate JSON data",
|
|
||||||
expected_output="Valid JSON object",
|
|
||||||
guardrail=validate_json_output
|
|
||||||
)
|
|
||||||
```
|
|
||||||
|
|
||||||
### How Guardrails Work
|
|
||||||
|
|
||||||
1. **Optional Attribute**: Guardrails are an optional attribute at the task level, allowing you to add validation only where needed.
|
|
||||||
2. **Execution Timing**: The guardrail function is executed before the next task starts, ensuring valid data flow between tasks.
|
|
||||||
3. **Return Format**: Guardrails must return a tuple of `(success, data)`:
|
|
||||||
- If `success` is `True`, `data` is the validated/transformed result
|
|
||||||
- If `success` is `False`, `data` is the error message
|
|
||||||
4. **Result Routing**:
|
|
||||||
- On success (`True`), the result is automatically passed to the next task
|
|
||||||
- On failure (`False`), the error is sent back to the agent to generate a new answer
|
|
||||||
|
|
||||||
### Common Use Cases
|
|
||||||
|
|
||||||
#### Data Format Validation
|
|
||||||
```python Code
|
|
||||||
def validate_email_format(result: str) -> Tuple[bool, Union[str, str]]:
|
|
||||||
"""Ensure the output contains a valid email address."""
|
|
||||||
import re
|
|
||||||
email_pattern = r'^[\w\.-]+@[\w\.-]+\.\w+$'
|
|
||||||
if re.match(email_pattern, result.strip()):
|
|
||||||
return (True, result.strip())
|
|
||||||
return (False, "Output must be a valid email address")
|
|
||||||
```
|
|
||||||
|
|
||||||
#### Content Filtering
|
|
||||||
```python Code
|
|
||||||
def filter_sensitive_info(result: str) -> Tuple[bool, Union[str, str]]:
|
|
||||||
"""Remove or validate sensitive information."""
|
|
||||||
sensitive_patterns = ['SSN:', 'password:', 'secret:']
|
|
||||||
for pattern in sensitive_patterns:
|
|
||||||
if pattern.lower() in result.lower():
|
|
||||||
return (False, f"Output contains sensitive information ({pattern})")
|
|
||||||
return (True, result)
|
|
||||||
```
|
|
||||||
|
|
||||||
#### Data Transformation
|
|
||||||
```python Code
|
|
||||||
def normalize_phone_number(result: str) -> Tuple[bool, Union[str, str]]:
|
|
||||||
"""Ensure phone numbers are in a consistent format."""
|
|
||||||
import re
|
|
||||||
digits = re.sub(r'\D', '', result)
|
|
||||||
if len(digits) == 10:
|
|
||||||
formatted = f"({digits[:3]}) {digits[3:6]}-{digits[6:]}"
|
|
||||||
return (True, formatted)
|
|
||||||
return (False, "Output must be a 10-digit phone number")
|
|
||||||
```
|
|
||||||
|
|
||||||
### Advanced Features
|
|
||||||
|
|
||||||
#### Chaining Multiple Validations
|
|
||||||
```python Code
|
|
||||||
def chain_validations(*validators):
|
|
||||||
"""Chain multiple validators together."""
|
|
||||||
def combined_validator(result):
|
|
||||||
for validator in validators:
|
|
||||||
success, data = validator(result)
|
|
||||||
if not success:
|
|
||||||
return (False, data)
|
|
||||||
result = data
|
|
||||||
return (True, result)
|
|
||||||
return combined_validator
|
|
||||||
|
|
||||||
# Usage
|
|
||||||
task = Task(
|
|
||||||
description="Get user contact info",
|
|
||||||
expected_output="Email and phone",
|
|
||||||
guardrail=chain_validations(
|
|
||||||
validate_email_format,
|
|
||||||
filter_sensitive_info
|
|
||||||
)
|
|
||||||
)
|
|
||||||
```
|
|
||||||
|
|
||||||
#### Custom Retry Logic
|
|
||||||
```python Code
|
|
||||||
task = Task(
|
|
||||||
description="Generate data",
|
|
||||||
expected_output="Valid data",
|
|
||||||
guardrail=validate_data,
|
|
||||||
max_retries=5 # Override default retry limit
|
|
||||||
)
|
|
||||||
```
|
|
||||||
|
|
||||||
## Creating Directories when Saving Files
|
## Creating Directories when Saving Files
|
||||||
|
|
||||||
You can now specify if a task should create directories when saving its output to a file. This is particularly useful for organizing outputs and ensuring that file paths are correctly structured.
|
You can now specify if a task should create directories when saving its output to a file. This is particularly useful for organizing outputs and ensuring that file paths are correctly structured.
|
||||||
|
|||||||
@@ -1,211 +0,0 @@
|
|||||||
# Portkey Integration with CrewAI
|
|
||||||
<img src="https://raw.githubusercontent.com/siddharthsambharia-portkey/Portkey-Product-Images/main/Portkey-CrewAI.png" alt="Portkey CrewAI Header Image" width="70%" />
|
|
||||||
|
|
||||||
|
|
||||||
[Portkey](https://portkey.ai/?utm_source=crewai&utm_medium=crewai&utm_campaign=crewai) is a 2-line upgrade to make your CrewAI agents reliable, cost-efficient, and fast.
|
|
||||||
|
|
||||||
Portkey adds 4 core production capabilities to any CrewAI agent:
|
|
||||||
1. Routing to **200+ LLMs**
|
|
||||||
2. Making each LLM call more robust
|
|
||||||
3. Full-stack tracing & cost, performance analytics
|
|
||||||
4. Real-time guardrails to enforce behavior
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
## Getting Started
|
|
||||||
|
|
||||||
1. **Install Required Packages:**
|
|
||||||
|
|
||||||
```bash
|
|
||||||
pip install -qU crewai portkey-ai
|
|
||||||
```
|
|
||||||
|
|
||||||
2. **Configure the LLM Client:**
|
|
||||||
|
|
||||||
To build CrewAI Agents with Portkey, you'll need two keys:
|
|
||||||
- **Portkey API Key**: Sign up on the [Portkey app](https://app.portkey.ai/?utm_source=crewai&utm_medium=crewai&utm_campaign=crewai) and copy your API key
|
|
||||||
- **Virtual Key**: Virtual Keys securely manage your LLM API keys in one place. Store your LLM provider API keys securely in Portkey's vault
|
|
||||||
|
|
||||||
```python
|
|
||||||
from crewai import LLM
|
|
||||||
from portkey_ai import createHeaders, PORTKEY_GATEWAY_URL
|
|
||||||
|
|
||||||
gpt_llm = LLM(
|
|
||||||
model="gpt-4",
|
|
||||||
base_url=PORTKEY_GATEWAY_URL,
|
|
||||||
api_key="dummy", # We are using Virtual key
|
|
||||||
extra_headers=createHeaders(
|
|
||||||
api_key="YOUR_PORTKEY_API_KEY",
|
|
||||||
virtual_key="YOUR_VIRTUAL_KEY", # Enter your Virtual key from Portkey
|
|
||||||
)
|
|
||||||
)
|
|
||||||
```
|
|
||||||
|
|
||||||
3. **Create and Run Your First Agent:**
|
|
||||||
|
|
||||||
```python
|
|
||||||
from crewai import Agent, Task, Crew
|
|
||||||
|
|
||||||
# Define your agents with roles and goals
|
|
||||||
coder = Agent(
|
|
||||||
role='Software developer',
|
|
||||||
goal='Write clear, concise code on demand',
|
|
||||||
backstory='An expert coder with a keen eye for software trends.',
|
|
||||||
llm=gpt_llm
|
|
||||||
)
|
|
||||||
|
|
||||||
# Create tasks for your agents
|
|
||||||
task1 = Task(
|
|
||||||
description="Define the HTML for making a simple website with heading- Hello World! Portkey is working!",
|
|
||||||
expected_output="A clear and concise HTML code",
|
|
||||||
agent=coder
|
|
||||||
)
|
|
||||||
|
|
||||||
# Instantiate your crew
|
|
||||||
crew = Crew(
|
|
||||||
agents=[coder],
|
|
||||||
tasks=[task1],
|
|
||||||
)
|
|
||||||
|
|
||||||
result = crew.kickoff()
|
|
||||||
print(result)
|
|
||||||
```
|
|
||||||
|
|
||||||
|
|
||||||
## Key Features
|
|
||||||
|
|
||||||
| Feature | Description |
|
|
||||||
|---------|-------------|
|
|
||||||
| 🌐 Multi-LLM Support | Access OpenAI, Anthropic, Gemini, Azure, and 250+ providers through a unified interface |
|
|
||||||
| 🛡️ Production Reliability | Implement retries, timeouts, load balancing, and fallbacks |
|
|
||||||
| 📊 Advanced Observability | Track 40+ metrics including costs, tokens, latency, and custom metadata |
|
|
||||||
| 🔍 Comprehensive Logging | Debug with detailed execution traces and function call logs |
|
|
||||||
| 🚧 Security Controls | Set budget limits and implement role-based access control |
|
|
||||||
| 🔄 Performance Analytics | Capture and analyze feedback for continuous improvement |
|
|
||||||
| 💾 Intelligent Caching | Reduce costs and latency with semantic or simple caching |
|
|
||||||
|
|
||||||
|
|
||||||
## Production Features with Portkey Configs
|
|
||||||
|
|
||||||
All features mentioned below are through Portkey's Config system. Portkey's Config system allows you to define routing strategies using simple JSON objects in your LLM API calls. You can create and manage Configs directly in your code or through the Portkey Dashboard. Each Config has a unique ID for easy reference.
|
|
||||||
|
|
||||||
<Frame>
|
|
||||||
<img src="https://raw.githubusercontent.com/Portkey-AI/docs-core/refs/heads/main/images/libraries/libraries-3.avif"/>
|
|
||||||
</Frame>
|
|
||||||
|
|
||||||
|
|
||||||
### 1. Use 250+ LLMs
|
|
||||||
Access various LLMs like Anthropic, Gemini, Mistral, Azure OpenAI, and more with minimal code changes. Switch between providers or use them together seamlessly. [Learn more about Universal API](https://portkey.ai/docs/product/ai-gateway/universal-api)
|
|
||||||
|
|
||||||
|
|
||||||
Easily switch between different LLM providers:
|
|
||||||
|
|
||||||
```python
|
|
||||||
# Anthropic Configuration
|
|
||||||
anthropic_llm = LLM(
|
|
||||||
model="claude-3-5-sonnet-latest",
|
|
||||||
base_url=PORTKEY_GATEWAY_URL,
|
|
||||||
api_key="dummy",
|
|
||||||
extra_headers=createHeaders(
|
|
||||||
api_key="YOUR_PORTKEY_API_KEY",
|
|
||||||
virtual_key="YOUR_ANTHROPIC_VIRTUAL_KEY", #You don't need provider when using Virtual keys
|
|
||||||
trace_id="anthropic_agent"
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
# Azure OpenAI Configuration
|
|
||||||
azure_llm = LLM(
|
|
||||||
model="gpt-4",
|
|
||||||
base_url=PORTKEY_GATEWAY_URL,
|
|
||||||
api_key="dummy",
|
|
||||||
extra_headers=createHeaders(
|
|
||||||
api_key="YOUR_PORTKEY_API_KEY",
|
|
||||||
virtual_key="YOUR_AZURE_VIRTUAL_KEY", #You don't need provider when using Virtual keys
|
|
||||||
trace_id="azure_agent"
|
|
||||||
)
|
|
||||||
)
|
|
||||||
```
|
|
||||||
|
|
||||||
|
|
||||||
### 2. Caching
|
|
||||||
Improve response times and reduce costs with two powerful caching modes:
|
|
||||||
- **Simple Cache**: Perfect for exact matches
|
|
||||||
- **Semantic Cache**: Matches responses for requests that are semantically similar
|
|
||||||
[Learn more about Caching](https://portkey.ai/docs/product/ai-gateway/cache-simple-and-semantic)
|
|
||||||
|
|
||||||
```py
|
|
||||||
config = {
|
|
||||||
"cache": {
|
|
||||||
"mode": "semantic", # or "simple" for exact matching
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
### 3. Production Reliability
|
|
||||||
Portkey provides comprehensive reliability features:
|
|
||||||
- **Automatic Retries**: Handle temporary failures gracefully
|
|
||||||
- **Request Timeouts**: Prevent hanging operations
|
|
||||||
- **Conditional Routing**: Route requests based on specific conditions
|
|
||||||
- **Fallbacks**: Set up automatic provider failovers
|
|
||||||
- **Load Balancing**: Distribute requests efficiently
|
|
||||||
|
|
||||||
[Learn more about Reliability Features](https://portkey.ai/docs/product/ai-gateway/)
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
### 4. Metrics
|
|
||||||
|
|
||||||
Agent runs are complex. Portkey automatically logs **40+ comprehensive metrics** for your AI agents, including cost, tokens used, latency, etc. Whether you need a broad overview or granular insights into your agent runs, Portkey's customizable filters provide the metrics you need.
|
|
||||||
|
|
||||||
|
|
||||||
- Cost per agent interaction
|
|
||||||
- Response times and latency
|
|
||||||
- Token usage and efficiency
|
|
||||||
- Success/failure rates
|
|
||||||
- Cache hit rates
|
|
||||||
|
|
||||||
<img src="https://github.com/siddharthsambharia-portkey/Portkey-Product-Images/blob/main/Portkey-Dashboard.png?raw=true" width="70%" alt="Portkey Dashboard" />
|
|
||||||
|
|
||||||
### 5. Detailed Logging
|
|
||||||
Logs are essential for understanding agent behavior, diagnosing issues, and improving performance. They provide a detailed record of agent activities and tool use, which is crucial for debugging and optimizing processes.
|
|
||||||
|
|
||||||
|
|
||||||
Access a dedicated section to view records of agent executions, including parameters, outcomes, function calls, and errors. Filter logs based on multiple parameters such as trace ID, model, tokens used, and metadata.
|
|
||||||
|
|
||||||
<details>
|
|
||||||
<summary><b>Traces</b></summary>
|
|
||||||
<img src="https://raw.githubusercontent.com/siddharthsambharia-portkey/Portkey-Product-Images/main/Portkey-Traces.png" alt="Portkey Traces" width="70%" />
|
|
||||||
</details>
|
|
||||||
|
|
||||||
<details>
|
|
||||||
<summary><b>Logs</b></summary>
|
|
||||||
<img src="https://raw.githubusercontent.com/siddharthsambharia-portkey/Portkey-Product-Images/main/Portkey-Logs.png" alt="Portkey Logs" width="70%" />
|
|
||||||
</details>
|
|
||||||
|
|
||||||
### 6. Enterprise Security Features
|
|
||||||
- Set budget limit and rate limts per Virtual Key (disposable API keys)
|
|
||||||
- Implement role-based access control
|
|
||||||
- Track system changes with audit logs
|
|
||||||
- Configure data retention policies
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
For detailed information on creating and managing Configs, visit the [Portkey documentation](https://docs.portkey.ai/product/ai-gateway/configs).
|
|
||||||
|
|
||||||
## Resources
|
|
||||||
|
|
||||||
- [📘 Portkey Documentation](https://docs.portkey.ai)
|
|
||||||
- [📊 Portkey Dashboard](https://app.portkey.ai/?utm_source=crewai&utm_medium=crewai&utm_campaign=crewai)
|
|
||||||
- [🐦 Twitter](https://twitter.com/portkeyai)
|
|
||||||
- [💬 Discord Community](https://discord.gg/DD7vgKK299)
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
@@ -1,138 +0,0 @@
|
|||||||
---
|
|
||||||
title: Using Multimodal Agents
|
|
||||||
description: Learn how to enable and use multimodal capabilities in your agents for processing images and other non-text content within the CrewAI framework.
|
|
||||||
icon: image
|
|
||||||
---
|
|
||||||
|
|
||||||
# Using Multimodal Agents
|
|
||||||
|
|
||||||
CrewAI supports multimodal agents that can process both text and non-text content like images. This guide will show you how to enable and use multimodal capabilities in your agents.
|
|
||||||
|
|
||||||
## Enabling Multimodal Capabilities
|
|
||||||
|
|
||||||
To create a multimodal agent, simply set the `multimodal` parameter to `True` when initializing your agent:
|
|
||||||
|
|
||||||
```python
|
|
||||||
from crewai import Agent
|
|
||||||
|
|
||||||
agent = Agent(
|
|
||||||
role="Image Analyst",
|
|
||||||
goal="Analyze and extract insights from images",
|
|
||||||
backstory="An expert in visual content interpretation with years of experience in image analysis",
|
|
||||||
multimodal=True # This enables multimodal capabilities
|
|
||||||
)
|
|
||||||
```
|
|
||||||
|
|
||||||
When you set `multimodal=True`, the agent is automatically configured with the necessary tools for handling non-text content, including the `AddImageTool`.
|
|
||||||
|
|
||||||
## Working with Images
|
|
||||||
|
|
||||||
The multimodal agent comes pre-configured with the `AddImageTool`, which allows it to process images. You don't need to manually add this tool - it's automatically included when you enable multimodal capabilities.
|
|
||||||
|
|
||||||
Here's a complete example showing how to use a multimodal agent to analyze an image:
|
|
||||||
|
|
||||||
```python
|
|
||||||
from crewai import Agent, Task, Crew
|
|
||||||
|
|
||||||
# Create a multimodal agent
|
|
||||||
image_analyst = Agent(
|
|
||||||
role="Product Analyst",
|
|
||||||
goal="Analyze product images and provide detailed descriptions",
|
|
||||||
backstory="Expert in visual product analysis with deep knowledge of design and features",
|
|
||||||
multimodal=True
|
|
||||||
)
|
|
||||||
|
|
||||||
# Create a task for image analysis
|
|
||||||
task = Task(
|
|
||||||
description="Analyze the product image at https://example.com/product.jpg and provide a detailed description",
|
|
||||||
agent=image_analyst
|
|
||||||
)
|
|
||||||
|
|
||||||
# Create and run the crew
|
|
||||||
crew = Crew(
|
|
||||||
agents=[image_analyst],
|
|
||||||
tasks=[task]
|
|
||||||
)
|
|
||||||
|
|
||||||
result = crew.kickoff()
|
|
||||||
```
|
|
||||||
|
|
||||||
### Advanced Usage with Context
|
|
||||||
|
|
||||||
You can provide additional context or specific questions about the image when creating tasks for multimodal agents. The task description can include specific aspects you want the agent to focus on:
|
|
||||||
|
|
||||||
```python
|
|
||||||
from crewai import Agent, Task, Crew
|
|
||||||
|
|
||||||
# Create a multimodal agent for detailed analysis
|
|
||||||
expert_analyst = Agent(
|
|
||||||
role="Visual Quality Inspector",
|
|
||||||
goal="Perform detailed quality analysis of product images",
|
|
||||||
backstory="Senior quality control expert with expertise in visual inspection",
|
|
||||||
multimodal=True # AddImageTool is automatically included
|
|
||||||
)
|
|
||||||
|
|
||||||
# Create a task with specific analysis requirements
|
|
||||||
inspection_task = Task(
|
|
||||||
description="""
|
|
||||||
Analyze the product image at https://example.com/product.jpg with focus on:
|
|
||||||
1. Quality of materials
|
|
||||||
2. Manufacturing defects
|
|
||||||
3. Compliance with standards
|
|
||||||
Provide a detailed report highlighting any issues found.
|
|
||||||
""",
|
|
||||||
agent=expert_analyst
|
|
||||||
)
|
|
||||||
|
|
||||||
# Create and run the crew
|
|
||||||
crew = Crew(
|
|
||||||
agents=[expert_analyst],
|
|
||||||
tasks=[inspection_task]
|
|
||||||
)
|
|
||||||
|
|
||||||
result = crew.kickoff()
|
|
||||||
```
|
|
||||||
|
|
||||||
### Tool Details
|
|
||||||
|
|
||||||
When working with multimodal agents, the `AddImageTool` is automatically configured with the following schema:
|
|
||||||
|
|
||||||
```python
|
|
||||||
class AddImageToolSchema:
|
|
||||||
image_url: str # Required: The URL or path of the image to process
|
|
||||||
action: Optional[str] = None # Optional: Additional context or specific questions about the image
|
|
||||||
```
|
|
||||||
|
|
||||||
The multimodal agent will automatically handle the image processing through its built-in tools, allowing it to:
|
|
||||||
- Access images via URLs or local file paths
|
|
||||||
- Process image content with optional context or specific questions
|
|
||||||
- Provide analysis and insights based on the visual information and task requirements
|
|
||||||
|
|
||||||
## Best Practices
|
|
||||||
|
|
||||||
When working with multimodal agents, keep these best practices in mind:
|
|
||||||
|
|
||||||
1. **Image Access**
|
|
||||||
- Ensure your images are accessible via URLs that the agent can reach
|
|
||||||
- For local images, consider hosting them temporarily or using absolute file paths
|
|
||||||
- Verify that image URLs are valid and accessible before running tasks
|
|
||||||
|
|
||||||
2. **Task Description**
|
|
||||||
- Be specific about what aspects of the image you want the agent to analyze
|
|
||||||
- Include clear questions or requirements in the task description
|
|
||||||
- Consider using the optional `action` parameter for focused analysis
|
|
||||||
|
|
||||||
3. **Resource Management**
|
|
||||||
- Image processing may require more computational resources than text-only tasks
|
|
||||||
- Some language models may require base64 encoding for image data
|
|
||||||
- Consider batch processing for multiple images to optimize performance
|
|
||||||
|
|
||||||
4. **Environment Setup**
|
|
||||||
- Verify that your environment has the necessary dependencies for image processing
|
|
||||||
- Ensure your language model supports multimodal capabilities
|
|
||||||
- Test with small images first to validate your setup
|
|
||||||
|
|
||||||
5. **Error Handling**
|
|
||||||
- Implement proper error handling for image loading failures
|
|
||||||
- Have fallback strategies for when image processing fails
|
|
||||||
- Monitor and log image processing operations for debugging
|
|
||||||
@@ -7,7 +7,7 @@ icon: wrench
|
|||||||
<Note>
|
<Note>
|
||||||
**Python Version Requirements**
|
**Python Version Requirements**
|
||||||
|
|
||||||
CrewAI requires `Python >=3.10 and <3.13`. Here's how to check your version:
|
CrewAI requires `Python >=3.10 and <=3.12`. Here's how to check your version:
|
||||||
```bash
|
```bash
|
||||||
python3 --version
|
python3 --version
|
||||||
```
|
```
|
||||||
|
|||||||
@@ -3,44 +3,33 @@ name = "crewai"
|
|||||||
version = "0.86.0"
|
version = "0.86.0"
|
||||||
description = "Cutting-edge framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks."
|
description = "Cutting-edge framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks."
|
||||||
readme = "README.md"
|
readme = "README.md"
|
||||||
requires-python = ">=3.10,<3.13"
|
requires-python = ">=3.10,<=3.12"
|
||||||
authors = [
|
authors = [
|
||||||
{ name = "Joao Moura", email = "joao@crewai.com" }
|
{ name = "Joao Moura", email = "joao@crewai.com" }
|
||||||
]
|
]
|
||||||
dependencies = [
|
dependencies = [
|
||||||
# Core Dependencies
|
|
||||||
"pydantic>=2.4.2",
|
"pydantic>=2.4.2",
|
||||||
"openai>=1.13.3",
|
"openai>=1.13.3",
|
||||||
"litellm>=1.56.4",
|
|
||||||
"instructor>=1.3.3",
|
|
||||||
|
|
||||||
# Text Processing
|
|
||||||
"pdfplumber>=0.11.4",
|
|
||||||
"regex>=2024.9.11",
|
|
||||||
|
|
||||||
# Telemetry and Monitoring
|
|
||||||
"opentelemetry-api>=1.22.0",
|
"opentelemetry-api>=1.22.0",
|
||||||
"opentelemetry-sdk>=1.22.0",
|
"opentelemetry-sdk>=1.22.0",
|
||||||
"opentelemetry-exporter-otlp-proto-http>=1.22.0",
|
"opentelemetry-exporter-otlp-proto-http>=1.22.0",
|
||||||
|
"instructor>=1.3.3",
|
||||||
# Data Handling
|
"regex>=2024.9.11",
|
||||||
"chromadb>=0.5.23",
|
"crewai-tools>=0.17.0",
|
||||||
"openpyxl>=3.1.5",
|
|
||||||
"pyvis>=0.3.2",
|
|
||||||
|
|
||||||
# Authentication and Security
|
|
||||||
"auth0-python>=4.7.1",
|
|
||||||
"python-dotenv>=1.0.0",
|
|
||||||
|
|
||||||
# Configuration and Utils
|
|
||||||
"click>=8.1.7",
|
"click>=8.1.7",
|
||||||
|
"python-dotenv>=1.0.0",
|
||||||
"appdirs>=1.4.4",
|
"appdirs>=1.4.4",
|
||||||
"jsonref>=1.1.0",
|
"jsonref>=1.1.0",
|
||||||
"json-repair>=0.25.2",
|
"json-repair>=0.25.2",
|
||||||
|
"auth0-python>=4.7.1",
|
||||||
|
"litellm>=1.44.22",
|
||||||
|
"pyvis>=0.3.2",
|
||||||
"uv>=0.4.25",
|
"uv>=0.4.25",
|
||||||
"tomli-w>=1.1.0",
|
"tomli-w>=1.1.0",
|
||||||
"tomli>=2.0.2",
|
"tomli>=2.0.2",
|
||||||
"blinker>=1.9.0",
|
"chromadb>=0.5.18",
|
||||||
|
"pdfplumber>=0.11.4",
|
||||||
|
"openpyxl>=3.1.5",
|
||||||
]
|
]
|
||||||
|
|
||||||
[project.urls]
|
[project.urls]
|
||||||
@@ -49,10 +38,7 @@ Documentation = "https://docs.crewai.com"
|
|||||||
Repository = "https://github.com/crewAIInc/crewAI"
|
Repository = "https://github.com/crewAIInc/crewAI"
|
||||||
|
|
||||||
[project.optional-dependencies]
|
[project.optional-dependencies]
|
||||||
tools = ["crewai-tools>=0.17.0"]
|
tools = ["crewai-tools>=0.14.0"]
|
||||||
embeddings = [
|
|
||||||
"tiktoken~=0.7.0"
|
|
||||||
]
|
|
||||||
agentops = ["agentops>=0.3.0"]
|
agentops = ["agentops>=0.3.0"]
|
||||||
fastembed = ["fastembed>=0.4.1"]
|
fastembed = ["fastembed>=0.4.1"]
|
||||||
pdfplumber = [
|
pdfplumber = [
|
||||||
@@ -65,13 +51,10 @@ openpyxl = [
|
|||||||
"openpyxl>=3.1.5",
|
"openpyxl>=3.1.5",
|
||||||
]
|
]
|
||||||
mem0 = ["mem0ai>=0.1.29"]
|
mem0 = ["mem0ai>=0.1.29"]
|
||||||
docling = [
|
|
||||||
"docling>=2.12.0",
|
|
||||||
]
|
|
||||||
|
|
||||||
[tool.uv]
|
[tool.uv]
|
||||||
dev-dependencies = [
|
dev-dependencies = [
|
||||||
"ruff>=0.8.2",
|
"ruff>=0.4.10",
|
||||||
"mypy>=1.10.0",
|
"mypy>=1.10.0",
|
||||||
"pre-commit>=3.6.0",
|
"pre-commit>=3.6.0",
|
||||||
"mkdocs>=1.4.3",
|
"mkdocs>=1.4.3",
|
||||||
@@ -81,6 +64,7 @@ dev-dependencies = [
|
|||||||
"mkdocs-material-extensions>=1.3.1",
|
"mkdocs-material-extensions>=1.3.1",
|
||||||
"pillow>=10.2.0",
|
"pillow>=10.2.0",
|
||||||
"cairosvg>=2.7.1",
|
"cairosvg>=2.7.1",
|
||||||
|
"crewai-tools>=0.14.0",
|
||||||
"pytest>=8.0.0",
|
"pytest>=8.0.0",
|
||||||
"pytest-vcr>=1.0.2",
|
"pytest-vcr>=1.0.2",
|
||||||
"python-dotenv>=1.0.0",
|
"python-dotenv>=1.0.0",
|
||||||
|
|||||||
@@ -17,26 +17,33 @@ from crewai.memory.contextual.contextual_memory import ContextualMemory
|
|||||||
from crewai.task import Task
|
from crewai.task import Task
|
||||||
from crewai.tools import BaseTool
|
from crewai.tools import BaseTool
|
||||||
from crewai.tools.agent_tools.agent_tools import AgentTools
|
from crewai.tools.agent_tools.agent_tools import AgentTools
|
||||||
from crewai.tools.base_tool import Tool
|
|
||||||
from crewai.utilities import Converter, Prompts
|
from crewai.utilities import Converter, Prompts
|
||||||
from crewai.utilities.constants import TRAINED_AGENTS_DATA_FILE, TRAINING_DATA_FILE
|
from crewai.utilities.constants import TRAINED_AGENTS_DATA_FILE, TRAINING_DATA_FILE
|
||||||
from crewai.utilities.converter import generate_model_description
|
from crewai.utilities.converter import generate_model_description
|
||||||
from crewai.utilities.token_counter_callback import TokenCalcHandler
|
from crewai.utilities.token_counter_callback import TokenCalcHandler
|
||||||
from crewai.utilities.training_handler import CrewTrainingHandler
|
from crewai.utilities.training_handler import CrewTrainingHandler
|
||||||
|
|
||||||
agentops = None
|
|
||||||
|
|
||||||
try:
|
def mock_agent_ops_provider():
|
||||||
import agentops # type: ignore # Name "agentops" is already defined
|
def track_agent(*args, **kwargs):
|
||||||
from agentops import track_agent # type: ignore
|
|
||||||
except ImportError:
|
|
||||||
|
|
||||||
def track_agent():
|
|
||||||
def noop(f):
|
def noop(f):
|
||||||
return f
|
return f
|
||||||
|
|
||||||
return noop
|
return noop
|
||||||
|
|
||||||
|
return track_agent
|
||||||
|
|
||||||
|
|
||||||
|
agentops = None
|
||||||
|
|
||||||
|
if os.environ.get("AGENTOPS_API_KEY"):
|
||||||
|
try:
|
||||||
|
from agentops import track_agent
|
||||||
|
except ImportError:
|
||||||
|
track_agent = mock_agent_ops_provider()
|
||||||
|
else:
|
||||||
|
track_agent = mock_agent_ops_provider()
|
||||||
|
|
||||||
|
|
||||||
@track_agent()
|
@track_agent()
|
||||||
class Agent(BaseAgent):
|
class Agent(BaseAgent):
|
||||||
@@ -115,10 +122,6 @@ class Agent(BaseAgent):
|
|||||||
default=2,
|
default=2,
|
||||||
description="Maximum number of retries for an agent to execute a task when an error occurs.",
|
description="Maximum number of retries for an agent to execute a task when an error occurs.",
|
||||||
)
|
)
|
||||||
multimodal: bool = Field(
|
|
||||||
default=False,
|
|
||||||
description="Whether the agent is multimodal.",
|
|
||||||
)
|
|
||||||
code_execution_mode: Literal["safe", "unsafe"] = Field(
|
code_execution_mode: Literal["safe", "unsafe"] = Field(
|
||||||
default="safe",
|
default="safe",
|
||||||
description="Mode for code execution: 'safe' (using Docker) or 'unsafe' (direct execution).",
|
description="Mode for code execution: 'safe' (using Docker) or 'unsafe' (direct execution).",
|
||||||
@@ -411,10 +414,6 @@ class Agent(BaseAgent):
|
|||||||
tools = agent_tools.tools()
|
tools = agent_tools.tools()
|
||||||
return tools
|
return tools
|
||||||
|
|
||||||
def get_multimodal_tools(self) -> List[Tool]:
|
|
||||||
from crewai.tools.agent_tools.add_image_tool import AddImageTool
|
|
||||||
return [AddImageTool()]
|
|
||||||
|
|
||||||
def get_code_execution_tools(self):
|
def get_code_execution_tools(self):
|
||||||
try:
|
try:
|
||||||
from crewai_tools import CodeInterpreterTool
|
from crewai_tools import CodeInterpreterTool
|
||||||
|
|||||||
@@ -143,20 +143,10 @@ class CrewAgentExecutor(CrewAgentExecutorMixin):
|
|||||||
tool_result = self._execute_tool_and_check_finality(
|
tool_result = self._execute_tool_and_check_finality(
|
||||||
formatted_answer
|
formatted_answer
|
||||||
)
|
)
|
||||||
|
|
||||||
# Directly append the result to the messages if the
|
|
||||||
# tool is "Add image to content" in case of multimodal
|
|
||||||
# agents
|
|
||||||
if formatted_answer.tool == self._i18n.tools("add_image")["name"]:
|
|
||||||
self.messages.append(tool_result.result)
|
|
||||||
continue
|
|
||||||
|
|
||||||
else:
|
|
||||||
if self.step_callback:
|
if self.step_callback:
|
||||||
self.step_callback(tool_result)
|
self.step_callback(tool_result)
|
||||||
|
|
||||||
formatted_answer.text += f"\nObservation: {tool_result.result}"
|
formatted_answer.text += f"\nObservation: {tool_result.result}"
|
||||||
|
|
||||||
formatted_answer.result = tool_result.result
|
formatted_answer.result = tool_result.result
|
||||||
if tool_result.result_as_answer:
|
if tool_result.result_as_answer:
|
||||||
return AgentFinish(
|
return AgentFinish(
|
||||||
@@ -423,6 +413,7 @@ class CrewAgentExecutor(CrewAgentExecutorMixin):
|
|||||||
"""
|
"""
|
||||||
while self.ask_for_human_input:
|
while self.ask_for_human_input:
|
||||||
human_feedback = self._ask_human_input(formatted_answer.output)
|
human_feedback = self._ask_human_input(formatted_answer.output)
|
||||||
|
print("Human feedback: ", human_feedback)
|
||||||
|
|
||||||
if self.crew and self.crew._train:
|
if self.crew and self.crew._train:
|
||||||
self._handle_crew_training_output(formatted_answer, human_feedback)
|
self._handle_crew_training_output(formatted_answer, human_feedback)
|
||||||
|
|||||||
@@ -1,6 +1,5 @@
|
|||||||
import re
|
import re
|
||||||
from typing import Any, Union
|
from typing import Any, Union
|
||||||
|
|
||||||
from json_repair import repair_json
|
from json_repair import repair_json
|
||||||
|
|
||||||
from crewai.utilities import I18N
|
from crewai.utilities import I18N
|
||||||
|
|||||||
@@ -5,10 +5,9 @@ from typing import Any, Dict
|
|||||||
import requests
|
import requests
|
||||||
from rich.console import Console
|
from rich.console import Console
|
||||||
|
|
||||||
from crewai.cli.tools.main import ToolCommand
|
|
||||||
|
|
||||||
from .constants import AUTH0_AUDIENCE, AUTH0_CLIENT_ID, AUTH0_DOMAIN
|
from .constants import AUTH0_AUDIENCE, AUTH0_CLIENT_ID, AUTH0_DOMAIN
|
||||||
from .utils import TokenManager, validate_token
|
from .utils import TokenManager, validate_token
|
||||||
|
from crewai.cli.tools.main import ToolCommand
|
||||||
|
|
||||||
console = Console()
|
console = Console()
|
||||||
|
|
||||||
@@ -80,9 +79,7 @@ class AuthenticationCommand:
|
|||||||
style="yellow",
|
style="yellow",
|
||||||
)
|
)
|
||||||
|
|
||||||
console.print(
|
console.print("\n[bold green]Welcome to CrewAI Enterprise![/bold green]\n")
|
||||||
"\n[bold green]Welcome to CrewAI Enterprise![/bold green]\n"
|
|
||||||
)
|
|
||||||
return
|
return
|
||||||
|
|
||||||
if token_data["error"] not in ("authorization_pending", "slow_down"):
|
if token_data["error"] not in ("authorization_pending", "slow_down"):
|
||||||
|
|||||||
@@ -1,9 +1,10 @@
|
|||||||
from .utils import TokenManager
|
from .utils import TokenManager
|
||||||
|
|
||||||
|
|
||||||
def get_auth_token() -> str:
|
def get_auth_token() -> str:
|
||||||
"""Get the authentication token."""
|
"""Get the authentication token."""
|
||||||
access_token = TokenManager().get_token()
|
access_token = TokenManager().get_token()
|
||||||
if not access_token:
|
if not access_token:
|
||||||
raise Exception()
|
raise Exception()
|
||||||
return access_token
|
return access_token
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
from importlib.metadata import version as get_version
|
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
|
|
||||||
import click
|
import click
|
||||||
|
import pkg_resources
|
||||||
|
|
||||||
from crewai.cli.add_crew_to_flow import add_crew_to_flow
|
from crewai.cli.add_crew_to_flow import add_crew_to_flow
|
||||||
from crewai.cli.create_crew import create_crew
|
from crewai.cli.create_crew import create_crew
|
||||||
@@ -25,7 +25,7 @@ from .update_crew import update_crew
|
|||||||
|
|
||||||
|
|
||||||
@click.group()
|
@click.group()
|
||||||
@click.version_option(get_version("crewai"))
|
@click.version_option(pkg_resources.get_distribution("crewai").version)
|
||||||
def crewai():
|
def crewai():
|
||||||
"""Top-level command group for crewai."""
|
"""Top-level command group for crewai."""
|
||||||
|
|
||||||
@@ -52,16 +52,16 @@ def create(type, name, provider, skip_provider=False):
|
|||||||
def version(tools):
|
def version(tools):
|
||||||
"""Show the installed version of crewai."""
|
"""Show the installed version of crewai."""
|
||||||
try:
|
try:
|
||||||
crewai_version = get_version("crewai")
|
crewai_version = pkg_resources.get_distribution("crewai").version
|
||||||
except Exception:
|
except Exception:
|
||||||
crewai_version = "unknown version"
|
crewai_version = "unknown version"
|
||||||
click.echo(f"crewai version: {crewai_version}")
|
click.echo(f"crewai version: {crewai_version}")
|
||||||
|
|
||||||
if tools:
|
if tools:
|
||||||
try:
|
try:
|
||||||
tools_version = get_version("crewai")
|
tools_version = pkg_resources.get_distribution("crewai-tools").version
|
||||||
click.echo(f"crewai tools version: {tools_version}")
|
click.echo(f"crewai tools version: {tools_version}")
|
||||||
except Exception:
|
except pkg_resources.DistributionNotFound:
|
||||||
click.echo("crewai tools not installed")
|
click.echo("crewai tools not installed")
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -1,9 +1,8 @@
|
|||||||
import requests
|
import requests
|
||||||
from requests.exceptions import JSONDecodeError
|
from requests.exceptions import JSONDecodeError
|
||||||
from rich.console import Console
|
from rich.console import Console
|
||||||
|
|
||||||
from crewai.cli.authentication.token import get_auth_token
|
|
||||||
from crewai.cli.plus_api import PlusAPI
|
from crewai.cli.plus_api import PlusAPI
|
||||||
|
from crewai.cli.authentication.token import get_auth_token
|
||||||
from crewai.telemetry.telemetry import Telemetry
|
from crewai.telemetry.telemetry import Telemetry
|
||||||
|
|
||||||
console = Console()
|
console = Console()
|
||||||
|
|||||||
@@ -1,19 +1,13 @@
|
|||||||
import json
|
import json
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Optional
|
|
||||||
|
|
||||||
from pydantic import BaseModel, Field
|
from pydantic import BaseModel, Field
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
DEFAULT_CONFIG_PATH = Path.home() / ".config" / "crewai" / "settings.json"
|
DEFAULT_CONFIG_PATH = Path.home() / ".config" / "crewai" / "settings.json"
|
||||||
|
|
||||||
|
|
||||||
class Settings(BaseModel):
|
class Settings(BaseModel):
|
||||||
tool_repository_username: Optional[str] = Field(
|
tool_repository_username: Optional[str] = Field(None, description="Username for interacting with the Tool Repository")
|
||||||
None, description="Username for interacting with the Tool Repository"
|
tool_repository_password: Optional[str] = Field(None, description="Password for interacting with the Tool Repository")
|
||||||
)
|
|
||||||
tool_repository_password: Optional[str] = Field(
|
|
||||||
None, description="Password for interacting with the Tool Repository"
|
|
||||||
)
|
|
||||||
config_path: Path = Field(default=DEFAULT_CONFIG_PATH, exclude=True)
|
config_path: Path = Field(default=DEFAULT_CONFIG_PATH, exclude=True)
|
||||||
|
|
||||||
def __init__(self, config_path: Path = DEFAULT_CONFIG_PATH, **data):
|
def __init__(self, config_path: Path = DEFAULT_CONFIG_PATH, **data):
|
||||||
|
|||||||
@@ -1,10 +1,8 @@
|
|||||||
from os import getenv
|
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
from urllib.parse import urljoin
|
|
||||||
|
|
||||||
import requests
|
import requests
|
||||||
|
from os import getenv
|
||||||
from crewai.cli.version import get_crewai_version
|
from crewai.cli.version import get_crewai_version
|
||||||
|
from urllib.parse import urljoin
|
||||||
|
|
||||||
|
|
||||||
class PlusAPI:
|
class PlusAPI:
|
||||||
|
|||||||
@@ -1,12 +1,11 @@
|
|||||||
import subprocess
|
import subprocess
|
||||||
|
|
||||||
import click
|
import click
|
||||||
|
|
||||||
from crewai.knowledge.storage.knowledge_storage import KnowledgeStorage
|
|
||||||
from crewai.memory.entity.entity_memory import EntityMemory
|
from crewai.memory.entity.entity_memory import EntityMemory
|
||||||
from crewai.memory.long_term.long_term_memory import LongTermMemory
|
from crewai.memory.long_term.long_term_memory import LongTermMemory
|
||||||
from crewai.memory.short_term.short_term_memory import ShortTermMemory
|
from crewai.memory.short_term.short_term_memory import ShortTermMemory
|
||||||
from crewai.utilities.task_output_storage_handler import TaskOutputStorageHandler
|
from crewai.utilities.task_output_storage_handler import TaskOutputStorageHandler
|
||||||
|
from crewai.knowledge.storage.knowledge_storage import KnowledgeStorage
|
||||||
|
|
||||||
|
|
||||||
def reset_memories_command(
|
def reset_memories_command(
|
||||||
|
|||||||
@@ -4,7 +4,7 @@ Welcome to the {{crew_name}} Crew project, powered by [crewAI](https://crewai.co
|
|||||||
|
|
||||||
## Installation
|
## Installation
|
||||||
|
|
||||||
Ensure you have Python >=3.10 <3.13 installed on your system. This project uses [UV](https://docs.astral.sh/uv/) for dependency management and package handling, offering a seamless setup and execution experience.
|
Ensure you have Python >=3.10 <=3.12 installed on your system. This project uses [UV](https://docs.astral.sh/uv/) for dependency management and package handling, offering a seamless setup and execution experience.
|
||||||
|
|
||||||
First, if you haven't already, install uv:
|
First, if you haven't already, install uv:
|
||||||
|
|
||||||
|
|||||||
@@ -3,7 +3,7 @@ name = "{{folder_name}}"
|
|||||||
version = "0.1.0"
|
version = "0.1.0"
|
||||||
description = "{{name}} using crewAI"
|
description = "{{name}} using crewAI"
|
||||||
authors = [{ name = "Your Name", email = "you@example.com" }]
|
authors = [{ name = "Your Name", email = "you@example.com" }]
|
||||||
requires-python = ">=3.10,<3.13"
|
requires-python = ">=3.10,<=3.12"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"crewai[tools]>=0.86.0,<1.0.0"
|
"crewai[tools]>=0.86.0,<1.0.0"
|
||||||
]
|
]
|
||||||
@@ -18,6 +18,3 @@ test = "{{folder_name}}.main:test"
|
|||||||
[build-system]
|
[build-system]
|
||||||
requires = ["hatchling"]
|
requires = ["hatchling"]
|
||||||
build-backend = "hatchling.build"
|
build-backend = "hatchling.build"
|
||||||
|
|
||||||
[tool.crewai]
|
|
||||||
type = "crew"
|
|
||||||
|
|||||||
@@ -10,7 +10,7 @@ class MyCustomToolInput(BaseModel):
|
|||||||
class MyCustomTool(BaseTool):
|
class MyCustomTool(BaseTool):
|
||||||
name: str = "Name of my tool"
|
name: str = "Name of my tool"
|
||||||
description: str = (
|
description: str = (
|
||||||
"Clear description for what this tool is useful for, your agent will need this information to use it."
|
"Clear description for what this tool is useful for, you agent will need this information to use it."
|
||||||
)
|
)
|
||||||
args_schema: Type[BaseModel] = MyCustomToolInput
|
args_schema: Type[BaseModel] = MyCustomToolInput
|
||||||
|
|
||||||
|
|||||||
@@ -4,7 +4,7 @@ Welcome to the {{crew_name}} Crew project, powered by [crewAI](https://crewai.co
|
|||||||
|
|
||||||
## Installation
|
## Installation
|
||||||
|
|
||||||
Ensure you have Python >=3.10 <3.13 installed on your system. This project uses [UV](https://docs.astral.sh/uv/) for dependency management and package handling, offering a seamless setup and execution experience.
|
Ensure you have Python >=3.10 <=3.12 installed on your system. This project uses [UV](https://docs.astral.sh/uv/) for dependency management and package handling, offering a seamless setup and execution experience.
|
||||||
|
|
||||||
First, if you haven't already, install uv:
|
First, if you haven't already, install uv:
|
||||||
|
|
||||||
|
|||||||
@@ -5,7 +5,7 @@ from pydantic import BaseModel
|
|||||||
|
|
||||||
from crewai.flow.flow import Flow, listen, start
|
from crewai.flow.flow import Flow, listen, start
|
||||||
|
|
||||||
from {{folder_name}}.crews.poem_crew.poem_crew import PoemCrew
|
from .crews.poem_crew.poem_crew import PoemCrew
|
||||||
|
|
||||||
|
|
||||||
class PoemState(BaseModel):
|
class PoemState(BaseModel):
|
||||||
|
|||||||
@@ -3,7 +3,7 @@ name = "{{folder_name}}"
|
|||||||
version = "0.1.0"
|
version = "0.1.0"
|
||||||
description = "{{name}} using crewAI"
|
description = "{{name}} using crewAI"
|
||||||
authors = [{ name = "Your Name", email = "you@example.com" }]
|
authors = [{ name = "Your Name", email = "you@example.com" }]
|
||||||
requires-python = ">=3.10,<3.13"
|
requires-python = ">=3.10,<=3.12"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"crewai[tools]>=0.86.0,<1.0.0",
|
"crewai[tools]>=0.86.0,<1.0.0",
|
||||||
]
|
]
|
||||||
@@ -15,6 +15,3 @@ plot = "{{folder_name}}.main:plot"
|
|||||||
[build-system]
|
[build-system]
|
||||||
requires = ["hatchling"]
|
requires = ["hatchling"]
|
||||||
build-backend = "hatchling.build"
|
build-backend = "hatchling.build"
|
||||||
|
|
||||||
[tool.crewai]
|
|
||||||
type = "flow"
|
|
||||||
|
|||||||
@@ -13,7 +13,7 @@ class MyCustomToolInput(BaseModel):
|
|||||||
class MyCustomTool(BaseTool):
|
class MyCustomTool(BaseTool):
|
||||||
name: str = "Name of my tool"
|
name: str = "Name of my tool"
|
||||||
description: str = (
|
description: str = (
|
||||||
"Clear description for what this tool is useful for, your agent will need this information to use it."
|
"Clear description for what this tool is useful for, you agent will need this information to use it."
|
||||||
)
|
)
|
||||||
args_schema: Type[BaseModel] = MyCustomToolInput
|
args_schema: Type[BaseModel] = MyCustomToolInput
|
||||||
|
|
||||||
|
|||||||
@@ -5,7 +5,7 @@ custom tools to power up your crews.
|
|||||||
|
|
||||||
## Installing
|
## Installing
|
||||||
|
|
||||||
Ensure you have Python >=3.10 <3.13 installed on your system. This project
|
Ensure you have Python >=3.10 <=3.12 installed on your system. This project
|
||||||
uses [UV](https://docs.astral.sh/uv/) for dependency management and package
|
uses [UV](https://docs.astral.sh/uv/) for dependency management and package
|
||||||
handling, offering a seamless setup and execution experience.
|
handling, offering a seamless setup and execution experience.
|
||||||
|
|
||||||
|
|||||||
@@ -3,10 +3,8 @@ name = "{{folder_name}}"
|
|||||||
version = "0.1.0"
|
version = "0.1.0"
|
||||||
description = "Power up your crews with {{folder_name}}"
|
description = "Power up your crews with {{folder_name}}"
|
||||||
readme = "README.md"
|
readme = "README.md"
|
||||||
requires-python = ">=3.10,<3.13"
|
requires-python = ">=3.10,<=3.12"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"crewai[tools]>=0.86.0"
|
"crewai[tools]>=0.86.0"
|
||||||
]
|
]
|
||||||
|
|
||||||
[tool.crewai]
|
|
||||||
type = "tool"
|
|
||||||
|
|||||||
@@ -117,7 +117,7 @@ class ToolCommand(BaseCommand, PlusAPIMixin):
|
|||||||
|
|
||||||
published_handle = publish_response.json()["handle"]
|
published_handle = publish_response.json()["handle"]
|
||||||
console.print(
|
console.print(
|
||||||
f"Successfully published {published_handle} ({project_version}).\nInstall it in other projects with crewai tool install {published_handle}",
|
f"Succesfully published {published_handle} ({project_version}).\nInstall it in other projects with crewai tool install {published_handle}",
|
||||||
style="bold green",
|
style="bold green",
|
||||||
)
|
)
|
||||||
|
|
||||||
@@ -138,7 +138,7 @@ class ToolCommand(BaseCommand, PlusAPIMixin):
|
|||||||
|
|
||||||
self._add_package(get_response.json())
|
self._add_package(get_response.json())
|
||||||
|
|
||||||
console.print(f"Successfully installed {handle}", style="bold green")
|
console.print(f"Succesfully installed {handle}", style="bold green")
|
||||||
|
|
||||||
def login(self):
|
def login(self):
|
||||||
login_response = self.plus_api_client.login_to_tool_repository()
|
login_response = self.plus_api_client.login_to_tool_repository()
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
import importlib.metadata
|
import importlib.metadata
|
||||||
|
|
||||||
|
|
||||||
def get_crewai_version() -> str:
|
def get_crewai_version() -> str:
|
||||||
"""Get the version number of CrewAI running the CLI"""
|
"""Get the version number of CrewAI running the CLI"""
|
||||||
return importlib.metadata.version("crewai")
|
return importlib.metadata.version("crewai")
|
||||||
|
|
||||||
|
|||||||
@@ -1,5 +1,6 @@
|
|||||||
import asyncio
|
import asyncio
|
||||||
import json
|
import json
|
||||||
|
import os
|
||||||
import uuid
|
import uuid
|
||||||
import warnings
|
import warnings
|
||||||
from concurrent.futures import Future
|
from concurrent.futures import Future
|
||||||
@@ -35,7 +36,6 @@ from crewai.tasks.conditional_task import ConditionalTask
|
|||||||
from crewai.tasks.task_output import TaskOutput
|
from crewai.tasks.task_output import TaskOutput
|
||||||
from crewai.telemetry import Telemetry
|
from crewai.telemetry import Telemetry
|
||||||
from crewai.tools.agent_tools.agent_tools import AgentTools
|
from crewai.tools.agent_tools.agent_tools import AgentTools
|
||||||
from crewai.tools.base_tool import Tool
|
|
||||||
from crewai.types.usage_metrics import UsageMetrics
|
from crewai.types.usage_metrics import UsageMetrics
|
||||||
from crewai.utilities import I18N, FileHandler, Logger, RPMController
|
from crewai.utilities import I18N, FileHandler, Logger, RPMController
|
||||||
from crewai.utilities.constants import TRAINING_DATA_FILE
|
from crewai.utilities.constants import TRAINING_DATA_FILE
|
||||||
@@ -49,10 +49,12 @@ from crewai.utilities.planning_handler import CrewPlanner
|
|||||||
from crewai.utilities.task_output_storage_handler import TaskOutputStorageHandler
|
from crewai.utilities.task_output_storage_handler import TaskOutputStorageHandler
|
||||||
from crewai.utilities.training_handler import CrewTrainingHandler
|
from crewai.utilities.training_handler import CrewTrainingHandler
|
||||||
|
|
||||||
try:
|
agentops = None
|
||||||
|
if os.environ.get("AGENTOPS_API_KEY"):
|
||||||
|
try:
|
||||||
import agentops # type: ignore
|
import agentops # type: ignore
|
||||||
except ImportError:
|
except ImportError:
|
||||||
agentops = None
|
pass
|
||||||
|
|
||||||
|
|
||||||
warnings.filterwarnings("ignore", category=SyntaxWarning, module="pysbd")
|
warnings.filterwarnings("ignore", category=SyntaxWarning, module="pysbd")
|
||||||
@@ -534,6 +536,9 @@ class Crew(BaseModel):
|
|||||||
if not agent.function_calling_llm: # type: ignore # "BaseAgent" has no attribute "function_calling_llm"
|
if not agent.function_calling_llm: # type: ignore # "BaseAgent" has no attribute "function_calling_llm"
|
||||||
agent.function_calling_llm = self.function_calling_llm # type: ignore # "BaseAgent" has no attribute "function_calling_llm"
|
agent.function_calling_llm = self.function_calling_llm # type: ignore # "BaseAgent" has no attribute "function_calling_llm"
|
||||||
|
|
||||||
|
if agent.allow_code_execution: # type: ignore # BaseAgent" has no attribute "allow_code_execution"
|
||||||
|
agent.tools += agent.get_code_execution_tools() # type: ignore # "BaseAgent" has no attribute "get_code_execution_tools"; maybe "get_delegation_tools"?
|
||||||
|
|
||||||
if not agent.step_callback: # type: ignore # "BaseAgent" has no attribute "step_callback"
|
if not agent.step_callback: # type: ignore # "BaseAgent" has no attribute "step_callback"
|
||||||
agent.step_callback = self.step_callback # type: ignore # "BaseAgent" has no attribute "step_callback"
|
agent.step_callback = self.step_callback # type: ignore # "BaseAgent" has no attribute "step_callback"
|
||||||
|
|
||||||
@@ -670,6 +675,7 @@ class Crew(BaseModel):
|
|||||||
)
|
)
|
||||||
manager.tools = []
|
manager.tools = []
|
||||||
raise Exception("Manager agent should not have tools")
|
raise Exception("Manager agent should not have tools")
|
||||||
|
manager.tools = self.manager_agent.get_delegation_tools(self.agents)
|
||||||
else:
|
else:
|
||||||
self.manager_llm = (
|
self.manager_llm = (
|
||||||
getattr(self.manager_llm, "model_name", None)
|
getattr(self.manager_llm, "model_name", None)
|
||||||
@@ -681,7 +687,6 @@ class Crew(BaseModel):
|
|||||||
goal=i18n.retrieve("hierarchical_manager_agent", "goal"),
|
goal=i18n.retrieve("hierarchical_manager_agent", "goal"),
|
||||||
backstory=i18n.retrieve("hierarchical_manager_agent", "backstory"),
|
backstory=i18n.retrieve("hierarchical_manager_agent", "backstory"),
|
||||||
tools=AgentTools(agents=self.agents).tools(),
|
tools=AgentTools(agents=self.agents).tools(),
|
||||||
allow_delegation=True,
|
|
||||||
llm=self.manager_llm,
|
llm=self.manager_llm,
|
||||||
verbose=self.verbose,
|
verbose=self.verbose,
|
||||||
)
|
)
|
||||||
@@ -724,10 +729,7 @@ class Crew(BaseModel):
|
|||||||
f"No agent available for task: {task.description}. Ensure that either the task has an assigned agent or a manager agent is provided."
|
f"No agent available for task: {task.description}. Ensure that either the task has an assigned agent or a manager agent is provided."
|
||||||
)
|
)
|
||||||
|
|
||||||
# Determine which tools to use - task tools take precedence over agent tools
|
self._prepare_agent_tools(task)
|
||||||
tools_for_task = task.tools or agent_to_use.tools or []
|
|
||||||
tools_for_task = self._prepare_tools(agent_to_use, task, tools_for_task)
|
|
||||||
|
|
||||||
self._log_task_start(task, agent_to_use.role)
|
self._log_task_start(task, agent_to_use.role)
|
||||||
|
|
||||||
if isinstance(task, ConditionalTask):
|
if isinstance(task, ConditionalTask):
|
||||||
@@ -744,7 +746,7 @@ class Crew(BaseModel):
|
|||||||
future = task.execute_async(
|
future = task.execute_async(
|
||||||
agent=agent_to_use,
|
agent=agent_to_use,
|
||||||
context=context,
|
context=context,
|
||||||
tools=tools_for_task,
|
tools=agent_to_use.tools,
|
||||||
)
|
)
|
||||||
futures.append((task, future, task_index))
|
futures.append((task, future, task_index))
|
||||||
else:
|
else:
|
||||||
@@ -756,7 +758,7 @@ class Crew(BaseModel):
|
|||||||
task_output = task.execute_sync(
|
task_output = task.execute_sync(
|
||||||
agent=agent_to_use,
|
agent=agent_to_use,
|
||||||
context=context,
|
context=context,
|
||||||
tools=tools_for_task,
|
tools=agent_to_use.tools,
|
||||||
)
|
)
|
||||||
task_outputs = [task_output]
|
task_outputs = [task_output]
|
||||||
self._process_task_result(task, task_output)
|
self._process_task_result(task, task_output)
|
||||||
@@ -793,77 +795,45 @@ class Crew(BaseModel):
|
|||||||
return skipped_task_output
|
return skipped_task_output
|
||||||
return None
|
return None
|
||||||
|
|
||||||
def _prepare_tools(
|
def _prepare_agent_tools(self, task: Task):
|
||||||
self, agent: BaseAgent, task: Task, tools: List[Tool]
|
|
||||||
) -> List[Tool]:
|
|
||||||
# Add delegation tools if agent allows delegation
|
|
||||||
if agent.allow_delegation:
|
|
||||||
if self.process == Process.hierarchical:
|
if self.process == Process.hierarchical:
|
||||||
if self.manager_agent:
|
if self.manager_agent:
|
||||||
tools = self._update_manager_tools(task, tools)
|
self._update_manager_tools(task)
|
||||||
else:
|
else:
|
||||||
raise ValueError(
|
raise ValueError("Manager agent is required for hierarchical process.")
|
||||||
"Manager agent is required for hierarchical process."
|
elif task.agent and task.agent.allow_delegation:
|
||||||
)
|
self._add_delegation_tools(task)
|
||||||
|
|
||||||
elif agent and agent.allow_delegation:
|
|
||||||
tools = self._add_delegation_tools(task, tools)
|
|
||||||
|
|
||||||
# Add code execution tools if agent allows code execution
|
|
||||||
if agent.allow_code_execution:
|
|
||||||
tools = self._add_code_execution_tools(agent, tools)
|
|
||||||
|
|
||||||
if agent and agent.multimodal:
|
|
||||||
tools = self._add_multimodal_tools(agent, tools)
|
|
||||||
|
|
||||||
return tools
|
|
||||||
|
|
||||||
def _get_agent_to_use(self, task: Task) -> Optional[BaseAgent]:
|
def _get_agent_to_use(self, task: Task) -> Optional[BaseAgent]:
|
||||||
if self.process == Process.hierarchical:
|
if self.process == Process.hierarchical:
|
||||||
return self.manager_agent
|
return self.manager_agent
|
||||||
return task.agent
|
return task.agent
|
||||||
|
|
||||||
def _merge_tools(
|
def _add_delegation_tools(self, task: Task):
|
||||||
self, existing_tools: List[Tool], new_tools: List[Tool]
|
|
||||||
) -> List[Tool]:
|
|
||||||
"""Merge new tools into existing tools list, avoiding duplicates by tool name."""
|
|
||||||
if not new_tools:
|
|
||||||
return existing_tools
|
|
||||||
|
|
||||||
# Create mapping of tool names to new tools
|
|
||||||
new_tool_map = {tool.name: tool for tool in new_tools}
|
|
||||||
|
|
||||||
# Remove any existing tools that will be replaced
|
|
||||||
tools = [tool for tool in existing_tools if tool.name not in new_tool_map]
|
|
||||||
|
|
||||||
# Add all new tools
|
|
||||||
tools.extend(new_tools)
|
|
||||||
|
|
||||||
return tools
|
|
||||||
|
|
||||||
def _inject_delegation_tools(
|
|
||||||
self, tools: List[Tool], task_agent: BaseAgent, agents: List[BaseAgent]
|
|
||||||
):
|
|
||||||
delegation_tools = task_agent.get_delegation_tools(agents)
|
|
||||||
return self._merge_tools(tools, delegation_tools)
|
|
||||||
|
|
||||||
def _add_multimodal_tools(self, agent: BaseAgent, tools: List[Tool]):
|
|
||||||
multimodal_tools = agent.get_multimodal_tools()
|
|
||||||
return self._merge_tools(tools, multimodal_tools)
|
|
||||||
|
|
||||||
def _add_code_execution_tools(self, agent: BaseAgent, tools: List[Tool]):
|
|
||||||
code_tools = agent.get_code_execution_tools()
|
|
||||||
return self._merge_tools(tools, code_tools)
|
|
||||||
|
|
||||||
def _add_delegation_tools(self, task: Task, tools: List[Tool]):
|
|
||||||
agents_for_delegation = [agent for agent in self.agents if agent != task.agent]
|
agents_for_delegation = [agent for agent in self.agents if agent != task.agent]
|
||||||
if len(self.agents) > 1 and len(agents_for_delegation) > 0 and task.agent:
|
if len(self.agents) > 1 and len(agents_for_delegation) > 0 and task.agent:
|
||||||
if not tools:
|
delegation_tools = task.agent.get_delegation_tools(agents_for_delegation)
|
||||||
tools = []
|
|
||||||
tools = self._inject_delegation_tools(
|
# Add tools if they are not already in task.tools
|
||||||
tools, task.agent, agents_for_delegation
|
for new_tool in delegation_tools:
|
||||||
|
# Find the index of the tool with the same name
|
||||||
|
existing_tool_index = next(
|
||||||
|
(
|
||||||
|
index
|
||||||
|
for index, tool in enumerate(task.tools or [])
|
||||||
|
if tool.name == new_tool.name
|
||||||
|
),
|
||||||
|
None,
|
||||||
)
|
)
|
||||||
return tools
|
if not task.tools:
|
||||||
|
task.tools = []
|
||||||
|
|
||||||
|
if existing_tool_index is not None:
|
||||||
|
# Replace the existing tool
|
||||||
|
task.tools[existing_tool_index] = new_tool
|
||||||
|
else:
|
||||||
|
# Add the new tool
|
||||||
|
task.tools.append(new_tool)
|
||||||
|
|
||||||
def _log_task_start(self, task: Task, role: str = "None"):
|
def _log_task_start(self, task: Task, role: str = "None"):
|
||||||
if self.output_log_file:
|
if self.output_log_file:
|
||||||
@@ -871,15 +841,14 @@ class Crew(BaseModel):
|
|||||||
task_name=task.name, task=task.description, agent=role, status="started"
|
task_name=task.name, task=task.description, agent=role, status="started"
|
||||||
)
|
)
|
||||||
|
|
||||||
def _update_manager_tools(self, task: Task, tools: List[Tool]):
|
def _update_manager_tools(self, task: Task):
|
||||||
if self.manager_agent:
|
if self.manager_agent:
|
||||||
if task.agent:
|
if task.agent:
|
||||||
tools = self._inject_delegation_tools(tools, task.agent, [task.agent])
|
self.manager_agent.tools = task.agent.get_delegation_tools([task.agent])
|
||||||
else:
|
else:
|
||||||
tools = self._inject_delegation_tools(
|
self.manager_agent.tools = self.manager_agent.get_delegation_tools(
|
||||||
tools, self.manager_agent, self.agents
|
self.agents
|
||||||
)
|
)
|
||||||
return tools
|
|
||||||
|
|
||||||
def _get_context(self, task: Task, task_outputs: List[TaskOutput]):
|
def _get_context(self, task: Task, task_outputs: List[TaskOutput]):
|
||||||
context = (
|
context = (
|
||||||
@@ -1063,7 +1032,6 @@ class Crew(BaseModel):
|
|||||||
agentops.end_session(
|
agentops.end_session(
|
||||||
end_state="Success",
|
end_state="Success",
|
||||||
end_state_reason="Finished Execution",
|
end_state_reason="Finished Execution",
|
||||||
is_auto_end=True,
|
|
||||||
)
|
)
|
||||||
self._telemetry.end_crew(self, final_string_output)
|
self._telemetry.end_crew(self, final_string_output)
|
||||||
|
|
||||||
|
|||||||
@@ -14,15 +14,8 @@ from typing import (
|
|||||||
cast,
|
cast,
|
||||||
)
|
)
|
||||||
|
|
||||||
from blinker import Signal
|
|
||||||
from pydantic import BaseModel, ValidationError
|
from pydantic import BaseModel, ValidationError
|
||||||
|
|
||||||
from crewai.flow.flow_events import (
|
|
||||||
FlowFinishedEvent,
|
|
||||||
FlowStartedEvent,
|
|
||||||
MethodExecutionFinishedEvent,
|
|
||||||
MethodExecutionStartedEvent,
|
|
||||||
)
|
|
||||||
from crewai.flow.flow_visualizer import plot_flow
|
from crewai.flow.flow_visualizer import plot_flow
|
||||||
from crewai.flow.utils import get_possible_return_constants
|
from crewai.flow.utils import get_possible_return_constants
|
||||||
from crewai.telemetry import Telemetry
|
from crewai.telemetry import Telemetry
|
||||||
@@ -30,47 +23,7 @@ from crewai.telemetry import Telemetry
|
|||||||
T = TypeVar("T", bound=Union[BaseModel, Dict[str, Any]])
|
T = TypeVar("T", bound=Union[BaseModel, Dict[str, Any]])
|
||||||
|
|
||||||
|
|
||||||
def start(condition: Optional[Union[str, dict, Callable]] = None) -> Callable:
|
def start(condition=None):
|
||||||
"""
|
|
||||||
Marks a method as a flow's starting point.
|
|
||||||
|
|
||||||
This decorator designates a method as an entry point for the flow execution.
|
|
||||||
It can optionally specify conditions that trigger the start based on other
|
|
||||||
method executions.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
condition : Optional[Union[str, dict, Callable]], optional
|
|
||||||
Defines when the start method should execute. Can be:
|
|
||||||
- str: Name of a method that triggers this start
|
|
||||||
- dict: Contains "type" ("AND"/"OR") and "methods" (list of triggers)
|
|
||||||
- Callable: A method reference that triggers this start
|
|
||||||
Default is None, meaning unconditional start.
|
|
||||||
|
|
||||||
Returns
|
|
||||||
-------
|
|
||||||
Callable
|
|
||||||
A decorator function that marks the method as a flow start point.
|
|
||||||
|
|
||||||
Raises
|
|
||||||
------
|
|
||||||
ValueError
|
|
||||||
If the condition format is invalid.
|
|
||||||
|
|
||||||
Examples
|
|
||||||
--------
|
|
||||||
>>> @start() # Unconditional start
|
|
||||||
>>> def begin_flow(self):
|
|
||||||
... pass
|
|
||||||
|
|
||||||
>>> @start("method_name") # Start after specific method
|
|
||||||
>>> def conditional_start(self):
|
|
||||||
... pass
|
|
||||||
|
|
||||||
>>> @start(and_("method1", "method2")) # Start after multiple methods
|
|
||||||
>>> def complex_start(self):
|
|
||||||
... pass
|
|
||||||
"""
|
|
||||||
def decorator(func):
|
def decorator(func):
|
||||||
func.__is_start_method__ = True
|
func.__is_start_method__ = True
|
||||||
if condition is not None:
|
if condition is not None:
|
||||||
@@ -95,42 +48,8 @@ def start(condition: Optional[Union[str, dict, Callable]] = None) -> Callable:
|
|||||||
|
|
||||||
return decorator
|
return decorator
|
||||||
|
|
||||||
def listen(condition: Union[str, dict, Callable]) -> Callable:
|
|
||||||
"""
|
|
||||||
Creates a listener that executes when specified conditions are met.
|
|
||||||
|
|
||||||
This decorator sets up a method to execute in response to other method
|
def listen(condition):
|
||||||
executions in the flow. It supports both simple and complex triggering
|
|
||||||
conditions.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
condition : Union[str, dict, Callable]
|
|
||||||
Specifies when the listener should execute. Can be:
|
|
||||||
- str: Name of a method that triggers this listener
|
|
||||||
- dict: Contains "type" ("AND"/"OR") and "methods" (list of triggers)
|
|
||||||
- Callable: A method reference that triggers this listener
|
|
||||||
|
|
||||||
Returns
|
|
||||||
-------
|
|
||||||
Callable
|
|
||||||
A decorator function that sets up the method as a listener.
|
|
||||||
|
|
||||||
Raises
|
|
||||||
------
|
|
||||||
ValueError
|
|
||||||
If the condition format is invalid.
|
|
||||||
|
|
||||||
Examples
|
|
||||||
--------
|
|
||||||
>>> @listen("process_data") # Listen to single method
|
|
||||||
>>> def handle_processed_data(self):
|
|
||||||
... pass
|
|
||||||
|
|
||||||
>>> @listen(or_("success", "failure")) # Listen to multiple methods
|
|
||||||
>>> def handle_completion(self):
|
|
||||||
... pass
|
|
||||||
"""
|
|
||||||
def decorator(func):
|
def decorator(func):
|
||||||
if isinstance(condition, str):
|
if isinstance(condition, str):
|
||||||
func.__trigger_methods__ = [condition]
|
func.__trigger_methods__ = [condition]
|
||||||
@@ -154,103 +73,16 @@ def listen(condition: Union[str, dict, Callable]) -> Callable:
|
|||||||
return decorator
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
def router(condition: Union[str, dict, Callable]) -> Callable:
|
def router(method):
|
||||||
"""
|
|
||||||
Creates a routing method that directs flow execution based on conditions.
|
|
||||||
|
|
||||||
This decorator marks a method as a router, which can dynamically determine
|
|
||||||
the next steps in the flow based on its return value. Routers are triggered
|
|
||||||
by specified conditions and can return constants that determine which path
|
|
||||||
the flow should take.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
condition : Union[str, dict, Callable]
|
|
||||||
Specifies when the router should execute. Can be:
|
|
||||||
- str: Name of a method that triggers this router
|
|
||||||
- dict: Contains "type" ("AND"/"OR") and "methods" (list of triggers)
|
|
||||||
- Callable: A method reference that triggers this router
|
|
||||||
|
|
||||||
Returns
|
|
||||||
-------
|
|
||||||
Callable
|
|
||||||
A decorator function that sets up the method as a router.
|
|
||||||
|
|
||||||
Raises
|
|
||||||
------
|
|
||||||
ValueError
|
|
||||||
If the condition format is invalid.
|
|
||||||
|
|
||||||
Examples
|
|
||||||
--------
|
|
||||||
>>> @router("check_status")
|
|
||||||
>>> def route_based_on_status(self):
|
|
||||||
... if self.state.status == "success":
|
|
||||||
... return SUCCESS
|
|
||||||
... return FAILURE
|
|
||||||
|
|
||||||
>>> @router(and_("validate", "process"))
|
|
||||||
>>> def complex_routing(self):
|
|
||||||
... if all([self.state.valid, self.state.processed]):
|
|
||||||
... return CONTINUE
|
|
||||||
... return STOP
|
|
||||||
"""
|
|
||||||
def decorator(func):
|
def decorator(func):
|
||||||
func.__is_router__ = True
|
func.__is_router__ = True
|
||||||
if isinstance(condition, str):
|
func.__router_for__ = method.__name__
|
||||||
func.__trigger_methods__ = [condition]
|
|
||||||
func.__condition_type__ = "OR"
|
|
||||||
elif (
|
|
||||||
isinstance(condition, dict)
|
|
||||||
and "type" in condition
|
|
||||||
and "methods" in condition
|
|
||||||
):
|
|
||||||
func.__trigger_methods__ = condition["methods"]
|
|
||||||
func.__condition_type__ = condition["type"]
|
|
||||||
elif callable(condition) and hasattr(condition, "__name__"):
|
|
||||||
func.__trigger_methods__ = [condition.__name__]
|
|
||||||
func.__condition_type__ = "OR"
|
|
||||||
else:
|
|
||||||
raise ValueError(
|
|
||||||
"Condition must be a method, string, or a result of or_() or and_()"
|
|
||||||
)
|
|
||||||
return func
|
return func
|
||||||
|
|
||||||
return decorator
|
return decorator
|
||||||
|
|
||||||
def or_(*conditions: Union[str, dict, Callable]) -> dict:
|
|
||||||
"""
|
|
||||||
Combines multiple conditions with OR logic for flow control.
|
|
||||||
|
|
||||||
Creates a condition that is satisfied when any of the specified conditions
|
def or_(*conditions):
|
||||||
are met. This is used with @start, @listen, or @router decorators to create
|
|
||||||
complex triggering conditions.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
*conditions : Union[str, dict, Callable]
|
|
||||||
Variable number of conditions that can be:
|
|
||||||
- str: Method names
|
|
||||||
- dict: Existing condition dictionaries
|
|
||||||
- Callable: Method references
|
|
||||||
|
|
||||||
Returns
|
|
||||||
-------
|
|
||||||
dict
|
|
||||||
A condition dictionary with format:
|
|
||||||
{"type": "OR", "methods": list_of_method_names}
|
|
||||||
|
|
||||||
Raises
|
|
||||||
------
|
|
||||||
ValueError
|
|
||||||
If any condition is invalid.
|
|
||||||
|
|
||||||
Examples
|
|
||||||
--------
|
|
||||||
>>> @listen(or_("success", "timeout"))
|
|
||||||
>>> def handle_completion(self):
|
|
||||||
... pass
|
|
||||||
"""
|
|
||||||
methods = []
|
methods = []
|
||||||
for condition in conditions:
|
for condition in conditions:
|
||||||
if isinstance(condition, dict) and "methods" in condition:
|
if isinstance(condition, dict) and "methods" in condition:
|
||||||
@@ -264,39 +96,7 @@ def or_(*conditions: Union[str, dict, Callable]) -> dict:
|
|||||||
return {"type": "OR", "methods": methods}
|
return {"type": "OR", "methods": methods}
|
||||||
|
|
||||||
|
|
||||||
def and_(*conditions: Union[str, dict, Callable]) -> dict:
|
def and_(*conditions):
|
||||||
"""
|
|
||||||
Combines multiple conditions with AND logic for flow control.
|
|
||||||
|
|
||||||
Creates a condition that is satisfied only when all specified conditions
|
|
||||||
are met. This is used with @start, @listen, or @router decorators to create
|
|
||||||
complex triggering conditions.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
*conditions : Union[str, dict, Callable]
|
|
||||||
Variable number of conditions that can be:
|
|
||||||
- str: Method names
|
|
||||||
- dict: Existing condition dictionaries
|
|
||||||
- Callable: Method references
|
|
||||||
|
|
||||||
Returns
|
|
||||||
-------
|
|
||||||
dict
|
|
||||||
A condition dictionary with format:
|
|
||||||
{"type": "AND", "methods": list_of_method_names}
|
|
||||||
|
|
||||||
Raises
|
|
||||||
------
|
|
||||||
ValueError
|
|
||||||
If any condition is invalid.
|
|
||||||
|
|
||||||
Examples
|
|
||||||
--------
|
|
||||||
>>> @listen(and_("validated", "processed"))
|
|
||||||
>>> def handle_complete_data(self):
|
|
||||||
... pass
|
|
||||||
"""
|
|
||||||
methods = []
|
methods = []
|
||||||
for condition in conditions:
|
for condition in conditions:
|
||||||
if isinstance(condition, dict) and "methods" in condition:
|
if isinstance(condition, dict) and "methods" in condition:
|
||||||
@@ -316,8 +116,8 @@ class FlowMeta(type):
|
|||||||
|
|
||||||
start_methods = []
|
start_methods = []
|
||||||
listeners = {}
|
listeners = {}
|
||||||
|
routers = {}
|
||||||
router_paths = {}
|
router_paths = {}
|
||||||
routers = set()
|
|
||||||
|
|
||||||
for attr_name, attr_value in dct.items():
|
for attr_name, attr_value in dct.items():
|
||||||
if hasattr(attr_value, "__is_start_method__"):
|
if hasattr(attr_value, "__is_start_method__"):
|
||||||
@@ -330,12 +130,19 @@ class FlowMeta(type):
|
|||||||
methods = attr_value.__trigger_methods__
|
methods = attr_value.__trigger_methods__
|
||||||
condition_type = getattr(attr_value, "__condition_type__", "OR")
|
condition_type = getattr(attr_value, "__condition_type__", "OR")
|
||||||
listeners[attr_name] = (condition_type, methods)
|
listeners[attr_name] = (condition_type, methods)
|
||||||
if hasattr(attr_value, "__is_router__") and attr_value.__is_router__:
|
|
||||||
routers.add(attr_name)
|
elif hasattr(attr_value, "__is_router__"):
|
||||||
|
routers[attr_value.__router_for__] = attr_name
|
||||||
possible_returns = get_possible_return_constants(attr_value)
|
possible_returns = get_possible_return_constants(attr_value)
|
||||||
if possible_returns:
|
if possible_returns:
|
||||||
router_paths[attr_name] = possible_returns
|
router_paths[attr_name] = possible_returns
|
||||||
|
|
||||||
|
# Register router as a listener to its triggering method
|
||||||
|
trigger_method_name = attr_value.__router_for__
|
||||||
|
methods = [trigger_method_name]
|
||||||
|
condition_type = "OR"
|
||||||
|
listeners[attr_name] = (condition_type, methods)
|
||||||
|
|
||||||
setattr(cls, "_start_methods", start_methods)
|
setattr(cls, "_start_methods", start_methods)
|
||||||
setattr(cls, "_listeners", listeners)
|
setattr(cls, "_listeners", listeners)
|
||||||
setattr(cls, "_routers", routers)
|
setattr(cls, "_routers", routers)
|
||||||
@@ -349,10 +156,9 @@ class Flow(Generic[T], metaclass=FlowMeta):
|
|||||||
|
|
||||||
_start_methods: List[str] = []
|
_start_methods: List[str] = []
|
||||||
_listeners: Dict[str, tuple[str, List[str]]] = {}
|
_listeners: Dict[str, tuple[str, List[str]]] = {}
|
||||||
_routers: Set[str] = set()
|
_routers: Dict[str, str] = {}
|
||||||
_router_paths: Dict[str, List[str]] = {}
|
_router_paths: Dict[str, List[str]] = {}
|
||||||
initial_state: Union[Type[T], T, None] = None
|
initial_state: Union[Type[T], T, None] = None
|
||||||
event_emitter = Signal("event_emitter")
|
|
||||||
|
|
||||||
def __class_getitem__(cls: Type["Flow"], item: Type[T]) -> Type["Flow"]:
|
def __class_getitem__(cls: Type["Flow"], item: Type[T]) -> Type["Flow"]:
|
||||||
class _FlowGeneric(cls): # type: ignore
|
class _FlowGeneric(cls): # type: ignore
|
||||||
@@ -396,10 +202,20 @@ class Flow(Generic[T], metaclass=FlowMeta):
|
|||||||
return self._method_outputs
|
return self._method_outputs
|
||||||
|
|
||||||
def _initialize_state(self, inputs: Dict[str, Any]) -> None:
|
def _initialize_state(self, inputs: Dict[str, Any]) -> None:
|
||||||
if isinstance(self._state, BaseModel):
|
"""
|
||||||
# Structured state
|
Initializes or updates the state with the provided inputs.
|
||||||
try:
|
|
||||||
|
|
||||||
|
Args:
|
||||||
|
inputs: Dictionary of inputs to initialize or update the state.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ValueError: If inputs do not match the structured state model.
|
||||||
|
TypeError: If state is neither a BaseModel instance nor a dictionary.
|
||||||
|
"""
|
||||||
|
if isinstance(self._state, BaseModel):
|
||||||
|
# Structured state management
|
||||||
|
try:
|
||||||
|
# Define a function to create the dynamic class
|
||||||
def create_model_with_extra_forbid(
|
def create_model_with_extra_forbid(
|
||||||
base_model: Type[BaseModel],
|
base_model: Type[BaseModel],
|
||||||
) -> Type[BaseModel]:
|
) -> Type[BaseModel]:
|
||||||
@@ -409,33 +225,50 @@ class Flow(Generic[T], metaclass=FlowMeta):
|
|||||||
|
|
||||||
return ModelWithExtraForbid
|
return ModelWithExtraForbid
|
||||||
|
|
||||||
|
# Create the dynamic class
|
||||||
ModelWithExtraForbid = create_model_with_extra_forbid(
|
ModelWithExtraForbid = create_model_with_extra_forbid(
|
||||||
self._state.__class__
|
self._state.__class__
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# Create a new instance using the combined state and inputs
|
||||||
self._state = cast(
|
self._state = cast(
|
||||||
T, ModelWithExtraForbid(**{**self._state.model_dump(), **inputs})
|
T, ModelWithExtraForbid(**{**self._state.model_dump(), **inputs})
|
||||||
)
|
)
|
||||||
|
|
||||||
except ValidationError as e:
|
except ValidationError as e:
|
||||||
raise ValueError(f"Invalid inputs for structured state: {e}") from e
|
raise ValueError(f"Invalid inputs for structured state: {e}") from e
|
||||||
elif isinstance(self._state, dict):
|
elif isinstance(self._state, dict):
|
||||||
|
# Unstructured state management
|
||||||
self._state.update(inputs)
|
self._state.update(inputs)
|
||||||
else:
|
else:
|
||||||
raise TypeError("State must be a BaseModel instance or a dictionary.")
|
raise TypeError("State must be a BaseModel instance or a dictionary.")
|
||||||
|
|
||||||
def kickoff(self, inputs: Optional[Dict[str, Any]] = None) -> Any:
|
def kickoff(self, inputs: Optional[Dict[str, Any]] = None) -> Any:
|
||||||
self.event_emitter.send(
|
"""
|
||||||
self,
|
Starts the execution of the flow synchronously.
|
||||||
event=FlowStartedEvent(
|
|
||||||
type="flow_started",
|
|
||||||
flow_name=self.__class__.__name__,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
|
|
||||||
|
Args:
|
||||||
|
inputs: Optional dictionary of inputs to initialize or update the state.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The final output from the flow execution.
|
||||||
|
"""
|
||||||
if inputs is not None:
|
if inputs is not None:
|
||||||
self._initialize_state(inputs)
|
self._initialize_state(inputs)
|
||||||
return asyncio.run(self.kickoff_async())
|
return asyncio.run(self.kickoff_async())
|
||||||
|
|
||||||
async def kickoff_async(self, inputs: Optional[Dict[str, Any]] = None) -> Any:
|
async def kickoff_async(self, inputs: Optional[Dict[str, Any]] = None) -> Any:
|
||||||
|
"""
|
||||||
|
Starts the execution of the flow asynchronously.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
inputs: Optional dictionary of inputs to initialize or update the state.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The final output from the flow execution.
|
||||||
|
"""
|
||||||
|
if inputs is not None:
|
||||||
|
self._initialize_state(inputs)
|
||||||
if not self._start_methods:
|
if not self._start_methods:
|
||||||
raise ValueError("No start method defined")
|
raise ValueError("No start method defined")
|
||||||
|
|
||||||
@@ -443,42 +276,22 @@ class Flow(Generic[T], metaclass=FlowMeta):
|
|||||||
self.__class__.__name__, list(self._methods.keys())
|
self.__class__.__name__, list(self._methods.keys())
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# Create tasks for all start methods
|
||||||
tasks = [
|
tasks = [
|
||||||
self._execute_start_method(start_method)
|
self._execute_start_method(start_method)
|
||||||
for start_method in self._start_methods
|
for start_method in self._start_methods
|
||||||
]
|
]
|
||||||
|
|
||||||
|
# Run all start methods concurrently
|
||||||
await asyncio.gather(*tasks)
|
await asyncio.gather(*tasks)
|
||||||
|
|
||||||
final_output = self._method_outputs[-1] if self._method_outputs else None
|
# Return the final output (from the last executed method)
|
||||||
|
if self._method_outputs:
|
||||||
self.event_emitter.send(
|
return self._method_outputs[-1]
|
||||||
self,
|
else:
|
||||||
event=FlowFinishedEvent(
|
return None # Or raise an exception if no methods were executed
|
||||||
type="flow_finished",
|
|
||||||
flow_name=self.__class__.__name__,
|
|
||||||
result=final_output,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
return final_output
|
|
||||||
|
|
||||||
async def _execute_start_method(self, start_method_name: str) -> None:
|
async def _execute_start_method(self, start_method_name: str) -> None:
|
||||||
"""
|
|
||||||
Executes a flow's start method and its triggered listeners.
|
|
||||||
|
|
||||||
This internal method handles the execution of methods marked with @start
|
|
||||||
decorator and manages the subsequent chain of listener executions.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
start_method_name : str
|
|
||||||
The name of the start method to execute.
|
|
||||||
|
|
||||||
Notes
|
|
||||||
-----
|
|
||||||
- Executes the start method and captures its result
|
|
||||||
- Triggers execution of any listeners waiting on this start method
|
|
||||||
- Part of the flow's initialization sequence
|
|
||||||
"""
|
|
||||||
result = await self._execute_method(
|
result = await self._execute_method(
|
||||||
start_method_name, self._methods[start_method_name]
|
start_method_name, self._methods[start_method_name]
|
||||||
)
|
)
|
||||||
@@ -492,181 +305,70 @@ class Flow(Generic[T], metaclass=FlowMeta):
|
|||||||
if asyncio.iscoroutinefunction(method)
|
if asyncio.iscoroutinefunction(method)
|
||||||
else method(*args, **kwargs)
|
else method(*args, **kwargs)
|
||||||
)
|
)
|
||||||
self._method_outputs.append(result)
|
self._method_outputs.append(result) # Store the output
|
||||||
|
|
||||||
|
# Track method execution counts
|
||||||
self._method_execution_counts[method_name] = (
|
self._method_execution_counts[method_name] = (
|
||||||
self._method_execution_counts.get(method_name, 0) + 1
|
self._method_execution_counts.get(method_name, 0) + 1
|
||||||
)
|
)
|
||||||
|
|
||||||
return result
|
return result
|
||||||
|
|
||||||
async def _execute_listeners(self, trigger_method: str, result: Any) -> None:
|
async def _execute_listeners(self, trigger_method: str, result: Any) -> None:
|
||||||
"""
|
listener_tasks = []
|
||||||
Executes all listeners and routers triggered by a method completion.
|
|
||||||
|
|
||||||
This internal method manages the execution flow by:
|
if trigger_method in self._routers:
|
||||||
1. First executing all triggered routers sequentially
|
router_method = self._methods[self._routers[trigger_method]]
|
||||||
2. Then executing all triggered listeners in parallel
|
path = await self._execute_method(
|
||||||
|
self._routers[trigger_method], router_method
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
trigger_method : str
|
|
||||||
The name of the method that triggered these listeners.
|
|
||||||
result : Any
|
|
||||||
The result from the triggering method, passed to listeners
|
|
||||||
that accept parameters.
|
|
||||||
|
|
||||||
Notes
|
|
||||||
-----
|
|
||||||
- Routers are executed sequentially to maintain flow control
|
|
||||||
- Each router's result becomes the new trigger_method
|
|
||||||
- Normal listeners are executed in parallel for efficiency
|
|
||||||
- Listeners can receive the trigger method's result as a parameter
|
|
||||||
"""
|
|
||||||
# First, handle routers repeatedly until no router triggers anymore
|
|
||||||
while True:
|
|
||||||
routers_triggered = self._find_triggered_methods(
|
|
||||||
trigger_method, router_only=True
|
|
||||||
)
|
)
|
||||||
if not routers_triggered:
|
trigger_method = path
|
||||||
break
|
|
||||||
for router_name in routers_triggered:
|
|
||||||
await self._execute_single_listener(router_name, result)
|
|
||||||
# After executing router, the router's result is the path
|
|
||||||
# The last router executed sets the trigger_method
|
|
||||||
# The router result is the last element in self._method_outputs
|
|
||||||
trigger_method = self._method_outputs[-1]
|
|
||||||
|
|
||||||
# Now that no more routers are triggered by current trigger_method,
|
|
||||||
# execute normal listeners
|
|
||||||
listeners_triggered = self._find_triggered_methods(
|
|
||||||
trigger_method, router_only=False
|
|
||||||
)
|
|
||||||
if listeners_triggered:
|
|
||||||
tasks = [
|
|
||||||
self._execute_single_listener(listener_name, result)
|
|
||||||
for listener_name in listeners_triggered
|
|
||||||
]
|
|
||||||
await asyncio.gather(*tasks)
|
|
||||||
|
|
||||||
def _find_triggered_methods(
|
|
||||||
self, trigger_method: str, router_only: bool
|
|
||||||
) -> List[str]:
|
|
||||||
"""
|
|
||||||
Finds all methods that should be triggered based on conditions.
|
|
||||||
|
|
||||||
This internal method evaluates both OR and AND conditions to determine
|
|
||||||
which methods should be executed next in the flow.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
trigger_method : str
|
|
||||||
The name of the method that just completed execution.
|
|
||||||
router_only : bool
|
|
||||||
If True, only consider router methods.
|
|
||||||
If False, only consider non-router methods.
|
|
||||||
|
|
||||||
Returns
|
|
||||||
-------
|
|
||||||
List[str]
|
|
||||||
Names of methods that should be triggered.
|
|
||||||
|
|
||||||
Notes
|
|
||||||
-----
|
|
||||||
- Handles both OR and AND conditions:
|
|
||||||
* OR: Triggers if any condition is met
|
|
||||||
* AND: Triggers only when all conditions are met
|
|
||||||
- Maintains state for AND conditions using _pending_and_listeners
|
|
||||||
- Separates router and normal listener evaluation
|
|
||||||
"""
|
|
||||||
triggered = []
|
|
||||||
for listener_name, (condition_type, methods) in self._listeners.items():
|
for listener_name, (condition_type, methods) in self._listeners.items():
|
||||||
is_router = listener_name in self._routers
|
|
||||||
|
|
||||||
if router_only != is_router:
|
|
||||||
continue
|
|
||||||
|
|
||||||
if condition_type == "OR":
|
if condition_type == "OR":
|
||||||
# If the trigger_method matches any in methods, run this
|
|
||||||
if trigger_method in methods:
|
if trigger_method in methods:
|
||||||
triggered.append(listener_name)
|
# Schedule the listener without preventing re-execution
|
||||||
|
listener_tasks.append(
|
||||||
|
self._execute_single_listener(listener_name, result)
|
||||||
|
)
|
||||||
elif condition_type == "AND":
|
elif condition_type == "AND":
|
||||||
# Initialize pending methods for this listener if not already done
|
# Initialize pending methods for this listener if not already done
|
||||||
if listener_name not in self._pending_and_listeners:
|
if listener_name not in self._pending_and_listeners:
|
||||||
self._pending_and_listeners[listener_name] = set(methods)
|
self._pending_and_listeners[listener_name] = set(methods)
|
||||||
# Remove the trigger method from pending methods
|
# Remove the trigger method from pending methods
|
||||||
if trigger_method in self._pending_and_listeners[listener_name]:
|
|
||||||
self._pending_and_listeners[listener_name].discard(trigger_method)
|
self._pending_and_listeners[listener_name].discard(trigger_method)
|
||||||
|
|
||||||
if not self._pending_and_listeners[listener_name]:
|
if not self._pending_and_listeners[listener_name]:
|
||||||
# All required methods have been executed
|
# All required methods have been executed
|
||||||
triggered.append(listener_name)
|
listener_tasks.append(
|
||||||
|
self._execute_single_listener(listener_name, result)
|
||||||
|
)
|
||||||
# Reset pending methods for this listener
|
# Reset pending methods for this listener
|
||||||
self._pending_and_listeners.pop(listener_name, None)
|
self._pending_and_listeners.pop(listener_name, None)
|
||||||
|
|
||||||
return triggered
|
# Run all listener tasks concurrently and wait for them to complete
|
||||||
|
if listener_tasks:
|
||||||
|
await asyncio.gather(*listener_tasks)
|
||||||
|
|
||||||
async def _execute_single_listener(self, listener_name: str, result: Any) -> None:
|
async def _execute_single_listener(self, listener_name: str, result: Any) -> None:
|
||||||
"""
|
|
||||||
Executes a single listener method with proper event handling.
|
|
||||||
|
|
||||||
This internal method manages the execution of an individual listener,
|
|
||||||
including parameter inspection, event emission, and error handling.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
listener_name : str
|
|
||||||
The name of the listener method to execute.
|
|
||||||
result : Any
|
|
||||||
The result from the triggering method, which may be passed
|
|
||||||
to the listener if it accepts parameters.
|
|
||||||
|
|
||||||
Notes
|
|
||||||
-----
|
|
||||||
- Inspects method signature to determine if it accepts the trigger result
|
|
||||||
- Emits events for method execution start and finish
|
|
||||||
- Handles errors gracefully with detailed logging
|
|
||||||
- Recursively triggers listeners of this listener
|
|
||||||
- Supports both parameterized and parameter-less listeners
|
|
||||||
|
|
||||||
Error Handling
|
|
||||||
-------------
|
|
||||||
Catches and logs any exceptions during execution, preventing
|
|
||||||
individual listener failures from breaking the entire flow.
|
|
||||||
"""
|
|
||||||
try:
|
try:
|
||||||
method = self._methods[listener_name]
|
method = self._methods[listener_name]
|
||||||
|
|
||||||
self.event_emitter.send(
|
|
||||||
self,
|
|
||||||
event=MethodExecutionStartedEvent(
|
|
||||||
type="method_execution_started",
|
|
||||||
method_name=listener_name,
|
|
||||||
flow_name=self.__class__.__name__,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
|
|
||||||
sig = inspect.signature(method)
|
sig = inspect.signature(method)
|
||||||
params = list(sig.parameters.values())
|
params = list(sig.parameters.values())
|
||||||
|
|
||||||
|
# Exclude 'self' parameter
|
||||||
method_params = [p for p in params if p.name != "self"]
|
method_params = [p for p in params if p.name != "self"]
|
||||||
|
|
||||||
if method_params:
|
if method_params:
|
||||||
|
# If listener expects parameters, pass the result
|
||||||
listener_result = await self._execute_method(
|
listener_result = await self._execute_method(
|
||||||
listener_name, method, result
|
listener_name, method, result
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
|
# If listener does not expect parameters, call without arguments
|
||||||
listener_result = await self._execute_method(listener_name, method)
|
listener_result = await self._execute_method(listener_name, method)
|
||||||
|
|
||||||
self.event_emitter.send(
|
# Execute listeners of this listener
|
||||||
self,
|
|
||||||
event=MethodExecutionFinishedEvent(
|
|
||||||
type="method_execution_finished",
|
|
||||||
method_name=listener_name,
|
|
||||||
flow_name=self.__class__.__name__,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
|
|
||||||
# Execute listeners (and possibly routers) of this listener
|
|
||||||
await self._execute_listeners(listener_name, listener_result)
|
await self._execute_listeners(listener_name, listener_result)
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(
|
print(
|
||||||
f"[Flow._execute_single_listener] Error in method {listener_name}: {e}"
|
f"[Flow._execute_single_listener] Error in method {listener_name}: {e}"
|
||||||
@@ -679,4 +381,5 @@ class Flow(Generic[T], metaclass=FlowMeta):
|
|||||||
self._telemetry.flow_plotting_span(
|
self._telemetry.flow_plotting_span(
|
||||||
self.__class__.__name__, list(self._methods.keys())
|
self.__class__.__name__, list(self._methods.keys())
|
||||||
)
|
)
|
||||||
|
|
||||||
plot_flow(self, filename)
|
plot_flow(self, filename)
|
||||||
|
|||||||
@@ -1,33 +0,0 @@
|
|||||||
from dataclasses import dataclass, field
|
|
||||||
from datetime import datetime
|
|
||||||
from typing import Any, Optional
|
|
||||||
|
|
||||||
|
|
||||||
@dataclass
|
|
||||||
class Event:
|
|
||||||
type: str
|
|
||||||
flow_name: str
|
|
||||||
timestamp: datetime = field(init=False)
|
|
||||||
|
|
||||||
def __post_init__(self):
|
|
||||||
self.timestamp = datetime.now()
|
|
||||||
|
|
||||||
|
|
||||||
@dataclass
|
|
||||||
class FlowStartedEvent(Event):
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
@dataclass
|
|
||||||
class MethodExecutionStartedEvent(Event):
|
|
||||||
method_name: str
|
|
||||||
|
|
||||||
|
|
||||||
@dataclass
|
|
||||||
class MethodExecutionFinishedEvent(Event):
|
|
||||||
method_name: str
|
|
||||||
|
|
||||||
|
|
||||||
@dataclass
|
|
||||||
class FlowFinishedEvent(Event):
|
|
||||||
result: Optional[Any] = None
|
|
||||||
@@ -1,14 +1,12 @@
|
|||||||
# flow_visualizer.py
|
# flow_visualizer.py
|
||||||
|
|
||||||
import os
|
import os
|
||||||
from pathlib import Path
|
|
||||||
|
|
||||||
from pyvis.network import Network
|
from pyvis.network import Network
|
||||||
|
|
||||||
from crewai.flow.config import COLORS, NODE_STYLES
|
from crewai.flow.config import COLORS, NODE_STYLES
|
||||||
from crewai.flow.html_template_handler import HTMLTemplateHandler
|
from crewai.flow.html_template_handler import HTMLTemplateHandler
|
||||||
from crewai.flow.legend_generator import generate_legend_items_html, get_legend_items
|
from crewai.flow.legend_generator import generate_legend_items_html, get_legend_items
|
||||||
from crewai.flow.path_utils import safe_path_join, validate_path_exists
|
|
||||||
from crewai.flow.utils import calculate_node_levels
|
from crewai.flow.utils import calculate_node_levels
|
||||||
from crewai.flow.visualization_utils import (
|
from crewai.flow.visualization_utils import (
|
||||||
add_edges,
|
add_edges,
|
||||||
@@ -18,56 +16,12 @@ from crewai.flow.visualization_utils import (
|
|||||||
|
|
||||||
|
|
||||||
class FlowPlot:
|
class FlowPlot:
|
||||||
"""Handles the creation and rendering of flow visualization diagrams."""
|
|
||||||
|
|
||||||
def __init__(self, flow):
|
def __init__(self, flow):
|
||||||
"""
|
|
||||||
Initialize FlowPlot with a flow object.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
flow : Flow
|
|
||||||
A Flow instance to visualize.
|
|
||||||
|
|
||||||
Raises
|
|
||||||
------
|
|
||||||
ValueError
|
|
||||||
If flow object is invalid or missing required attributes.
|
|
||||||
"""
|
|
||||||
if not hasattr(flow, '_methods'):
|
|
||||||
raise ValueError("Invalid flow object: missing '_methods' attribute")
|
|
||||||
if not hasattr(flow, '_listeners'):
|
|
||||||
raise ValueError("Invalid flow object: missing '_listeners' attribute")
|
|
||||||
if not hasattr(flow, '_start_methods'):
|
|
||||||
raise ValueError("Invalid flow object: missing '_start_methods' attribute")
|
|
||||||
|
|
||||||
self.flow = flow
|
self.flow = flow
|
||||||
self.colors = COLORS
|
self.colors = COLORS
|
||||||
self.node_styles = NODE_STYLES
|
self.node_styles = NODE_STYLES
|
||||||
|
|
||||||
def plot(self, filename):
|
def plot(self, filename):
|
||||||
"""
|
|
||||||
Generate and save an HTML visualization of the flow.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
filename : str
|
|
||||||
Name of the output file (without extension).
|
|
||||||
|
|
||||||
Raises
|
|
||||||
------
|
|
||||||
ValueError
|
|
||||||
If filename is invalid or network generation fails.
|
|
||||||
IOError
|
|
||||||
If file operations fail or visualization cannot be generated.
|
|
||||||
RuntimeError
|
|
||||||
If network visualization generation fails.
|
|
||||||
"""
|
|
||||||
if not filename or not isinstance(filename, str):
|
|
||||||
raise ValueError("Filename must be a non-empty string")
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Initialize network
|
|
||||||
net = Network(
|
net = Network(
|
||||||
directed=True,
|
directed=True,
|
||||||
height="750px",
|
height="750px",
|
||||||
@@ -93,85 +47,34 @@ class FlowPlot:
|
|||||||
)
|
)
|
||||||
|
|
||||||
# Calculate levels for nodes
|
# Calculate levels for nodes
|
||||||
try:
|
|
||||||
node_levels = calculate_node_levels(self.flow)
|
node_levels = calculate_node_levels(self.flow)
|
||||||
except Exception as e:
|
|
||||||
raise ValueError(f"Failed to calculate node levels: {str(e)}")
|
|
||||||
|
|
||||||
# Compute positions
|
# Compute positions
|
||||||
try:
|
|
||||||
node_positions = compute_positions(self.flow, node_levels)
|
node_positions = compute_positions(self.flow, node_levels)
|
||||||
except Exception as e:
|
|
||||||
raise ValueError(f"Failed to compute node positions: {str(e)}")
|
|
||||||
|
|
||||||
# Add nodes to the network
|
# Add nodes to the network
|
||||||
try:
|
|
||||||
add_nodes_to_network(net, self.flow, node_positions, self.node_styles)
|
add_nodes_to_network(net, self.flow, node_positions, self.node_styles)
|
||||||
except Exception as e:
|
|
||||||
raise RuntimeError(f"Failed to add nodes to network: {str(e)}")
|
|
||||||
|
|
||||||
# Add edges to the network
|
# Add edges to the network
|
||||||
try:
|
|
||||||
add_edges(net, self.flow, node_positions, self.colors)
|
add_edges(net, self.flow, node_positions, self.colors)
|
||||||
except Exception as e:
|
|
||||||
raise RuntimeError(f"Failed to add edges to network: {str(e)}")
|
|
||||||
|
|
||||||
# Generate HTML
|
|
||||||
try:
|
|
||||||
network_html = net.generate_html()
|
network_html = net.generate_html()
|
||||||
final_html_content = self._generate_final_html(network_html)
|
final_html_content = self._generate_final_html(network_html)
|
||||||
except Exception as e:
|
|
||||||
raise RuntimeError(f"Failed to generate network visualization: {str(e)}")
|
|
||||||
|
|
||||||
# Save the final HTML content to the file
|
# Save the final HTML content to the file
|
||||||
try:
|
|
||||||
with open(f"{filename}.html", "w", encoding="utf-8") as f:
|
with open(f"{filename}.html", "w", encoding="utf-8") as f:
|
||||||
f.write(final_html_content)
|
f.write(final_html_content)
|
||||||
print(f"Plot saved as {filename}.html")
|
print(f"Plot saved as {filename}.html")
|
||||||
except IOError as e:
|
|
||||||
raise IOError(f"Failed to save flow visualization to {filename}.html: {str(e)}")
|
|
||||||
|
|
||||||
except (ValueError, RuntimeError, IOError) as e:
|
|
||||||
raise e
|
|
||||||
except Exception as e:
|
|
||||||
raise RuntimeError(f"Unexpected error during flow visualization: {str(e)}")
|
|
||||||
finally:
|
|
||||||
self._cleanup_pyvis_lib()
|
self._cleanup_pyvis_lib()
|
||||||
|
|
||||||
def _generate_final_html(self, network_html):
|
def _generate_final_html(self, network_html):
|
||||||
"""
|
|
||||||
Generate the final HTML content with network visualization and legend.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
network_html : str
|
|
||||||
HTML content generated by pyvis Network.
|
|
||||||
|
|
||||||
Returns
|
|
||||||
-------
|
|
||||||
str
|
|
||||||
Complete HTML content with styling and legend.
|
|
||||||
|
|
||||||
Raises
|
|
||||||
------
|
|
||||||
IOError
|
|
||||||
If template or logo files cannot be accessed.
|
|
||||||
ValueError
|
|
||||||
If network_html is invalid.
|
|
||||||
"""
|
|
||||||
if not network_html:
|
|
||||||
raise ValueError("Invalid network HTML content")
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Extract just the body content from the generated HTML
|
# Extract just the body content from the generated HTML
|
||||||
current_dir = os.path.dirname(__file__)
|
current_dir = os.path.dirname(__file__)
|
||||||
template_path = safe_path_join("assets", "crewai_flow_visual_template.html", root=current_dir)
|
template_path = os.path.join(
|
||||||
logo_path = safe_path_join("assets", "crewai_logo.svg", root=current_dir)
|
current_dir, "assets", "crewai_flow_visual_template.html"
|
||||||
|
)
|
||||||
if not os.path.exists(template_path):
|
logo_path = os.path.join(current_dir, "assets", "crewai_logo.svg")
|
||||||
raise IOError(f"Template file not found: {template_path}")
|
|
||||||
if not os.path.exists(logo_path):
|
|
||||||
raise IOError(f"Logo file not found: {logo_path}")
|
|
||||||
|
|
||||||
html_handler = HTMLTemplateHandler(template_path, logo_path)
|
html_handler = HTMLTemplateHandler(template_path, logo_path)
|
||||||
network_body = html_handler.extract_body_content(network_html)
|
network_body = html_handler.extract_body_content(network_html)
|
||||||
@@ -183,44 +86,19 @@ class FlowPlot:
|
|||||||
network_body, legend_items_html
|
network_body, legend_items_html
|
||||||
)
|
)
|
||||||
return final_html_content
|
return final_html_content
|
||||||
except Exception as e:
|
|
||||||
raise IOError(f"Failed to generate visualization HTML: {str(e)}")
|
|
||||||
|
|
||||||
def _cleanup_pyvis_lib(self):
|
def _cleanup_pyvis_lib(self):
|
||||||
"""
|
# Clean up the generated lib folder
|
||||||
Clean up the generated lib folder from pyvis.
|
lib_folder = os.path.join(os.getcwd(), "lib")
|
||||||
|
|
||||||
This method safely removes the temporary lib directory created by pyvis
|
|
||||||
during network visualization generation.
|
|
||||||
"""
|
|
||||||
try:
|
try:
|
||||||
lib_folder = safe_path_join("lib", root=os.getcwd())
|
|
||||||
if os.path.exists(lib_folder) and os.path.isdir(lib_folder):
|
if os.path.exists(lib_folder) and os.path.isdir(lib_folder):
|
||||||
import shutil
|
import shutil
|
||||||
|
|
||||||
shutil.rmtree(lib_folder)
|
shutil.rmtree(lib_folder)
|
||||||
except ValueError as e:
|
|
||||||
print(f"Error validating lib folder path: {e}")
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(f"Error cleaning up lib folder: {e}")
|
print(f"Error cleaning up {lib_folder}: {e}")
|
||||||
|
|
||||||
|
|
||||||
def plot_flow(flow, filename="flow_plot"):
|
def plot_flow(flow, filename="flow_plot"):
|
||||||
"""
|
|
||||||
Convenience function to create and save a flow visualization.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
flow : Flow
|
|
||||||
Flow instance to visualize.
|
|
||||||
filename : str, optional
|
|
||||||
Output filename without extension, by default "flow_plot".
|
|
||||||
|
|
||||||
Raises
|
|
||||||
------
|
|
||||||
ValueError
|
|
||||||
If flow object or filename is invalid.
|
|
||||||
IOError
|
|
||||||
If file operations fail.
|
|
||||||
"""
|
|
||||||
visualizer = FlowPlot(flow)
|
visualizer = FlowPlot(flow)
|
||||||
visualizer.plot(filename)
|
visualizer.plot(filename)
|
||||||
|
|||||||
@@ -1,53 +1,26 @@
|
|||||||
import base64
|
import base64
|
||||||
import re
|
import re
|
||||||
from pathlib import Path
|
|
||||||
|
|
||||||
from crewai.flow.path_utils import safe_path_join, validate_path_exists
|
|
||||||
|
|
||||||
|
|
||||||
class HTMLTemplateHandler:
|
class HTMLTemplateHandler:
|
||||||
"""Handles HTML template processing and generation for flow visualization diagrams."""
|
|
||||||
|
|
||||||
def __init__(self, template_path, logo_path):
|
def __init__(self, template_path, logo_path):
|
||||||
"""
|
self.template_path = template_path
|
||||||
Initialize HTMLTemplateHandler with validated template and logo paths.
|
self.logo_path = logo_path
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
template_path : str
|
|
||||||
Path to the HTML template file.
|
|
||||||
logo_path : str
|
|
||||||
Path to the logo image file.
|
|
||||||
|
|
||||||
Raises
|
|
||||||
------
|
|
||||||
ValueError
|
|
||||||
If template or logo paths are invalid or files don't exist.
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
self.template_path = validate_path_exists(template_path, "file")
|
|
||||||
self.logo_path = validate_path_exists(logo_path, "file")
|
|
||||||
except ValueError as e:
|
|
||||||
raise ValueError(f"Invalid template or logo path: {e}")
|
|
||||||
|
|
||||||
def read_template(self):
|
def read_template(self):
|
||||||
"""Read and return the HTML template file contents."""
|
|
||||||
with open(self.template_path, "r", encoding="utf-8") as f:
|
with open(self.template_path, "r", encoding="utf-8") as f:
|
||||||
return f.read()
|
return f.read()
|
||||||
|
|
||||||
def encode_logo(self):
|
def encode_logo(self):
|
||||||
"""Convert the logo SVG file to base64 encoded string."""
|
|
||||||
with open(self.logo_path, "rb") as logo_file:
|
with open(self.logo_path, "rb") as logo_file:
|
||||||
logo_svg_data = logo_file.read()
|
logo_svg_data = logo_file.read()
|
||||||
return base64.b64encode(logo_svg_data).decode("utf-8")
|
return base64.b64encode(logo_svg_data).decode("utf-8")
|
||||||
|
|
||||||
def extract_body_content(self, html):
|
def extract_body_content(self, html):
|
||||||
"""Extract and return content between body tags from HTML string."""
|
|
||||||
match = re.search("<body.*?>(.*?)</body>", html, re.DOTALL)
|
match = re.search("<body.*?>(.*?)</body>", html, re.DOTALL)
|
||||||
return match.group(1) if match else ""
|
return match.group(1) if match else ""
|
||||||
|
|
||||||
def generate_legend_items_html(self, legend_items):
|
def generate_legend_items_html(self, legend_items):
|
||||||
"""Generate HTML markup for the legend items."""
|
|
||||||
legend_items_html = ""
|
legend_items_html = ""
|
||||||
for item in legend_items:
|
for item in legend_items:
|
||||||
if "border" in item:
|
if "border" in item:
|
||||||
@@ -75,7 +48,6 @@ class HTMLTemplateHandler:
|
|||||||
return legend_items_html
|
return legend_items_html
|
||||||
|
|
||||||
def generate_final_html(self, network_body, legend_items_html, title="Flow Plot"):
|
def generate_final_html(self, network_body, legend_items_html, title="Flow Plot"):
|
||||||
"""Combine all components into final HTML document with network visualization."""
|
|
||||||
html_template = self.read_template()
|
html_template = self.read_template()
|
||||||
logo_svg_base64 = self.encode_logo()
|
logo_svg_base64 = self.encode_logo()
|
||||||
|
|
||||||
|
|||||||
@@ -1,4 +1,3 @@
|
|||||||
|
|
||||||
def get_legend_items(colors):
|
def get_legend_items(colors):
|
||||||
return [
|
return [
|
||||||
{"label": "Start Method", "color": colors["start"]},
|
{"label": "Start Method", "color": colors["start"]},
|
||||||
|
|||||||
@@ -1,135 +0,0 @@
|
|||||||
"""
|
|
||||||
Path utilities for secure file operations in CrewAI flow module.
|
|
||||||
|
|
||||||
This module provides utilities for secure path handling to prevent directory
|
|
||||||
traversal attacks and ensure paths remain within allowed boundaries.
|
|
||||||
"""
|
|
||||||
|
|
||||||
import os
|
|
||||||
from pathlib import Path
|
|
||||||
from typing import List, Union
|
|
||||||
|
|
||||||
|
|
||||||
def safe_path_join(*parts: str, root: Union[str, Path, None] = None) -> str:
|
|
||||||
"""
|
|
||||||
Safely join path components and ensure the result is within allowed boundaries.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
*parts : str
|
|
||||||
Variable number of path components to join.
|
|
||||||
root : Union[str, Path, None], optional
|
|
||||||
Root directory to use as base. If None, uses current working directory.
|
|
||||||
|
|
||||||
Returns
|
|
||||||
-------
|
|
||||||
str
|
|
||||||
String representation of the resolved path.
|
|
||||||
|
|
||||||
Raises
|
|
||||||
------
|
|
||||||
ValueError
|
|
||||||
If the resulting path would be outside the root directory
|
|
||||||
or if any path component is invalid.
|
|
||||||
"""
|
|
||||||
if not parts:
|
|
||||||
raise ValueError("No path components provided")
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Convert all parts to strings and clean them
|
|
||||||
clean_parts = [str(part).strip() for part in parts if part]
|
|
||||||
if not clean_parts:
|
|
||||||
raise ValueError("No valid path components provided")
|
|
||||||
|
|
||||||
# Establish root directory
|
|
||||||
root_path = Path(root).resolve() if root else Path.cwd()
|
|
||||||
|
|
||||||
# Join and resolve the full path
|
|
||||||
full_path = Path(root_path, *clean_parts).resolve()
|
|
||||||
|
|
||||||
# Check if the resolved path is within root
|
|
||||||
if not str(full_path).startswith(str(root_path)):
|
|
||||||
raise ValueError(
|
|
||||||
f"Invalid path: Potential directory traversal. Path must be within {root_path}"
|
|
||||||
)
|
|
||||||
|
|
||||||
return str(full_path)
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
if isinstance(e, ValueError):
|
|
||||||
raise
|
|
||||||
raise ValueError(f"Invalid path components: {str(e)}")
|
|
||||||
|
|
||||||
|
|
||||||
def validate_path_exists(path: Union[str, Path], file_type: str = "file") -> str:
|
|
||||||
"""
|
|
||||||
Validate that a path exists and is of the expected type.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
path : Union[str, Path]
|
|
||||||
Path to validate.
|
|
||||||
file_type : str, optional
|
|
||||||
Expected type ('file' or 'directory'), by default 'file'.
|
|
||||||
|
|
||||||
Returns
|
|
||||||
-------
|
|
||||||
str
|
|
||||||
Validated path as string.
|
|
||||||
|
|
||||||
Raises
|
|
||||||
------
|
|
||||||
ValueError
|
|
||||||
If path doesn't exist or is not of expected type.
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
path_obj = Path(path).resolve()
|
|
||||||
|
|
||||||
if not path_obj.exists():
|
|
||||||
raise ValueError(f"Path does not exist: {path}")
|
|
||||||
|
|
||||||
if file_type == "file" and not path_obj.is_file():
|
|
||||||
raise ValueError(f"Path is not a file: {path}")
|
|
||||||
elif file_type == "directory" and not path_obj.is_dir():
|
|
||||||
raise ValueError(f"Path is not a directory: {path}")
|
|
||||||
|
|
||||||
return str(path_obj)
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
if isinstance(e, ValueError):
|
|
||||||
raise
|
|
||||||
raise ValueError(f"Invalid path: {str(e)}")
|
|
||||||
|
|
||||||
|
|
||||||
def list_files(directory: Union[str, Path], pattern: str = "*") -> List[str]:
|
|
||||||
"""
|
|
||||||
Safely list files in a directory matching a pattern.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
directory : Union[str, Path]
|
|
||||||
Directory to search in.
|
|
||||||
pattern : str, optional
|
|
||||||
Glob pattern to match files against, by default "*".
|
|
||||||
|
|
||||||
Returns
|
|
||||||
-------
|
|
||||||
List[str]
|
|
||||||
List of matching file paths.
|
|
||||||
|
|
||||||
Raises
|
|
||||||
------
|
|
||||||
ValueError
|
|
||||||
If directory is invalid or inaccessible.
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
dir_path = Path(directory).resolve()
|
|
||||||
if not dir_path.is_dir():
|
|
||||||
raise ValueError(f"Not a directory: {directory}")
|
|
||||||
|
|
||||||
return [str(p) for p in dir_path.glob(pattern) if p.is_file()]
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
if isinstance(e, ValueError):
|
|
||||||
raise
|
|
||||||
raise ValueError(f"Error listing files: {str(e)}")
|
|
||||||
@@ -1,25 +1,9 @@
|
|||||||
"""
|
|
||||||
Utility functions for flow visualization and dependency analysis.
|
|
||||||
|
|
||||||
This module provides core functionality for analyzing and manipulating flow structures,
|
|
||||||
including node level calculation, ancestor tracking, and return value analysis.
|
|
||||||
Functions in this module are primarily used by the visualization system to create
|
|
||||||
accurate and informative flow diagrams.
|
|
||||||
|
|
||||||
Example
|
|
||||||
-------
|
|
||||||
>>> flow = Flow()
|
|
||||||
>>> node_levels = calculate_node_levels(flow)
|
|
||||||
>>> ancestors = build_ancestor_dict(flow)
|
|
||||||
"""
|
|
||||||
|
|
||||||
import ast
|
import ast
|
||||||
import inspect
|
import inspect
|
||||||
import textwrap
|
import textwrap
|
||||||
from typing import Any, Dict, List, Optional, Set, Union
|
|
||||||
|
|
||||||
|
|
||||||
def get_possible_return_constants(function: Any) -> Optional[List[str]]:
|
def get_possible_return_constants(function):
|
||||||
try:
|
try:
|
||||||
source = inspect.getsource(function)
|
source = inspect.getsource(function)
|
||||||
except OSError:
|
except OSError:
|
||||||
@@ -47,80 +31,23 @@ def get_possible_return_constants(function: Any) -> Optional[List[str]]:
|
|||||||
print(f"Source code:\n{source}")
|
print(f"Source code:\n{source}")
|
||||||
return None
|
return None
|
||||||
|
|
||||||
return_values = set()
|
return_values = []
|
||||||
dict_definitions = {}
|
|
||||||
|
|
||||||
class DictionaryAssignmentVisitor(ast.NodeVisitor):
|
|
||||||
def visit_Assign(self, node):
|
|
||||||
# Check if this assignment is assigning a dictionary literal to a variable
|
|
||||||
if isinstance(node.value, ast.Dict) and len(node.targets) == 1:
|
|
||||||
target = node.targets[0]
|
|
||||||
if isinstance(target, ast.Name):
|
|
||||||
var_name = target.id
|
|
||||||
dict_values = []
|
|
||||||
# Extract string values from the dictionary
|
|
||||||
for val in node.value.values:
|
|
||||||
if isinstance(val, ast.Constant) and isinstance(val.value, str):
|
|
||||||
dict_values.append(val.value)
|
|
||||||
# If non-string, skip or just ignore
|
|
||||||
if dict_values:
|
|
||||||
dict_definitions[var_name] = dict_values
|
|
||||||
self.generic_visit(node)
|
|
||||||
|
|
||||||
class ReturnVisitor(ast.NodeVisitor):
|
class ReturnVisitor(ast.NodeVisitor):
|
||||||
def visit_Return(self, node):
|
def visit_Return(self, node):
|
||||||
# Direct string return
|
# Check if the return value is a constant (Python 3.8+)
|
||||||
if isinstance(node.value, ast.Constant) and isinstance(
|
if isinstance(node.value, ast.Constant):
|
||||||
node.value.value, str
|
return_values.append(node.value.value)
|
||||||
):
|
|
||||||
return_values.add(node.value.value)
|
|
||||||
# Dictionary-based return, like return paths[result]
|
|
||||||
elif isinstance(node.value, ast.Subscript):
|
|
||||||
# Check if we're subscripting a known dictionary variable
|
|
||||||
if isinstance(node.value.value, ast.Name):
|
|
||||||
var_name = node.value.value.id
|
|
||||||
if var_name in dict_definitions:
|
|
||||||
# Add all possible dictionary values
|
|
||||||
for v in dict_definitions[var_name]:
|
|
||||||
return_values.add(v)
|
|
||||||
self.generic_visit(node)
|
|
||||||
|
|
||||||
# First pass: identify dictionary assignments
|
|
||||||
DictionaryAssignmentVisitor().visit(code_ast)
|
|
||||||
# Second pass: identify returns
|
|
||||||
ReturnVisitor().visit(code_ast)
|
ReturnVisitor().visit(code_ast)
|
||||||
|
return return_values
|
||||||
return list(return_values) if return_values else None
|
|
||||||
|
|
||||||
|
|
||||||
def calculate_node_levels(flow: Any) -> Dict[str, int]:
|
def calculate_node_levels(flow):
|
||||||
"""
|
levels = {}
|
||||||
Calculate the hierarchical level of each node in the flow.
|
queue = []
|
||||||
|
visited = set()
|
||||||
Performs a breadth-first traversal of the flow graph to assign levels
|
pending_and_listeners = {}
|
||||||
to nodes, starting with start methods at level 0.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
flow : Any
|
|
||||||
The flow instance containing methods, listeners, and router configurations.
|
|
||||||
|
|
||||||
Returns
|
|
||||||
-------
|
|
||||||
Dict[str, int]
|
|
||||||
Dictionary mapping method names to their hierarchical levels.
|
|
||||||
|
|
||||||
Notes
|
|
||||||
-----
|
|
||||||
- Start methods are assigned level 0
|
|
||||||
- Each subsequent connected node is assigned level = parent_level + 1
|
|
||||||
- Handles both OR and AND conditions for listeners
|
|
||||||
- Processes router paths separately
|
|
||||||
"""
|
|
||||||
levels: Dict[str, int] = {}
|
|
||||||
queue: List[str] = []
|
|
||||||
visited: Set[str] = set()
|
|
||||||
pending_and_listeners: Dict[str, Set[str]] = {}
|
|
||||||
|
|
||||||
# Make all start methods at level 0
|
# Make all start methods at level 0
|
||||||
for method_name, method in flow._methods.items():
|
for method_name, method in flow._methods.items():
|
||||||
@@ -134,7 +61,10 @@ def calculate_node_levels(flow: Any) -> Dict[str, int]:
|
|||||||
current_level = levels[current]
|
current_level = levels[current]
|
||||||
visited.add(current)
|
visited.add(current)
|
||||||
|
|
||||||
for listener_name, (condition_type, trigger_methods) in flow._listeners.items():
|
for listener_name, (
|
||||||
|
condition_type,
|
||||||
|
trigger_methods,
|
||||||
|
) in flow._listeners.items():
|
||||||
if condition_type == "OR":
|
if condition_type == "OR":
|
||||||
if current in trigger_methods:
|
if current in trigger_methods:
|
||||||
if (
|
if (
|
||||||
@@ -159,7 +89,7 @@ def calculate_node_levels(flow: Any) -> Dict[str, int]:
|
|||||||
queue.append(listener_name)
|
queue.append(listener_name)
|
||||||
|
|
||||||
# Handle router connections
|
# Handle router connections
|
||||||
if current in flow._routers:
|
if current in flow._routers.values():
|
||||||
router_method_name = current
|
router_method_name = current
|
||||||
paths = flow._router_paths.get(router_method_name, [])
|
paths = flow._router_paths.get(router_method_name, [])
|
||||||
for path in paths:
|
for path in paths:
|
||||||
@@ -175,24 +105,10 @@ def calculate_node_levels(flow: Any) -> Dict[str, int]:
|
|||||||
levels[listener_name] = current_level + 1
|
levels[listener_name] = current_level + 1
|
||||||
if listener_name not in visited:
|
if listener_name not in visited:
|
||||||
queue.append(listener_name)
|
queue.append(listener_name)
|
||||||
|
|
||||||
return levels
|
return levels
|
||||||
|
|
||||||
|
|
||||||
def count_outgoing_edges(flow: Any) -> Dict[str, int]:
|
def count_outgoing_edges(flow):
|
||||||
"""
|
|
||||||
Count the number of outgoing edges for each method in the flow.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
flow : Any
|
|
||||||
The flow instance to analyze.
|
|
||||||
|
|
||||||
Returns
|
|
||||||
-------
|
|
||||||
Dict[str, int]
|
|
||||||
Dictionary mapping method names to their outgoing edge count.
|
|
||||||
"""
|
|
||||||
counts = {}
|
counts = {}
|
||||||
for method_name in flow._methods:
|
for method_name in flow._methods:
|
||||||
counts[method_name] = 0
|
counts[method_name] = 0
|
||||||
@@ -204,53 +120,16 @@ def count_outgoing_edges(flow: Any) -> Dict[str, int]:
|
|||||||
return counts
|
return counts
|
||||||
|
|
||||||
|
|
||||||
def build_ancestor_dict(flow: Any) -> Dict[str, Set[str]]:
|
def build_ancestor_dict(flow):
|
||||||
"""
|
ancestors = {node: set() for node in flow._methods}
|
||||||
Build a dictionary mapping each node to its ancestor nodes.
|
visited = set()
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
flow : Any
|
|
||||||
The flow instance to analyze.
|
|
||||||
|
|
||||||
Returns
|
|
||||||
-------
|
|
||||||
Dict[str, Set[str]]
|
|
||||||
Dictionary mapping each node to a set of its ancestor nodes.
|
|
||||||
"""
|
|
||||||
ancestors: Dict[str, Set[str]] = {node: set() for node in flow._methods}
|
|
||||||
visited: Set[str] = set()
|
|
||||||
for node in flow._methods:
|
for node in flow._methods:
|
||||||
if node not in visited:
|
if node not in visited:
|
||||||
dfs_ancestors(node, ancestors, visited, flow)
|
dfs_ancestors(node, ancestors, visited, flow)
|
||||||
return ancestors
|
return ancestors
|
||||||
|
|
||||||
|
|
||||||
def dfs_ancestors(
|
def dfs_ancestors(node, ancestors, visited, flow):
|
||||||
node: str,
|
|
||||||
ancestors: Dict[str, Set[str]],
|
|
||||||
visited: Set[str],
|
|
||||||
flow: Any
|
|
||||||
) -> None:
|
|
||||||
"""
|
|
||||||
Perform depth-first search to build ancestor relationships.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
node : str
|
|
||||||
Current node being processed.
|
|
||||||
ancestors : Dict[str, Set[str]]
|
|
||||||
Dictionary tracking ancestor relationships.
|
|
||||||
visited : Set[str]
|
|
||||||
Set of already visited nodes.
|
|
||||||
flow : Any
|
|
||||||
The flow instance being analyzed.
|
|
||||||
|
|
||||||
Notes
|
|
||||||
-----
|
|
||||||
This function modifies the ancestors dictionary in-place to build
|
|
||||||
the complete ancestor graph.
|
|
||||||
"""
|
|
||||||
if node in visited:
|
if node in visited:
|
||||||
return
|
return
|
||||||
visited.add(node)
|
visited.add(node)
|
||||||
@@ -263,7 +142,7 @@ def dfs_ancestors(
|
|||||||
dfs_ancestors(listener_name, ancestors, visited, flow)
|
dfs_ancestors(listener_name, ancestors, visited, flow)
|
||||||
|
|
||||||
# Handle router methods separately
|
# Handle router methods separately
|
||||||
if node in flow._routers:
|
if node in flow._routers.values():
|
||||||
router_method_name = node
|
router_method_name = node
|
||||||
paths = flow._router_paths.get(router_method_name, [])
|
paths = flow._router_paths.get(router_method_name, [])
|
||||||
for path in paths:
|
for path in paths:
|
||||||
@@ -274,48 +153,12 @@ def dfs_ancestors(
|
|||||||
dfs_ancestors(listener_name, ancestors, visited, flow)
|
dfs_ancestors(listener_name, ancestors, visited, flow)
|
||||||
|
|
||||||
|
|
||||||
def is_ancestor(node: str, ancestor_candidate: str, ancestors: Dict[str, Set[str]]) -> bool:
|
def is_ancestor(node, ancestor_candidate, ancestors):
|
||||||
"""
|
|
||||||
Check if one node is an ancestor of another.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
node : str
|
|
||||||
The node to check ancestors for.
|
|
||||||
ancestor_candidate : str
|
|
||||||
The potential ancestor node.
|
|
||||||
ancestors : Dict[str, Set[str]]
|
|
||||||
Dictionary containing ancestor relationships.
|
|
||||||
|
|
||||||
Returns
|
|
||||||
-------
|
|
||||||
bool
|
|
||||||
True if ancestor_candidate is an ancestor of node, False otherwise.
|
|
||||||
"""
|
|
||||||
return ancestor_candidate in ancestors.get(node, set())
|
return ancestor_candidate in ancestors.get(node, set())
|
||||||
|
|
||||||
|
|
||||||
def build_parent_children_dict(flow: Any) -> Dict[str, List[str]]:
|
def build_parent_children_dict(flow):
|
||||||
"""
|
parent_children = {}
|
||||||
Build a dictionary mapping parent nodes to their children.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
flow : Any
|
|
||||||
The flow instance to analyze.
|
|
||||||
|
|
||||||
Returns
|
|
||||||
-------
|
|
||||||
Dict[str, List[str]]
|
|
||||||
Dictionary mapping parent method names to lists of their child method names.
|
|
||||||
|
|
||||||
Notes
|
|
||||||
-----
|
|
||||||
- Maps listeners to their trigger methods
|
|
||||||
- Maps router methods to their paths and listeners
|
|
||||||
- Children lists are sorted for consistent ordering
|
|
||||||
"""
|
|
||||||
parent_children: Dict[str, List[str]] = {}
|
|
||||||
|
|
||||||
# Map listeners to their trigger methods
|
# Map listeners to their trigger methods
|
||||||
for listener_name, (_, trigger_methods) in flow._listeners.items():
|
for listener_name, (_, trigger_methods) in flow._listeners.items():
|
||||||
@@ -339,24 +182,7 @@ def build_parent_children_dict(flow: Any) -> Dict[str, List[str]]:
|
|||||||
return parent_children
|
return parent_children
|
||||||
|
|
||||||
|
|
||||||
def get_child_index(parent: str, child: str, parent_children: Dict[str, List[str]]) -> int:
|
def get_child_index(parent, child, parent_children):
|
||||||
"""
|
|
||||||
Get the index of a child node in its parent's sorted children list.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
parent : str
|
|
||||||
The parent node name.
|
|
||||||
child : str
|
|
||||||
The child node name to find the index for.
|
|
||||||
parent_children : Dict[str, List[str]]
|
|
||||||
Dictionary mapping parents to their children lists.
|
|
||||||
|
|
||||||
Returns
|
|
||||||
-------
|
|
||||||
int
|
|
||||||
Zero-based index of the child in its parent's sorted children list.
|
|
||||||
"""
|
|
||||||
children = parent_children.get(parent, [])
|
children = parent_children.get(parent, [])
|
||||||
children.sort()
|
children.sort()
|
||||||
return children.index(child)
|
return children.index(child)
|
||||||
|
|||||||
@@ -1,23 +1,5 @@
|
|||||||
"""
|
|
||||||
Utilities for creating visual representations of flow structures.
|
|
||||||
|
|
||||||
This module provides functions for generating network visualizations of flows,
|
|
||||||
including node placement, edge creation, and visual styling. It handles the
|
|
||||||
conversion of flow structures into visual network graphs with appropriate
|
|
||||||
styling and layout.
|
|
||||||
|
|
||||||
Example
|
|
||||||
-------
|
|
||||||
>>> flow = Flow()
|
|
||||||
>>> net = Network(directed=True)
|
|
||||||
>>> node_positions = compute_positions(flow, node_levels)
|
|
||||||
>>> add_nodes_to_network(net, flow, node_positions, node_styles)
|
|
||||||
>>> add_edges(net, flow, node_positions, colors)
|
|
||||||
"""
|
|
||||||
|
|
||||||
import ast
|
import ast
|
||||||
import inspect
|
import inspect
|
||||||
from typing import Any, Dict, List, Optional, Tuple, Union
|
|
||||||
|
|
||||||
from .utils import (
|
from .utils import (
|
||||||
build_ancestor_dict,
|
build_ancestor_dict,
|
||||||
@@ -27,25 +9,8 @@ from .utils import (
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
def method_calls_crew(method: Any) -> bool:
|
def method_calls_crew(method):
|
||||||
"""
|
"""Check if the method calls `.crew()`."""
|
||||||
Check if the method contains a call to `.crew()`.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
method : Any
|
|
||||||
The method to analyze for crew() calls.
|
|
||||||
|
|
||||||
Returns
|
|
||||||
-------
|
|
||||||
bool
|
|
||||||
True if the method calls .crew(), False otherwise.
|
|
||||||
|
|
||||||
Notes
|
|
||||||
-----
|
|
||||||
Uses AST analysis to detect method calls, specifically looking for
|
|
||||||
attribute access of 'crew'.
|
|
||||||
"""
|
|
||||||
try:
|
try:
|
||||||
source = inspect.getsource(method)
|
source = inspect.getsource(method)
|
||||||
source = inspect.cleandoc(source)
|
source = inspect.cleandoc(source)
|
||||||
@@ -55,7 +20,6 @@ def method_calls_crew(method: Any) -> bool:
|
|||||||
return False
|
return False
|
||||||
|
|
||||||
class CrewCallVisitor(ast.NodeVisitor):
|
class CrewCallVisitor(ast.NodeVisitor):
|
||||||
"""AST visitor to detect .crew() method calls."""
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
self.found = False
|
self.found = False
|
||||||
|
|
||||||
@@ -70,34 +34,7 @@ def method_calls_crew(method: Any) -> bool:
|
|||||||
return visitor.found
|
return visitor.found
|
||||||
|
|
||||||
|
|
||||||
def add_nodes_to_network(
|
def add_nodes_to_network(net, flow, node_positions, node_styles):
|
||||||
net: Any,
|
|
||||||
flow: Any,
|
|
||||||
node_positions: Dict[str, Tuple[float, float]],
|
|
||||||
node_styles: Dict[str, Dict[str, Any]]
|
|
||||||
) -> None:
|
|
||||||
"""
|
|
||||||
Add nodes to the network visualization with appropriate styling.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
net : Any
|
|
||||||
The pyvis Network instance to add nodes to.
|
|
||||||
flow : Any
|
|
||||||
The flow instance containing method information.
|
|
||||||
node_positions : Dict[str, Tuple[float, float]]
|
|
||||||
Dictionary mapping node names to their (x, y) positions.
|
|
||||||
node_styles : Dict[str, Dict[str, Any]]
|
|
||||||
Dictionary containing style configurations for different node types.
|
|
||||||
|
|
||||||
Notes
|
|
||||||
-----
|
|
||||||
Node types include:
|
|
||||||
- Start methods
|
|
||||||
- Router methods
|
|
||||||
- Crew methods
|
|
||||||
- Regular methods
|
|
||||||
"""
|
|
||||||
def human_friendly_label(method_name):
|
def human_friendly_label(method_name):
|
||||||
return method_name.replace("_", " ").title()
|
return method_name.replace("_", " ").title()
|
||||||
|
|
||||||
@@ -136,33 +73,9 @@ def add_nodes_to_network(
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
def compute_positions(
|
def compute_positions(flow, node_levels, y_spacing=150, x_spacing=150):
|
||||||
flow: Any,
|
level_nodes = {}
|
||||||
node_levels: Dict[str, int],
|
node_positions = {}
|
||||||
y_spacing: float = 150,
|
|
||||||
x_spacing: float = 150
|
|
||||||
) -> Dict[str, Tuple[float, float]]:
|
|
||||||
"""
|
|
||||||
Compute the (x, y) positions for each node in the flow graph.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
flow : Any
|
|
||||||
The flow instance to compute positions for.
|
|
||||||
node_levels : Dict[str, int]
|
|
||||||
Dictionary mapping node names to their hierarchical levels.
|
|
||||||
y_spacing : float, optional
|
|
||||||
Vertical spacing between levels, by default 150.
|
|
||||||
x_spacing : float, optional
|
|
||||||
Horizontal spacing between nodes, by default 150.
|
|
||||||
|
|
||||||
Returns
|
|
||||||
-------
|
|
||||||
Dict[str, Tuple[float, float]]
|
|
||||||
Dictionary mapping node names to their (x, y) coordinates.
|
|
||||||
"""
|
|
||||||
level_nodes: Dict[int, List[str]] = {}
|
|
||||||
node_positions: Dict[str, Tuple[float, float]] = {}
|
|
||||||
|
|
||||||
for method_name, level in node_levels.items():
|
for method_name, level in node_levels.items():
|
||||||
level_nodes.setdefault(level, []).append(method_name)
|
level_nodes.setdefault(level, []).append(method_name)
|
||||||
@@ -177,44 +90,16 @@ def compute_positions(
|
|||||||
return node_positions
|
return node_positions
|
||||||
|
|
||||||
|
|
||||||
def add_edges(
|
def add_edges(net, flow, node_positions, colors):
|
||||||
net: Any,
|
|
||||||
flow: Any,
|
|
||||||
node_positions: Dict[str, Tuple[float, float]],
|
|
||||||
colors: Dict[str, str]
|
|
||||||
) -> None:
|
|
||||||
edge_smooth: Dict[str, Union[str, float]] = {"type": "continuous"} # Default value
|
|
||||||
"""
|
|
||||||
Add edges to the network visualization with appropriate styling.
|
|
||||||
|
|
||||||
Parameters
|
|
||||||
----------
|
|
||||||
net : Any
|
|
||||||
The pyvis Network instance to add edges to.
|
|
||||||
flow : Any
|
|
||||||
The flow instance containing edge information.
|
|
||||||
node_positions : Dict[str, Tuple[float, float]]
|
|
||||||
Dictionary mapping node names to their positions.
|
|
||||||
colors : Dict[str, str]
|
|
||||||
Dictionary mapping edge types to their colors.
|
|
||||||
|
|
||||||
Notes
|
|
||||||
-----
|
|
||||||
- Handles both normal listener edges and router edges
|
|
||||||
- Applies appropriate styling (color, dashes) based on edge type
|
|
||||||
- Adds curvature to edges when needed (cycles or multiple children)
|
|
||||||
"""
|
|
||||||
ancestors = build_ancestor_dict(flow)
|
ancestors = build_ancestor_dict(flow)
|
||||||
parent_children = build_parent_children_dict(flow)
|
parent_children = build_parent_children_dict(flow)
|
||||||
|
|
||||||
# Edges for normal listeners
|
|
||||||
for method_name in flow._listeners:
|
for method_name in flow._listeners:
|
||||||
condition_type, trigger_methods = flow._listeners[method_name]
|
condition_type, trigger_methods = flow._listeners[method_name]
|
||||||
is_and_condition = condition_type == "AND"
|
is_and_condition = condition_type == "AND"
|
||||||
|
|
||||||
for trigger in trigger_methods:
|
for trigger in trigger_methods:
|
||||||
# Check if nodes exist before adding edges
|
if trigger in flow._methods or trigger in flow._routers.values():
|
||||||
if trigger in node_positions and method_name in node_positions:
|
|
||||||
is_router_edge = any(
|
is_router_edge = any(
|
||||||
trigger in paths for paths in flow._router_paths.values()
|
trigger in paths for paths in flow._router_paths.values()
|
||||||
)
|
)
|
||||||
@@ -239,7 +124,7 @@ def add_edges(
|
|||||||
else:
|
else:
|
||||||
edge_smooth = {"type": "cubicBezier"}
|
edge_smooth = {"type": "cubicBezier"}
|
||||||
else:
|
else:
|
||||||
edge_smooth.update({"type": "continuous"})
|
edge_smooth = False
|
||||||
|
|
||||||
edge_style = {
|
edge_style = {
|
||||||
"color": edge_color,
|
"color": edge_color,
|
||||||
@@ -250,22 +135,7 @@ def add_edges(
|
|||||||
}
|
}
|
||||||
|
|
||||||
net.add_edge(trigger, method_name, **edge_style)
|
net.add_edge(trigger, method_name, **edge_style)
|
||||||
else:
|
|
||||||
# Nodes not found in node_positions. Check if it's a known router outcome and a known method.
|
|
||||||
is_router_edge = any(
|
|
||||||
trigger in paths for paths in flow._router_paths.values()
|
|
||||||
)
|
|
||||||
# Check if method_name is a known method
|
|
||||||
method_known = method_name in flow._methods
|
|
||||||
|
|
||||||
# If it's a known router edge and the method is known, don't warn.
|
|
||||||
# This means the path is legitimate, just not reflected as nodes here.
|
|
||||||
if not (is_router_edge and method_known):
|
|
||||||
print(
|
|
||||||
f"Warning: No node found for '{trigger}' or '{method_name}'. Skipping edge."
|
|
||||||
)
|
|
||||||
|
|
||||||
# Edges for router return paths
|
|
||||||
for router_method_name, paths in flow._router_paths.items():
|
for router_method_name, paths in flow._router_paths.items():
|
||||||
for path in paths:
|
for path in paths:
|
||||||
for listener_name, (
|
for listener_name, (
|
||||||
@@ -273,13 +143,7 @@ def add_edges(
|
|||||||
trigger_methods,
|
trigger_methods,
|
||||||
) in flow._listeners.items():
|
) in flow._listeners.items():
|
||||||
if path in trigger_methods:
|
if path in trigger_methods:
|
||||||
if (
|
is_cycle_edge = is_ancestor(trigger, method_name, ancestors)
|
||||||
router_method_name in node_positions
|
|
||||||
and listener_name in node_positions
|
|
||||||
):
|
|
||||||
is_cycle_edge = is_ancestor(
|
|
||||||
router_method_name, listener_name, ancestors
|
|
||||||
)
|
|
||||||
parent_has_multiple_children = (
|
parent_has_multiple_children = (
|
||||||
len(parent_children.get(router_method_name, [])) > 1
|
len(parent_children.get(router_method_name, [])) > 1
|
||||||
)
|
)
|
||||||
@@ -302,7 +166,7 @@ def add_edges(
|
|||||||
else:
|
else:
|
||||||
edge_smooth = {"type": "cubicBezier"}
|
edge_smooth = {"type": "cubicBezier"}
|
||||||
else:
|
else:
|
||||||
edge_smooth.update({"type": "continuous"})
|
edge_smooth = False
|
||||||
|
|
||||||
edge_style = {
|
edge_style = {
|
||||||
"color": colors["router_edge"],
|
"color": colors["router_edge"],
|
||||||
@@ -312,10 +176,3 @@ def add_edges(
|
|||||||
"smooth": edge_smooth,
|
"smooth": edge_smooth,
|
||||||
}
|
}
|
||||||
net.add_edge(router_method_name, listener_name, **edge_style)
|
net.add_edge(router_method_name, listener_name, **edge_style)
|
||||||
else:
|
|
||||||
# Same check here: known router edge and known method?
|
|
||||||
method_known = listener_name in flow._methods
|
|
||||||
if not method_known:
|
|
||||||
print(
|
|
||||||
f"Warning: No node found for '{router_method_name}' or '{listener_name}'. Skipping edge."
|
|
||||||
)
|
|
||||||
|
|||||||
@@ -14,13 +14,13 @@ class Knowledge(BaseModel):
|
|||||||
Knowledge is a collection of sources and setup for the vector store to save and query relevant context.
|
Knowledge is a collection of sources and setup for the vector store to save and query relevant context.
|
||||||
Args:
|
Args:
|
||||||
sources: List[BaseKnowledgeSource] = Field(default_factory=list)
|
sources: List[BaseKnowledgeSource] = Field(default_factory=list)
|
||||||
storage: Optional[KnowledgeStorage] = Field(default=None)
|
storage: KnowledgeStorage = Field(default_factory=KnowledgeStorage)
|
||||||
embedder_config: Optional[Dict[str, Any]] = None
|
embedder_config: Optional[Dict[str, Any]] = None
|
||||||
"""
|
"""
|
||||||
|
|
||||||
sources: List[BaseKnowledgeSource] = Field(default_factory=list)
|
sources: List[BaseKnowledgeSource] = Field(default_factory=list)
|
||||||
model_config = ConfigDict(arbitrary_types_allowed=True)
|
model_config = ConfigDict(arbitrary_types_allowed=True)
|
||||||
storage: Optional[KnowledgeStorage] = Field(default=None)
|
storage: KnowledgeStorage = Field(default_factory=KnowledgeStorage)
|
||||||
embedder_config: Optional[Dict[str, Any]] = None
|
embedder_config: Optional[Dict[str, Any]] = None
|
||||||
collection_name: Optional[str] = None
|
collection_name: Optional[str] = None
|
||||||
|
|
||||||
@@ -49,12 +49,7 @@ class Knowledge(BaseModel):
|
|||||||
"""
|
"""
|
||||||
Query across all knowledge sources to find the most relevant information.
|
Query across all knowledge sources to find the most relevant information.
|
||||||
Returns the top_k most relevant chunks.
|
Returns the top_k most relevant chunks.
|
||||||
|
|
||||||
Raises:
|
|
||||||
ValueError: If storage is not initialized.
|
|
||||||
"""
|
"""
|
||||||
if self.storage is None:
|
|
||||||
raise ValueError("Storage is not initialized.")
|
|
||||||
|
|
||||||
results = self.storage.search(
|
results = self.storage.search(
|
||||||
query,
|
query,
|
||||||
|
|||||||
@@ -1,8 +1,8 @@
|
|||||||
from abc import ABC, abstractmethod
|
from abc import ABC, abstractmethod
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Dict, List, Optional, Union
|
from typing import Dict, List, Union
|
||||||
|
|
||||||
from pydantic import Field, field_validator
|
from pydantic import Field
|
||||||
|
|
||||||
from crewai.knowledge.source.base_knowledge_source import BaseKnowledgeSource
|
from crewai.knowledge.source.base_knowledge_source import BaseKnowledgeSource
|
||||||
from crewai.knowledge.storage.knowledge_storage import KnowledgeStorage
|
from crewai.knowledge.storage.knowledge_storage import KnowledgeStorage
|
||||||
@@ -14,29 +14,17 @@ class BaseFileKnowledgeSource(BaseKnowledgeSource, ABC):
|
|||||||
"""Base class for knowledge sources that load content from files."""
|
"""Base class for knowledge sources that load content from files."""
|
||||||
|
|
||||||
_logger: Logger = Logger(verbose=True)
|
_logger: Logger = Logger(verbose=True)
|
||||||
file_path: Optional[Union[Path, List[Path], str, List[str]]] = Field(
|
file_path: Union[Path, List[Path], str, List[str]] = Field(
|
||||||
default=None,
|
..., description="The path to the file"
|
||||||
description="[Deprecated] The path to the file. Use file_paths instead.",
|
|
||||||
)
|
|
||||||
file_paths: Optional[Union[Path, List[Path], str, List[str]]] = Field(
|
|
||||||
default_factory=list, description="The path to the file"
|
|
||||||
)
|
)
|
||||||
content: Dict[Path, str] = Field(init=False, default_factory=dict)
|
content: Dict[Path, str] = Field(init=False, default_factory=dict)
|
||||||
storage: Optional[KnowledgeStorage] = Field(default=None)
|
storage: KnowledgeStorage = Field(default_factory=KnowledgeStorage)
|
||||||
safe_file_paths: List[Path] = Field(default_factory=list)
|
safe_file_paths: List[Path] = Field(default_factory=list)
|
||||||
|
|
||||||
@field_validator("file_path", "file_paths", mode="before")
|
|
||||||
def validate_file_path(cls, v, info):
|
|
||||||
"""Validate that at least one of file_path or file_paths is provided."""
|
|
||||||
# Single check if both are None, O(1) instead of nested conditions
|
|
||||||
if v is None and info.data.get("file_path" if info.field_name == "file_paths" else "file_paths") is None:
|
|
||||||
raise ValueError("Either file_path or file_paths must be provided")
|
|
||||||
return v
|
|
||||||
|
|
||||||
def model_post_init(self, _):
|
def model_post_init(self, _):
|
||||||
"""Post-initialization method to load content."""
|
"""Post-initialization method to load content."""
|
||||||
self.safe_file_paths = self._process_file_paths()
|
self.safe_file_paths = self._process_file_paths()
|
||||||
self.validate_content()
|
self.validate_paths()
|
||||||
self.content = self.load_content()
|
self.content = self.load_content()
|
||||||
|
|
||||||
@abstractmethod
|
@abstractmethod
|
||||||
@@ -44,13 +32,13 @@ class BaseFileKnowledgeSource(BaseKnowledgeSource, ABC):
|
|||||||
"""Load and preprocess file content. Should be overridden by subclasses. Assume that the file path is relative to the project root in the knowledge directory."""
|
"""Load and preprocess file content. Should be overridden by subclasses. Assume that the file path is relative to the project root in the knowledge directory."""
|
||||||
pass
|
pass
|
||||||
|
|
||||||
def validate_content(self):
|
def validate_paths(self):
|
||||||
"""Validate the paths."""
|
"""Validate the paths."""
|
||||||
for path in self.safe_file_paths:
|
for path in self.safe_file_paths:
|
||||||
if not path.exists():
|
if not path.exists():
|
||||||
self._logger.log(
|
self._logger.log(
|
||||||
"error",
|
"error",
|
||||||
f"File not found: {path}. Try adding sources to the knowledge directory. If it's inside the knowledge directory, use the relative path.",
|
f"File not found: {path}. Try adding sources to the knowledge directory. If its inside the knowledge directory, use the relative path.",
|
||||||
color="red",
|
color="red",
|
||||||
)
|
)
|
||||||
raise FileNotFoundError(f"File not found: {path}")
|
raise FileNotFoundError(f"File not found: {path}")
|
||||||
@@ -63,10 +51,7 @@ class BaseFileKnowledgeSource(BaseKnowledgeSource, ABC):
|
|||||||
|
|
||||||
def _save_documents(self):
|
def _save_documents(self):
|
||||||
"""Save the documents to the storage."""
|
"""Save the documents to the storage."""
|
||||||
if self.storage:
|
|
||||||
self.storage.save(self.chunks)
|
self.storage.save(self.chunks)
|
||||||
else:
|
|
||||||
raise ValueError("No storage found to save documents.")
|
|
||||||
|
|
||||||
def convert_to_path(self, path: Union[Path, str]) -> Path:
|
def convert_to_path(self, path: Union[Path, str]) -> Path:
|
||||||
"""Convert a path to a Path object."""
|
"""Convert a path to a Path object."""
|
||||||
@@ -74,30 +59,13 @@ class BaseFileKnowledgeSource(BaseKnowledgeSource, ABC):
|
|||||||
|
|
||||||
def _process_file_paths(self) -> List[Path]:
|
def _process_file_paths(self) -> List[Path]:
|
||||||
"""Convert file_path to a list of Path objects."""
|
"""Convert file_path to a list of Path objects."""
|
||||||
|
paths = (
|
||||||
if hasattr(self, "file_path") and self.file_path is not None:
|
[self.file_path]
|
||||||
self._logger.log(
|
if isinstance(self.file_path, (str, Path))
|
||||||
"warning",
|
else self.file_path
|
||||||
"The 'file_path' attribute is deprecated and will be removed in a future version. Please use 'file_paths' instead.",
|
|
||||||
color="yellow",
|
|
||||||
)
|
|
||||||
self.file_paths = self.file_path
|
|
||||||
|
|
||||||
if self.file_paths is None:
|
|
||||||
raise ValueError("Your source must be provided with a file_paths: []")
|
|
||||||
|
|
||||||
# Convert single path to list
|
|
||||||
path_list: List[Union[Path, str]] = (
|
|
||||||
[self.file_paths]
|
|
||||||
if isinstance(self.file_paths, (str, Path))
|
|
||||||
else list(self.file_paths)
|
|
||||||
if isinstance(self.file_paths, list)
|
|
||||||
else []
|
|
||||||
)
|
)
|
||||||
|
|
||||||
if not path_list:
|
if not isinstance(paths, list):
|
||||||
raise ValueError(
|
raise ValueError("file_path must be a Path, str, or a list of these types")
|
||||||
"file_path/file_paths must be a Path, str, or a list of these types"
|
|
||||||
)
|
|
||||||
|
|
||||||
return [self.convert_to_path(path) for path in path_list]
|
return [self.convert_to_path(path) for path in paths]
|
||||||
|
|||||||
@@ -16,12 +16,12 @@ class BaseKnowledgeSource(BaseModel, ABC):
|
|||||||
chunk_embeddings: List[np.ndarray] = Field(default_factory=list)
|
chunk_embeddings: List[np.ndarray] = Field(default_factory=list)
|
||||||
|
|
||||||
model_config = ConfigDict(arbitrary_types_allowed=True)
|
model_config = ConfigDict(arbitrary_types_allowed=True)
|
||||||
storage: Optional[KnowledgeStorage] = Field(default=None)
|
storage: KnowledgeStorage = Field(default_factory=KnowledgeStorage)
|
||||||
metadata: Dict[str, Any] = Field(default_factory=dict) # Currently unused
|
metadata: Dict[str, Any] = Field(default_factory=dict) # Currently unused
|
||||||
collection_name: Optional[str] = Field(default=None)
|
collection_name: Optional[str] = Field(default=None)
|
||||||
|
|
||||||
@abstractmethod
|
@abstractmethod
|
||||||
def validate_content(self) -> Any:
|
def load_content(self) -> Dict[Any, str]:
|
||||||
"""Load and preprocess content from the source."""
|
"""Load and preprocess content from the source."""
|
||||||
pass
|
pass
|
||||||
|
|
||||||
@@ -46,7 +46,4 @@ class BaseKnowledgeSource(BaseModel, ABC):
|
|||||||
Save the documents to the storage.
|
Save the documents to the storage.
|
||||||
This method should be called after the chunks and embeddings are generated.
|
This method should be called after the chunks and embeddings are generated.
|
||||||
"""
|
"""
|
||||||
if self.storage:
|
|
||||||
self.storage.save(self.chunks)
|
self.storage.save(self.chunks)
|
||||||
else:
|
|
||||||
raise ValueError("No storage found to save documents.")
|
|
||||||
|
|||||||
@@ -1,120 +0,0 @@
|
|||||||
from pathlib import Path
|
|
||||||
from typing import Iterator, List, Optional, Union
|
|
||||||
from urllib.parse import urlparse
|
|
||||||
|
|
||||||
from docling.datamodel.base_models import InputFormat
|
|
||||||
from docling.document_converter import DocumentConverter
|
|
||||||
from docling.exceptions import ConversionError
|
|
||||||
from docling_core.transforms.chunker.hierarchical_chunker import HierarchicalChunker
|
|
||||||
from docling_core.types.doc.document import DoclingDocument
|
|
||||||
from pydantic import Field
|
|
||||||
|
|
||||||
from crewai.knowledge.source.base_knowledge_source import BaseKnowledgeSource
|
|
||||||
from crewai.utilities.constants import KNOWLEDGE_DIRECTORY
|
|
||||||
from crewai.utilities.logger import Logger
|
|
||||||
|
|
||||||
|
|
||||||
class CrewDoclingSource(BaseKnowledgeSource):
|
|
||||||
"""Default Source class for converting documents to markdown or json
|
|
||||||
This will auto support PDF, DOCX, and TXT, XLSX, Images, and HTML files without any additional dependencies and follows the docling package as the source of truth.
|
|
||||||
"""
|
|
||||||
|
|
||||||
_logger: Logger = Logger(verbose=True)
|
|
||||||
|
|
||||||
file_path: Optional[List[Union[Path, str]]] = Field(default=None)
|
|
||||||
file_paths: List[Union[Path, str]] = Field(default_factory=list)
|
|
||||||
chunks: List[str] = Field(default_factory=list)
|
|
||||||
safe_file_paths: List[Union[Path, str]] = Field(default_factory=list)
|
|
||||||
content: List[DoclingDocument] = Field(default_factory=list)
|
|
||||||
document_converter: DocumentConverter = Field(
|
|
||||||
default_factory=lambda: DocumentConverter(
|
|
||||||
allowed_formats=[
|
|
||||||
InputFormat.MD,
|
|
||||||
InputFormat.ASCIIDOC,
|
|
||||||
InputFormat.PDF,
|
|
||||||
InputFormat.DOCX,
|
|
||||||
InputFormat.HTML,
|
|
||||||
InputFormat.IMAGE,
|
|
||||||
InputFormat.XLSX,
|
|
||||||
InputFormat.PPTX,
|
|
||||||
]
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
def model_post_init(self, _) -> None:
|
|
||||||
if self.file_path:
|
|
||||||
self._logger.log(
|
|
||||||
"warning",
|
|
||||||
"The 'file_path' attribute is deprecated and will be removed in a future version. Please use 'file_paths' instead.",
|
|
||||||
color="yellow",
|
|
||||||
)
|
|
||||||
self.file_paths = self.file_path
|
|
||||||
self.safe_file_paths = self.validate_content()
|
|
||||||
self.content = self._load_content()
|
|
||||||
|
|
||||||
def _load_content(self) -> List[DoclingDocument]:
|
|
||||||
try:
|
|
||||||
return self._convert_source_to_docling_documents()
|
|
||||||
except ConversionError as e:
|
|
||||||
self._logger.log(
|
|
||||||
"error",
|
|
||||||
f"Error loading content: {e}. Supported formats: {self.document_converter.allowed_formats}",
|
|
||||||
"red",
|
|
||||||
)
|
|
||||||
raise e
|
|
||||||
except Exception as e:
|
|
||||||
self._logger.log("error", f"Error loading content: {e}")
|
|
||||||
raise e
|
|
||||||
|
|
||||||
def add(self) -> None:
|
|
||||||
if self.content is None:
|
|
||||||
return
|
|
||||||
for doc in self.content:
|
|
||||||
new_chunks_iterable = self._chunk_doc(doc)
|
|
||||||
self.chunks.extend(list(new_chunks_iterable))
|
|
||||||
self._save_documents()
|
|
||||||
|
|
||||||
def _convert_source_to_docling_documents(self) -> List[DoclingDocument]:
|
|
||||||
conv_results_iter = self.document_converter.convert_all(self.safe_file_paths)
|
|
||||||
return [result.document for result in conv_results_iter]
|
|
||||||
|
|
||||||
def _chunk_doc(self, doc: DoclingDocument) -> Iterator[str]:
|
|
||||||
chunker = HierarchicalChunker()
|
|
||||||
for chunk in chunker.chunk(doc):
|
|
||||||
yield chunk.text
|
|
||||||
|
|
||||||
def validate_content(self) -> List[Union[Path, str]]:
|
|
||||||
processed_paths: List[Union[Path, str]] = []
|
|
||||||
for path in self.file_paths:
|
|
||||||
if isinstance(path, str):
|
|
||||||
if path.startswith(("http://", "https://")):
|
|
||||||
try:
|
|
||||||
if self._validate_url(path):
|
|
||||||
processed_paths.append(path)
|
|
||||||
else:
|
|
||||||
raise ValueError(f"Invalid URL format: {path}")
|
|
||||||
except Exception as e:
|
|
||||||
raise ValueError(f"Invalid URL: {path}. Error: {str(e)}")
|
|
||||||
else:
|
|
||||||
local_path = Path(KNOWLEDGE_DIRECTORY + "/" + path)
|
|
||||||
if local_path.exists():
|
|
||||||
processed_paths.append(local_path)
|
|
||||||
else:
|
|
||||||
raise FileNotFoundError(f"File not found: {local_path}")
|
|
||||||
else:
|
|
||||||
# this is an instance of Path
|
|
||||||
processed_paths.append(path)
|
|
||||||
return processed_paths
|
|
||||||
|
|
||||||
def _validate_url(self, url: str) -> bool:
|
|
||||||
try:
|
|
||||||
result = urlparse(url)
|
|
||||||
return all(
|
|
||||||
[
|
|
||||||
result.scheme in ("http", "https"),
|
|
||||||
result.netloc,
|
|
||||||
len(result.netloc.split(".")) >= 2, # Ensure domain has TLD
|
|
||||||
]
|
|
||||||
)
|
|
||||||
except Exception:
|
|
||||||
return False
|
|
||||||
@@ -13,9 +13,9 @@ class StringKnowledgeSource(BaseKnowledgeSource):
|
|||||||
|
|
||||||
def model_post_init(self, _):
|
def model_post_init(self, _):
|
||||||
"""Post-initialization method to validate content."""
|
"""Post-initialization method to validate content."""
|
||||||
self.validate_content()
|
self.load_content()
|
||||||
|
|
||||||
def validate_content(self):
|
def load_content(self):
|
||||||
"""Validate string content."""
|
"""Validate string content."""
|
||||||
if not isinstance(self.content, str):
|
if not isinstance(self.content, str):
|
||||||
raise ValueError("StringKnowledgeSource only accepts string content")
|
raise ValueError("StringKnowledgeSource only accepts string content")
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
from abc import ABC, abstractmethod
|
from abc import ABC, abstractmethod
|
||||||
from typing import Any, Dict, List, Optional
|
from typing import Dict, Any, List, Optional
|
||||||
|
|
||||||
|
|
||||||
class BaseKnowledgeStorage(ABC):
|
class BaseKnowledgeStorage(ABC):
|
||||||
|
|||||||
@@ -14,9 +14,9 @@ from chromadb.config import Settings
|
|||||||
|
|
||||||
from crewai.knowledge.storage.base_knowledge_storage import BaseKnowledgeStorage
|
from crewai.knowledge.storage.base_knowledge_storage import BaseKnowledgeStorage
|
||||||
from crewai.utilities import EmbeddingConfigurator
|
from crewai.utilities import EmbeddingConfigurator
|
||||||
from crewai.utilities.constants import KNOWLEDGE_DIRECTORY
|
|
||||||
from crewai.utilities.logger import Logger
|
from crewai.utilities.logger import Logger
|
||||||
from crewai.utilities.paths import db_storage_path
|
from crewai.utilities.paths import db_storage_path
|
||||||
|
from crewai.utilities.constants import KNOWLEDGE_DIRECTORY
|
||||||
|
|
||||||
|
|
||||||
@contextlib.contextmanager
|
@contextlib.contextmanager
|
||||||
@@ -124,44 +124,23 @@ class KnowledgeStorage(BaseKnowledgeStorage):
|
|||||||
documents: List[str],
|
documents: List[str],
|
||||||
metadata: Optional[Union[Dict[str, Any], List[Dict[str, Any]]]] = None,
|
metadata: Optional[Union[Dict[str, Any], List[Dict[str, Any]]]] = None,
|
||||||
):
|
):
|
||||||
if not self.collection:
|
if self.collection:
|
||||||
raise Exception("Collection not initialized")
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
# Create a dictionary to store unique documents
|
if metadata is None:
|
||||||
unique_docs = {}
|
metadatas: Optional[OneOrMany[chromadb.Metadata]] = None
|
||||||
|
elif isinstance(metadata, list):
|
||||||
# Generate IDs and create a mapping of id -> (document, metadata)
|
metadatas = [cast(chromadb.Metadata, m) for m in metadata]
|
||||||
for idx, doc in enumerate(documents):
|
|
||||||
doc_id = hashlib.sha256(doc.encode("utf-8")).hexdigest()
|
|
||||||
doc_metadata = None
|
|
||||||
if metadata is not None:
|
|
||||||
if isinstance(metadata, list):
|
|
||||||
doc_metadata = metadata[idx]
|
|
||||||
else:
|
else:
|
||||||
doc_metadata = metadata
|
metadatas = cast(chromadb.Metadata, metadata)
|
||||||
unique_docs[doc_id] = (doc, doc_metadata)
|
|
||||||
|
|
||||||
# Prepare filtered lists for ChromaDB
|
ids = [
|
||||||
filtered_docs = []
|
hashlib.sha256(doc.encode("utf-8")).hexdigest() for doc in documents
|
||||||
filtered_metadata = []
|
]
|
||||||
filtered_ids = []
|
|
||||||
|
|
||||||
# Build the filtered lists
|
|
||||||
for doc_id, (doc, meta) in unique_docs.items():
|
|
||||||
filtered_docs.append(doc)
|
|
||||||
filtered_metadata.append(meta)
|
|
||||||
filtered_ids.append(doc_id)
|
|
||||||
|
|
||||||
# If we have no metadata at all, set it to None
|
|
||||||
final_metadata: Optional[OneOrMany[chromadb.Metadata]] = (
|
|
||||||
None if all(m is None for m in filtered_metadata) else filtered_metadata
|
|
||||||
)
|
|
||||||
|
|
||||||
self.collection.upsert(
|
self.collection.upsert(
|
||||||
documents=filtered_docs,
|
documents=documents,
|
||||||
metadatas=final_metadata,
|
metadatas=metadatas,
|
||||||
ids=filtered_ids,
|
ids=ids,
|
||||||
)
|
)
|
||||||
except chromadb.errors.InvalidDimensionException as e:
|
except chromadb.errors.InvalidDimensionException as e:
|
||||||
Logger(verbose=True).log(
|
Logger(verbose=True).log(
|
||||||
@@ -175,8 +154,12 @@ class KnowledgeStorage(BaseKnowledgeStorage):
|
|||||||
"Try resetting the collection using `crewai reset-memories -a`"
|
"Try resetting the collection using `crewai reset-memories -a`"
|
||||||
) from e
|
) from e
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
Logger(verbose=True).log("error", f"Failed to upsert documents: {e}", "red")
|
Logger(verbose=True).log(
|
||||||
|
"error", f"Failed to upsert documents: {e}", "red"
|
||||||
|
)
|
||||||
raise
|
raise
|
||||||
|
else:
|
||||||
|
raise Exception("Collection not initialized")
|
||||||
|
|
||||||
def _create_default_embedding_function(self):
|
def _create_default_embedding_function(self):
|
||||||
from chromadb.utils.embedding_functions.openai_embedding_function import (
|
from chromadb.utils.embedding_functions.openai_embedding_function import (
|
||||||
|
|||||||
@@ -6,10 +6,8 @@ import warnings
|
|||||||
from contextlib import contextmanager
|
from contextlib import contextmanager
|
||||||
from typing import Any, Dict, List, Optional, Union
|
from typing import Any, Dict, List, Optional, Union
|
||||||
|
|
||||||
with warnings.catch_warnings():
|
import litellm
|
||||||
warnings.simplefilter("ignore", UserWarning)
|
from litellm import get_supported_openai_params
|
||||||
import litellm
|
|
||||||
from litellm import get_supported_openai_params
|
|
||||||
|
|
||||||
from crewai.utilities.exceptions.context_window_exceeding_exception import (
|
from crewai.utilities.exceptions.context_window_exceeding_exception import (
|
||||||
LLMContextLengthExceededException,
|
LLMContextLengthExceededException,
|
||||||
@@ -45,11 +43,6 @@ LLM_CONTEXT_WINDOW_SIZES = {
|
|||||||
"gpt-4-turbo": 128000,
|
"gpt-4-turbo": 128000,
|
||||||
"o1-preview": 128000,
|
"o1-preview": 128000,
|
||||||
"o1-mini": 128000,
|
"o1-mini": 128000,
|
||||||
# gemini
|
|
||||||
"gemini-2.0-flash": 1048576,
|
|
||||||
"gemini-1.5-pro": 2097152,
|
|
||||||
"gemini-1.5-flash": 1048576,
|
|
||||||
"gemini-1.5-flash-8b": 1048576,
|
|
||||||
# deepseek
|
# deepseek
|
||||||
"deepseek-chat": 128000,
|
"deepseek-chat": 128000,
|
||||||
# groq
|
# groq
|
||||||
@@ -66,13 +59,8 @@ LLM_CONTEXT_WINDOW_SIZES = {
|
|||||||
"llama3-70b-8192": 8192,
|
"llama3-70b-8192": 8192,
|
||||||
"llama3-8b-8192": 8192,
|
"llama3-8b-8192": 8192,
|
||||||
"mixtral-8x7b-32768": 32768,
|
"mixtral-8x7b-32768": 32768,
|
||||||
"llama-3.3-70b-versatile": 128000,
|
|
||||||
"llama-3.3-70b-instruct": 128000,
|
|
||||||
}
|
}
|
||||||
|
|
||||||
DEFAULT_CONTEXT_WINDOW_SIZE = 8192
|
|
||||||
CONTEXT_WINDOW_USAGE_RATIO = 0.75
|
|
||||||
|
|
||||||
|
|
||||||
@contextmanager
|
@contextmanager
|
||||||
def suppress_warnings():
|
def suppress_warnings():
|
||||||
@@ -136,11 +124,10 @@ class LLM:
|
|||||||
self.api_version = api_version
|
self.api_version = api_version
|
||||||
self.api_key = api_key
|
self.api_key = api_key
|
||||||
self.callbacks = callbacks
|
self.callbacks = callbacks
|
||||||
self.context_window_size = 0
|
|
||||||
self.kwargs = kwargs
|
self.kwargs = kwargs
|
||||||
|
|
||||||
litellm.drop_params = True
|
litellm.drop_params = True
|
||||||
|
litellm.set_verbose = False
|
||||||
self.set_callbacks(callbacks)
|
self.set_callbacks(callbacks)
|
||||||
self.set_env_callbacks()
|
self.set_env_callbacks()
|
||||||
|
|
||||||
@@ -204,16 +191,7 @@ class LLM:
|
|||||||
|
|
||||||
def get_context_window_size(self) -> int:
|
def get_context_window_size(self) -> int:
|
||||||
# Only using 75% of the context window size to avoid cutting the message in the middle
|
# Only using 75% of the context window size to avoid cutting the message in the middle
|
||||||
if self.context_window_size != 0:
|
return int(LLM_CONTEXT_WINDOW_SIZES.get(self.model, 8192) * 0.75)
|
||||||
return self.context_window_size
|
|
||||||
|
|
||||||
self.context_window_size = int(
|
|
||||||
DEFAULT_CONTEXT_WINDOW_SIZE * CONTEXT_WINDOW_USAGE_RATIO
|
|
||||||
)
|
|
||||||
for key, value in LLM_CONTEXT_WINDOW_SIZES.items():
|
|
||||||
if self.model.startswith(key):
|
|
||||||
self.context_window_size = int(value * CONTEXT_WINDOW_USAGE_RATIO)
|
|
||||||
return self.context_window_size
|
|
||||||
|
|
||||||
def set_callbacks(self, callbacks: List[Any]):
|
def set_callbacks(self, callbacks: List[Any]):
|
||||||
callback_types = [type(callback) for callback in callbacks]
|
callback_types = [type(callback) for callback in callbacks]
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
from typing import Any, Dict, Optional
|
from typing import Optional, Dict, Any
|
||||||
|
|
||||||
from crewai.memory import EntityMemory, LongTermMemory, ShortTermMemory, UserMemory
|
from crewai.memory import EntityMemory, LongTermMemory, ShortTermMemory, UserMemory
|
||||||
|
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
from typing import Any, Dict, List, Optional
|
from typing import Any, Dict, Optional, List
|
||||||
|
|
||||||
from crewai.memory.storage.rag_storage import RAGStorage
|
from crewai.memory.storage.rag_storage import RAGStorage
|
||||||
|
|
||||||
|
|||||||
@@ -1,5 +1,4 @@
|
|||||||
from typing import Any, Dict, Optional
|
from typing import Any, Dict, Optional
|
||||||
|
|
||||||
from crewai.memory.memory import Memory
|
from crewai.memory.memory import Memory
|
||||||
from crewai.memory.short_term.short_term_memory_item import ShortTermMemoryItem
|
from crewai.memory.short_term.short_term_memory_item import ShortTermMemoryItem
|
||||||
from crewai.memory.storage.rag_storage import RAGStorage
|
from crewai.memory.storage.rag_storage import RAGStorage
|
||||||
@@ -33,10 +32,7 @@ class ShortTermMemory(Memory):
|
|||||||
storage
|
storage
|
||||||
if storage
|
if storage
|
||||||
else RAGStorage(
|
else RAGStorage(
|
||||||
type="short_term",
|
type="short_term", embedder_config=embedder_config, crew=crew, path=path
|
||||||
embedder_config=embedder_config,
|
|
||||||
crew=crew,
|
|
||||||
path=path,
|
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
super().__init__(storage)
|
super().__init__(storage)
|
||||||
|
|||||||
@@ -2,7 +2,6 @@ import os
|
|||||||
from typing import Any, Dict, List
|
from typing import Any, Dict, List
|
||||||
|
|
||||||
from mem0 import MemoryClient
|
from mem0 import MemoryClient
|
||||||
|
|
||||||
from crewai.memory.storage.interface import Storage
|
from crewai.memory.storage.interface import Storage
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,5 @@
|
|||||||
from functools import wraps
|
from functools import wraps
|
||||||
|
|
||||||
|
|
||||||
def memoize(func):
|
def memoize(func):
|
||||||
cache = {}
|
cache = {}
|
||||||
|
|
||||||
|
|||||||
@@ -1,25 +1,12 @@
|
|||||||
import datetime
|
import datetime
|
||||||
import inspect
|
|
||||||
import json
|
import json
|
||||||
import logging
|
from pathlib import Path
|
||||||
import threading
|
import threading
|
||||||
import uuid
|
import uuid
|
||||||
from concurrent.futures import Future
|
from concurrent.futures import Future
|
||||||
from copy import copy
|
from copy import copy
|
||||||
from hashlib import md5
|
from hashlib import md5
|
||||||
from pathlib import Path
|
from typing import Any, Dict, List, Optional, Set, Tuple, Type, Union
|
||||||
from typing import (
|
|
||||||
Any,
|
|
||||||
Callable,
|
|
||||||
ClassVar,
|
|
||||||
Dict,
|
|
||||||
List,
|
|
||||||
Optional,
|
|
||||||
Set,
|
|
||||||
Tuple,
|
|
||||||
Type,
|
|
||||||
Union,
|
|
||||||
)
|
|
||||||
|
|
||||||
from opentelemetry.trace import Span
|
from opentelemetry.trace import Span
|
||||||
from pydantic import (
|
from pydantic import (
|
||||||
@@ -33,7 +20,6 @@ from pydantic import (
|
|||||||
from pydantic_core import PydanticCustomError
|
from pydantic_core import PydanticCustomError
|
||||||
|
|
||||||
from crewai.agents.agent_builder.base_agent import BaseAgent
|
from crewai.agents.agent_builder.base_agent import BaseAgent
|
||||||
from crewai.tasks.guardrail_result import GuardrailResult
|
|
||||||
from crewai.tasks.output_format import OutputFormat
|
from crewai.tasks.output_format import OutputFormat
|
||||||
from crewai.tasks.task_output import TaskOutput
|
from crewai.tasks.task_output import TaskOutput
|
||||||
from crewai.telemetry.telemetry import Telemetry
|
from crewai.telemetry.telemetry import Telemetry
|
||||||
@@ -63,7 +49,6 @@ class Task(BaseModel):
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
__hash__ = object.__hash__ # type: ignore
|
__hash__ = object.__hash__ # type: ignore
|
||||||
logger: ClassVar[logging.Logger] = logging.getLogger(__name__)
|
|
||||||
used_tools: int = 0
|
used_tools: int = 0
|
||||||
tools_errors: int = 0
|
tools_errors: int = 0
|
||||||
delegations: int = 0
|
delegations: int = 0
|
||||||
@@ -125,61 +110,11 @@ class Task(BaseModel):
|
|||||||
default=None,
|
default=None,
|
||||||
)
|
)
|
||||||
processed_by_agents: Set[str] = Field(default_factory=set)
|
processed_by_agents: Set[str] = Field(default_factory=set)
|
||||||
guardrail: Optional[Callable[[TaskOutput], Tuple[bool, Any]]] = Field(
|
|
||||||
default=None,
|
|
||||||
description="Function to validate task output before proceeding to next task"
|
|
||||||
)
|
|
||||||
max_retries: int = Field(
|
|
||||||
default=3,
|
|
||||||
description="Maximum number of retries when guardrail fails"
|
|
||||||
)
|
|
||||||
retry_count: int = Field(
|
|
||||||
default=0,
|
|
||||||
description="Current number of retries"
|
|
||||||
)
|
|
||||||
|
|
||||||
@field_validator("guardrail")
|
|
||||||
@classmethod
|
|
||||||
def validate_guardrail_function(cls, v: Optional[Callable]) -> Optional[Callable]:
|
|
||||||
"""Validate that the guardrail function has the correct signature and behavior.
|
|
||||||
|
|
||||||
While type hints provide static checking, this validator ensures runtime safety by:
|
|
||||||
1. Verifying the function accepts exactly one parameter (the TaskOutput)
|
|
||||||
2. Checking return type annotations match Tuple[bool, Any] if present
|
|
||||||
3. Providing clear, immediate error messages for debugging
|
|
||||||
|
|
||||||
This runtime validation is crucial because:
|
|
||||||
- Type hints are optional and can be ignored at runtime
|
|
||||||
- Function signatures need immediate validation before task execution
|
|
||||||
- Clear error messages help users debug guardrail implementation issues
|
|
||||||
|
|
||||||
Args:
|
|
||||||
v: The guardrail function to validate
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
The validated guardrail function
|
|
||||||
|
|
||||||
Raises:
|
|
||||||
ValueError: If the function signature is invalid or return annotation
|
|
||||||
doesn't match Tuple[bool, Any]
|
|
||||||
"""
|
|
||||||
if v is not None:
|
|
||||||
sig = inspect.signature(v)
|
|
||||||
if len(sig.parameters) != 1:
|
|
||||||
raise ValueError("Guardrail function must accept exactly one parameter")
|
|
||||||
|
|
||||||
# Check return annotation if present, but don't require it
|
|
||||||
return_annotation = sig.return_annotation
|
|
||||||
if return_annotation != inspect.Signature.empty:
|
|
||||||
if not (return_annotation == Tuple[bool, Any] or str(return_annotation) == 'Tuple[bool, Any]'):
|
|
||||||
raise ValueError("If return type is annotated, it must be Tuple[bool, Any]")
|
|
||||||
return v
|
|
||||||
|
|
||||||
_telemetry: Telemetry = PrivateAttr(default_factory=Telemetry)
|
_telemetry: Telemetry = PrivateAttr(default_factory=Telemetry)
|
||||||
_execution_span: Optional[Span] = PrivateAttr(default=None)
|
_execution_span: Optional[Span] = PrivateAttr(default=None)
|
||||||
_original_description: Optional[str] = PrivateAttr(default=None)
|
_original_description: Optional[str] = PrivateAttr(default=None)
|
||||||
_original_expected_output: Optional[str] = PrivateAttr(default=None)
|
_original_expected_output: Optional[str] = PrivateAttr(default=None)
|
||||||
_original_output_file: Optional[str] = PrivateAttr(default=None)
|
|
||||||
_thread: Optional[threading.Thread] = PrivateAttr(default=None)
|
_thread: Optional[threading.Thread] = PrivateAttr(default=None)
|
||||||
_execution_time: Optional[float] = PrivateAttr(default=None)
|
_execution_time: Optional[float] = PrivateAttr(default=None)
|
||||||
|
|
||||||
@@ -214,46 +149,8 @@ class Task(BaseModel):
|
|||||||
|
|
||||||
@field_validator("output_file")
|
@field_validator("output_file")
|
||||||
@classmethod
|
@classmethod
|
||||||
def output_file_validation(cls, value: Optional[str]) -> Optional[str]:
|
def output_file_validation(cls, value: str) -> str:
|
||||||
"""Validate the output file path.
|
"""Validate the output file path by removing the / from the beginning of the path."""
|
||||||
|
|
||||||
Args:
|
|
||||||
value: The output file path to validate. Can be None or a string.
|
|
||||||
If the path contains template variables (e.g. {var}), leading slashes are preserved.
|
|
||||||
For regular paths, leading slashes are stripped.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
The validated and potentially modified path, or None if no path was provided.
|
|
||||||
|
|
||||||
Raises:
|
|
||||||
ValueError: If the path contains invalid characters, path traversal attempts,
|
|
||||||
or other security concerns.
|
|
||||||
"""
|
|
||||||
if value is None:
|
|
||||||
return None
|
|
||||||
|
|
||||||
# Basic security checks
|
|
||||||
if ".." in value:
|
|
||||||
raise ValueError("Path traversal attempts are not allowed in output_file paths")
|
|
||||||
|
|
||||||
# Check for shell expansion first
|
|
||||||
if value.startswith('~') or value.startswith('$'):
|
|
||||||
raise ValueError("Shell expansion characters are not allowed in output_file paths")
|
|
||||||
|
|
||||||
# Then check other shell special characters
|
|
||||||
if any(char in value for char in ['|', '>', '<', '&', ';']):
|
|
||||||
raise ValueError("Shell special characters are not allowed in output_file paths")
|
|
||||||
|
|
||||||
# Don't strip leading slash if it's a template path with variables
|
|
||||||
if "{" in value or "}" in value:
|
|
||||||
# Validate template variable format
|
|
||||||
template_vars = [part.split("}")[0] for part in value.split("{")[1:]]
|
|
||||||
for var in template_vars:
|
|
||||||
if not var.isidentifier():
|
|
||||||
raise ValueError(f"Invalid template variable name: {var}")
|
|
||||||
return value
|
|
||||||
|
|
||||||
# Strip leading slash for regular paths
|
|
||||||
if value.startswith("/"):
|
if value.startswith("/"):
|
||||||
return value[1:]
|
return value[1:]
|
||||||
return value
|
return value
|
||||||
@@ -357,6 +254,7 @@ class Task(BaseModel):
|
|||||||
)
|
)
|
||||||
|
|
||||||
pydantic_output, json_output = self._export_output(result)
|
pydantic_output, json_output = self._export_output(result)
|
||||||
|
|
||||||
task_output = TaskOutput(
|
task_output = TaskOutput(
|
||||||
name=self.name,
|
name=self.name,
|
||||||
description=self.description,
|
description=self.description,
|
||||||
@@ -367,37 +265,6 @@ class Task(BaseModel):
|
|||||||
agent=agent.role,
|
agent=agent.role,
|
||||||
output_format=self._get_output_format(),
|
output_format=self._get_output_format(),
|
||||||
)
|
)
|
||||||
|
|
||||||
if self.guardrail:
|
|
||||||
guardrail_result = GuardrailResult.from_tuple(self.guardrail(task_output))
|
|
||||||
if not guardrail_result.success:
|
|
||||||
if self.retry_count >= self.max_retries:
|
|
||||||
raise Exception(
|
|
||||||
f"Task failed guardrail validation after {self.max_retries} retries. "
|
|
||||||
f"Last error: {guardrail_result.error}"
|
|
||||||
)
|
|
||||||
|
|
||||||
self.retry_count += 1
|
|
||||||
context = (
|
|
||||||
f"### Previous attempt failed validation: {guardrail_result.error}\n\n\n"
|
|
||||||
f"### Previous result:\n{task_output.raw}\n\n\n"
|
|
||||||
"Try again, making sure to address the validation error."
|
|
||||||
)
|
|
||||||
return self._execute_core(agent, context, tools)
|
|
||||||
|
|
||||||
if guardrail_result.result is None:
|
|
||||||
raise Exception(
|
|
||||||
"Task guardrail returned None as result. This is not allowed."
|
|
||||||
)
|
|
||||||
|
|
||||||
if isinstance(guardrail_result.result, str):
|
|
||||||
task_output.raw = guardrail_result.result
|
|
||||||
pydantic_output, json_output = self._export_output(guardrail_result.result)
|
|
||||||
task_output.pydantic = pydantic_output
|
|
||||||
task_output.json_dict = json_output
|
|
||||||
elif isinstance(guardrail_result.result, TaskOutput):
|
|
||||||
task_output = guardrail_result.result
|
|
||||||
|
|
||||||
self.output = task_output
|
self.output = task_output
|
||||||
|
|
||||||
self._set_end_execution_time(start_time)
|
self._set_end_execution_time(start_time)
|
||||||
@@ -432,89 +299,16 @@ class Task(BaseModel):
|
|||||||
tasks_slices = [self.description, output]
|
tasks_slices = [self.description, output]
|
||||||
return "\n".join(tasks_slices)
|
return "\n".join(tasks_slices)
|
||||||
|
|
||||||
def interpolate_inputs(self, inputs: Dict[str, Union[str, int, float]]) -> None:
|
def interpolate_inputs(self, inputs: Dict[str, Any]) -> None:
|
||||||
"""Interpolate inputs into the task description, expected output, and output file path.
|
"""Interpolate inputs into the task description and expected output."""
|
||||||
|
|
||||||
Args:
|
|
||||||
inputs: Dictionary mapping template variables to their values.
|
|
||||||
Supported value types are strings, integers, and floats.
|
|
||||||
|
|
||||||
Raises:
|
|
||||||
ValueError: If a required template variable is missing from inputs.
|
|
||||||
"""
|
|
||||||
if self._original_description is None:
|
if self._original_description is None:
|
||||||
self._original_description = self.description
|
self._original_description = self.description
|
||||||
if self._original_expected_output is None:
|
if self._original_expected_output is None:
|
||||||
self._original_expected_output = self.expected_output
|
self._original_expected_output = self.expected_output
|
||||||
if self.output_file is not None and self._original_output_file is None:
|
|
||||||
self._original_output_file = self.output_file
|
|
||||||
|
|
||||||
if not inputs:
|
if inputs:
|
||||||
return
|
|
||||||
|
|
||||||
try:
|
|
||||||
self.description = self._original_description.format(**inputs)
|
self.description = self._original_description.format(**inputs)
|
||||||
except KeyError as e:
|
self.expected_output = self._original_expected_output.format(**inputs)
|
||||||
raise ValueError(f"Missing required template variable '{e.args[0]}' in description") from e
|
|
||||||
except ValueError as e:
|
|
||||||
raise ValueError(f"Error interpolating description: {str(e)}") from e
|
|
||||||
|
|
||||||
try:
|
|
||||||
self.expected_output = self.interpolate_only(
|
|
||||||
input_string=self._original_expected_output, inputs=inputs
|
|
||||||
)
|
|
||||||
except (KeyError, ValueError) as e:
|
|
||||||
raise ValueError(f"Error interpolating expected_output: {str(e)}") from e
|
|
||||||
|
|
||||||
if self.output_file is not None:
|
|
||||||
try:
|
|
||||||
self.output_file = self.interpolate_only(
|
|
||||||
input_string=self._original_output_file, inputs=inputs
|
|
||||||
)
|
|
||||||
except (KeyError, ValueError) as e:
|
|
||||||
raise ValueError(f"Error interpolating output_file path: {str(e)}") from e
|
|
||||||
|
|
||||||
def interpolate_only(self, input_string: Optional[str], inputs: Dict[str, Union[str, int, float]]) -> str:
|
|
||||||
"""Interpolate placeholders (e.g., {key}) in a string while leaving JSON untouched.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
input_string: The string containing template variables to interpolate.
|
|
||||||
Can be None or empty, in which case an empty string is returned.
|
|
||||||
inputs: Dictionary mapping template variables to their values.
|
|
||||||
Supported value types are strings, integers, and floats.
|
|
||||||
If input_string is empty or has no placeholders, inputs can be empty.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
The interpolated string with all template variables replaced with their values.
|
|
||||||
Empty string if input_string is None or empty.
|
|
||||||
|
|
||||||
Raises:
|
|
||||||
ValueError: If a required template variable is missing from inputs.
|
|
||||||
KeyError: If a template variable is not found in the inputs dictionary.
|
|
||||||
"""
|
|
||||||
if input_string is None or not input_string:
|
|
||||||
return ""
|
|
||||||
if "{" not in input_string and "}" not in input_string:
|
|
||||||
return input_string
|
|
||||||
if not inputs:
|
|
||||||
raise ValueError("Inputs dictionary cannot be empty when interpolating variables")
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Validate input types
|
|
||||||
for key, value in inputs.items():
|
|
||||||
if not isinstance(value, (str, int, float)):
|
|
||||||
raise ValueError(f"Value for key '{key}' must be a string, integer, or float, got {type(value).__name__}")
|
|
||||||
|
|
||||||
escaped_string = input_string.replace("{", "{{").replace("}", "}}")
|
|
||||||
|
|
||||||
for key in inputs.keys():
|
|
||||||
escaped_string = escaped_string.replace(f"{{{{{key}}}}}", f"{{{key}}}")
|
|
||||||
|
|
||||||
return escaped_string.format(**inputs)
|
|
||||||
except KeyError as e:
|
|
||||||
raise KeyError(f"Template variable '{e.args[0]}' not found in inputs dictionary") from e
|
|
||||||
except ValueError as e:
|
|
||||||
raise ValueError(f"Error during string interpolation: {str(e)}") from e
|
|
||||||
|
|
||||||
def increment_tools_errors(self) -> None:
|
def increment_tools_errors(self) -> None:
|
||||||
"""Increment the tools errors counter."""
|
"""Increment the tools errors counter."""
|
||||||
@@ -596,19 +390,9 @@ class Task(BaseModel):
|
|||||||
return OutputFormat.RAW
|
return OutputFormat.RAW
|
||||||
|
|
||||||
def _save_file(self, result: Any) -> None:
|
def _save_file(self, result: Any) -> None:
|
||||||
"""Save task output to a file.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
result: The result to save to the file. Can be a dict or any stringifiable object.
|
|
||||||
|
|
||||||
Raises:
|
|
||||||
ValueError: If output_file is not set
|
|
||||||
RuntimeError: If there is an error writing to the file
|
|
||||||
"""
|
|
||||||
if self.output_file is None:
|
if self.output_file is None:
|
||||||
raise ValueError("output_file is not set.")
|
raise ValueError("output_file is not set.")
|
||||||
|
|
||||||
try:
|
|
||||||
resolved_path = Path(self.output_file).expanduser().resolve()
|
resolved_path = Path(self.output_file).expanduser().resolve()
|
||||||
directory = resolved_path.parent
|
directory = resolved_path.parent
|
||||||
|
|
||||||
@@ -618,11 +402,10 @@ class Task(BaseModel):
|
|||||||
with resolved_path.open("w", encoding="utf-8") as file:
|
with resolved_path.open("w", encoding="utf-8") as file:
|
||||||
if isinstance(result, dict):
|
if isinstance(result, dict):
|
||||||
import json
|
import json
|
||||||
|
|
||||||
json.dump(result, file, ensure_ascii=False, indent=2)
|
json.dump(result, file, ensure_ascii=False, indent=2)
|
||||||
else:
|
else:
|
||||||
file.write(str(result))
|
file.write(str(result))
|
||||||
except (OSError, IOError) as e:
|
|
||||||
raise RuntimeError(f"Failed to save output file: {e}")
|
|
||||||
return None
|
return None
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
|
|||||||
@@ -1,56 +0,0 @@
|
|||||||
"""
|
|
||||||
Module for handling task guardrail validation results.
|
|
||||||
|
|
||||||
This module provides the GuardrailResult class which standardizes
|
|
||||||
the way task guardrails return their validation results.
|
|
||||||
"""
|
|
||||||
|
|
||||||
from typing import Any, Optional, Tuple, Union
|
|
||||||
|
|
||||||
from pydantic import BaseModel, field_validator
|
|
||||||
|
|
||||||
|
|
||||||
class GuardrailResult(BaseModel):
|
|
||||||
"""Result from a task guardrail execution.
|
|
||||||
|
|
||||||
This class standardizes the return format of task guardrails,
|
|
||||||
converting tuple responses into a structured format that can
|
|
||||||
be easily handled by the task execution system.
|
|
||||||
|
|
||||||
Attributes:
|
|
||||||
success (bool): Whether the guardrail validation passed
|
|
||||||
result (Any, optional): The validated/transformed result if successful
|
|
||||||
error (str, optional): Error message if validation failed
|
|
||||||
"""
|
|
||||||
success: bool
|
|
||||||
result: Optional[Any] = None
|
|
||||||
error: Optional[str] = None
|
|
||||||
|
|
||||||
@field_validator("result", "error")
|
|
||||||
@classmethod
|
|
||||||
def validate_result_error_exclusivity(cls, v: Any, info) -> Any:
|
|
||||||
values = info.data
|
|
||||||
if "success" in values:
|
|
||||||
if values["success"] and v and "error" in values and values["error"]:
|
|
||||||
raise ValueError("Cannot have both result and error when success is True")
|
|
||||||
if not values["success"] and v and "result" in values and values["result"]:
|
|
||||||
raise ValueError("Cannot have both result and error when success is False")
|
|
||||||
return v
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def from_tuple(cls, result: Tuple[bool, Union[Any, str]]) -> "GuardrailResult":
|
|
||||||
"""Create a GuardrailResult from a validation tuple.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
result: A tuple of (success, data) where data is either
|
|
||||||
the validated result or error message.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
GuardrailResult: A new instance with the tuple data.
|
|
||||||
"""
|
|
||||||
success, data = result
|
|
||||||
return cls(
|
|
||||||
success=success,
|
|
||||||
result=data if success else None,
|
|
||||||
error=data if not success else None
|
|
||||||
)
|
|
||||||
@@ -6,7 +6,6 @@ import os
|
|||||||
import platform
|
import platform
|
||||||
import warnings
|
import warnings
|
||||||
from contextlib import contextmanager
|
from contextlib import contextmanager
|
||||||
from importlib.metadata import version
|
|
||||||
from typing import TYPE_CHECKING, Any, Optional
|
from typing import TYPE_CHECKING, Any, Optional
|
||||||
|
|
||||||
|
|
||||||
@@ -17,10 +16,12 @@ def suppress_warnings():
|
|||||||
yield
|
yield
|
||||||
|
|
||||||
|
|
||||||
|
with suppress_warnings():
|
||||||
|
import pkg_resources
|
||||||
|
|
||||||
|
|
||||||
from opentelemetry import trace # noqa: E402
|
from opentelemetry import trace # noqa: E402
|
||||||
from opentelemetry.exporter.otlp.proto.http.trace_exporter import (
|
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter # noqa: E402
|
||||||
OTLPSpanExporter, # noqa: E402
|
|
||||||
)
|
|
||||||
from opentelemetry.sdk.resources import SERVICE_NAME, Resource # noqa: E402
|
from opentelemetry.sdk.resources import SERVICE_NAME, Resource # noqa: E402
|
||||||
from opentelemetry.sdk.trace import TracerProvider # noqa: E402
|
from opentelemetry.sdk.trace import TracerProvider # noqa: E402
|
||||||
from opentelemetry.sdk.trace.export import BatchSpanProcessor # noqa: E402
|
from opentelemetry.sdk.trace.export import BatchSpanProcessor # noqa: E402
|
||||||
@@ -103,7 +104,7 @@ class Telemetry:
|
|||||||
self._add_attribute(
|
self._add_attribute(
|
||||||
span,
|
span,
|
||||||
"crewai_version",
|
"crewai_version",
|
||||||
version("crewai"),
|
pkg_resources.get_distribution("crewai").version,
|
||||||
)
|
)
|
||||||
self._add_attribute(span, "python_version", platform.python_version())
|
self._add_attribute(span, "python_version", platform.python_version())
|
||||||
self._add_attribute(span, "crew_key", crew.key)
|
self._add_attribute(span, "crew_key", crew.key)
|
||||||
@@ -305,7 +306,7 @@ class Telemetry:
|
|||||||
self._add_attribute(
|
self._add_attribute(
|
||||||
span,
|
span,
|
||||||
"crewai_version",
|
"crewai_version",
|
||||||
version("crewai"),
|
pkg_resources.get_distribution("crewai").version,
|
||||||
)
|
)
|
||||||
self._add_attribute(span, "tool_name", tool_name)
|
self._add_attribute(span, "tool_name", tool_name)
|
||||||
self._add_attribute(span, "attempts", attempts)
|
self._add_attribute(span, "attempts", attempts)
|
||||||
@@ -325,7 +326,7 @@ class Telemetry:
|
|||||||
self._add_attribute(
|
self._add_attribute(
|
||||||
span,
|
span,
|
||||||
"crewai_version",
|
"crewai_version",
|
||||||
version("crewai"),
|
pkg_resources.get_distribution("crewai").version,
|
||||||
)
|
)
|
||||||
self._add_attribute(span, "tool_name", tool_name)
|
self._add_attribute(span, "tool_name", tool_name)
|
||||||
self._add_attribute(span, "attempts", attempts)
|
self._add_attribute(span, "attempts", attempts)
|
||||||
@@ -345,7 +346,7 @@ class Telemetry:
|
|||||||
self._add_attribute(
|
self._add_attribute(
|
||||||
span,
|
span,
|
||||||
"crewai_version",
|
"crewai_version",
|
||||||
version("crewai"),
|
pkg_resources.get_distribution("crewai").version,
|
||||||
)
|
)
|
||||||
if llm:
|
if llm:
|
||||||
self._add_attribute(span, "llm", llm.model)
|
self._add_attribute(span, "llm", llm.model)
|
||||||
@@ -364,7 +365,7 @@ class Telemetry:
|
|||||||
self._add_attribute(
|
self._add_attribute(
|
||||||
span,
|
span,
|
||||||
"crewai_version",
|
"crewai_version",
|
||||||
version("crewai"),
|
pkg_resources.get_distribution("crewai").version,
|
||||||
)
|
)
|
||||||
self._add_attribute(span, "crew_key", crew.key)
|
self._add_attribute(span, "crew_key", crew.key)
|
||||||
self._add_attribute(span, "crew_id", str(crew.id))
|
self._add_attribute(span, "crew_id", str(crew.id))
|
||||||
@@ -390,7 +391,7 @@ class Telemetry:
|
|||||||
self._add_attribute(
|
self._add_attribute(
|
||||||
span,
|
span,
|
||||||
"crewai_version",
|
"crewai_version",
|
||||||
version("crewai"),
|
pkg_resources.get_distribution("crewai").version,
|
||||||
)
|
)
|
||||||
self._add_attribute(span, "crew_key", crew.key)
|
self._add_attribute(span, "crew_key", crew.key)
|
||||||
self._add_attribute(span, "crew_id", str(crew.id))
|
self._add_attribute(span, "crew_id", str(crew.id))
|
||||||
@@ -471,7 +472,7 @@ class Telemetry:
|
|||||||
self._add_attribute(
|
self._add_attribute(
|
||||||
span,
|
span,
|
||||||
"crewai_version",
|
"crewai_version",
|
||||||
version("crewai"),
|
pkg_resources.get_distribution("crewai").version,
|
||||||
)
|
)
|
||||||
self._add_attribute(span, "crew_key", crew.key)
|
self._add_attribute(span, "crew_key", crew.key)
|
||||||
self._add_attribute(span, "crew_id", str(crew.id))
|
self._add_attribute(span, "crew_id", str(crew.id))
|
||||||
@@ -540,7 +541,7 @@ class Telemetry:
|
|||||||
self._add_attribute(
|
self._add_attribute(
|
||||||
crew._execution_span,
|
crew._execution_span,
|
||||||
"crewai_version",
|
"crewai_version",
|
||||||
version("crewai"),
|
pkg_resources.get_distribution("crewai").version,
|
||||||
)
|
)
|
||||||
self._add_attribute(
|
self._add_attribute(
|
||||||
crew._execution_span, "crew_output", final_string_output
|
crew._execution_span, "crew_output", final_string_output
|
||||||
|
|||||||
@@ -1,45 +0,0 @@
|
|||||||
from typing import Dict, Optional, Union
|
|
||||||
|
|
||||||
from pydantic import BaseModel, Field
|
|
||||||
|
|
||||||
from crewai.tools.base_tool import BaseTool
|
|
||||||
from crewai.utilities import I18N
|
|
||||||
|
|
||||||
i18n = I18N()
|
|
||||||
|
|
||||||
class AddImageToolSchema(BaseModel):
|
|
||||||
image_url: str = Field(..., description="The URL or path of the image to add")
|
|
||||||
action: Optional[str] = Field(
|
|
||||||
default=None,
|
|
||||||
description="Optional context or question about the image"
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
class AddImageTool(BaseTool):
|
|
||||||
"""Tool for adding images to the content"""
|
|
||||||
|
|
||||||
name: str = Field(default_factory=lambda: i18n.tools("add_image")["name"]) # type: ignore
|
|
||||||
description: str = Field(default_factory=lambda: i18n.tools("add_image")["description"]) # type: ignore
|
|
||||||
args_schema: type[BaseModel] = AddImageToolSchema
|
|
||||||
|
|
||||||
def _run(
|
|
||||||
self,
|
|
||||||
image_url: str,
|
|
||||||
action: Optional[str] = None,
|
|
||||||
**kwargs,
|
|
||||||
) -> dict:
|
|
||||||
action = action or i18n.tools("add_image")["default_action"] # type: ignore
|
|
||||||
content = [
|
|
||||||
{"type": "text", "text": action},
|
|
||||||
{
|
|
||||||
"type": "image_url",
|
|
||||||
"image_url": {
|
|
||||||
"url": image_url,
|
|
||||||
},
|
|
||||||
}
|
|
||||||
]
|
|
||||||
|
|
||||||
return {
|
|
||||||
"role": "user",
|
|
||||||
"content": content
|
|
||||||
}
|
|
||||||
@@ -1,9 +1,9 @@
|
|||||||
from crewai.agents.agent_builder.base_agent import BaseAgent
|
|
||||||
from crewai.tools.base_tool import BaseTool
|
from crewai.tools.base_tool import BaseTool
|
||||||
|
from crewai.agents.agent_builder.base_agent import BaseAgent
|
||||||
from crewai.utilities import I18N
|
from crewai.utilities import I18N
|
||||||
|
|
||||||
from .ask_question_tool import AskQuestionTool
|
|
||||||
from .delegate_work_tool import DelegateWorkTool
|
from .delegate_work_tool import DelegateWorkTool
|
||||||
|
from .ask_question_tool import AskQuestionTool
|
||||||
|
|
||||||
|
|
||||||
class AgentTools:
|
class AgentTools:
|
||||||
@@ -20,13 +20,13 @@ class AgentTools:
|
|||||||
delegate_tool = DelegateWorkTool(
|
delegate_tool = DelegateWorkTool(
|
||||||
agents=self.agents,
|
agents=self.agents,
|
||||||
i18n=self.i18n,
|
i18n=self.i18n,
|
||||||
description=self.i18n.tools("delegate_work").format(coworkers=coworkers), # type: ignore
|
description=self.i18n.tools("delegate_work").format(coworkers=coworkers),
|
||||||
)
|
)
|
||||||
|
|
||||||
ask_tool = AskQuestionTool(
|
ask_tool = AskQuestionTool(
|
||||||
agents=self.agents,
|
agents=self.agents,
|
||||||
i18n=self.i18n,
|
i18n=self.i18n,
|
||||||
description=self.i18n.tools("ask_question").format(coworkers=coworkers), # type: ignore
|
description=self.i18n.tools("ask_question").format(coworkers=coworkers),
|
||||||
)
|
)
|
||||||
|
|
||||||
return [delegate_tool, ask_tool]
|
return [delegate_tool, ask_tool]
|
||||||
|
|||||||
@@ -1,8 +1,6 @@
|
|||||||
from typing import Optional
|
|
||||||
|
|
||||||
from pydantic import BaseModel, Field
|
|
||||||
|
|
||||||
from crewai.tools.agent_tools.base_agent_tools import BaseAgentTool
|
from crewai.tools.agent_tools.base_agent_tools import BaseAgentTool
|
||||||
|
from typing import Optional
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
|
||||||
|
|
||||||
class AskQuestionToolSchema(BaseModel):
|
class AskQuestionToolSchema(BaseModel):
|
||||||
|
|||||||
@@ -1,15 +1,11 @@
|
|||||||
import logging
|
|
||||||
from typing import Optional, Union
|
from typing import Optional, Union
|
||||||
|
|
||||||
from pydantic import Field
|
from pydantic import Field
|
||||||
|
|
||||||
|
from crewai.tools.base_tool import BaseTool
|
||||||
from crewai.agents.agent_builder.base_agent import BaseAgent
|
from crewai.agents.agent_builder.base_agent import BaseAgent
|
||||||
from crewai.task import Task
|
from crewai.task import Task
|
||||||
from crewai.tools.base_tool import BaseTool
|
|
||||||
from crewai.utilities import I18N
|
from crewai.utilities import I18N
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
class BaseAgentTool(BaseTool):
|
class BaseAgentTool(BaseTool):
|
||||||
"""Base class for agent-related tools"""
|
"""Base class for agent-related tools"""
|
||||||
@@ -19,25 +15,6 @@ class BaseAgentTool(BaseTool):
|
|||||||
default_factory=I18N, description="Internationalization settings"
|
default_factory=I18N, description="Internationalization settings"
|
||||||
)
|
)
|
||||||
|
|
||||||
def sanitize_agent_name(self, name: str) -> str:
|
|
||||||
"""
|
|
||||||
Sanitize agent role name by normalizing whitespace and setting to lowercase.
|
|
||||||
Converts all whitespace (including newlines) to single spaces and removes quotes.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
name (str): The agent role name to sanitize
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
str: The sanitized agent role name, with whitespace normalized,
|
|
||||||
converted to lowercase, and quotes removed
|
|
||||||
"""
|
|
||||||
if not name:
|
|
||||||
return ""
|
|
||||||
# Normalize all whitespace (including newlines) to single spaces
|
|
||||||
normalized = " ".join(name.split())
|
|
||||||
# Remove quotes and convert to lowercase
|
|
||||||
return normalized.replace('"', "").casefold()
|
|
||||||
|
|
||||||
def _get_coworker(self, coworker: Optional[str], **kwargs) -> Optional[str]:
|
def _get_coworker(self, coworker: Optional[str], **kwargs) -> Optional[str]:
|
||||||
coworker = coworker or kwargs.get("co_worker") or kwargs.get("coworker")
|
coworker = coworker or kwargs.get("co_worker") or kwargs.get("coworker")
|
||||||
if coworker:
|
if coworker:
|
||||||
@@ -47,27 +24,11 @@ class BaseAgentTool(BaseTool):
|
|||||||
return coworker
|
return coworker
|
||||||
|
|
||||||
def _execute(
|
def _execute(
|
||||||
self,
|
self, agent_name: Union[str, None], task: str, context: Union[str, None]
|
||||||
agent_name: Optional[str],
|
|
||||||
task: str,
|
|
||||||
context: Optional[str] = None
|
|
||||||
) -> str:
|
) -> str:
|
||||||
"""
|
|
||||||
Execute delegation to an agent with case-insensitive and whitespace-tolerant matching.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
agent_name: Name/role of the agent to delegate to (case-insensitive)
|
|
||||||
task: The specific question or task to delegate
|
|
||||||
context: Optional additional context for the task execution
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
str: The execution result from the delegated agent or an error message
|
|
||||||
if the agent cannot be found
|
|
||||||
"""
|
|
||||||
try:
|
try:
|
||||||
if agent_name is None:
|
if agent_name is None:
|
||||||
agent_name = ""
|
agent_name = ""
|
||||||
logger.debug("No agent name provided, using empty string")
|
|
||||||
|
|
||||||
# It is important to remove the quotes from the agent name.
|
# It is important to remove the quotes from the agent name.
|
||||||
# The reason we have to do this is because less-powerful LLM's
|
# The reason we have to do this is because less-powerful LLM's
|
||||||
@@ -76,49 +37,31 @@ class BaseAgentTool(BaseTool):
|
|||||||
# {"task": "....", "coworker": "....
|
# {"task": "....", "coworker": "....
|
||||||
# when it should look like this:
|
# when it should look like this:
|
||||||
# {"task": "....", "coworker": "...."}
|
# {"task": "....", "coworker": "...."}
|
||||||
sanitized_name = self.sanitize_agent_name(agent_name)
|
agent_name = agent_name.casefold().replace('"', "").replace("\n", "")
|
||||||
logger.debug(f"Sanitized agent name from '{agent_name}' to '{sanitized_name}'")
|
|
||||||
|
|
||||||
available_agents = [agent.role for agent in self.agents]
|
|
||||||
logger.debug(f"Available agents: {available_agents}")
|
|
||||||
|
|
||||||
agent = [ # type: ignore # Incompatible types in assignment (expression has type "list[BaseAgent]", variable has type "str | None")
|
agent = [ # type: ignore # Incompatible types in assignment (expression has type "list[BaseAgent]", variable has type "str | None")
|
||||||
available_agent
|
available_agent
|
||||||
for available_agent in self.agents
|
for available_agent in self.agents
|
||||||
if self.sanitize_agent_name(available_agent.role) == sanitized_name
|
if available_agent.role.casefold().replace("\n", "") == agent_name
|
||||||
]
|
]
|
||||||
logger.debug(f"Found {len(agent)} matching agents for role '{sanitized_name}'")
|
except Exception as _:
|
||||||
except (AttributeError, ValueError) as e:
|
return self.i18n.errors("agent_tool_unexsiting_coworker").format(
|
||||||
# Handle specific exceptions that might occur during role name processing
|
|
||||||
return self.i18n.errors("agent_tool_unexisting_coworker").format(
|
|
||||||
coworkers="\n".join(
|
coworkers="\n".join(
|
||||||
[f"- {self.sanitize_agent_name(agent.role)}" for agent in self.agents]
|
[f"- {agent.role.casefold()}" for agent in self.agents]
|
||||||
),
|
)
|
||||||
error=str(e)
|
|
||||||
)
|
)
|
||||||
|
|
||||||
if not agent:
|
if not agent:
|
||||||
# No matching agent found after sanitization
|
return self.i18n.errors("agent_tool_unexsiting_coworker").format(
|
||||||
return self.i18n.errors("agent_tool_unexisting_coworker").format(
|
|
||||||
coworkers="\n".join(
|
coworkers="\n".join(
|
||||||
[f"- {self.sanitize_agent_name(agent.role)}" for agent in self.agents]
|
[f"- {agent.role.casefold()}" for agent in self.agents]
|
||||||
),
|
)
|
||||||
error=f"No agent found with role '{sanitized_name}'"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
agent = agent[0]
|
agent = agent[0]
|
||||||
try:
|
task_with_assigned_agent = Task( # type: ignore # Incompatible types in assignment (expression has type "Task", variable has type "str")
|
||||||
task_with_assigned_agent = Task(
|
|
||||||
description=task,
|
description=task,
|
||||||
agent=agent,
|
agent=agent,
|
||||||
expected_output=agent.i18n.slice("manager_request"),
|
expected_output=agent.i18n.slice("manager_request"),
|
||||||
i18n=agent.i18n,
|
i18n=agent.i18n,
|
||||||
)
|
)
|
||||||
logger.debug(f"Created task for agent '{self.sanitize_agent_name(agent.role)}': {task}")
|
|
||||||
return agent.execute_task(task_with_assigned_agent, context)
|
return agent.execute_task(task_with_assigned_agent, context)
|
||||||
except Exception as e:
|
|
||||||
# Handle task creation or execution errors
|
|
||||||
return self.i18n.errors("agent_tool_execution_error").format(
|
|
||||||
agent_role=self.sanitize_agent_name(agent.role),
|
|
||||||
error=str(e)
|
|
||||||
)
|
|
||||||
|
|||||||
@@ -1,9 +1,8 @@
|
|||||||
|
from crewai.tools.agent_tools.base_agent_tools import BaseAgentTool
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
|
|
||||||
from pydantic import BaseModel, Field
|
from pydantic import BaseModel, Field
|
||||||
|
|
||||||
from crewai.tools.agent_tools.base_agent_tools import BaseAgentTool
|
|
||||||
|
|
||||||
|
|
||||||
class DelegateWorkToolSchema(BaseModel):
|
class DelegateWorkToolSchema(BaseModel):
|
||||||
task: str = Field(..., description="The task to delegate")
|
task: str = Field(..., description="The task to delegate")
|
||||||
|
|||||||
@@ -1,5 +1,6 @@
|
|||||||
import ast
|
import ast
|
||||||
import datetime
|
import datetime
|
||||||
|
import os
|
||||||
import time
|
import time
|
||||||
from difflib import SequenceMatcher
|
from difflib import SequenceMatcher
|
||||||
from textwrap import dedent
|
from textwrap import dedent
|
||||||
@@ -10,16 +11,18 @@ from crewai.agents.tools_handler import ToolsHandler
|
|||||||
from crewai.task import Task
|
from crewai.task import Task
|
||||||
from crewai.telemetry import Telemetry
|
from crewai.telemetry import Telemetry
|
||||||
from crewai.tools import BaseTool
|
from crewai.tools import BaseTool
|
||||||
from crewai.tools.structured_tool import CrewStructuredTool
|
|
||||||
from crewai.tools.tool_calling import InstructorToolCalling, ToolCalling
|
from crewai.tools.tool_calling import InstructorToolCalling, ToolCalling
|
||||||
from crewai.tools.tool_usage_events import ToolUsageError, ToolUsageFinished
|
from crewai.tools.tool_usage_events import ToolUsageError, ToolUsageFinished
|
||||||
from crewai.utilities import I18N, Converter, ConverterError, Printer
|
from crewai.utilities import I18N, Converter, ConverterError, Printer
|
||||||
|
|
||||||
try:
|
agentops = None
|
||||||
|
if os.environ.get("AGENTOPS_API_KEY"):
|
||||||
|
try:
|
||||||
import agentops # type: ignore
|
import agentops # type: ignore
|
||||||
except ImportError:
|
except ImportError:
|
||||||
agentops = None
|
pass
|
||||||
OPENAI_BIGGER_MODELS = ["gpt-4", "gpt-4o", "o1-preview", "o1-mini", "o1", "o3", "o3-mini"]
|
|
||||||
|
OPENAI_BIGGER_MODELS = ["gpt-4", "gpt-4o", "o1-preview", "o1-mini"]
|
||||||
|
|
||||||
|
|
||||||
class ToolUsageErrorException(Exception):
|
class ToolUsageErrorException(Exception):
|
||||||
@@ -103,19 +106,6 @@ class ToolUsage:
|
|||||||
if self.agent.verbose:
|
if self.agent.verbose:
|
||||||
self._printer.print(content=f"\n\n{error}\n", color="red")
|
self._printer.print(content=f"\n\n{error}\n", color="red")
|
||||||
return error
|
return error
|
||||||
|
|
||||||
if isinstance(tool, CrewStructuredTool) and tool.name == self._i18n.tools("add_image")["name"]: # type: ignore
|
|
||||||
try:
|
|
||||||
result = self._use(tool_string=tool_string, tool=tool, calling=calling)
|
|
||||||
return result
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
error = getattr(e, "message", str(e))
|
|
||||||
self.task.increment_tools_errors()
|
|
||||||
if self.agent.verbose:
|
|
||||||
self._printer.print(content=f"\n\n{error}\n", color="red")
|
|
||||||
return error
|
|
||||||
|
|
||||||
return f"{self._use(tool_string=tool_string, tool=tool, calling=calling)}" # type: ignore # BUG?: "_use" of "ToolUsage" does not return a value (it only ever returns None)
|
return f"{self._use(tool_string=tool_string, tool=tool, calling=calling)}" # type: ignore # BUG?: "_use" of "ToolUsage" does not return a value (it only ever returns None)
|
||||||
|
|
||||||
def _use(
|
def _use(
|
||||||
@@ -432,10 +422,9 @@ class ToolUsage:
|
|||||||
elif value.lower() in [
|
elif value.lower() in [
|
||||||
"true",
|
"true",
|
||||||
"false",
|
"false",
|
||||||
|
"null",
|
||||||
]: # Check for boolean and null values
|
]: # Check for boolean and null values
|
||||||
value = value.lower().capitalize()
|
value = value.lower()
|
||||||
elif value.lower() == "null":
|
|
||||||
value = "None"
|
|
||||||
else:
|
else:
|
||||||
# Assume the value is a string and needs quotes
|
# Assume the value is a string and needs quotes
|
||||||
value = '"' + value.replace('"', '\\"') + '"'
|
value = '"' + value.replace('"', '\\"') + '"'
|
||||||
|
|||||||
@@ -1,7 +1,6 @@
|
|||||||
from datetime import datetime
|
|
||||||
from typing import Any, Dict
|
from typing import Any, Dict
|
||||||
|
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
|
||||||
class ToolUsageEvent(BaseModel):
|
class ToolUsageEvent(BaseModel):
|
||||||
|
|||||||
@@ -12,7 +12,7 @@
|
|||||||
"tools": "\nYou ONLY have access to the following tools, and should NEVER make up tools that are not listed here:\n\n{tools}\n\nUse the following format:\n\nThought: you should always think about what to do\nAction: the action to take, only one name of [{tool_names}], just the name, exactly as it's written.\nAction Input: the input to the action, just a simple python dictionary, enclosed in curly braces, using \" to wrap keys and values.\nObservation: the result of the action\n\nOnce all necessary information is gathered:\n\nThought: I now know the final answer\nFinal Answer: the final answer to the original input question\n",
|
"tools": "\nYou ONLY have access to the following tools, and should NEVER make up tools that are not listed here:\n\n{tools}\n\nUse the following format:\n\nThought: you should always think about what to do\nAction: the action to take, only one name of [{tool_names}], just the name, exactly as it's written.\nAction Input: the input to the action, just a simple python dictionary, enclosed in curly braces, using \" to wrap keys and values.\nObservation: the result of the action\n\nOnce all necessary information is gathered:\n\nThought: I now know the final answer\nFinal Answer: the final answer to the original input question\n",
|
||||||
"no_tools": "\nTo give my best complete final answer to the task use the exact following format:\n\nThought: I now can give a great answer\nFinal Answer: Your final answer must be the great and the most complete as possible, it must be outcome described.\n\nI MUST use these formats, my job depends on it!",
|
"no_tools": "\nTo give my best complete final answer to the task use the exact following format:\n\nThought: I now can give a great answer\nFinal Answer: Your final answer must be the great and the most complete as possible, it must be outcome described.\n\nI MUST use these formats, my job depends on it!",
|
||||||
"format": "I MUST either use a tool (use one at time) OR give my best final answer not both at the same time. To Use the following format:\n\nThought: you should always think about what to do\nAction: the action to take, should be one of [{tool_names}]\nAction Input: the input to the action, dictionary enclosed in curly braces\nObservation: the result of the action\n... (this Thought/Action/Action Input/Result can repeat N times)\nThought: I now can give a great answer\nFinal Answer: Your final answer must be the great and the most complete as possible, it must be outcome described\n\n",
|
"format": "I MUST either use a tool (use one at time) OR give my best final answer not both at the same time. To Use the following format:\n\nThought: you should always think about what to do\nAction: the action to take, should be one of [{tool_names}]\nAction Input: the input to the action, dictionary enclosed in curly braces\nObservation: the result of the action\n... (this Thought/Action/Action Input/Result can repeat N times)\nThought: I now can give a great answer\nFinal Answer: Your final answer must be the great and the most complete as possible, it must be outcome described\n\n",
|
||||||
"final_answer_format": "If you don't need to use any more tools, you must give your best complete final answer, make sure it satisfies the expected criteria, use the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer: my best complete final answer to the task.\n\n",
|
"final_answer_format": "If you don't need to use any more tools, you must give your best complete final answer, make sure it satisfy the expect criteria, use the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer: my best complete final answer to the task.\n\n",
|
||||||
"format_without_tools": "\nSorry, I didn't use the right format. I MUST either use a tool (among the available ones), OR give my best final answer.\nI just remembered the expected format I must follow:\n\nQuestion: the input question you must answer\nThought: you should always think about what to do\nAction: the action to take, should be one of [{tool_names}]\nAction Input: the input to the action\nObservation: the result of the action\n... (this Thought/Action/Action Input/Result can repeat N times)\nThought: I now can give a great answer\nFinal Answer: Your final answer must be the great and the most complete as possible, it must be outcome described\n\n",
|
"format_without_tools": "\nSorry, I didn't use the right format. I MUST either use a tool (among the available ones), OR give my best final answer.\nI just remembered the expected format I must follow:\n\nQuestion: the input question you must answer\nThought: you should always think about what to do\nAction: the action to take, should be one of [{tool_names}]\nAction Input: the input to the action\nObservation: the result of the action\n... (this Thought/Action/Action Input/Result can repeat N times)\nThought: I now can give a great answer\nFinal Answer: Your final answer must be the great and the most complete as possible, it must be outcome described\n\n",
|
||||||
"task_with_context": "{task}\n\nThis is the context you're working with:\n{context}",
|
"task_with_context": "{task}\n\nThis is the context you're working with:\n{context}",
|
||||||
"expected_output": "\nThis is the expect criteria for your final answer: {expected_output}\nyou MUST return the actual complete content as the final answer, not a summary.",
|
"expected_output": "\nThis is the expect criteria for your final answer: {expected_output}\nyou MUST return the actual complete content as the final answer, not a summary.",
|
||||||
@@ -28,21 +28,15 @@
|
|||||||
"errors": {
|
"errors": {
|
||||||
"force_final_answer_error": "You can't keep going, this was the best you could do.\n {formatted_answer.text}",
|
"force_final_answer_error": "You can't keep going, this was the best you could do.\n {formatted_answer.text}",
|
||||||
"force_final_answer": "Now it's time you MUST give your absolute best final answer. You'll ignore all previous instructions, stop using any tools, and just return your absolute BEST Final answer.",
|
"force_final_answer": "Now it's time you MUST give your absolute best final answer. You'll ignore all previous instructions, stop using any tools, and just return your absolute BEST Final answer.",
|
||||||
"agent_tool_unexisting_coworker": "\nError executing tool. coworker mentioned not found, it must be one of the following options:\n{coworkers}\n",
|
"agent_tool_unexsiting_coworker": "\nError executing tool. coworker mentioned not found, it must be one of the following options:\n{coworkers}\n",
|
||||||
"task_repeated_usage": "I tried reusing the same input, I must stop using this action input. I'll try something else instead.\n\n",
|
"task_repeated_usage": "I tried reusing the same input, I must stop using this action input. I'll try something else instead.\n\n",
|
||||||
"tool_usage_error": "I encountered an error: {error}",
|
"tool_usage_error": "I encountered an error: {error}",
|
||||||
"tool_arguments_error": "Error: the Action Input is not a valid key, value dictionary.",
|
"tool_arguments_error": "Error: the Action Input is not a valid key, value dictionary.",
|
||||||
"wrong_tool_name": "You tried to use the tool {tool}, but it doesn't exist. You must use one of the following tools, use one at time: {tools}.",
|
"wrong_tool_name": "You tried to use the tool {tool}, but it doesn't exist. You must use one of the following tools, use one at time: {tools}.",
|
||||||
"tool_usage_exception": "I encountered an error while trying to use the tool. This was the error: {error}.\n Tool {tool} accepts these inputs: {tool_inputs}",
|
"tool_usage_exception": "I encountered an error while trying to use the tool. This was the error: {error}.\n Tool {tool} accepts these inputs: {tool_inputs}"
|
||||||
"agent_tool_execution_error": "Error executing task with agent '{agent_role}'. Error: {error}"
|
|
||||||
},
|
},
|
||||||
"tools": {
|
"tools": {
|
||||||
"delegate_work": "Delegate a specific task to one of the following coworkers: {coworkers}\nThe input to this tool should be the coworker, the task you want them to do, and ALL necessary context to execute the task, they know nothing about the task, so share absolute everything you know, don't reference things but instead explain them.",
|
"delegate_work": "Delegate a specific task to one of the following coworkers: {coworkers}\nThe input to this tool should be the coworker, the task you want them to do, and ALL necessary context to execute the task, they know nothing about the task, so share absolute everything you know, don't reference things but instead explain them.",
|
||||||
"ask_question": "Ask a specific question to one of the following coworkers: {coworkers}\nThe input to this tool should be the coworker, the question you have for them, and ALL necessary context to ask the question properly, they know nothing about the question, so share absolute everything you know, don't reference things but instead explain them.",
|
"ask_question": "Ask a specific question to one of the following coworkers: {coworkers}\nThe input to this tool should be the coworker, the question you have for them, and ALL necessary context to ask the question properly, they know nothing about the question, so share absolute everything you know, don't reference things but instead explain them."
|
||||||
"add_image": {
|
|
||||||
"name": "Add image to content",
|
|
||||||
"description": "See image to understand it's content, you can optionally ask a question about the image",
|
|
||||||
"default_action": "Please provide a detailed description of this image, including all visual elements, context, and any notable details you can observe."
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,17 +1,15 @@
|
|||||||
|
from datetime import datetime, date
|
||||||
import json
|
import json
|
||||||
from datetime import date, datetime
|
|
||||||
from decimal import Decimal
|
|
||||||
from enum import Enum
|
|
||||||
from uuid import UUID
|
from uuid import UUID
|
||||||
|
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel
|
||||||
|
from decimal import Decimal
|
||||||
|
|
||||||
|
|
||||||
class CrewJSONEncoder(json.JSONEncoder):
|
class CrewJSONEncoder(json.JSONEncoder):
|
||||||
def default(self, obj):
|
def default(self, obj):
|
||||||
if isinstance(obj, BaseModel):
|
if isinstance(obj, BaseModel):
|
||||||
return self._handle_pydantic_model(obj)
|
return self._handle_pydantic_model(obj)
|
||||||
elif isinstance(obj, UUID) or isinstance(obj, Decimal) or isinstance(obj, Enum):
|
elif isinstance(obj, UUID) or isinstance(obj, Decimal):
|
||||||
return str(obj)
|
return str(obj)
|
||||||
|
|
||||||
elif isinstance(obj, datetime) or isinstance(obj, date):
|
elif isinstance(obj, datetime) or isinstance(obj, date):
|
||||||
|
|||||||
@@ -1,10 +1,9 @@
|
|||||||
import json
|
import json
|
||||||
|
import regex
|
||||||
from typing import Any, Type
|
from typing import Any, Type
|
||||||
|
|
||||||
import regex
|
|
||||||
from pydantic import BaseModel, ValidationError
|
|
||||||
|
|
||||||
from crewai.agents.parser import OutputParserException
|
from crewai.agents.parser import OutputParserException
|
||||||
|
from pydantic import BaseModel, ValidationError
|
||||||
|
|
||||||
|
|
||||||
class CrewPydanticOutputParser:
|
class CrewPydanticOutputParser:
|
||||||
|
|||||||
@@ -1,7 +1,6 @@
|
|||||||
import os
|
import os
|
||||||
from typing import Any, Dict, cast
|
from typing import Any, Dict, cast
|
||||||
|
from chromadb import EmbeddingFunction, Documents, Embeddings
|
||||||
from chromadb import Documents, EmbeddingFunction, Embeddings
|
|
||||||
from chromadb.api.types import validate_embedding_function
|
from chromadb.api.types import validate_embedding_function
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -1,14 +1,13 @@
|
|||||||
from collections import defaultdict
|
from collections import defaultdict
|
||||||
|
|
||||||
from pydantic import BaseModel, Field
|
|
||||||
from rich.box import HEAVY_EDGE
|
|
||||||
from rich.console import Console
|
|
||||||
from rich.table import Table
|
|
||||||
|
|
||||||
from crewai.agent import Agent
|
from crewai.agent import Agent
|
||||||
from crewai.task import Task
|
from crewai.task import Task
|
||||||
from crewai.tasks.task_output import TaskOutput
|
from crewai.tasks.task_output import TaskOutput
|
||||||
from crewai.telemetry import Telemetry
|
from crewai.telemetry import Telemetry
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
from rich.box import HEAVY_EDGE
|
||||||
|
from rich.console import Console
|
||||||
|
from rich.table import Table
|
||||||
|
|
||||||
|
|
||||||
class TaskEvaluationPydanticOutput(BaseModel):
|
class TaskEvaluationPydanticOutput(BaseModel):
|
||||||
|
|||||||
@@ -1,3 +1,4 @@
|
|||||||
|
import os
|
||||||
from typing import List
|
from typing import List
|
||||||
|
|
||||||
from pydantic import BaseModel, Field
|
from pydantic import BaseModel, Field
|
||||||
@@ -5,17 +6,27 @@ from pydantic import BaseModel, Field
|
|||||||
from crewai.utilities import Converter
|
from crewai.utilities import Converter
|
||||||
from crewai.utilities.pydantic_schema_parser import PydanticSchemaParser
|
from crewai.utilities.pydantic_schema_parser import PydanticSchemaParser
|
||||||
|
|
||||||
agentops = None
|
|
||||||
try:
|
|
||||||
from agentops import track_agent # type: ignore
|
|
||||||
except ImportError:
|
|
||||||
|
|
||||||
def track_agent(name):
|
def mock_agent_ops_provider():
|
||||||
|
def track_agent(*args, **kwargs):
|
||||||
def noop(f):
|
def noop(f):
|
||||||
return f
|
return f
|
||||||
|
|
||||||
return noop
|
return noop
|
||||||
|
|
||||||
|
return track_agent
|
||||||
|
|
||||||
|
|
||||||
|
agentops = None
|
||||||
|
|
||||||
|
if os.environ.get("AGENTOPS_API_KEY"):
|
||||||
|
try:
|
||||||
|
from agentops import track_agent
|
||||||
|
except ImportError:
|
||||||
|
track_agent = mock_agent_ops_provider()
|
||||||
|
else:
|
||||||
|
track_agent = mock_agent_ops_provider()
|
||||||
|
|
||||||
|
|
||||||
class Entity(BaseModel):
|
class Entity(BaseModel):
|
||||||
name: str = Field(description="The name of the entity.")
|
name: str = Field(description="The name of the entity.")
|
||||||
|
|||||||
@@ -1,8 +1,8 @@
|
|||||||
|
from typing import Any, Callable, Generic, List, Dict, Type, TypeVar
|
||||||
from functools import wraps
|
from functools import wraps
|
||||||
from typing import Any, Callable, Dict, Generic, List, Type, TypeVar
|
|
||||||
|
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel
|
||||||
|
|
||||||
|
|
||||||
T = TypeVar("T")
|
T = TypeVar("T")
|
||||||
EVT = TypeVar("EVT", bound=BaseModel)
|
EVT = TypeVar("EVT", bound=BaseModel)
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
import json
|
import json
|
||||||
import os
|
import os
|
||||||
from typing import Dict, Optional, Union
|
from typing import Dict, Optional
|
||||||
|
|
||||||
from pydantic import BaseModel, Field, PrivateAttr, model_validator
|
from pydantic import BaseModel, Field, PrivateAttr, model_validator
|
||||||
|
|
||||||
@@ -41,8 +41,8 @@ class I18N(BaseModel):
|
|||||||
def errors(self, error: str) -> str:
|
def errors(self, error: str) -> str:
|
||||||
return self.retrieve("errors", error)
|
return self.retrieve("errors", error)
|
||||||
|
|
||||||
def tools(self, tool: str) -> Union[str, Dict[str, str]]:
|
def tools(self, error: str) -> str:
|
||||||
return self.retrieve("tools", tool)
|
return self.retrieve("tools", error)
|
||||||
|
|
||||||
def retrieve(self, kind, key) -> str:
|
def retrieve(self, kind, key) -> str:
|
||||||
try:
|
try:
|
||||||
|
|||||||
@@ -1,4 +1,3 @@
|
|||||||
import warnings
|
|
||||||
from typing import Any, Optional, Type
|
from typing import Any, Optional, Type
|
||||||
|
|
||||||
|
|
||||||
@@ -26,8 +25,7 @@ class InternalInstructor:
|
|||||||
if self.agent and not self.llm:
|
if self.agent and not self.llm:
|
||||||
self.llm = self.agent.function_calling_llm or self.agent.llm
|
self.llm = self.agent.function_calling_llm or self.agent.llm
|
||||||
|
|
||||||
with warnings.catch_warnings():
|
# Lazy import
|
||||||
warnings.simplefilter("ignore", UserWarning)
|
|
||||||
import instructor
|
import instructor
|
||||||
from litellm import completion
|
from litellm import completion
|
||||||
|
|
||||||
|
|||||||
@@ -1,14 +1,9 @@
|
|||||||
import json
|
|
||||||
import logging
|
|
||||||
from typing import Any, List, Optional
|
from typing import Any, List, Optional
|
||||||
|
|
||||||
from pydantic import BaseModel, Field
|
from pydantic import BaseModel, Field
|
||||||
|
|
||||||
from crewai.agent import Agent
|
from crewai.agent import Agent
|
||||||
from crewai.task import Task
|
from crewai.task import Task
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
class PlanPerTask(BaseModel):
|
class PlanPerTask(BaseModel):
|
||||||
task: str = Field(..., description="The task for which the plan is created")
|
task: str = Field(..., description="The task for which the plan is created")
|
||||||
@@ -72,39 +67,19 @@ class CrewPlanner:
|
|||||||
output_pydantic=PlannerTaskPydanticOutput,
|
output_pydantic=PlannerTaskPydanticOutput,
|
||||||
)
|
)
|
||||||
|
|
||||||
def _get_agent_knowledge(self, task: Task) -> List[str]:
|
|
||||||
"""
|
|
||||||
Safely retrieve knowledge source content from the task's agent.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
task: The task containing an agent with potential knowledge sources
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
List[str]: A list of knowledge source strings
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
if task.agent and task.agent.knowledge_sources:
|
|
||||||
return [source.content for source in task.agent.knowledge_sources]
|
|
||||||
except AttributeError:
|
|
||||||
logger.warning("Error accessing agent knowledge sources")
|
|
||||||
return []
|
|
||||||
|
|
||||||
def _create_tasks_summary(self) -> str:
|
def _create_tasks_summary(self) -> str:
|
||||||
"""Creates a summary of all tasks."""
|
"""Creates a summary of all tasks."""
|
||||||
tasks_summary = []
|
tasks_summary = []
|
||||||
for idx, task in enumerate(self.tasks):
|
for idx, task in enumerate(self.tasks):
|
||||||
knowledge_list = self._get_agent_knowledge(task)
|
tasks_summary.append(
|
||||||
task_summary = f"""
|
f"""
|
||||||
Task Number {idx + 1} - {task.description}
|
Task Number {idx + 1} - {task.description}
|
||||||
"task_description": {task.description}
|
"task_description": {task.description}
|
||||||
"task_expected_output": {task.expected_output}
|
"task_expected_output": {task.expected_output}
|
||||||
"agent": {task.agent.role if task.agent else "None"}
|
"agent": {task.agent.role if task.agent else "None"}
|
||||||
"agent_goal": {task.agent.goal if task.agent else "None"}
|
"agent_goal": {task.agent.goal if task.agent else "None"}
|
||||||
"task_tools": {task.tools}
|
"task_tools": {task.tools}
|
||||||
"agent_tools": %s%s""" % (
|
"agent_tools": {task.agent.tools if task.agent else "None"}
|
||||||
f"[{', '.join(str(tool) for tool in task.agent.tools)}]" if task.agent and task.agent.tools else '"agent has no tools"',
|
"""
|
||||||
f',\n "agent_knowledge": "[\\"{knowledge_list[0]}\\"]"' if knowledge_list and str(knowledge_list) != "None" else ""
|
|
||||||
)
|
)
|
||||||
|
|
||||||
tasks_summary.append(task_summary)
|
|
||||||
return " ".join(tasks_summary)
|
return " ".join(tasks_summary)
|
||||||
|
|||||||
@@ -1,7 +1,5 @@
|
|||||||
from typing import Any, Optional
|
|
||||||
|
|
||||||
from pydantic import BaseModel, Field
|
from pydantic import BaseModel, Field
|
||||||
|
from typing import Any, Optional
|
||||||
from crewai.utilities import I18N
|
from crewai.utilities import I18N
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
from typing import Type, Union, get_args, get_origin
|
from typing import Type, get_args, get_origin, Union
|
||||||
|
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel
|
||||||
|
|
||||||
|
|||||||
@@ -1,8 +1,6 @@
|
|||||||
from datetime import datetime
|
|
||||||
from typing import Any, Dict, List, Optional
|
|
||||||
|
|
||||||
from pydantic import BaseModel, Field
|
from pydantic import BaseModel, Field
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Dict, Any, Optional, List
|
||||||
from crewai.memory.storage.kickoff_task_outputs_storage import (
|
from crewai.memory.storage.kickoff_task_outputs_storage import (
|
||||||
KickoffTaskOutputsSQLiteStorage,
|
KickoffTaskOutputsSQLiteStorage,
|
||||||
)
|
)
|
||||||
|
|||||||
@@ -1,8 +1,5 @@
|
|||||||
import warnings
|
|
||||||
|
|
||||||
from litellm.integrations.custom_logger import CustomLogger
|
from litellm.integrations.custom_logger import CustomLogger
|
||||||
from litellm.types.utils import Usage
|
from litellm.types.utils import Usage
|
||||||
|
|
||||||
from crewai.agents.agent_builder.utilities.base_token_process import TokenProcess
|
from crewai.agents.agent_builder.utilities.base_token_process import TokenProcess
|
||||||
|
|
||||||
|
|
||||||
@@ -14,9 +11,7 @@ class TokenCalcHandler(CustomLogger):
|
|||||||
if self.token_cost_process is None:
|
if self.token_cost_process is None:
|
||||||
return
|
return
|
||||||
|
|
||||||
with warnings.catch_warnings():
|
usage : Usage = response_obj["usage"]
|
||||||
warnings.simplefilter("ignore", UserWarning)
|
|
||||||
usage: Usage = response_obj["usage"]
|
|
||||||
self.token_cost_process.sum_successful_requests(1)
|
self.token_cost_process.sum_successful_requests(1)
|
||||||
self.token_cost_process.sum_prompt_tokens(usage.prompt_tokens)
|
self.token_cost_process.sum_prompt_tokens(usage.prompt_tokens)
|
||||||
self.token_cost_process.sum_completion_tokens(usage.completion_tokens)
|
self.token_cost_process.sum_completion_tokens(usage.completion_tokens)
|
||||||
|
|||||||
@@ -1445,31 +1445,34 @@ def test_llm_call_with_all_attributes():
|
|||||||
|
|
||||||
|
|
||||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||||
def test_agent_with_ollama_llama3():
|
def test_agent_with_ollama_gemma():
|
||||||
agent = Agent(
|
agent = Agent(
|
||||||
role="test role",
|
role="test role",
|
||||||
goal="test goal",
|
goal="test goal",
|
||||||
backstory="test backstory",
|
backstory="test backstory",
|
||||||
llm=LLM(model="ollama/llama3.2:3b", base_url="http://localhost:11434"),
|
llm=LLM(
|
||||||
|
model="ollama/gemma2:latest",
|
||||||
|
base_url="http://localhost:8080",
|
||||||
|
),
|
||||||
)
|
)
|
||||||
|
|
||||||
assert isinstance(agent.llm, LLM)
|
assert isinstance(agent.llm, LLM)
|
||||||
assert agent.llm.model == "ollama/llama3.2:3b"
|
assert agent.llm.model == "ollama/gemma2:latest"
|
||||||
assert agent.llm.base_url == "http://localhost:11434"
|
assert agent.llm.base_url == "http://localhost:8080"
|
||||||
|
|
||||||
task = "Respond in 20 words. Who are you?"
|
task = "Respond in 20 words. Who are you?"
|
||||||
response = agent.llm.call([{"role": "user", "content": task}])
|
response = agent.llm.call([{"role": "user", "content": task}])
|
||||||
|
|
||||||
assert response
|
assert response
|
||||||
assert len(response.split()) <= 25 # Allow a little flexibility in word count
|
assert len(response.split()) <= 25 # Allow a little flexibility in word count
|
||||||
assert "Llama3" in response or "AI" in response or "language model" in response
|
assert "Gemma" in response or "AI" in response or "language model" in response
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||||
def test_llm_call_with_ollama_llama3():
|
def test_llm_call_with_ollama_gemma():
|
||||||
llm = LLM(
|
llm = LLM(
|
||||||
model="ollama/llama3.2:3b",
|
model="ollama/gemma2:latest",
|
||||||
base_url="http://localhost:11434",
|
base_url="http://localhost:8080",
|
||||||
temperature=0.7,
|
temperature=0.7,
|
||||||
max_tokens=30,
|
max_tokens=30,
|
||||||
)
|
)
|
||||||
@@ -1479,7 +1482,7 @@ def test_llm_call_with_ollama_llama3():
|
|||||||
|
|
||||||
assert response
|
assert response
|
||||||
assert len(response.split()) <= 25 # Allow a little flexibility in word count
|
assert len(response.split()) <= 25 # Allow a little flexibility in word count
|
||||||
assert "Llama3" in response or "AI" in response or "language model" in response
|
assert "Gemma" in response or "AI" in response or "language model" in response
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||||
@@ -1575,7 +1578,7 @@ def test_agent_execute_task_with_ollama():
|
|||||||
role="test role",
|
role="test role",
|
||||||
goal="test goal",
|
goal="test goal",
|
||||||
backstory="test backstory",
|
backstory="test backstory",
|
||||||
llm=LLM(model="ollama/llama3.2:3b", base_url="http://localhost:11434"),
|
llm=LLM(model="ollama/gemma2:latest", base_url="http://localhost:8080"),
|
||||||
)
|
)
|
||||||
|
|
||||||
task = Task(
|
task = Task(
|
||||||
@@ -1592,15 +1595,19 @@ def test_agent_execute_task_with_ollama():
|
|||||||
@pytest.mark.vcr(filter_headers=["authorization"])
|
@pytest.mark.vcr(filter_headers=["authorization"])
|
||||||
def test_agent_with_knowledge_sources():
|
def test_agent_with_knowledge_sources():
|
||||||
# Create a knowledge source with some content
|
# Create a knowledge source with some content
|
||||||
content = "Brandon's favorite color is red and he likes Mexican food."
|
content = "Brandon's favorite color is blue and he likes Mexican food."
|
||||||
string_source = StringKnowledgeSource(content=content)
|
string_source = StringKnowledgeSource(
|
||||||
|
content=content, metadata={"preference": "personal"}
|
||||||
|
)
|
||||||
|
|
||||||
with patch(
|
with patch(
|
||||||
"crewai.knowledge.storage.knowledge_storage.KnowledgeStorage"
|
"crewai.knowledge.storage.knowledge_storage.KnowledgeStorage"
|
||||||
) as MockKnowledge:
|
) as MockKnowledge:
|
||||||
mock_knowledge_instance = MockKnowledge.return_value
|
mock_knowledge_instance = MockKnowledge.return_value
|
||||||
mock_knowledge_instance.sources = [string_source]
|
mock_knowledge_instance.sources = [string_source]
|
||||||
mock_knowledge_instance.query.return_value = [{"content": content}]
|
mock_knowledge_instance.query.return_value = [
|
||||||
|
{"content": content, "metadata": {"preference": "personal"}}
|
||||||
|
]
|
||||||
|
|
||||||
agent = Agent(
|
agent = Agent(
|
||||||
role="Information Agent",
|
role="Information Agent",
|
||||||
@@ -1621,4 +1628,4 @@ def test_agent_with_knowledge_sources():
|
|||||||
result = crew.kickoff()
|
result = crew.kickoff()
|
||||||
|
|
||||||
# Assert that the agent provides the correct information
|
# Assert that the agent provides the correct information
|
||||||
assert "red" in result.raw.lower()
|
assert "blue" in result.raw.lower()
|
||||||
|
|||||||
@@ -1,10 +1,9 @@
|
|||||||
import hashlib
|
import hashlib
|
||||||
from typing import Any, List, Optional
|
from typing import Any, List, Optional
|
||||||
|
|
||||||
from pydantic import BaseModel
|
|
||||||
|
|
||||||
from crewai.agents.agent_builder.base_agent import BaseAgent
|
from crewai.agents.agent_builder.base_agent import BaseAgent
|
||||||
from crewai.tools.base_tool import BaseTool
|
from crewai.tools.base_tool import BaseTool
|
||||||
|
from pydantic import BaseModel
|
||||||
|
|
||||||
|
|
||||||
class TestAgent(BaseAgent):
|
class TestAgent(BaseAgent):
|
||||||
|
|||||||
@@ -1,11 +1,10 @@
|
|||||||
import pytest
|
import pytest
|
||||||
|
from crewai.agents.parser import CrewAgentParser
|
||||||
from crewai.agents.crew_agent_executor import (
|
from crewai.agents.crew_agent_executor import (
|
||||||
AgentAction,
|
AgentAction,
|
||||||
AgentFinish,
|
AgentFinish,
|
||||||
OutputParserException,
|
OutputParserException,
|
||||||
)
|
)
|
||||||
from crewai.agents.parser import CrewAgentParser
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
@pytest.fixture
|
||||||
|
|||||||
@@ -1,6 +1,42 @@
|
|||||||
interactions:
|
interactions:
|
||||||
- request:
|
- request:
|
||||||
body: '{"model": "llama3.2:3b", "prompt": "### System:\nYou are test role. test
|
body: !!binary |
|
||||||
|
CrcCCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSjgIKEgoQY3Jld2FpLnRl
|
||||||
|
bGVtZXRyeRJoChA/Q8UW5bidCRtKvri5fOaNEgh5qLzvLvZJkioQVG9vbCBVc2FnZSBFcnJvcjAB
|
||||||
|
OYjFVQr1TPgXQXCXhwr1TPgXShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuNjEuMHoCGAGFAQABAAAS
|
||||||
|
jQEKEChQTWQ07t26ELkZmP5RresSCHEivRGBpsP7KgpUb29sIFVzYWdlMAE5sKkbC/VM+BdB8MIc
|
||||||
|
C/VM+BdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC42MS4wShkKCXRvb2xfbmFtZRIMCgpkdW1teV90
|
||||||
|
b29sSg4KCGF0dGVtcHRzEgIYAXoCGAGFAQABAAA=
|
||||||
|
headers:
|
||||||
|
Accept:
|
||||||
|
- '*/*'
|
||||||
|
Accept-Encoding:
|
||||||
|
- gzip, deflate
|
||||||
|
Connection:
|
||||||
|
- keep-alive
|
||||||
|
Content-Length:
|
||||||
|
- '314'
|
||||||
|
Content-Type:
|
||||||
|
- application/x-protobuf
|
||||||
|
User-Agent:
|
||||||
|
- OTel-OTLP-Exporter-Python/1.27.0
|
||||||
|
method: POST
|
||||||
|
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||||
|
response:
|
||||||
|
body:
|
||||||
|
string: "\n\0"
|
||||||
|
headers:
|
||||||
|
Content-Length:
|
||||||
|
- '2'
|
||||||
|
Content-Type:
|
||||||
|
- application/x-protobuf
|
||||||
|
Date:
|
||||||
|
- Tue, 24 Sep 2024 21:57:54 GMT
|
||||||
|
status:
|
||||||
|
code: 200
|
||||||
|
message: OK
|
||||||
|
- request:
|
||||||
|
body: '{"model": "gemma2:latest", "prompt": "### System:\nYou are test role. test
|
||||||
backstory\nYour personal goal is: test goal\nTo give my best complete final
|
backstory\nYour personal goal is: test goal\nTo give my best complete final
|
||||||
answer to the task use the exact following format:\n\nThought: I now can give
|
answer to the task use the exact following format:\n\nThought: I now can give
|
||||||
a great answer\nFinal Answer: Your final answer must be the great and the most
|
a great answer\nFinal Answer: Your final answer must be the great and the most
|
||||||
@@ -10,864 +46,36 @@ interactions:
|
|||||||
explanation of AI\nyou MUST return the actual complete content as the final
|
explanation of AI\nyou MUST return the actual complete content as the final
|
||||||
answer, not a summary.\n\nBegin! This is VERY important to you, use the tools
|
answer, not a summary.\n\nBegin! This is VERY important to you, use the tools
|
||||||
available and give your best Final Answer, your job depends on it!\n\nThought:\n\n",
|
available and give your best Final Answer, your job depends on it!\n\nThought:\n\n",
|
||||||
"options": {"stop": ["\nObservation:"]}, "stream": false}'
|
"options": {}, "stream": false}'
|
||||||
headers:
|
headers:
|
||||||
accept:
|
Accept:
|
||||||
- '*/*'
|
- '*/*'
|
||||||
accept-encoding:
|
Accept-Encoding:
|
||||||
- gzip, deflate
|
- gzip, deflate
|
||||||
connection:
|
Connection:
|
||||||
- keep-alive
|
- keep-alive
|
||||||
content-length:
|
Content-Length:
|
||||||
- '839'
|
- '815'
|
||||||
host:
|
Content-Type:
|
||||||
- localhost:11434
|
- application/json
|
||||||
user-agent:
|
User-Agent:
|
||||||
- litellm/1.56.4
|
- python-requests/2.31.0
|
||||||
method: POST
|
method: POST
|
||||||
uri: http://localhost:11434/api/generate
|
uri: http://localhost:8080/api/generate
|
||||||
response:
|
response:
|
||||||
content: '{"model":"llama3.2:3b","created_at":"2024-12-31T16:56:15.759718Z","response":"Final
|
body:
|
||||||
Answer: Artificial Intelligence (AI) refers to the development of computer systems
|
string: '{"model":"gemma2:latest","created_at":"2024-09-24T21:57:55.835715Z","response":"Thought:
|
||||||
able to perform tasks that typically require human intelligence, including learning,
|
I can explain AI in one sentence. \n\nFinal Answer: Artificial intelligence
|
||||||
problem-solving, decision-making, and perception.","done":true,"done_reason":"stop","context":[128006,9125,128007,271,38766,1303,33025,2696,25,6790,220,2366,18,271,128009,128006,882,128007,271,14711,744,512,2675,527,1296,3560,13,1296,93371,198,7927,4443,5915,374,25,1296,5915,198,1271,3041,856,1888,4686,1620,4320,311,279,3465,1005,279,4839,2768,3645,1473,85269,25,358,1457,649,3041,264,2294,4320,198,19918,22559,25,4718,1620,4320,2011,387,279,2294,323,279,1455,4686,439,3284,11,433,2011,387,15632,7633,382,40,28832,1005,1521,20447,11,856,2683,14117,389,433,2268,14711,2724,1473,5520,5546,25,83017,1148,15592,374,304,832,11914,271,2028,374,279,1755,13186,369,701,1620,4320,25,362,832,1355,18886,16540,315,15592,198,9514,28832,471,279,5150,4686,2262,439,279,1620,4320,11,539,264,12399,382,11382,0,1115,374,48174,3062,311,499,11,1005,279,7526,2561,323,3041,701,1888,13321,22559,11,701,2683,14117,389,433,2268,85269,1473,128009,128006,78191,128007,271,19918,22559,25,59294,22107,320,15836,8,19813,311,279,4500,315,6500,6067,3025,311,2804,9256,430,11383,1397,3823,11478,11,2737,6975,11,3575,99246,11,5597,28846,11,323,21063,13],"total_duration":1156303250,"load_duration":35999125,"prompt_eval_count":181,"prompt_eval_duration":408000000,"eval_count":38,"eval_duration":711000000}'
|
(AI) is the ability of computer systems to perform tasks that typically require
|
||||||
|
human intelligence, such as learning, problem-solving, and decision-making. \n","done":true,"done_reason":"stop","context":[106,1645,108,6176,1479,235292,108,2045,708,2121,4731,235265,2121,135147,108,6922,3749,6789,603,235292,2121,6789,108,1469,2734,970,1963,3407,2048,3448,577,573,6911,1281,573,5463,2412,5920,235292,109,65366,235292,590,1490,798,2734,476,1775,3448,108,11263,10358,235292,3883,2048,3448,2004,614,573,1775,578,573,1546,3407,685,3077,235269,665,2004,614,17526,6547,235265,109,235285,44472,1281,1450,32808,235269,970,3356,12014,611,665,235341,109,6176,4926,235292,109,6846,12297,235292,36576,1212,16481,603,575,974,13060,109,1596,603,573,5246,12830,604,861,2048,3448,235292,586,974,235290,47366,15844,576,16481,108,4747,44472,2203,573,5579,3407,3381,685,573,2048,3448,235269,780,476,13367,235265,109,12694,235341,1417,603,50471,2845,577,692,235269,1281,573,8112,2506,578,2734,861,1963,14124,10358,235269,861,3356,12014,611,665,235341,109,65366,235292,109,107,108,106,2516,108,65366,235292,590,798,10200,16481,575,974,13060,235265,235248,109,11263,10358,235292,42456,17273,591,11716,235275,603,573,7374,576,6875,5188,577,3114,13333,674,15976,2817,3515,17273,235269,1582,685,6044,235269,3210,235290,60495,235269,578,4530,235290,14577,235265,139,108],"total_duration":3370959792,"load_duration":20611750,"prompt_eval_count":173,"prompt_eval_duration":688036000,"eval_count":51,"eval_duration":2660291000}'
|
||||||
headers:
|
headers:
|
||||||
Content-Length:
|
Content-Length:
|
||||||
- '1528'
|
- '1662'
|
||||||
Content-Type:
|
Content-Type:
|
||||||
- application/json; charset=utf-8
|
- application/json; charset=utf-8
|
||||||
Date:
|
Date:
|
||||||
- Tue, 31 Dec 2024 16:56:15 GMT
|
- Tue, 24 Sep 2024 21:57:55 GMT
|
||||||
http_version: HTTP/1.1
|
status:
|
||||||
status_code: 200
|
code: 200
|
||||||
- request:
|
message: OK
|
||||||
body: '{"name": "llama3.2:3b"}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- '*/*'
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '23'
|
|
||||||
content-type:
|
|
||||||
- application/json
|
|
||||||
host:
|
|
||||||
- localhost:11434
|
|
||||||
user-agent:
|
|
||||||
- litellm/1.56.4
|
|
||||||
method: POST
|
|
||||||
uri: http://localhost:11434/api/show
|
|
||||||
response:
|
|
||||||
content: "{\"license\":\"LLAMA 3.2 COMMUNITY LICENSE AGREEMENT\\nLlama 3.2 Version
|
|
||||||
Release Date: September 25, 2024\\n\\n\u201CAgreement\u201D means the terms
|
|
||||||
and conditions for use, reproduction, distribution \\nand modification of the
|
|
||||||
Llama Materials set forth herein.\\n\\n\u201CDocumentation\u201D means the specifications,
|
|
||||||
manuals and documentation accompanying Llama 3.2\\ndistributed by Meta at https://llama.meta.com/doc/overview.\\n\\n\u201CLicensee\u201D
|
|
||||||
or \u201Cyou\u201D means you, or your employer or any other person or entity
|
|
||||||
(if you are \\nentering into this Agreement on such person or entity\u2019s
|
|
||||||
behalf), of the age required under\\napplicable laws, rules or regulations to
|
|
||||||
provide legal consent and that has legal authority\\nto bind your employer or
|
|
||||||
such other person or entity if you are entering in this Agreement\\non their
|
|
||||||
behalf.\\n\\n\u201CLlama 3.2\u201D means the foundational large language models
|
|
||||||
and software and algorithms, including\\nmachine-learning model code, trained
|
|
||||||
model weights, inference-enabling code, training-enabling code,\\nfine-tuning
|
|
||||||
enabling code and other elements of the foregoing distributed by Meta at \\nhttps://www.llama.com/llama-downloads.\\n\\n\u201CLlama
|
|
||||||
Materials\u201D means, collectively, Meta\u2019s proprietary Llama 3.2 and Documentation
|
|
||||||
(and \\nany portion thereof) made available under this Agreement.\\n\\n\u201CMeta\u201D
|
|
||||||
or \u201Cwe\u201D means Meta Platforms Ireland Limited (if you are located in
|
|
||||||
or, \\nif you are an entity, your principal place of business is in the EEA
|
|
||||||
or Switzerland) \\nand Meta Platforms, Inc. (if you are located outside of the
|
|
||||||
EEA or Switzerland). \\n\\n\\nBy clicking \u201CI Accept\u201D below or by using
|
|
||||||
or distributing any portion or element of the Llama Materials,\\nyou agree to
|
|
||||||
be bound by this Agreement.\\n\\n\\n1. License Rights and Redistribution.\\n\\n
|
|
||||||
\ a. Grant of Rights. You are granted a non-exclusive, worldwide, \\nnon-transferable
|
|
||||||
and royalty-free limited license under Meta\u2019s intellectual property or
|
|
||||||
other rights \\nowned by Meta embodied in the Llama Materials to use, reproduce,
|
|
||||||
distribute, copy, create derivative works \\nof, and make modifications to the
|
|
||||||
Llama Materials. \\n\\n b. Redistribution and Use. \\n\\n i. If
|
|
||||||
you distribute or make available the Llama Materials (or any derivative works
|
|
||||||
thereof), \\nor a product or service (including another AI model) that contains
|
|
||||||
any of them, you shall (A) provide\\na copy of this Agreement with any such
|
|
||||||
Llama Materials; and (B) prominently display \u201CBuilt with Llama\u201D\\non
|
|
||||||
a related website, user interface, blogpost, about page, or product documentation.
|
|
||||||
If you use the\\nLlama Materials or any outputs or results of the Llama Materials
|
|
||||||
to create, train, fine tune, or\\notherwise improve an AI model, which is distributed
|
|
||||||
or made available, you shall also include \u201CLlama\u201D\\nat the beginning
|
|
||||||
of any such AI model name.\\n\\n ii. If you receive Llama Materials,
|
|
||||||
or any derivative works thereof, from a Licensee as part\\nof an integrated
|
|
||||||
end user product, then Section 2 of this Agreement will not apply to you. \\n\\n
|
|
||||||
\ iii. You must retain in all copies of the Llama Materials that you distribute
|
|
||||||
the \\nfollowing attribution notice within a \u201CNotice\u201D text file distributed
|
|
||||||
as a part of such copies: \\n\u201CLlama 3.2 is licensed under the Llama 3.2
|
|
||||||
Community License, Copyright \xA9 Meta Platforms,\\nInc. All Rights Reserved.\u201D\\n\\n
|
|
||||||
\ iv. Your use of the Llama Materials must comply with applicable laws
|
|
||||||
and regulations\\n(including trade compliance laws and regulations) and adhere
|
|
||||||
to the Acceptable Use Policy for\\nthe Llama Materials (available at https://www.llama.com/llama3_2/use-policy),
|
|
||||||
which is hereby \\nincorporated by reference into this Agreement.\\n \\n2.
|
|
||||||
Additional Commercial Terms. If, on the Llama 3.2 version release date, the
|
|
||||||
monthly active users\\nof the products or services made available by or for
|
|
||||||
Licensee, or Licensee\u2019s affiliates, \\nis greater than 700 million monthly
|
|
||||||
active users in the preceding calendar month, you must request \\na license
|
|
||||||
from Meta, which Meta may grant to you in its sole discretion, and you are not
|
|
||||||
authorized to\\nexercise any of the rights under this Agreement unless or until
|
|
||||||
Meta otherwise expressly grants you such rights.\\n\\n3. Disclaimer of Warranty.
|
|
||||||
UNLESS REQUIRED BY APPLICABLE LAW, THE LLAMA MATERIALS AND ANY OUTPUT AND \\nRESULTS
|
|
||||||
THEREFROM ARE PROVIDED ON AN \u201CAS IS\u201D BASIS, WITHOUT WARRANTIES OF
|
|
||||||
ANY KIND, AND META DISCLAIMS\\nALL WARRANTIES OF ANY KIND, BOTH EXPRESS AND
|
|
||||||
IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES\\nOF TITLE, NON-INFRINGEMENT,
|
|
||||||
MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE\\nFOR
|
|
||||||
DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE LLAMA MATERIALS
|
|
||||||
AND ASSUME ANY RISKS ASSOCIATED\\nWITH YOUR USE OF THE LLAMA MATERIALS AND ANY
|
|
||||||
OUTPUT AND RESULTS.\\n\\n4. Limitation of Liability. IN NO EVENT WILL META OR
|
|
||||||
ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY, \\nWHETHER IN CONTRACT,
|
|
||||||
TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT,
|
|
||||||
\\nFOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL,
|
|
||||||
EXEMPLARY OR PUNITIVE DAMAGES, EVEN \\nIF META OR ITS AFFILIATES HAVE BEEN ADVISED
|
|
||||||
OF THE POSSIBILITY OF ANY OF THE FOREGOING.\\n\\n5. Intellectual Property.\\n\\n
|
|
||||||
\ a. No trademark licenses are granted under this Agreement, and in connection
|
|
||||||
with the Llama Materials, \\nneither Meta nor Licensee may use any name or mark
|
|
||||||
owned by or associated with the other or any of its affiliates, \\nexcept as
|
|
||||||
required for reasonable and customary use in describing and redistributing the
|
|
||||||
Llama Materials or as \\nset forth in this Section 5(a). Meta hereby grants
|
|
||||||
you a license to use \u201CLlama\u201D (the \u201CMark\u201D) solely as required
|
|
||||||
\\nto comply with the last sentence of Section 1.b.i. You will comply with Meta\u2019s
|
|
||||||
brand guidelines (currently accessible \\nat https://about.meta.com/brand/resources/meta/company-brand/).
|
|
||||||
All goodwill arising out of your use of the Mark \\nwill inure to the benefit
|
|
||||||
of Meta.\\n\\n b. Subject to Meta\u2019s ownership of Llama Materials and
|
|
||||||
derivatives made by or for Meta, with respect to any\\n derivative works
|
|
||||||
and modifications of the Llama Materials that are made by you, as between you
|
|
||||||
and Meta,\\n you are and will be the owner of such derivative works and modifications.\\n\\n
|
|
||||||
\ c. If you institute litigation or other proceedings against Meta or any
|
|
||||||
entity (including a cross-claim or\\n counterclaim in a lawsuit) alleging
|
|
||||||
that the Llama Materials or Llama 3.2 outputs or results, or any portion\\n
|
|
||||||
\ of any of the foregoing, constitutes infringement of intellectual property
|
|
||||||
or other rights owned or licensable\\n by you, then any licenses granted
|
|
||||||
to you under this Agreement shall terminate as of the date such litigation or\\n
|
|
||||||
\ claim is filed or instituted. You will indemnify and hold harmless Meta
|
|
||||||
from and against any claim by any third\\n party arising out of or related
|
|
||||||
to your use or distribution of the Llama Materials.\\n\\n6. Term and Termination.
|
|
||||||
The term of this Agreement will commence upon your acceptance of this Agreement
|
|
||||||
or access\\nto the Llama Materials and will continue in full force and effect
|
|
||||||
until terminated in accordance with the terms\\nand conditions herein. Meta
|
|
||||||
may terminate this Agreement if you are in breach of any term or condition of
|
|
||||||
this\\nAgreement. Upon termination of this Agreement, you shall delete and cease
|
|
||||||
use of the Llama Materials. Sections 3,\\n4 and 7 shall survive the termination
|
|
||||||
of this Agreement. \\n\\n7. Governing Law and Jurisdiction. This Agreement will
|
|
||||||
be governed and construed under the laws of the State of \\nCalifornia without
|
|
||||||
regard to choice of law principles, and the UN Convention on Contracts for the
|
|
||||||
International\\nSale of Goods does not apply to this Agreement. The courts of
|
|
||||||
California shall have exclusive jurisdiction of\\nany dispute arising out of
|
|
||||||
this Agreement.\\n**Llama 3.2** **Acceptable Use Policy**\\n\\nMeta is committed
|
|
||||||
to promoting safe and fair use of its tools and features, including Llama 3.2.
|
|
||||||
If you access or use Llama 3.2, you agree to this Acceptable Use Policy (\u201C**Policy**\u201D).
|
|
||||||
The most recent copy of this policy can be found at [https://www.llama.com/llama3_2/use-policy](https://www.llama.com/llama3_2/use-policy).\\n\\n**Prohibited
|
|
||||||
Uses**\\n\\nWe want everyone to use Llama 3.2 safely and responsibly. You agree
|
|
||||||
you will not use, or allow others to use, Llama 3.2 to:\\n\\n\\n\\n1. Violate
|
|
||||||
the law or others\u2019 rights, including to:\\n 1. Engage in, promote, generate,
|
|
||||||
contribute to, encourage, plan, incite, or further illegal or unlawful activity
|
|
||||||
or content, such as:\\n 1. Violence or terrorism\\n 2. Exploitation
|
|
||||||
or harm to children, including the solicitation, creation, acquisition, or dissemination
|
|
||||||
of child exploitative content or failure to report Child Sexual Abuse Material\\n
|
|
||||||
\ 3. Human trafficking, exploitation, and sexual violence\\n 4.
|
|
||||||
The illegal distribution of information or materials to minors, including obscene
|
|
||||||
materials, or failure to employ legally required age-gating in connection with
|
|
||||||
such information or materials.\\n 5. Sexual solicitation\\n 6.
|
|
||||||
Any other criminal activity\\n 1. Engage in, promote, incite, or facilitate
|
|
||||||
the harassment, abuse, threatening, or bullying of individuals or groups of
|
|
||||||
individuals\\n 2. Engage in, promote, incite, or facilitate discrimination
|
|
||||||
or other unlawful or harmful conduct in the provision of employment, employment
|
|
||||||
benefits, credit, housing, other economic benefits, or other essential goods
|
|
||||||
and services\\n 3. Engage in the unauthorized or unlicensed practice of any
|
|
||||||
profession including, but not limited to, financial, legal, medical/health,
|
|
||||||
or related professional practices\\n 4. Collect, process, disclose, generate,
|
|
||||||
or infer private or sensitive information about individuals, including information
|
|
||||||
about individuals\u2019 identity, health, or demographic information, unless
|
|
||||||
you have obtained the right to do so in accordance with applicable law\\n 5.
|
|
||||||
Engage in or facilitate any action or generate any content that infringes, misappropriates,
|
|
||||||
or otherwise violates any third-party rights, including the outputs or results
|
|
||||||
of any products or services using the Llama Materials\\n 6. Create, generate,
|
|
||||||
or facilitate the creation of malicious code, malware, computer viruses or do
|
|
||||||
anything else that could disable, overburden, interfere with or impair the proper
|
|
||||||
working, integrity, operation or appearance of a website or computer system\\n
|
|
||||||
\ 7. Engage in any action, or facilitate any action, to intentionally circumvent
|
|
||||||
or remove usage restrictions or other safety measures, or to enable functionality
|
|
||||||
disabled by Meta\\n2. Engage in, promote, incite, facilitate, or assist in the
|
|
||||||
planning or development of activities that present a risk of death or bodily
|
|
||||||
harm to individuals, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 8. Military, warfare, nuclear industries or applications, espionage, use
|
|
||||||
for materials or activities that are subject to the International Traffic Arms
|
|
||||||
Regulations (ITAR) maintained by the United States Department of State or to
|
|
||||||
the U.S. Biological Weapons Anti-Terrorism Act of 1989 or the Chemical Weapons
|
|
||||||
Convention Implementation Act of 1997\\n 9. Guns and illegal weapons (including
|
|
||||||
weapon development)\\n 10. Illegal drugs and regulated/controlled substances\\n
|
|
||||||
\ 11. Operation of critical infrastructure, transportation technologies, or
|
|
||||||
heavy machinery\\n 12. Self-harm or harm to others, including suicide, cutting,
|
|
||||||
and eating disorders\\n 13. Any content intended to incite or promote violence,
|
|
||||||
abuse, or any infliction of bodily harm to an individual\\n3. Intentionally
|
|
||||||
deceive or mislead others, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 14. Generating, promoting, or furthering fraud or the creation or promotion
|
|
||||||
of disinformation\\n 15. Generating, promoting, or furthering defamatory
|
|
||||||
content, including the creation of defamatory statements, images, or other content\\n
|
|
||||||
\ 16. Generating, promoting, or further distributing spam\\n 17. Impersonating
|
|
||||||
another individual without consent, authorization, or legal right\\n 18.
|
|
||||||
Representing that the use of Llama 3.2 or outputs are human-generated\\n 19.
|
|
||||||
Generating or facilitating false online engagement, including fake reviews and
|
|
||||||
other means of fake online engagement\\n4. Fail to appropriately disclose to
|
|
||||||
end users any known dangers of your AI system\\n5. Interact with third party
|
|
||||||
tools, models, or software designed to generate unlawful content or engage in
|
|
||||||
unlawful or harmful conduct and/or represent that the outputs of such tools,
|
|
||||||
models, or software are associated with Meta or Llama 3.2\\n\\nWith respect
|
|
||||||
to any multimodal models included in Llama 3.2, the rights granted under Section
|
|
||||||
1(a) of the Llama 3.2 Community License Agreement are not being granted to you
|
|
||||||
if you are an individual domiciled in, or a company with a principal place of
|
|
||||||
business in, the European Union. This restriction does not apply to end users
|
|
||||||
of a product or service that incorporates any such multimodal models.\\n\\nPlease
|
|
||||||
report any violation of this Policy, software \u201Cbug,\u201D or other problems
|
|
||||||
that could lead to a violation of this Policy through one of the following means:\\n\\n\\n\\n*
|
|
||||||
Reporting issues with the model: [https://github.com/meta-llama/llama-models/issues](https://l.workplace.com/l.php?u=https%3A%2F%2Fgithub.com%2Fmeta-llama%2Fllama-models%2Fissues\\u0026h=AT0qV8W9BFT6NwihiOHRuKYQM_UnkzN_NmHMy91OT55gkLpgi4kQupHUl0ssR4dQsIQ8n3tfd0vtkobvsEvt1l4Ic6GXI2EeuHV8N08OG2WnbAmm0FL4ObkazC6G_256vN0lN9DsykCvCqGZ)\\n*
|
|
||||||
Reporting risky content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)\\n*
|
|
||||||
Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)\\n*
|
|
||||||
Reporting violations of the Acceptable Use Policy or unlicensed uses of Llama
|
|
||||||
3.2: LlamaUseReport@meta.com\",\"modelfile\":\"# Modelfile generated by \\\"ollama
|
|
||||||
show\\\"\\n# To build a new Modelfile based on this, replace FROM with:\\n#
|
|
||||||
FROM llama3.2:3b\\n\\nFROM /Users/brandonhancock/.ollama/models/blobs/sha256-dde5aa3fc5ffc17176b5e8bdc82f587b24b2678c6c66101bf7da77af9f7ccdff\\nTEMPLATE
|
|
||||||
\\\"\\\"\\\"\\u003c|start_header_id|\\u003esystem\\u003c|end_header_id|\\u003e\\n\\nCutting
|
|
||||||
Knowledge Date: December 2023\\n\\n{{ if .System }}{{ .System }}\\n{{- end }}\\n{{-
|
|
||||||
if .Tools }}When you receive a tool call response, use the output to format
|
|
||||||
an answer to the orginal user question.\\n\\nYou are a helpful assistant with
|
|
||||||
tool calling capabilities.\\n{{- end }}\\u003c|eot_id|\\u003e\\n{{- range $i,
|
|
||||||
$_ := .Messages }}\\n{{- $last := eq (len (slice $.Messages $i)) 1 }}\\n{{-
|
|
||||||
if eq .Role \\\"user\\\" }}\\u003c|start_header_id|\\u003euser\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if and $.Tools $last }}\\n\\nGiven the following functions, please respond with
|
|
||||||
a JSON for a function call with its proper arguments that best answers the given
|
|
||||||
prompt.\\n\\nRespond in the format {\\\"name\\\": function name, \\\"parameters\\\":
|
|
||||||
dictionary of argument name and its value}. Do not use variables.\\n\\n{{ range
|
|
||||||
$.Tools }}\\n{{- . }}\\n{{ end }}\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{-
|
|
||||||
else }}\\n\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{- end }}{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- else if eq .Role \\\"assistant\\\" }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if .ToolCalls }}\\n{{ range .ToolCalls }}\\n{\\\"name\\\": \\\"{{ .Function.Name
|
|
||||||
}}\\\", \\\"parameters\\\": {{ .Function.Arguments }}}{{ end }}\\n{{- else }}\\n\\n{{
|
|
||||||
.Content }}\\n{{- end }}{{ if not $last }}\\u003c|eot_id|\\u003e{{ end }}\\n{{-
|
|
||||||
else if eq .Role \\\"tool\\\" }}\\u003c|start_header_id|\\u003eipython\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
.Content }}\\u003c|eot_id|\\u003e{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- end }}\\n{{- end }}\\\"\\\"\\\"\\nPARAMETER stop \\u003c|start_header_id|\\u003e\\nPARAMETER
|
|
||||||
stop \\u003c|end_header_id|\\u003e\\nPARAMETER stop \\u003c|eot_id|\\u003e\\nLICENSE
|
|
||||||
\\\"LLAMA 3.2 COMMUNITY LICENSE AGREEMENT\\nLlama 3.2 Version Release Date:
|
|
||||||
September 25, 2024\\n\\n\u201CAgreement\u201D means the terms and conditions
|
|
||||||
for use, reproduction, distribution \\nand modification of the Llama Materials
|
|
||||||
set forth herein.\\n\\n\u201CDocumentation\u201D means the specifications, manuals
|
|
||||||
and documentation accompanying Llama 3.2\\ndistributed by Meta at https://llama.meta.com/doc/overview.\\n\\n\u201CLicensee\u201D
|
|
||||||
or \u201Cyou\u201D means you, or your employer or any other person or entity
|
|
||||||
(if you are \\nentering into this Agreement on such person or entity\u2019s
|
|
||||||
behalf), of the age required under\\napplicable laws, rules or regulations to
|
|
||||||
provide legal consent and that has legal authority\\nto bind your employer or
|
|
||||||
such other person or entity if you are entering in this Agreement\\non their
|
|
||||||
behalf.\\n\\n\u201CLlama 3.2\u201D means the foundational large language models
|
|
||||||
and software and algorithms, including\\nmachine-learning model code, trained
|
|
||||||
model weights, inference-enabling code, training-enabling code,\\nfine-tuning
|
|
||||||
enabling code and other elements of the foregoing distributed by Meta at \\nhttps://www.llama.com/llama-downloads.\\n\\n\u201CLlama
|
|
||||||
Materials\u201D means, collectively, Meta\u2019s proprietary Llama 3.2 and Documentation
|
|
||||||
(and \\nany portion thereof) made available under this Agreement.\\n\\n\u201CMeta\u201D
|
|
||||||
or \u201Cwe\u201D means Meta Platforms Ireland Limited (if you are located in
|
|
||||||
or, \\nif you are an entity, your principal place of business is in the EEA
|
|
||||||
or Switzerland) \\nand Meta Platforms, Inc. (if you are located outside of the
|
|
||||||
EEA or Switzerland). \\n\\n\\nBy clicking \u201CI Accept\u201D below or by using
|
|
||||||
or distributing any portion or element of the Llama Materials,\\nyou agree to
|
|
||||||
be bound by this Agreement.\\n\\n\\n1. License Rights and Redistribution.\\n\\n
|
|
||||||
\ a. Grant of Rights. You are granted a non-exclusive, worldwide, \\nnon-transferable
|
|
||||||
and royalty-free limited license under Meta\u2019s intellectual property or
|
|
||||||
other rights \\nowned by Meta embodied in the Llama Materials to use, reproduce,
|
|
||||||
distribute, copy, create derivative works \\nof, and make modifications to the
|
|
||||||
Llama Materials. \\n\\n b. Redistribution and Use. \\n\\n i. If
|
|
||||||
you distribute or make available the Llama Materials (or any derivative works
|
|
||||||
thereof), \\nor a product or service (including another AI model) that contains
|
|
||||||
any of them, you shall (A) provide\\na copy of this Agreement with any such
|
|
||||||
Llama Materials; and (B) prominently display \u201CBuilt with Llama\u201D\\non
|
|
||||||
a related website, user interface, blogpost, about page, or product documentation.
|
|
||||||
If you use the\\nLlama Materials or any outputs or results of the Llama Materials
|
|
||||||
to create, train, fine tune, or\\notherwise improve an AI model, which is distributed
|
|
||||||
or made available, you shall also include \u201CLlama\u201D\\nat the beginning
|
|
||||||
of any such AI model name.\\n\\n ii. If you receive Llama Materials,
|
|
||||||
or any derivative works thereof, from a Licensee as part\\nof an integrated
|
|
||||||
end user product, then Section 2 of this Agreement will not apply to you. \\n\\n
|
|
||||||
\ iii. You must retain in all copies of the Llama Materials that you distribute
|
|
||||||
the \\nfollowing attribution notice within a \u201CNotice\u201D text file distributed
|
|
||||||
as a part of such copies: \\n\u201CLlama 3.2 is licensed under the Llama 3.2
|
|
||||||
Community License, Copyright \xA9 Meta Platforms,\\nInc. All Rights Reserved.\u201D\\n\\n
|
|
||||||
\ iv. Your use of the Llama Materials must comply with applicable laws
|
|
||||||
and regulations\\n(including trade compliance laws and regulations) and adhere
|
|
||||||
to the Acceptable Use Policy for\\nthe Llama Materials (available at https://www.llama.com/llama3_2/use-policy),
|
|
||||||
which is hereby \\nincorporated by reference into this Agreement.\\n \\n2.
|
|
||||||
Additional Commercial Terms. If, on the Llama 3.2 version release date, the
|
|
||||||
monthly active users\\nof the products or services made available by or for
|
|
||||||
Licensee, or Licensee\u2019s affiliates, \\nis greater than 700 million monthly
|
|
||||||
active users in the preceding calendar month, you must request \\na license
|
|
||||||
from Meta, which Meta may grant to you in its sole discretion, and you are not
|
|
||||||
authorized to\\nexercise any of the rights under this Agreement unless or until
|
|
||||||
Meta otherwise expressly grants you such rights.\\n\\n3. Disclaimer of Warranty.
|
|
||||||
UNLESS REQUIRED BY APPLICABLE LAW, THE LLAMA MATERIALS AND ANY OUTPUT AND \\nRESULTS
|
|
||||||
THEREFROM ARE PROVIDED ON AN \u201CAS IS\u201D BASIS, WITHOUT WARRANTIES OF
|
|
||||||
ANY KIND, AND META DISCLAIMS\\nALL WARRANTIES OF ANY KIND, BOTH EXPRESS AND
|
|
||||||
IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES\\nOF TITLE, NON-INFRINGEMENT,
|
|
||||||
MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE\\nFOR
|
|
||||||
DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE LLAMA MATERIALS
|
|
||||||
AND ASSUME ANY RISKS ASSOCIATED\\nWITH YOUR USE OF THE LLAMA MATERIALS AND ANY
|
|
||||||
OUTPUT AND RESULTS.\\n\\n4. Limitation of Liability. IN NO EVENT WILL META OR
|
|
||||||
ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY, \\nWHETHER IN CONTRACT,
|
|
||||||
TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT,
|
|
||||||
\\nFOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL,
|
|
||||||
EXEMPLARY OR PUNITIVE DAMAGES, EVEN \\nIF META OR ITS AFFILIATES HAVE BEEN ADVISED
|
|
||||||
OF THE POSSIBILITY OF ANY OF THE FOREGOING.\\n\\n5. Intellectual Property.\\n\\n
|
|
||||||
\ a. No trademark licenses are granted under this Agreement, and in connection
|
|
||||||
with the Llama Materials, \\nneither Meta nor Licensee may use any name or mark
|
|
||||||
owned by or associated with the other or any of its affiliates, \\nexcept as
|
|
||||||
required for reasonable and customary use in describing and redistributing the
|
|
||||||
Llama Materials or as \\nset forth in this Section 5(a). Meta hereby grants
|
|
||||||
you a license to use \u201CLlama\u201D (the \u201CMark\u201D) solely as required
|
|
||||||
\\nto comply with the last sentence of Section 1.b.i. You will comply with Meta\u2019s
|
|
||||||
brand guidelines (currently accessible \\nat https://about.meta.com/brand/resources/meta/company-brand/).
|
|
||||||
All goodwill arising out of your use of the Mark \\nwill inure to the benefit
|
|
||||||
of Meta.\\n\\n b. Subject to Meta\u2019s ownership of Llama Materials and
|
|
||||||
derivatives made by or for Meta, with respect to any\\n derivative works
|
|
||||||
and modifications of the Llama Materials that are made by you, as between you
|
|
||||||
and Meta,\\n you are and will be the owner of such derivative works and modifications.\\n\\n
|
|
||||||
\ c. If you institute litigation or other proceedings against Meta or any
|
|
||||||
entity (including a cross-claim or\\n counterclaim in a lawsuit) alleging
|
|
||||||
that the Llama Materials or Llama 3.2 outputs or results, or any portion\\n
|
|
||||||
\ of any of the foregoing, constitutes infringement of intellectual property
|
|
||||||
or other rights owned or licensable\\n by you, then any licenses granted
|
|
||||||
to you under this Agreement shall terminate as of the date such litigation or\\n
|
|
||||||
\ claim is filed or instituted. You will indemnify and hold harmless Meta
|
|
||||||
from and against any claim by any third\\n party arising out of or related
|
|
||||||
to your use or distribution of the Llama Materials.\\n\\n6. Term and Termination.
|
|
||||||
The term of this Agreement will commence upon your acceptance of this Agreement
|
|
||||||
or access\\nto the Llama Materials and will continue in full force and effect
|
|
||||||
until terminated in accordance with the terms\\nand conditions herein. Meta
|
|
||||||
may terminate this Agreement if you are in breach of any term or condition of
|
|
||||||
this\\nAgreement. Upon termination of this Agreement, you shall delete and cease
|
|
||||||
use of the Llama Materials. Sections 3,\\n4 and 7 shall survive the termination
|
|
||||||
of this Agreement. \\n\\n7. Governing Law and Jurisdiction. This Agreement will
|
|
||||||
be governed and construed under the laws of the State of \\nCalifornia without
|
|
||||||
regard to choice of law principles, and the UN Convention on Contracts for the
|
|
||||||
International\\nSale of Goods does not apply to this Agreement. The courts of
|
|
||||||
California shall have exclusive jurisdiction of\\nany dispute arising out of
|
|
||||||
this Agreement.\\\"\\nLICENSE \\\"**Llama 3.2** **Acceptable Use Policy**\\n\\nMeta
|
|
||||||
is committed to promoting safe and fair use of its tools and features, including
|
|
||||||
Llama 3.2. If you access or use Llama 3.2, you agree to this Acceptable Use
|
|
||||||
Policy (\u201C**Policy**\u201D). The most recent copy of this policy can be
|
|
||||||
found at [https://www.llama.com/llama3_2/use-policy](https://www.llama.com/llama3_2/use-policy).\\n\\n**Prohibited
|
|
||||||
Uses**\\n\\nWe want everyone to use Llama 3.2 safely and responsibly. You agree
|
|
||||||
you will not use, or allow others to use, Llama 3.2 to:\\n\\n\\n\\n1. Violate
|
|
||||||
the law or others\u2019 rights, including to:\\n 1. Engage in, promote, generate,
|
|
||||||
contribute to, encourage, plan, incite, or further illegal or unlawful activity
|
|
||||||
or content, such as:\\n 1. Violence or terrorism\\n 2. Exploitation
|
|
||||||
or harm to children, including the solicitation, creation, acquisition, or dissemination
|
|
||||||
of child exploitative content or failure to report Child Sexual Abuse Material\\n
|
|
||||||
\ 3. Human trafficking, exploitation, and sexual violence\\n 4.
|
|
||||||
The illegal distribution of information or materials to minors, including obscene
|
|
||||||
materials, or failure to employ legally required age-gating in connection with
|
|
||||||
such information or materials.\\n 5. Sexual solicitation\\n 6.
|
|
||||||
Any other criminal activity\\n 1. Engage in, promote, incite, or facilitate
|
|
||||||
the harassment, abuse, threatening, or bullying of individuals or groups of
|
|
||||||
individuals\\n 2. Engage in, promote, incite, or facilitate discrimination
|
|
||||||
or other unlawful or harmful conduct in the provision of employment, employment
|
|
||||||
benefits, credit, housing, other economic benefits, or other essential goods
|
|
||||||
and services\\n 3. Engage in the unauthorized or unlicensed practice of any
|
|
||||||
profession including, but not limited to, financial, legal, medical/health,
|
|
||||||
or related professional practices\\n 4. Collect, process, disclose, generate,
|
|
||||||
or infer private or sensitive information about individuals, including information
|
|
||||||
about individuals\u2019 identity, health, or demographic information, unless
|
|
||||||
you have obtained the right to do so in accordance with applicable law\\n 5.
|
|
||||||
Engage in or facilitate any action or generate any content that infringes, misappropriates,
|
|
||||||
or otherwise violates any third-party rights, including the outputs or results
|
|
||||||
of any products or services using the Llama Materials\\n 6. Create, generate,
|
|
||||||
or facilitate the creation of malicious code, malware, computer viruses or do
|
|
||||||
anything else that could disable, overburden, interfere with or impair the proper
|
|
||||||
working, integrity, operation or appearance of a website or computer system\\n
|
|
||||||
\ 7. Engage in any action, or facilitate any action, to intentionally circumvent
|
|
||||||
or remove usage restrictions or other safety measures, or to enable functionality
|
|
||||||
disabled by Meta\\n2. Engage in, promote, incite, facilitate, or assist in the
|
|
||||||
planning or development of activities that present a risk of death or bodily
|
|
||||||
harm to individuals, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 8. Military, warfare, nuclear industries or applications, espionage, use
|
|
||||||
for materials or activities that are subject to the International Traffic Arms
|
|
||||||
Regulations (ITAR) maintained by the United States Department of State or to
|
|
||||||
the U.S. Biological Weapons Anti-Terrorism Act of 1989 or the Chemical Weapons
|
|
||||||
Convention Implementation Act of 1997\\n 9. Guns and illegal weapons (including
|
|
||||||
weapon development)\\n 10. Illegal drugs and regulated/controlled substances\\n
|
|
||||||
\ 11. Operation of critical infrastructure, transportation technologies, or
|
|
||||||
heavy machinery\\n 12. Self-harm or harm to others, including suicide, cutting,
|
|
||||||
and eating disorders\\n 13. Any content intended to incite or promote violence,
|
|
||||||
abuse, or any infliction of bodily harm to an individual\\n3. Intentionally
|
|
||||||
deceive or mislead others, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 14. Generating, promoting, or furthering fraud or the creation or promotion
|
|
||||||
of disinformation\\n 15. Generating, promoting, or furthering defamatory
|
|
||||||
content, including the creation of defamatory statements, images, or other content\\n
|
|
||||||
\ 16. Generating, promoting, or further distributing spam\\n 17. Impersonating
|
|
||||||
another individual without consent, authorization, or legal right\\n 18.
|
|
||||||
Representing that the use of Llama 3.2 or outputs are human-generated\\n 19.
|
|
||||||
Generating or facilitating false online engagement, including fake reviews and
|
|
||||||
other means of fake online engagement\\n4. Fail to appropriately disclose to
|
|
||||||
end users any known dangers of your AI system\\n5. Interact with third party
|
|
||||||
tools, models, or software designed to generate unlawful content or engage in
|
|
||||||
unlawful or harmful conduct and/or represent that the outputs of such tools,
|
|
||||||
models, or software are associated with Meta or Llama 3.2\\n\\nWith respect
|
|
||||||
to any multimodal models included in Llama 3.2, the rights granted under Section
|
|
||||||
1(a) of the Llama 3.2 Community License Agreement are not being granted to you
|
|
||||||
if you are an individual domiciled in, or a company with a principal place of
|
|
||||||
business in, the European Union. This restriction does not apply to end users
|
|
||||||
of a product or service that incorporates any such multimodal models.\\n\\nPlease
|
|
||||||
report any violation of this Policy, software \u201Cbug,\u201D or other problems
|
|
||||||
that could lead to a violation of this Policy through one of the following means:\\n\\n\\n\\n*
|
|
||||||
Reporting issues with the model: [https://github.com/meta-llama/llama-models/issues](https://l.workplace.com/l.php?u=https%3A%2F%2Fgithub.com%2Fmeta-llama%2Fllama-models%2Fissues\\u0026h=AT0qV8W9BFT6NwihiOHRuKYQM_UnkzN_NmHMy91OT55gkLpgi4kQupHUl0ssR4dQsIQ8n3tfd0vtkobvsEvt1l4Ic6GXI2EeuHV8N08OG2WnbAmm0FL4ObkazC6G_256vN0lN9DsykCvCqGZ)\\n*
|
|
||||||
Reporting risky content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)\\n*
|
|
||||||
Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)\\n*
|
|
||||||
Reporting violations of the Acceptable Use Policy or unlicensed uses of Llama
|
|
||||||
3.2: LlamaUseReport@meta.com\\\"\\n\",\"parameters\":\"stop \\\"\\u003c|start_header_id|\\u003e\\\"\\nstop
|
|
||||||
\ \\\"\\u003c|end_header_id|\\u003e\\\"\\nstop \\\"\\u003c|eot_id|\\u003e\\\"\",\"template\":\"\\u003c|start_header_id|\\u003esystem\\u003c|end_header_id|\\u003e\\n\\nCutting
|
|
||||||
Knowledge Date: December 2023\\n\\n{{ if .System }}{{ .System }}\\n{{- end }}\\n{{-
|
|
||||||
if .Tools }}When you receive a tool call response, use the output to format
|
|
||||||
an answer to the orginal user question.\\n\\nYou are a helpful assistant with
|
|
||||||
tool calling capabilities.\\n{{- end }}\\u003c|eot_id|\\u003e\\n{{- range $i,
|
|
||||||
$_ := .Messages }}\\n{{- $last := eq (len (slice $.Messages $i)) 1 }}\\n{{-
|
|
||||||
if eq .Role \\\"user\\\" }}\\u003c|start_header_id|\\u003euser\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if and $.Tools $last }}\\n\\nGiven the following functions, please respond with
|
|
||||||
a JSON for a function call with its proper arguments that best answers the given
|
|
||||||
prompt.\\n\\nRespond in the format {\\\"name\\\": function name, \\\"parameters\\\":
|
|
||||||
dictionary of argument name and its value}. Do not use variables.\\n\\n{{ range
|
|
||||||
$.Tools }}\\n{{- . }}\\n{{ end }}\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{-
|
|
||||||
else }}\\n\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{- end }}{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- else if eq .Role \\\"assistant\\\" }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if .ToolCalls }}\\n{{ range .ToolCalls }}\\n{\\\"name\\\": \\\"{{ .Function.Name
|
|
||||||
}}\\\", \\\"parameters\\\": {{ .Function.Arguments }}}{{ end }}\\n{{- else }}\\n\\n{{
|
|
||||||
.Content }}\\n{{- end }}{{ if not $last }}\\u003c|eot_id|\\u003e{{ end }}\\n{{-
|
|
||||||
else if eq .Role \\\"tool\\\" }}\\u003c|start_header_id|\\u003eipython\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
.Content }}\\u003c|eot_id|\\u003e{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- end }}\\n{{- end }}\",\"details\":{\"parent_model\":\"\",\"format\":\"gguf\",\"family\":\"llama\",\"families\":[\"llama\"],\"parameter_size\":\"3.2B\",\"quantization_level\":\"Q4_K_M\"},\"model_info\":{\"general.architecture\":\"llama\",\"general.basename\":\"Llama-3.2\",\"general.file_type\":15,\"general.finetune\":\"Instruct\",\"general.languages\":[\"en\",\"de\",\"fr\",\"it\",\"pt\",\"hi\",\"es\",\"th\"],\"general.parameter_count\":3212749888,\"general.quantization_version\":2,\"general.size_label\":\"3B\",\"general.tags\":[\"facebook\",\"meta\",\"pytorch\",\"llama\",\"llama-3\",\"text-generation\"],\"general.type\":\"model\",\"llama.attention.head_count\":24,\"llama.attention.head_count_kv\":8,\"llama.attention.key_length\":128,\"llama.attention.layer_norm_rms_epsilon\":0.00001,\"llama.attention.value_length\":128,\"llama.block_count\":28,\"llama.context_length\":131072,\"llama.embedding_length\":3072,\"llama.feed_forward_length\":8192,\"llama.rope.dimension_count\":128,\"llama.rope.freq_base\":500000,\"llama.vocab_size\":128256,\"tokenizer.ggml.bos_token_id\":128000,\"tokenizer.ggml.eos_token_id\":128009,\"tokenizer.ggml.merges\":null,\"tokenizer.ggml.model\":\"gpt2\",\"tokenizer.ggml.pre\":\"llama-bpe\",\"tokenizer.ggml.token_type\":null,\"tokenizer.ggml.tokens\":null},\"modified_at\":\"2024-12-31T11:53:14.529771974-05:00\"}"
|
|
||||||
headers:
|
|
||||||
Content-Type:
|
|
||||||
- application/json; charset=utf-8
|
|
||||||
Date:
|
|
||||||
- Tue, 31 Dec 2024 16:56:15 GMT
|
|
||||||
Transfer-Encoding:
|
|
||||||
- chunked
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
- request:
|
|
||||||
body: '{"name": "llama3.2:3b"}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- '*/*'
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '23'
|
|
||||||
content-type:
|
|
||||||
- application/json
|
|
||||||
host:
|
|
||||||
- localhost:11434
|
|
||||||
user-agent:
|
|
||||||
- litellm/1.56.4
|
|
||||||
method: POST
|
|
||||||
uri: http://localhost:11434/api/show
|
|
||||||
response:
|
|
||||||
content: "{\"license\":\"LLAMA 3.2 COMMUNITY LICENSE AGREEMENT\\nLlama 3.2 Version
|
|
||||||
Release Date: September 25, 2024\\n\\n\u201CAgreement\u201D means the terms
|
|
||||||
and conditions for use, reproduction, distribution \\nand modification of the
|
|
||||||
Llama Materials set forth herein.\\n\\n\u201CDocumentation\u201D means the specifications,
|
|
||||||
manuals and documentation accompanying Llama 3.2\\ndistributed by Meta at https://llama.meta.com/doc/overview.\\n\\n\u201CLicensee\u201D
|
|
||||||
or \u201Cyou\u201D means you, or your employer or any other person or entity
|
|
||||||
(if you are \\nentering into this Agreement on such person or entity\u2019s
|
|
||||||
behalf), of the age required under\\napplicable laws, rules or regulations to
|
|
||||||
provide legal consent and that has legal authority\\nto bind your employer or
|
|
||||||
such other person or entity if you are entering in this Agreement\\non their
|
|
||||||
behalf.\\n\\n\u201CLlama 3.2\u201D means the foundational large language models
|
|
||||||
and software and algorithms, including\\nmachine-learning model code, trained
|
|
||||||
model weights, inference-enabling code, training-enabling code,\\nfine-tuning
|
|
||||||
enabling code and other elements of the foregoing distributed by Meta at \\nhttps://www.llama.com/llama-downloads.\\n\\n\u201CLlama
|
|
||||||
Materials\u201D means, collectively, Meta\u2019s proprietary Llama 3.2 and Documentation
|
|
||||||
(and \\nany portion thereof) made available under this Agreement.\\n\\n\u201CMeta\u201D
|
|
||||||
or \u201Cwe\u201D means Meta Platforms Ireland Limited (if you are located in
|
|
||||||
or, \\nif you are an entity, your principal place of business is in the EEA
|
|
||||||
or Switzerland) \\nand Meta Platforms, Inc. (if you are located outside of the
|
|
||||||
EEA or Switzerland). \\n\\n\\nBy clicking \u201CI Accept\u201D below or by using
|
|
||||||
or distributing any portion or element of the Llama Materials,\\nyou agree to
|
|
||||||
be bound by this Agreement.\\n\\n\\n1. License Rights and Redistribution.\\n\\n
|
|
||||||
\ a. Grant of Rights. You are granted a non-exclusive, worldwide, \\nnon-transferable
|
|
||||||
and royalty-free limited license under Meta\u2019s intellectual property or
|
|
||||||
other rights \\nowned by Meta embodied in the Llama Materials to use, reproduce,
|
|
||||||
distribute, copy, create derivative works \\nof, and make modifications to the
|
|
||||||
Llama Materials. \\n\\n b. Redistribution and Use. \\n\\n i. If
|
|
||||||
you distribute or make available the Llama Materials (or any derivative works
|
|
||||||
thereof), \\nor a product or service (including another AI model) that contains
|
|
||||||
any of them, you shall (A) provide\\na copy of this Agreement with any such
|
|
||||||
Llama Materials; and (B) prominently display \u201CBuilt with Llama\u201D\\non
|
|
||||||
a related website, user interface, blogpost, about page, or product documentation.
|
|
||||||
If you use the\\nLlama Materials or any outputs or results of the Llama Materials
|
|
||||||
to create, train, fine tune, or\\notherwise improve an AI model, which is distributed
|
|
||||||
or made available, you shall also include \u201CLlama\u201D\\nat the beginning
|
|
||||||
of any such AI model name.\\n\\n ii. If you receive Llama Materials,
|
|
||||||
or any derivative works thereof, from a Licensee as part\\nof an integrated
|
|
||||||
end user product, then Section 2 of this Agreement will not apply to you. \\n\\n
|
|
||||||
\ iii. You must retain in all copies of the Llama Materials that you distribute
|
|
||||||
the \\nfollowing attribution notice within a \u201CNotice\u201D text file distributed
|
|
||||||
as a part of such copies: \\n\u201CLlama 3.2 is licensed under the Llama 3.2
|
|
||||||
Community License, Copyright \xA9 Meta Platforms,\\nInc. All Rights Reserved.\u201D\\n\\n
|
|
||||||
\ iv. Your use of the Llama Materials must comply with applicable laws
|
|
||||||
and regulations\\n(including trade compliance laws and regulations) and adhere
|
|
||||||
to the Acceptable Use Policy for\\nthe Llama Materials (available at https://www.llama.com/llama3_2/use-policy),
|
|
||||||
which is hereby \\nincorporated by reference into this Agreement.\\n \\n2.
|
|
||||||
Additional Commercial Terms. If, on the Llama 3.2 version release date, the
|
|
||||||
monthly active users\\nof the products or services made available by or for
|
|
||||||
Licensee, or Licensee\u2019s affiliates, \\nis greater than 700 million monthly
|
|
||||||
active users in the preceding calendar month, you must request \\na license
|
|
||||||
from Meta, which Meta may grant to you in its sole discretion, and you are not
|
|
||||||
authorized to\\nexercise any of the rights under this Agreement unless or until
|
|
||||||
Meta otherwise expressly grants you such rights.\\n\\n3. Disclaimer of Warranty.
|
|
||||||
UNLESS REQUIRED BY APPLICABLE LAW, THE LLAMA MATERIALS AND ANY OUTPUT AND \\nRESULTS
|
|
||||||
THEREFROM ARE PROVIDED ON AN \u201CAS IS\u201D BASIS, WITHOUT WARRANTIES OF
|
|
||||||
ANY KIND, AND META DISCLAIMS\\nALL WARRANTIES OF ANY KIND, BOTH EXPRESS AND
|
|
||||||
IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES\\nOF TITLE, NON-INFRINGEMENT,
|
|
||||||
MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE\\nFOR
|
|
||||||
DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE LLAMA MATERIALS
|
|
||||||
AND ASSUME ANY RISKS ASSOCIATED\\nWITH YOUR USE OF THE LLAMA MATERIALS AND ANY
|
|
||||||
OUTPUT AND RESULTS.\\n\\n4. Limitation of Liability. IN NO EVENT WILL META OR
|
|
||||||
ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY, \\nWHETHER IN CONTRACT,
|
|
||||||
TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT,
|
|
||||||
\\nFOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL,
|
|
||||||
EXEMPLARY OR PUNITIVE DAMAGES, EVEN \\nIF META OR ITS AFFILIATES HAVE BEEN ADVISED
|
|
||||||
OF THE POSSIBILITY OF ANY OF THE FOREGOING.\\n\\n5. Intellectual Property.\\n\\n
|
|
||||||
\ a. No trademark licenses are granted under this Agreement, and in connection
|
|
||||||
with the Llama Materials, \\nneither Meta nor Licensee may use any name or mark
|
|
||||||
owned by or associated with the other or any of its affiliates, \\nexcept as
|
|
||||||
required for reasonable and customary use in describing and redistributing the
|
|
||||||
Llama Materials or as \\nset forth in this Section 5(a). Meta hereby grants
|
|
||||||
you a license to use \u201CLlama\u201D (the \u201CMark\u201D) solely as required
|
|
||||||
\\nto comply with the last sentence of Section 1.b.i. You will comply with Meta\u2019s
|
|
||||||
brand guidelines (currently accessible \\nat https://about.meta.com/brand/resources/meta/company-brand/).
|
|
||||||
All goodwill arising out of your use of the Mark \\nwill inure to the benefit
|
|
||||||
of Meta.\\n\\n b. Subject to Meta\u2019s ownership of Llama Materials and
|
|
||||||
derivatives made by or for Meta, with respect to any\\n derivative works
|
|
||||||
and modifications of the Llama Materials that are made by you, as between you
|
|
||||||
and Meta,\\n you are and will be the owner of such derivative works and modifications.\\n\\n
|
|
||||||
\ c. If you institute litigation or other proceedings against Meta or any
|
|
||||||
entity (including a cross-claim or\\n counterclaim in a lawsuit) alleging
|
|
||||||
that the Llama Materials or Llama 3.2 outputs or results, or any portion\\n
|
|
||||||
\ of any of the foregoing, constitutes infringement of intellectual property
|
|
||||||
or other rights owned or licensable\\n by you, then any licenses granted
|
|
||||||
to you under this Agreement shall terminate as of the date such litigation or\\n
|
|
||||||
\ claim is filed or instituted. You will indemnify and hold harmless Meta
|
|
||||||
from and against any claim by any third\\n party arising out of or related
|
|
||||||
to your use or distribution of the Llama Materials.\\n\\n6. Term and Termination.
|
|
||||||
The term of this Agreement will commence upon your acceptance of this Agreement
|
|
||||||
or access\\nto the Llama Materials and will continue in full force and effect
|
|
||||||
until terminated in accordance with the terms\\nand conditions herein. Meta
|
|
||||||
may terminate this Agreement if you are in breach of any term or condition of
|
|
||||||
this\\nAgreement. Upon termination of this Agreement, you shall delete and cease
|
|
||||||
use of the Llama Materials. Sections 3,\\n4 and 7 shall survive the termination
|
|
||||||
of this Agreement. \\n\\n7. Governing Law and Jurisdiction. This Agreement will
|
|
||||||
be governed and construed under the laws of the State of \\nCalifornia without
|
|
||||||
regard to choice of law principles, and the UN Convention on Contracts for the
|
|
||||||
International\\nSale of Goods does not apply to this Agreement. The courts of
|
|
||||||
California shall have exclusive jurisdiction of\\nany dispute arising out of
|
|
||||||
this Agreement.\\n**Llama 3.2** **Acceptable Use Policy**\\n\\nMeta is committed
|
|
||||||
to promoting safe and fair use of its tools and features, including Llama 3.2.
|
|
||||||
If you access or use Llama 3.2, you agree to this Acceptable Use Policy (\u201C**Policy**\u201D).
|
|
||||||
The most recent copy of this policy can be found at [https://www.llama.com/llama3_2/use-policy](https://www.llama.com/llama3_2/use-policy).\\n\\n**Prohibited
|
|
||||||
Uses**\\n\\nWe want everyone to use Llama 3.2 safely and responsibly. You agree
|
|
||||||
you will not use, or allow others to use, Llama 3.2 to:\\n\\n\\n\\n1. Violate
|
|
||||||
the law or others\u2019 rights, including to:\\n 1. Engage in, promote, generate,
|
|
||||||
contribute to, encourage, plan, incite, or further illegal or unlawful activity
|
|
||||||
or content, such as:\\n 1. Violence or terrorism\\n 2. Exploitation
|
|
||||||
or harm to children, including the solicitation, creation, acquisition, or dissemination
|
|
||||||
of child exploitative content or failure to report Child Sexual Abuse Material\\n
|
|
||||||
\ 3. Human trafficking, exploitation, and sexual violence\\n 4.
|
|
||||||
The illegal distribution of information or materials to minors, including obscene
|
|
||||||
materials, or failure to employ legally required age-gating in connection with
|
|
||||||
such information or materials.\\n 5. Sexual solicitation\\n 6.
|
|
||||||
Any other criminal activity\\n 1. Engage in, promote, incite, or facilitate
|
|
||||||
the harassment, abuse, threatening, or bullying of individuals or groups of
|
|
||||||
individuals\\n 2. Engage in, promote, incite, or facilitate discrimination
|
|
||||||
or other unlawful or harmful conduct in the provision of employment, employment
|
|
||||||
benefits, credit, housing, other economic benefits, or other essential goods
|
|
||||||
and services\\n 3. Engage in the unauthorized or unlicensed practice of any
|
|
||||||
profession including, but not limited to, financial, legal, medical/health,
|
|
||||||
or related professional practices\\n 4. Collect, process, disclose, generate,
|
|
||||||
or infer private or sensitive information about individuals, including information
|
|
||||||
about individuals\u2019 identity, health, or demographic information, unless
|
|
||||||
you have obtained the right to do so in accordance with applicable law\\n 5.
|
|
||||||
Engage in or facilitate any action or generate any content that infringes, misappropriates,
|
|
||||||
or otherwise violates any third-party rights, including the outputs or results
|
|
||||||
of any products or services using the Llama Materials\\n 6. Create, generate,
|
|
||||||
or facilitate the creation of malicious code, malware, computer viruses or do
|
|
||||||
anything else that could disable, overburden, interfere with or impair the proper
|
|
||||||
working, integrity, operation or appearance of a website or computer system\\n
|
|
||||||
\ 7. Engage in any action, or facilitate any action, to intentionally circumvent
|
|
||||||
or remove usage restrictions or other safety measures, or to enable functionality
|
|
||||||
disabled by Meta\\n2. Engage in, promote, incite, facilitate, or assist in the
|
|
||||||
planning or development of activities that present a risk of death or bodily
|
|
||||||
harm to individuals, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 8. Military, warfare, nuclear industries or applications, espionage, use
|
|
||||||
for materials or activities that are subject to the International Traffic Arms
|
|
||||||
Regulations (ITAR) maintained by the United States Department of State or to
|
|
||||||
the U.S. Biological Weapons Anti-Terrorism Act of 1989 or the Chemical Weapons
|
|
||||||
Convention Implementation Act of 1997\\n 9. Guns and illegal weapons (including
|
|
||||||
weapon development)\\n 10. Illegal drugs and regulated/controlled substances\\n
|
|
||||||
\ 11. Operation of critical infrastructure, transportation technologies, or
|
|
||||||
heavy machinery\\n 12. Self-harm or harm to others, including suicide, cutting,
|
|
||||||
and eating disorders\\n 13. Any content intended to incite or promote violence,
|
|
||||||
abuse, or any infliction of bodily harm to an individual\\n3. Intentionally
|
|
||||||
deceive or mislead others, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 14. Generating, promoting, or furthering fraud or the creation or promotion
|
|
||||||
of disinformation\\n 15. Generating, promoting, or furthering defamatory
|
|
||||||
content, including the creation of defamatory statements, images, or other content\\n
|
|
||||||
\ 16. Generating, promoting, or further distributing spam\\n 17. Impersonating
|
|
||||||
another individual without consent, authorization, or legal right\\n 18.
|
|
||||||
Representing that the use of Llama 3.2 or outputs are human-generated\\n 19.
|
|
||||||
Generating or facilitating false online engagement, including fake reviews and
|
|
||||||
other means of fake online engagement\\n4. Fail to appropriately disclose to
|
|
||||||
end users any known dangers of your AI system\\n5. Interact with third party
|
|
||||||
tools, models, or software designed to generate unlawful content or engage in
|
|
||||||
unlawful or harmful conduct and/or represent that the outputs of such tools,
|
|
||||||
models, or software are associated with Meta or Llama 3.2\\n\\nWith respect
|
|
||||||
to any multimodal models included in Llama 3.2, the rights granted under Section
|
|
||||||
1(a) of the Llama 3.2 Community License Agreement are not being granted to you
|
|
||||||
if you are an individual domiciled in, or a company with a principal place of
|
|
||||||
business in, the European Union. This restriction does not apply to end users
|
|
||||||
of a product or service that incorporates any such multimodal models.\\n\\nPlease
|
|
||||||
report any violation of this Policy, software \u201Cbug,\u201D or other problems
|
|
||||||
that could lead to a violation of this Policy through one of the following means:\\n\\n\\n\\n*
|
|
||||||
Reporting issues with the model: [https://github.com/meta-llama/llama-models/issues](https://l.workplace.com/l.php?u=https%3A%2F%2Fgithub.com%2Fmeta-llama%2Fllama-models%2Fissues\\u0026h=AT0qV8W9BFT6NwihiOHRuKYQM_UnkzN_NmHMy91OT55gkLpgi4kQupHUl0ssR4dQsIQ8n3tfd0vtkobvsEvt1l4Ic6GXI2EeuHV8N08OG2WnbAmm0FL4ObkazC6G_256vN0lN9DsykCvCqGZ)\\n*
|
|
||||||
Reporting risky content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)\\n*
|
|
||||||
Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)\\n*
|
|
||||||
Reporting violations of the Acceptable Use Policy or unlicensed uses of Llama
|
|
||||||
3.2: LlamaUseReport@meta.com\",\"modelfile\":\"# Modelfile generated by \\\"ollama
|
|
||||||
show\\\"\\n# To build a new Modelfile based on this, replace FROM with:\\n#
|
|
||||||
FROM llama3.2:3b\\n\\nFROM /Users/brandonhancock/.ollama/models/blobs/sha256-dde5aa3fc5ffc17176b5e8bdc82f587b24b2678c6c66101bf7da77af9f7ccdff\\nTEMPLATE
|
|
||||||
\\\"\\\"\\\"\\u003c|start_header_id|\\u003esystem\\u003c|end_header_id|\\u003e\\n\\nCutting
|
|
||||||
Knowledge Date: December 2023\\n\\n{{ if .System }}{{ .System }}\\n{{- end }}\\n{{-
|
|
||||||
if .Tools }}When you receive a tool call response, use the output to format
|
|
||||||
an answer to the orginal user question.\\n\\nYou are a helpful assistant with
|
|
||||||
tool calling capabilities.\\n{{- end }}\\u003c|eot_id|\\u003e\\n{{- range $i,
|
|
||||||
$_ := .Messages }}\\n{{- $last := eq (len (slice $.Messages $i)) 1 }}\\n{{-
|
|
||||||
if eq .Role \\\"user\\\" }}\\u003c|start_header_id|\\u003euser\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if and $.Tools $last }}\\n\\nGiven the following functions, please respond with
|
|
||||||
a JSON for a function call with its proper arguments that best answers the given
|
|
||||||
prompt.\\n\\nRespond in the format {\\\"name\\\": function name, \\\"parameters\\\":
|
|
||||||
dictionary of argument name and its value}. Do not use variables.\\n\\n{{ range
|
|
||||||
$.Tools }}\\n{{- . }}\\n{{ end }}\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{-
|
|
||||||
else }}\\n\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{- end }}{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- else if eq .Role \\\"assistant\\\" }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if .ToolCalls }}\\n{{ range .ToolCalls }}\\n{\\\"name\\\": \\\"{{ .Function.Name
|
|
||||||
}}\\\", \\\"parameters\\\": {{ .Function.Arguments }}}{{ end }}\\n{{- else }}\\n\\n{{
|
|
||||||
.Content }}\\n{{- end }}{{ if not $last }}\\u003c|eot_id|\\u003e{{ end }}\\n{{-
|
|
||||||
else if eq .Role \\\"tool\\\" }}\\u003c|start_header_id|\\u003eipython\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
.Content }}\\u003c|eot_id|\\u003e{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- end }}\\n{{- end }}\\\"\\\"\\\"\\nPARAMETER stop \\u003c|start_header_id|\\u003e\\nPARAMETER
|
|
||||||
stop \\u003c|end_header_id|\\u003e\\nPARAMETER stop \\u003c|eot_id|\\u003e\\nLICENSE
|
|
||||||
\\\"LLAMA 3.2 COMMUNITY LICENSE AGREEMENT\\nLlama 3.2 Version Release Date:
|
|
||||||
September 25, 2024\\n\\n\u201CAgreement\u201D means the terms and conditions
|
|
||||||
for use, reproduction, distribution \\nand modification of the Llama Materials
|
|
||||||
set forth herein.\\n\\n\u201CDocumentation\u201D means the specifications, manuals
|
|
||||||
and documentation accompanying Llama 3.2\\ndistributed by Meta at https://llama.meta.com/doc/overview.\\n\\n\u201CLicensee\u201D
|
|
||||||
or \u201Cyou\u201D means you, or your employer or any other person or entity
|
|
||||||
(if you are \\nentering into this Agreement on such person or entity\u2019s
|
|
||||||
behalf), of the age required under\\napplicable laws, rules or regulations to
|
|
||||||
provide legal consent and that has legal authority\\nto bind your employer or
|
|
||||||
such other person or entity if you are entering in this Agreement\\non their
|
|
||||||
behalf.\\n\\n\u201CLlama 3.2\u201D means the foundational large language models
|
|
||||||
and software and algorithms, including\\nmachine-learning model code, trained
|
|
||||||
model weights, inference-enabling code, training-enabling code,\\nfine-tuning
|
|
||||||
enabling code and other elements of the foregoing distributed by Meta at \\nhttps://www.llama.com/llama-downloads.\\n\\n\u201CLlama
|
|
||||||
Materials\u201D means, collectively, Meta\u2019s proprietary Llama 3.2 and Documentation
|
|
||||||
(and \\nany portion thereof) made available under this Agreement.\\n\\n\u201CMeta\u201D
|
|
||||||
or \u201Cwe\u201D means Meta Platforms Ireland Limited (if you are located in
|
|
||||||
or, \\nif you are an entity, your principal place of business is in the EEA
|
|
||||||
or Switzerland) \\nand Meta Platforms, Inc. (if you are located outside of the
|
|
||||||
EEA or Switzerland). \\n\\n\\nBy clicking \u201CI Accept\u201D below or by using
|
|
||||||
or distributing any portion or element of the Llama Materials,\\nyou agree to
|
|
||||||
be bound by this Agreement.\\n\\n\\n1. License Rights and Redistribution.\\n\\n
|
|
||||||
\ a. Grant of Rights. You are granted a non-exclusive, worldwide, \\nnon-transferable
|
|
||||||
and royalty-free limited license under Meta\u2019s intellectual property or
|
|
||||||
other rights \\nowned by Meta embodied in the Llama Materials to use, reproduce,
|
|
||||||
distribute, copy, create derivative works \\nof, and make modifications to the
|
|
||||||
Llama Materials. \\n\\n b. Redistribution and Use. \\n\\n i. If
|
|
||||||
you distribute or make available the Llama Materials (or any derivative works
|
|
||||||
thereof), \\nor a product or service (including another AI model) that contains
|
|
||||||
any of them, you shall (A) provide\\na copy of this Agreement with any such
|
|
||||||
Llama Materials; and (B) prominently display \u201CBuilt with Llama\u201D\\non
|
|
||||||
a related website, user interface, blogpost, about page, or product documentation.
|
|
||||||
If you use the\\nLlama Materials or any outputs or results of the Llama Materials
|
|
||||||
to create, train, fine tune, or\\notherwise improve an AI model, which is distributed
|
|
||||||
or made available, you shall also include \u201CLlama\u201D\\nat the beginning
|
|
||||||
of any such AI model name.\\n\\n ii. If you receive Llama Materials,
|
|
||||||
or any derivative works thereof, from a Licensee as part\\nof an integrated
|
|
||||||
end user product, then Section 2 of this Agreement will not apply to you. \\n\\n
|
|
||||||
\ iii. You must retain in all copies of the Llama Materials that you distribute
|
|
||||||
the \\nfollowing attribution notice within a \u201CNotice\u201D text file distributed
|
|
||||||
as a part of such copies: \\n\u201CLlama 3.2 is licensed under the Llama 3.2
|
|
||||||
Community License, Copyright \xA9 Meta Platforms,\\nInc. All Rights Reserved.\u201D\\n\\n
|
|
||||||
\ iv. Your use of the Llama Materials must comply with applicable laws
|
|
||||||
and regulations\\n(including trade compliance laws and regulations) and adhere
|
|
||||||
to the Acceptable Use Policy for\\nthe Llama Materials (available at https://www.llama.com/llama3_2/use-policy),
|
|
||||||
which is hereby \\nincorporated by reference into this Agreement.\\n \\n2.
|
|
||||||
Additional Commercial Terms. If, on the Llama 3.2 version release date, the
|
|
||||||
monthly active users\\nof the products or services made available by or for
|
|
||||||
Licensee, or Licensee\u2019s affiliates, \\nis greater than 700 million monthly
|
|
||||||
active users in the preceding calendar month, you must request \\na license
|
|
||||||
from Meta, which Meta may grant to you in its sole discretion, and you are not
|
|
||||||
authorized to\\nexercise any of the rights under this Agreement unless or until
|
|
||||||
Meta otherwise expressly grants you such rights.\\n\\n3. Disclaimer of Warranty.
|
|
||||||
UNLESS REQUIRED BY APPLICABLE LAW, THE LLAMA MATERIALS AND ANY OUTPUT AND \\nRESULTS
|
|
||||||
THEREFROM ARE PROVIDED ON AN \u201CAS IS\u201D BASIS, WITHOUT WARRANTIES OF
|
|
||||||
ANY KIND, AND META DISCLAIMS\\nALL WARRANTIES OF ANY KIND, BOTH EXPRESS AND
|
|
||||||
IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES\\nOF TITLE, NON-INFRINGEMENT,
|
|
||||||
MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE\\nFOR
|
|
||||||
DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE LLAMA MATERIALS
|
|
||||||
AND ASSUME ANY RISKS ASSOCIATED\\nWITH YOUR USE OF THE LLAMA MATERIALS AND ANY
|
|
||||||
OUTPUT AND RESULTS.\\n\\n4. Limitation of Liability. IN NO EVENT WILL META OR
|
|
||||||
ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY, \\nWHETHER IN CONTRACT,
|
|
||||||
TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT,
|
|
||||||
\\nFOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL,
|
|
||||||
EXEMPLARY OR PUNITIVE DAMAGES, EVEN \\nIF META OR ITS AFFILIATES HAVE BEEN ADVISED
|
|
||||||
OF THE POSSIBILITY OF ANY OF THE FOREGOING.\\n\\n5. Intellectual Property.\\n\\n
|
|
||||||
\ a. No trademark licenses are granted under this Agreement, and in connection
|
|
||||||
with the Llama Materials, \\nneither Meta nor Licensee may use any name or mark
|
|
||||||
owned by or associated with the other or any of its affiliates, \\nexcept as
|
|
||||||
required for reasonable and customary use in describing and redistributing the
|
|
||||||
Llama Materials or as \\nset forth in this Section 5(a). Meta hereby grants
|
|
||||||
you a license to use \u201CLlama\u201D (the \u201CMark\u201D) solely as required
|
|
||||||
\\nto comply with the last sentence of Section 1.b.i. You will comply with Meta\u2019s
|
|
||||||
brand guidelines (currently accessible \\nat https://about.meta.com/brand/resources/meta/company-brand/).
|
|
||||||
All goodwill arising out of your use of the Mark \\nwill inure to the benefit
|
|
||||||
of Meta.\\n\\n b. Subject to Meta\u2019s ownership of Llama Materials and
|
|
||||||
derivatives made by or for Meta, with respect to any\\n derivative works
|
|
||||||
and modifications of the Llama Materials that are made by you, as between you
|
|
||||||
and Meta,\\n you are and will be the owner of such derivative works and modifications.\\n\\n
|
|
||||||
\ c. If you institute litigation or other proceedings against Meta or any
|
|
||||||
entity (including a cross-claim or\\n counterclaim in a lawsuit) alleging
|
|
||||||
that the Llama Materials or Llama 3.2 outputs or results, or any portion\\n
|
|
||||||
\ of any of the foregoing, constitutes infringement of intellectual property
|
|
||||||
or other rights owned or licensable\\n by you, then any licenses granted
|
|
||||||
to you under this Agreement shall terminate as of the date such litigation or\\n
|
|
||||||
\ claim is filed or instituted. You will indemnify and hold harmless Meta
|
|
||||||
from and against any claim by any third\\n party arising out of or related
|
|
||||||
to your use or distribution of the Llama Materials.\\n\\n6. Term and Termination.
|
|
||||||
The term of this Agreement will commence upon your acceptance of this Agreement
|
|
||||||
or access\\nto the Llama Materials and will continue in full force and effect
|
|
||||||
until terminated in accordance with the terms\\nand conditions herein. Meta
|
|
||||||
may terminate this Agreement if you are in breach of any term or condition of
|
|
||||||
this\\nAgreement. Upon termination of this Agreement, you shall delete and cease
|
|
||||||
use of the Llama Materials. Sections 3,\\n4 and 7 shall survive the termination
|
|
||||||
of this Agreement. \\n\\n7. Governing Law and Jurisdiction. This Agreement will
|
|
||||||
be governed and construed under the laws of the State of \\nCalifornia without
|
|
||||||
regard to choice of law principles, and the UN Convention on Contracts for the
|
|
||||||
International\\nSale of Goods does not apply to this Agreement. The courts of
|
|
||||||
California shall have exclusive jurisdiction of\\nany dispute arising out of
|
|
||||||
this Agreement.\\\"\\nLICENSE \\\"**Llama 3.2** **Acceptable Use Policy**\\n\\nMeta
|
|
||||||
is committed to promoting safe and fair use of its tools and features, including
|
|
||||||
Llama 3.2. If you access or use Llama 3.2, you agree to this Acceptable Use
|
|
||||||
Policy (\u201C**Policy**\u201D). The most recent copy of this policy can be
|
|
||||||
found at [https://www.llama.com/llama3_2/use-policy](https://www.llama.com/llama3_2/use-policy).\\n\\n**Prohibited
|
|
||||||
Uses**\\n\\nWe want everyone to use Llama 3.2 safely and responsibly. You agree
|
|
||||||
you will not use, or allow others to use, Llama 3.2 to:\\n\\n\\n\\n1. Violate
|
|
||||||
the law or others\u2019 rights, including to:\\n 1. Engage in, promote, generate,
|
|
||||||
contribute to, encourage, plan, incite, or further illegal or unlawful activity
|
|
||||||
or content, such as:\\n 1. Violence or terrorism\\n 2. Exploitation
|
|
||||||
or harm to children, including the solicitation, creation, acquisition, or dissemination
|
|
||||||
of child exploitative content or failure to report Child Sexual Abuse Material\\n
|
|
||||||
\ 3. Human trafficking, exploitation, and sexual violence\\n 4.
|
|
||||||
The illegal distribution of information or materials to minors, including obscene
|
|
||||||
materials, or failure to employ legally required age-gating in connection with
|
|
||||||
such information or materials.\\n 5. Sexual solicitation\\n 6.
|
|
||||||
Any other criminal activity\\n 1. Engage in, promote, incite, or facilitate
|
|
||||||
the harassment, abuse, threatening, or bullying of individuals or groups of
|
|
||||||
individuals\\n 2. Engage in, promote, incite, or facilitate discrimination
|
|
||||||
or other unlawful or harmful conduct in the provision of employment, employment
|
|
||||||
benefits, credit, housing, other economic benefits, or other essential goods
|
|
||||||
and services\\n 3. Engage in the unauthorized or unlicensed practice of any
|
|
||||||
profession including, but not limited to, financial, legal, medical/health,
|
|
||||||
or related professional practices\\n 4. Collect, process, disclose, generate,
|
|
||||||
or infer private or sensitive information about individuals, including information
|
|
||||||
about individuals\u2019 identity, health, or demographic information, unless
|
|
||||||
you have obtained the right to do so in accordance with applicable law\\n 5.
|
|
||||||
Engage in or facilitate any action or generate any content that infringes, misappropriates,
|
|
||||||
or otherwise violates any third-party rights, including the outputs or results
|
|
||||||
of any products or services using the Llama Materials\\n 6. Create, generate,
|
|
||||||
or facilitate the creation of malicious code, malware, computer viruses or do
|
|
||||||
anything else that could disable, overburden, interfere with or impair the proper
|
|
||||||
working, integrity, operation or appearance of a website or computer system\\n
|
|
||||||
\ 7. Engage in any action, or facilitate any action, to intentionally circumvent
|
|
||||||
or remove usage restrictions or other safety measures, or to enable functionality
|
|
||||||
disabled by Meta\\n2. Engage in, promote, incite, facilitate, or assist in the
|
|
||||||
planning or development of activities that present a risk of death or bodily
|
|
||||||
harm to individuals, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 8. Military, warfare, nuclear industries or applications, espionage, use
|
|
||||||
for materials or activities that are subject to the International Traffic Arms
|
|
||||||
Regulations (ITAR) maintained by the United States Department of State or to
|
|
||||||
the U.S. Biological Weapons Anti-Terrorism Act of 1989 or the Chemical Weapons
|
|
||||||
Convention Implementation Act of 1997\\n 9. Guns and illegal weapons (including
|
|
||||||
weapon development)\\n 10. Illegal drugs and regulated/controlled substances\\n
|
|
||||||
\ 11. Operation of critical infrastructure, transportation technologies, or
|
|
||||||
heavy machinery\\n 12. Self-harm or harm to others, including suicide, cutting,
|
|
||||||
and eating disorders\\n 13. Any content intended to incite or promote violence,
|
|
||||||
abuse, or any infliction of bodily harm to an individual\\n3. Intentionally
|
|
||||||
deceive or mislead others, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 14. Generating, promoting, or furthering fraud or the creation or promotion
|
|
||||||
of disinformation\\n 15. Generating, promoting, or furthering defamatory
|
|
||||||
content, including the creation of defamatory statements, images, or other content\\n
|
|
||||||
\ 16. Generating, promoting, or further distributing spam\\n 17. Impersonating
|
|
||||||
another individual without consent, authorization, or legal right\\n 18.
|
|
||||||
Representing that the use of Llama 3.2 or outputs are human-generated\\n 19.
|
|
||||||
Generating or facilitating false online engagement, including fake reviews and
|
|
||||||
other means of fake online engagement\\n4. Fail to appropriately disclose to
|
|
||||||
end users any known dangers of your AI system\\n5. Interact with third party
|
|
||||||
tools, models, or software designed to generate unlawful content or engage in
|
|
||||||
unlawful or harmful conduct and/or represent that the outputs of such tools,
|
|
||||||
models, or software are associated with Meta or Llama 3.2\\n\\nWith respect
|
|
||||||
to any multimodal models included in Llama 3.2, the rights granted under Section
|
|
||||||
1(a) of the Llama 3.2 Community License Agreement are not being granted to you
|
|
||||||
if you are an individual domiciled in, or a company with a principal place of
|
|
||||||
business in, the European Union. This restriction does not apply to end users
|
|
||||||
of a product or service that incorporates any such multimodal models.\\n\\nPlease
|
|
||||||
report any violation of this Policy, software \u201Cbug,\u201D or other problems
|
|
||||||
that could lead to a violation of this Policy through one of the following means:\\n\\n\\n\\n*
|
|
||||||
Reporting issues with the model: [https://github.com/meta-llama/llama-models/issues](https://l.workplace.com/l.php?u=https%3A%2F%2Fgithub.com%2Fmeta-llama%2Fllama-models%2Fissues\\u0026h=AT0qV8W9BFT6NwihiOHRuKYQM_UnkzN_NmHMy91OT55gkLpgi4kQupHUl0ssR4dQsIQ8n3tfd0vtkobvsEvt1l4Ic6GXI2EeuHV8N08OG2WnbAmm0FL4ObkazC6G_256vN0lN9DsykCvCqGZ)\\n*
|
|
||||||
Reporting risky content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)\\n*
|
|
||||||
Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)\\n*
|
|
||||||
Reporting violations of the Acceptable Use Policy or unlicensed uses of Llama
|
|
||||||
3.2: LlamaUseReport@meta.com\\\"\\n\",\"parameters\":\"stop \\\"\\u003c|start_header_id|\\u003e\\\"\\nstop
|
|
||||||
\ \\\"\\u003c|end_header_id|\\u003e\\\"\\nstop \\\"\\u003c|eot_id|\\u003e\\\"\",\"template\":\"\\u003c|start_header_id|\\u003esystem\\u003c|end_header_id|\\u003e\\n\\nCutting
|
|
||||||
Knowledge Date: December 2023\\n\\n{{ if .System }}{{ .System }}\\n{{- end }}\\n{{-
|
|
||||||
if .Tools }}When you receive a tool call response, use the output to format
|
|
||||||
an answer to the orginal user question.\\n\\nYou are a helpful assistant with
|
|
||||||
tool calling capabilities.\\n{{- end }}\\u003c|eot_id|\\u003e\\n{{- range $i,
|
|
||||||
$_ := .Messages }}\\n{{- $last := eq (len (slice $.Messages $i)) 1 }}\\n{{-
|
|
||||||
if eq .Role \\\"user\\\" }}\\u003c|start_header_id|\\u003euser\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if and $.Tools $last }}\\n\\nGiven the following functions, please respond with
|
|
||||||
a JSON for a function call with its proper arguments that best answers the given
|
|
||||||
prompt.\\n\\nRespond in the format {\\\"name\\\": function name, \\\"parameters\\\":
|
|
||||||
dictionary of argument name and its value}. Do not use variables.\\n\\n{{ range
|
|
||||||
$.Tools }}\\n{{- . }}\\n{{ end }}\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{-
|
|
||||||
else }}\\n\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{- end }}{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- else if eq .Role \\\"assistant\\\" }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if .ToolCalls }}\\n{{ range .ToolCalls }}\\n{\\\"name\\\": \\\"{{ .Function.Name
|
|
||||||
}}\\\", \\\"parameters\\\": {{ .Function.Arguments }}}{{ end }}\\n{{- else }}\\n\\n{{
|
|
||||||
.Content }}\\n{{- end }}{{ if not $last }}\\u003c|eot_id|\\u003e{{ end }}\\n{{-
|
|
||||||
else if eq .Role \\\"tool\\\" }}\\u003c|start_header_id|\\u003eipython\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
.Content }}\\u003c|eot_id|\\u003e{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- end }}\\n{{- end }}\",\"details\":{\"parent_model\":\"\",\"format\":\"gguf\",\"family\":\"llama\",\"families\":[\"llama\"],\"parameter_size\":\"3.2B\",\"quantization_level\":\"Q4_K_M\"},\"model_info\":{\"general.architecture\":\"llama\",\"general.basename\":\"Llama-3.2\",\"general.file_type\":15,\"general.finetune\":\"Instruct\",\"general.languages\":[\"en\",\"de\",\"fr\",\"it\",\"pt\",\"hi\",\"es\",\"th\"],\"general.parameter_count\":3212749888,\"general.quantization_version\":2,\"general.size_label\":\"3B\",\"general.tags\":[\"facebook\",\"meta\",\"pytorch\",\"llama\",\"llama-3\",\"text-generation\"],\"general.type\":\"model\",\"llama.attention.head_count\":24,\"llama.attention.head_count_kv\":8,\"llama.attention.key_length\":128,\"llama.attention.layer_norm_rms_epsilon\":0.00001,\"llama.attention.value_length\":128,\"llama.block_count\":28,\"llama.context_length\":131072,\"llama.embedding_length\":3072,\"llama.feed_forward_length\":8192,\"llama.rope.dimension_count\":128,\"llama.rope.freq_base\":500000,\"llama.vocab_size\":128256,\"tokenizer.ggml.bos_token_id\":128000,\"tokenizer.ggml.eos_token_id\":128009,\"tokenizer.ggml.merges\":null,\"tokenizer.ggml.model\":\"gpt2\",\"tokenizer.ggml.pre\":\"llama-bpe\",\"tokenizer.ggml.token_type\":null,\"tokenizer.ggml.tokens\":null},\"modified_at\":\"2024-12-31T11:53:14.529771974-05:00\"}"
|
|
||||||
headers:
|
|
||||||
Content-Type:
|
|
||||||
- application/json; charset=utf-8
|
|
||||||
Date:
|
|
||||||
- Tue, 31 Dec 2024 16:56:15 GMT
|
|
||||||
Transfer-Encoding:
|
|
||||||
- chunked
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
version: 1
|
version: 1
|
||||||
|
|||||||
@@ -26237,7 +26237,7 @@ interactions:
|
|||||||
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
||||||
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
||||||
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
||||||
your best complete final answer, make sure it satisfies the expected criteria, use
|
your best complete final answer, make sure it satisfy the expect criteria, use
|
||||||
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||||
my best complete final answer to the task.\n\n"}], "model": "gpt-4o"}'
|
my best complete final answer to the task.\n\n"}], "model": "gpt-4o"}'
|
||||||
headers:
|
headers:
|
||||||
@@ -26590,7 +26590,7 @@ interactions:
|
|||||||
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
||||||
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
||||||
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
||||||
your best complete final answer, make sure it satisfies the expected criteria, use
|
your best complete final answer, make sure it satisfy the expect criteria, use
|
||||||
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||||
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
||||||
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
||||||
@@ -26941,7 +26941,7 @@ interactions:
|
|||||||
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
||||||
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
||||||
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
||||||
your best complete final answer, make sure it satisfies the expected criteria, use
|
your best complete final answer, make sure it satisfy the expect criteria, use
|
||||||
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||||
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
||||||
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
||||||
@@ -27292,7 +27292,7 @@ interactions:
|
|||||||
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
||||||
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
||||||
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
||||||
your best complete final answer, make sure it satisfies the expected criteria, use
|
your best complete final answer, make sure it satisfy the expect criteria, use
|
||||||
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||||
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
||||||
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
||||||
@@ -27647,7 +27647,7 @@ interactions:
|
|||||||
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
||||||
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
||||||
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
||||||
your best complete final answer, make sure it satisfies the expected criteria, use
|
your best complete final answer, make sure it satisfy the expect criteria, use
|
||||||
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||||
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
||||||
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
||||||
@@ -28005,7 +28005,7 @@ interactions:
|
|||||||
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
||||||
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
||||||
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
||||||
your best complete final answer, make sure it satisfies the expected criteria, use
|
your best complete final answer, make sure it satisfy the expect criteria, use
|
||||||
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||||
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
||||||
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
||||||
@@ -28364,7 +28364,7 @@ interactions:
|
|||||||
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
||||||
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
||||||
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
||||||
your best complete final answer, make sure it satisfies the expected criteria, use
|
your best complete final answer, make sure it satisfy the expect criteria, use
|
||||||
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||||
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
||||||
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
||||||
@@ -28718,7 +28718,7 @@ interactions:
|
|||||||
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
||||||
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
||||||
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
||||||
your best complete final answer, make sure it satisfies the expected criteria, use
|
your best complete final answer, make sure it satisfy the expect criteria, use
|
||||||
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||||
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
||||||
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
||||||
@@ -29082,7 +29082,7 @@ interactions:
|
|||||||
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
||||||
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
||||||
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
||||||
your best complete final answer, make sure it satisfies the expected criteria, use
|
your best complete final answer, make sure it satisfy the expect criteria, use
|
||||||
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||||
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
||||||
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
||||||
@@ -29441,7 +29441,7 @@ interactions:
|
|||||||
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
||||||
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
||||||
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
||||||
your best complete final answer, make sure it satisfies the expected criteria, use
|
your best complete final answer, make sure it satisfy the expect criteria, use
|
||||||
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||||
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
||||||
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
||||||
@@ -29802,7 +29802,7 @@ interactions:
|
|||||||
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
||||||
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
||||||
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
||||||
your best complete final answer, make sure it satisfies the expected criteria, use
|
your best complete final answer, make sure it satisfy the expect criteria, use
|
||||||
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||||
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
||||||
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
||||||
@@ -30170,7 +30170,7 @@ interactions:
|
|||||||
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
||||||
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
||||||
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
||||||
your best complete final answer, make sure it satisfies the expected criteria, use
|
your best complete final answer, make sure it satisfy the expect criteria, use
|
||||||
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||||
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
||||||
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
||||||
@@ -30533,7 +30533,7 @@ interactions:
|
|||||||
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
||||||
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
||||||
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
||||||
your best complete final answer, make sure it satisfies the expected criteria, use
|
your best complete final answer, make sure it satisfy the expect criteria, use
|
||||||
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||||
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
||||||
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
||||||
@@ -30907,7 +30907,7 @@ interactions:
|
|||||||
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
||||||
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
||||||
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
||||||
your best complete final answer, make sure it satisfies the expected criteria, use
|
your best complete final answer, make sure it satisfy the expect criteria, use
|
||||||
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||||
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
||||||
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
||||||
@@ -31273,7 +31273,7 @@ interactions:
|
|||||||
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
||||||
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
||||||
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
||||||
your best complete final answer, make sure it satisfies the expected criteria, use
|
your best complete final answer, make sure it satisfy the expect criteria, use
|
||||||
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||||
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
||||||
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
||||||
@@ -31644,7 +31644,7 @@ interactions:
|
|||||||
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
||||||
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
||||||
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
||||||
your best complete final answer, make sure it satisfies the expected criteria, use
|
your best complete final answer, make sure it satisfy the expect criteria, use
|
||||||
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||||
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
||||||
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
||||||
@@ -32015,7 +32015,7 @@ interactions:
|
|||||||
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
answer."}, {"role": "user", "content": "I did it wrong. Invalid Format: I missed
|
||||||
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
the ''Action:'' after ''Thought:''. I will do right next, and don''t use a tool
|
||||||
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
I have already used.\n\nIf you don''t need to use any more tools, you must give
|
||||||
your best complete final answer, make sure it satisfies the expected criteria, use
|
your best complete final answer, make sure it satisfy the expect criteria, use
|
||||||
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
the EXACT format below:\n\nThought: I now can give a great answer\nFinal Answer:
|
||||||
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
my best complete final answer to the task.\n\n"}, {"role": "user", "content":
|
||||||
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
"I did it wrong. Tried to both perform Action and give a Final Answer at the
|
||||||
|
|||||||
@@ -247,7 +247,7 @@ interactions:
|
|||||||
{"role": "user", "content": "I did it wrong. Invalid Format: I missed the ''Action:''
|
{"role": "user", "content": "I did it wrong. Invalid Format: I missed the ''Action:''
|
||||||
after ''Thought:''. I will do right next, and don''t use a tool I have already
|
after ''Thought:''. I will do right next, and don''t use a tool I have already
|
||||||
used.\n\nIf you don''t need to use any more tools, you must give your best complete
|
used.\n\nIf you don''t need to use any more tools, you must give your best complete
|
||||||
final answer, make sure it satisfies the expected criteria, use the EXACT format
|
final answer, make sure it satisfy the expect criteria, use the EXACT format
|
||||||
below:\n\nThought: I now can give a great answer\nFinal Answer: my best complete
|
below:\n\nThought: I now can give a great answer\nFinal Answer: my best complete
|
||||||
final answer to the task.\n\n"}], "model": "o1-preview"}'
|
final answer to the task.\n\n"}], "model": "o1-preview"}'
|
||||||
headers:
|
headers:
|
||||||
|
|||||||
File diff suppressed because one or more lines are too long
397
tests/cassettes/test_agent_with_ollama_gemma.yaml
Normal file
397
tests/cassettes/test_agent_with_ollama_gemma.yaml
Normal file
@@ -0,0 +1,397 @@
|
|||||||
|
interactions:
|
||||||
|
- request:
|
||||||
|
body: !!binary |
|
||||||
|
CumTAQokCiIKDHNlcnZpY2UubmFtZRISChBjcmV3QUktdGVsZW1ldHJ5Er+TAQoSChBjcmV3YWku
|
||||||
|
dGVsZW1ldHJ5EqoHChDvqD2QZooz9BkEwtbWjp4OEgjxh72KACHvZSoMQ3JldyBDcmVhdGVkMAE5
|
||||||
|
qMhNnvBM+BdBcO9PnvBM+BdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC42MS4wShoKDnB5dGhvbl92
|
||||||
|
ZXJzaW9uEggKBjMuMTEuN0ouCghjcmV3X2tleRIiCiBkNTUxMTNiZTRhYTQxYmE2NDNkMzI2MDQy
|
||||||
|
YjJmMDNmMUoxCgdjcmV3X2lkEiYKJGY4YTA1OTA1LTk0OGEtNDQ0YS04NmJmLTJiNTNiNDkyYjgy
|
||||||
|
MkocCgxjcmV3X3Byb2Nlc3MSDAoKc2VxdWVudGlhbEoRCgtjcmV3X21lbW9yeRICEABKGgoUY3Jl
|
||||||
|
d19udW1iZXJfb2ZfdGFza3MSAhgBShsKFWNyZXdfbnVtYmVyX29mX2FnZW50cxICGAFKxwIKC2Ny
|
||||||
|
ZXdfYWdlbnRzErcCCrQCW3sia2V5IjogImUxNDhlNTMyMDI5MzQ5OWY4Y2ViZWE4MjZlNzI1ODJi
|
||||||
|
IiwgImlkIjogIjg1MGJjNWUwLTk4NTctNDhkOC1iNWZlLTJmZjk2OWExYTU3YiIsICJyb2xlIjog
|
||||||
|
InRlc3Qgcm9sZSIsICJ2ZXJib3NlPyI6IHRydWUsICJtYXhfaXRlciI6IDQsICJtYXhfcnBtIjog
|
||||||
|
MTAsICJmdW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAibGxtIjogImdwdC00byIsICJkZWxlZ2F0
|
||||||
|
aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJhbGxvd19jb2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1h
|
||||||
|
eF9yZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6IFtdfV1KkAIKCmNyZXdfdGFza3MSgQIK
|
||||||
|
/gFbeyJrZXkiOiAiNGEzMWI4NTEzM2EzYTI5NGM2ODUzZGE3NTdkNGJhZTciLCAiaWQiOiAiOTc1
|
||||||
|
ZDgwMjItMWJkMS00NjBlLTg2NmEtYjJmZGNiYjA4ZDliIiwgImFzeW5jX2V4ZWN1dGlvbj8iOiBm
|
||||||
|
YWxzZSwgImh1bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdlbnRfcm9sZSI6ICJ0ZXN0IHJvbGUiLCAi
|
||||||
|
YWdlbnRfa2V5IjogImUxNDhlNTMyMDI5MzQ5OWY4Y2ViZWE4MjZlNzI1ODJiIiwgInRvb2xzX25h
|
||||||
|
bWVzIjogWyJnZXRfZmluYWxfYW5zd2VyIl19XXoCGAGFAQABAAASjgIKEP9UYSAOFQbZquSppN1j
|
||||||
|
IeUSCAgZmXUoJKFmKgxUYXNrIENyZWF0ZWQwATloPV+e8Ez4F0GYsl+e8Ez4F0ouCghjcmV3X2tl
|
||||||
|
eRIiCiBkNTUxMTNiZTRhYTQxYmE2NDNkMzI2MDQyYjJmMDNmMUoxCgdjcmV3X2lkEiYKJGY4YTA1
|
||||||
|
OTA1LTk0OGEtNDQ0YS04NmJmLTJiNTNiNDkyYjgyMkouCgh0YXNrX2tleRIiCiA0YTMxYjg1MTMz
|
||||||
|
YTNhMjk0YzY4NTNkYTc1N2Q0YmFlN0oxCgd0YXNrX2lkEiYKJDk3NWQ4MDIyLTFiZDEtNDYwZS04
|
||||||
|
NjZhLWIyZmRjYmIwOGQ5YnoCGAGFAQABAAASkwEKEEfiywgqgiUXE3KoUbrnHDQSCGmv+iM7Wc1Z
|
||||||
|
KgpUb29sIFVzYWdlMAE5kOybnvBM+BdBIM+cnvBM+BdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC42
|
||||||
|
MS4wSh8KCXRvb2xfbmFtZRISChBnZXRfZmluYWxfYW5zd2VySg4KCGF0dGVtcHRzEgIYAXoCGAGF
|
||||||
|
AQABAAASkwEKEH7AHXpfmvwIkA45HB8YyY0SCAFRC+uJpsEZKgpUb29sIFVzYWdlMAE56PLdnvBM
|
||||||
|
+BdBYFbfnvBM+BdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC42MS4wSh8KCXRvb2xfbmFtZRISChBn
|
||||||
|
ZXRfZmluYWxfYW5zd2VySg4KCGF0dGVtcHRzEgIYAXoCGAGFAQABAAASkwEKEIDKKEbYU4lcJF+a
|
||||||
|
WsAVZwESCI+/La7oL86MKgpUb29sIFVzYWdlMAE5yIkgn/BM+BdBWGwhn/BM+BdKGgoOY3Jld2Fp
|
||||||
|
X3ZlcnNpb24SCAoGMC42MS4wSh8KCXRvb2xfbmFtZRISChBnZXRfZmluYWxfYW5zd2VySg4KCGF0
|
||||||
|
dGVtcHRzEgIYAXoCGAGFAQABAAASnAEKEMTZ2IhpLz6J2hJhHBQ8/M4SCEuWz+vjzYifKhNUb29s
|
||||||
|
IFJlcGVhdGVkIFVzYWdlMAE5mAVhn/BM+BdBKOhhn/BM+BdKGgoOY3Jld2FpX3ZlcnNpb24SCAoG
|
||||||
|
MC42MS4wSh8KCXRvb2xfbmFtZRISChBnZXRfZmluYWxfYW5zd2VySg4KCGF0dGVtcHRzEgIYAXoC
|
||||||
|
GAGFAQABAAASkAIKED8C+t95p855kLcXs5Nnt/sSCM4XAhL6u8O8Kg5UYXNrIEV4ZWN1dGlvbjAB
|
||||||
|
OdD8X57wTPgXQUgno5/wTPgXSi4KCGNyZXdfa2V5EiIKIGQ1NTExM2JlNGFhNDFiYTY0M2QzMjYw
|
||||||
|
NDJiMmYwM2YxSjEKB2NyZXdfaWQSJgokZjhhMDU5MDUtOTQ4YS00NDRhLTg2YmYtMmI1M2I0OTJi
|
||||||
|
ODIySi4KCHRhc2tfa2V5EiIKIDRhMzFiODUxMzNhM2EyOTRjNjg1M2RhNzU3ZDRiYWU3SjEKB3Rh
|
||||||
|
c2tfaWQSJgokOTc1ZDgwMjItMWJkMS00NjBlLTg2NmEtYjJmZGNiYjA4ZDliegIYAYUBAAEAABLO
|
||||||
|
CwoQFlnZCfbZ3Dj0L9TAE5LrLBIIoFr7BZErFNgqDENyZXcgQ3JlYXRlZDABOVhDDaDwTPgXQSg/
|
||||||
|
D6DwTPgXShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuNjEuMEoaCg5weXRob25fdmVyc2lvbhIICgYz
|
||||||
|
LjExLjdKLgoIY3Jld19rZXkSIgogOTRjMzBkNmMzYjJhYzhmYjk0YjJkY2ZjNTcyZDBmNTlKMQoH
|
||||||
|
Y3Jld19pZBImCiQyMzM2MzRjNi1lNmQ2LTQ5ZTYtODhhZS1lYWUxYTM5YjBlMGZKHAoMY3Jld19w
|
||||||
|
cm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdfbnVtYmVyX29m
|
||||||
|
X3Rhc2tzEgIYAkobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgCSv4ECgtjcmV3X2FnZW50cxLu
|
||||||
|
BArrBFt7ImtleSI6ICJlMTQ4ZTUzMjAyOTM0OTlmOGNlYmVhODI2ZTcyNTgyYiIsICJpZCI6ICI0
|
||||||
|
MjAzZjIyYi0wNWM3LTRiNjUtODBjMS1kM2Y0YmFlNzZhNDYiLCAicm9sZSI6ICJ0ZXN0IHJvbGUi
|
||||||
|
LCAidmVyYm9zZT8iOiB0cnVlLCAibWF4X2l0ZXIiOiAyLCAibWF4X3JwbSI6IDEwLCAiZnVuY3Rp
|
||||||
|
b25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJncHQtNG8iLCAiZGVsZWdhdGlvbl9lbmFibGVk
|
||||||
|
PyI6IGZhbHNlLCAiYWxsb3dfY29kZV9leGVjdXRpb24/IjogZmFsc2UsICJtYXhfcmV0cnlfbGlt
|
||||||
|
aXQiOiAyLCAidG9vbHNfbmFtZXMiOiBbXX0sIHsia2V5IjogImU3ZThlZWE4ODZiY2I4ZjEwNDVh
|
||||||
|
YmVlY2YxNDI1ZGI3IiwgImlkIjogImZjOTZjOTQ1LTY4ZDUtNDIxMy05NmNkLTNmYTAwNmUyZTYz
|
||||||
|
MCIsICJyb2xlIjogInRlc3Qgcm9sZTIiLCAidmVyYm9zZT8iOiB0cnVlLCAibWF4X2l0ZXIiOiAx
|
||||||
|
LCAibWF4X3JwbSI6IG51bGwsICJmdW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAibGxtIjogImdw
|
||||||
|
dC00byIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJhbGxvd19jb2RlX2V4ZWN1dGlv
|
||||||
|
bj8iOiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6IFtdfV1K/QMK
|
||||||
|
CmNyZXdfdGFza3MS7gMK6wNbeyJrZXkiOiAiMzIyZGRhZTNiYzgwYzFkNDViODVmYTc3NTZkYjg2
|
||||||
|
NjUiLCAiaWQiOiAiOTVjYTg4NDItNmExMi00MGQ5LWIwZDItNGI0MzYxYmJlNTZkIiwgImFzeW5j
|
||||||
|
X2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdlbnRfcm9sZSI6
|
||||||
|
ICJ0ZXN0IHJvbGUiLCAiYWdlbnRfa2V5IjogImUxNDhlNTMyMDI5MzQ5OWY4Y2ViZWE4MjZlNzI1
|
||||||
|
ODJiIiwgInRvb2xzX25hbWVzIjogW119LCB7ImtleSI6ICI1ZTljYTdkNjRiNDIwNWJiN2M0N2Uw
|
||||||
|
YjNmY2I1ZDIxZiIsICJpZCI6ICI5NzI5MTg2Yy1kN2JlLTRkYjQtYTk0ZS02OWU5OTk2NTI3MDAi
|
||||||
|
LCAiYXN5bmNfZXhlY3V0aW9uPyI6IGZhbHNlLCAiaHVtYW5faW5wdXQ/IjogZmFsc2UsICJhZ2Vu
|
||||||
|
dF9yb2xlIjogInRlc3Qgcm9sZTIiLCAiYWdlbnRfa2V5IjogImU3ZThlZWE4ODZiY2I4ZjEwNDVh
|
||||||
|
YmVlY2YxNDI1ZGI3IiwgInRvb2xzX25hbWVzIjogWyJnZXRfZmluYWxfYW5zd2VyIl19XXoCGAGF
|
||||||
|
AQABAAASjgIKEC/YM2OukRrSg+ZAev4VhGESCOQ5RvzSS5IEKgxUYXNrIENyZWF0ZWQwATmQJx6g
|
||||||
|
8Ez4F0EgjR6g8Ez4F0ouCghjcmV3X2tleRIiCiA5NGMzMGQ2YzNiMmFjOGZiOTRiMmRjZmM1NzJk
|
||||||
|
MGY1OUoxCgdjcmV3X2lkEiYKJDIzMzYzNGM2LWU2ZDYtNDllNi04OGFlLWVhZTFhMzliMGUwZkou
|
||||||
|
Cgh0YXNrX2tleRIiCiAzMjJkZGFlM2JjODBjMWQ0NWI4NWZhNzc1NmRiODY2NUoxCgd0YXNrX2lk
|
||||||
|
EiYKJDk1Y2E4ODQyLTZhMTItNDBkOS1iMGQyLTRiNDM2MWJiZTU2ZHoCGAGFAQABAAASkAIKEHqZ
|
||||||
|
L8s3clXQyVTemNcTCcQSCA0tzK95agRQKg5UYXNrIEV4ZWN1dGlvbjABOQC8HqDwTPgXQdgNSqDw
|
||||||
|
TPgXSi4KCGNyZXdfa2V5EiIKIDk0YzMwZDZjM2IyYWM4ZmI5NGIyZGNmYzU3MmQwZjU5SjEKB2Ny
|
||||||
|
ZXdfaWQSJgokMjMzNjM0YzYtZTZkNi00OWU2LTg4YWUtZWFlMWEzOWIwZTBmSi4KCHRhc2tfa2V5
|
||||||
|
EiIKIDMyMmRkYWUzYmM4MGMxZDQ1Yjg1ZmE3NzU2ZGI4NjY1SjEKB3Rhc2tfaWQSJgokOTVjYTg4
|
||||||
|
NDItNmExMi00MGQ5LWIwZDItNGI0MzYxYmJlNTZkegIYAYUBAAEAABKOAgoQjhKzodMUmQ8NWtdy
|
||||||
|
Uj99whIIBsGtAymZibwqDFRhc2sgQ3JlYXRlZDABOXjVVaDwTPgXQXhSVqDwTPgXSi4KCGNyZXdf
|
||||||
|
a2V5EiIKIDk0YzMwZDZjM2IyYWM4ZmI5NGIyZGNmYzU3MmQwZjU5SjEKB2NyZXdfaWQSJgokMjMz
|
||||||
|
NjM0YzYtZTZkNi00OWU2LTg4YWUtZWFlMWEzOWIwZTBmSi4KCHRhc2tfa2V5EiIKIDVlOWNhN2Q2
|
||||||
|
NGI0MjA1YmI3YzQ3ZTBiM2ZjYjVkMjFmSjEKB3Rhc2tfaWQSJgokOTcyOTE4NmMtZDdiZS00ZGI0
|
||||||
|
LWE5NGUtNjllOTk5NjUyNzAwegIYAYUBAAEAABKTAQoQx5IUsjAFMGNUaz5MHy20OBIIzl2tr25P
|
||||||
|
LL8qClRvb2wgVXNhZ2UwATkgt5Sg8Ez4F0GwFpag8Ez4F0oaCg5jcmV3YWlfdmVyc2lvbhIICgYw
|
||||||
|
LjYxLjBKHwoJdG9vbF9uYW1lEhIKEGdldF9maW5hbF9hbnN3ZXJKDgoIYXR0ZW1wdHMSAhgBegIY
|
||||||
|
AYUBAAEAABKQAgoQEkfcfCrzTYIM6GQXhknlexIIa/oxeT78OL8qDlRhc2sgRXhlY3V0aW9uMAE5
|
||||||
|
WIFWoPBM+BdBuL/GoPBM+BdKLgoIY3Jld19rZXkSIgogOTRjMzBkNmMzYjJhYzhmYjk0YjJkY2Zj
|
||||||
|
NTcyZDBmNTlKMQoHY3Jld19pZBImCiQyMzM2MzRjNi1lNmQ2LTQ5ZTYtODhhZS1lYWUxYTM5YjBl
|
||||||
|
MGZKLgoIdGFza19rZXkSIgogNWU5Y2E3ZDY0YjQyMDViYjdjNDdlMGIzZmNiNWQyMWZKMQoHdGFz
|
||||||
|
a19pZBImCiQ5NzI5MTg2Yy1kN2JlLTRkYjQtYTk0ZS02OWU5OTk2NTI3MDB6AhgBhQEAAQAAEqwH
|
||||||
|
ChDrKBdEe+Z5276g9fgg6VzjEgiJfnDwsv1SrCoMQ3JldyBDcmVhdGVkMAE5MLQYofBM+BdBQFIa
|
||||||
|
ofBM+BdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC42MS4wShoKDnB5dGhvbl92ZXJzaW9uEggKBjMu
|
||||||
|
MTEuN0ouCghjcmV3X2tleRIiCiA3M2FhYzI4NWU2NzQ2NjY3Zjc1MTQ3NjcwMDAzNDExMEoxCgdj
|
||||||
|
cmV3X2lkEiYKJDg0NDY0YjhlLTRiZjctNDRiYy05MmUxLWE4ZDE1NGZlNWZkN0ocCgxjcmV3X3By
|
||||||
|
b2Nlc3MSDAoKc2VxdWVudGlhbEoRCgtjcmV3X21lbW9yeRICEABKGgoUY3Jld19udW1iZXJfb2Zf
|
||||||
|
dGFza3MSAhgBShsKFWNyZXdfbnVtYmVyX29mX2FnZW50cxICGAFKyQIKC2NyZXdfYWdlbnRzErkC
|
||||||
|
CrYCW3sia2V5IjogImUxNDhlNTMyMDI5MzQ5OWY4Y2ViZWE4MjZlNzI1ODJiIiwgImlkIjogIjk4
|
||||||
|
YmIwNGYxLTBhZGMtNGZiNC04YzM2LWM3M2Q1MzQ1ZGRhZCIsICJyb2xlIjogInRlc3Qgcm9sZSIs
|
||||||
|
ICJ2ZXJib3NlPyI6IHRydWUsICJtYXhfaXRlciI6IDEsICJtYXhfcnBtIjogbnVsbCwgImZ1bmN0
|
||||||
|
aW9uX2NhbGxpbmdfbGxtIjogIiIsICJsbG0iOiAiZ3B0LTRvIiwgImRlbGVnYXRpb25fZW5hYmxl
|
||||||
|
ZD8iOiBmYWxzZSwgImFsbG93X2NvZGVfZXhlY3V0aW9uPyI6IGZhbHNlLCAibWF4X3JldHJ5X2xp
|
||||||
|
bWl0IjogMiwgInRvb2xzX25hbWVzIjogW119XUqQAgoKY3Jld190YXNrcxKBAgr+AVt7ImtleSI6
|
||||||
|
ICJmN2E5ZjdiYjFhZWU0YjZlZjJjNTI2ZDBhOGMyZjJhYyIsICJpZCI6ICIxZjRhYzJhYS03YmQ4
|
||||||
|
LTQ1NWQtODgyMC1jMzZmMjJjMDY4MzciLCAiYXN5bmNfZXhlY3V0aW9uPyI6IGZhbHNlLCAiaHVt
|
||||||
|
YW5faW5wdXQ/IjogZmFsc2UsICJhZ2VudF9yb2xlIjogInRlc3Qgcm9sZSIsICJhZ2VudF9rZXki
|
||||||
|
OiAiZTE0OGU1MzIwMjkzNDk5ZjhjZWJlYTgyNmU3MjU4MmIiLCAidG9vbHNfbmFtZXMiOiBbImdl
|
||||||
|
dF9maW5hbF9hbnN3ZXIiXX1degIYAYUBAAEAABKOAgoQ0/vrakH7zD0uSvmVBUV8lxIIYe4YKcYG
|
||||||
|
hNgqDFRhc2sgQ3JlYXRlZDABOdBXKqHwTPgXQcCtKqHwTPgXSi4KCGNyZXdfa2V5EiIKIDczYWFj
|
||||||
|
Mjg1ZTY3NDY2NjdmNzUxNDc2NzAwMDM0MTEwSjEKB2NyZXdfaWQSJgokODQ0NjRiOGUtNGJmNy00
|
||||||
|
NGJjLTkyZTEtYThkMTU0ZmU1ZmQ3Si4KCHRhc2tfa2V5EiIKIGY3YTlmN2JiMWFlZTRiNmVmMmM1
|
||||||
|
MjZkMGE4YzJmMmFjSjEKB3Rhc2tfaWQSJgokMWY0YWMyYWEtN2JkOC00NTVkLTg4MjAtYzM2ZjIy
|
||||||
|
YzA2ODM3egIYAYUBAAEAABKkAQoQ5GDzHNlSdlcVDdxsI3abfRIIhYu8fZS3iA4qClRvb2wgVXNh
|
||||||
|
Z2UwATnIi2eh8Ez4F0FYbmih8Ez4F0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjYxLjBKHwoJdG9v
|
||||||
|
bF9uYW1lEhIKEGdldF9maW5hbF9hbnN3ZXJKDgoIYXR0ZW1wdHMSAhgBSg8KA2xsbRIICgZncHQt
|
||||||
|
NG96AhgBhQEAAQAAEpACChAy85Jfr/EEIe1THU8koXoYEgjlkNn7xfysjioOVGFzayBFeGVjdXRp
|
||||||
|
b24wATm42Cqh8Ez4F0GgxZah8Ez4F0ouCghjcmV3X2tleRIiCiA3M2FhYzI4NWU2NzQ2NjY3Zjc1
|
||||||
|
MTQ3NjcwMDAzNDExMEoxCgdjcmV3X2lkEiYKJDg0NDY0YjhlLTRiZjctNDRiYy05MmUxLWE4ZDE1
|
||||||
|
NGZlNWZkN0ouCgh0YXNrX2tleRIiCiBmN2E5ZjdiYjFhZWU0YjZlZjJjNTI2ZDBhOGMyZjJhY0ox
|
||||||
|
Cgd0YXNrX2lkEiYKJDFmNGFjMmFhLTdiZDgtNDU1ZC04ODIwLWMzNmYyMmMwNjgzN3oCGAGFAQAB
|
||||||
|
AAASrAcKEG0ZVq5Ww+/A0wOY3HmKgq4SCMe0ooxqjqBlKgxDcmV3IENyZWF0ZWQwATlwmISi8Ez4
|
||||||
|
F0HYUYai8Ez4F0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjYxLjBKGgoOcHl0aG9uX3ZlcnNpb24S
|
||||||
|
CAoGMy4xMS43Si4KCGNyZXdfa2V5EiIKIGQ1NTExM2JlNGFhNDFiYTY0M2QzMjYwNDJiMmYwM2Yx
|
||||||
|
SjEKB2NyZXdfaWQSJgokNzkyMWVlYmItMWI4NS00MzNjLWIxMDAtZDU4MmMyOTg5MzBkShwKDGNy
|
||||||
|
ZXdfcHJvY2VzcxIMCgpzZXF1ZW50aWFsShEKC2NyZXdfbWVtb3J5EgIQAEoaChRjcmV3X251bWJl
|
||||||
|
cl9vZl90YXNrcxICGAFKGwoVY3Jld19udW1iZXJfb2ZfYWdlbnRzEgIYAUrJAgoLY3Jld19hZ2Vu
|
||||||
|
dHMSuQIKtgJbeyJrZXkiOiAiZTE0OGU1MzIwMjkzNDk5ZjhjZWJlYTgyNmU3MjU4MmIiLCAiaWQi
|
||||||
|
OiAiZmRiZDI1MWYtYzUwOC00YmFhLTkwNjctN2U5YzQ2ZGZiZTJhIiwgInJvbGUiOiAidGVzdCBy
|
||||||
|
b2xlIiwgInZlcmJvc2U/IjogdHJ1ZSwgIm1heF9pdGVyIjogNiwgIm1heF9ycG0iOiBudWxsLCAi
|
||||||
|
ZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJncHQtNG8iLCAiZGVsZWdhdGlvbl9l
|
||||||
|
bmFibGVkPyI6IGZhbHNlLCAiYWxsb3dfY29kZV9leGVjdXRpb24/IjogZmFsc2UsICJtYXhfcmV0
|
||||||
|
cnlfbGltaXQiOiAyLCAidG9vbHNfbmFtZXMiOiBbXX1dSpACCgpjcmV3X3Rhc2tzEoECCv4BW3si
|
||||||
|
a2V5IjogIjRhMzFiODUxMzNhM2EyOTRjNjg1M2RhNzU3ZDRiYWU3IiwgImlkIjogIjA2YWFmM2Y1
|
||||||
|
LTE5ODctNDAxYS05Yzk0LWY3ZjM1YmQzMDg3OSIsICJhc3luY19leGVjdXRpb24/IjogZmFsc2Us
|
||||||
|
ICJodW1hbl9pbnB1dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAidGVzdCByb2xlIiwgImFnZW50
|
||||||
|
X2tleSI6ICJlMTQ4ZTUzMjAyOTM0OTlmOGNlYmVhODI2ZTcyNTgyYiIsICJ0b29sc19uYW1lcyI6
|
||||||
|
IFsiZ2V0X2ZpbmFsX2Fuc3dlciJdfV16AhgBhQEAAQAAEo4CChDT+zPZHwfacDilkzaZJ9uGEgip
|
||||||
|
Kr5r62JB+ioMVGFzayBDcmVhdGVkMAE56KeTovBM+BdB8PmTovBM+BdKLgoIY3Jld19rZXkSIgog
|
||||||
|
ZDU1MTEzYmU0YWE0MWJhNjQzZDMyNjA0MmIyZjAzZjFKMQoHY3Jld19pZBImCiQ3OTIxZWViYi0x
|
||||||
|
Yjg1LTQzM2MtYjEwMC1kNTgyYzI5ODkzMGRKLgoIdGFza19rZXkSIgogNGEzMWI4NTEzM2EzYTI5
|
||||||
|
NGM2ODUzZGE3NTdkNGJhZTdKMQoHdGFza19pZBImCiQwNmFhZjNmNS0xOTg3LTQwMWEtOWM5NC1m
|
||||||
|
N2YzNWJkMzA4Nzl6AhgBhQEAAQAAEpMBChCl85ZcL2Fa0N5QTl6EsIfnEghyDo3bxT+AkyoKVG9v
|
||||||
|
bCBVc2FnZTABOVBA2aLwTPgXQYAy2qLwTPgXShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuNjEuMEof
|
||||||
|
Cgl0b29sX25hbWUSEgoQZ2V0X2ZpbmFsX2Fuc3dlckoOCghhdHRlbXB0cxICGAF6AhgBhQEAAQAA
|
||||||
|
EpwBChB22uwKhaur9zmeoeEMaRKzEgjrtSEzMbRdIioTVG9vbCBSZXBlYXRlZCBVc2FnZTABOQga
|
||||||
|
C6PwTPgXQaDRC6PwTPgXShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuNjEuMEofCgl0b29sX25hbWUS
|
||||||
|
EgoQZ2V0X2ZpbmFsX2Fuc3dlckoOCghhdHRlbXB0cxICGAF6AhgBhQEAAQAAEpMBChArAfcRpE+W
|
||||||
|
02oszyzccbaWEghTAO9J3zq/kyoKVG9vbCBVc2FnZTABORBRTqPwTPgXQegnT6PwTPgXShoKDmNy
|
||||||
|
ZXdhaV92ZXJzaW9uEggKBjAuNjEuMEofCgl0b29sX25hbWUSEgoQZ2V0X2ZpbmFsX2Fuc3dlckoO
|
||||||
|
CghhdHRlbXB0cxICGAF6AhgBhQEAAQAAEpwBChBdtM3p3aqT7wTGaXi6el/4Egie6lFQpa+AfioT
|
||||||
|
VG9vbCBSZXBlYXRlZCBVc2FnZTABOdBg2KPwTPgXQehW2aPwTPgXShoKDmNyZXdhaV92ZXJzaW9u
|
||||||
|
EggKBjAuNjEuMEofCgl0b29sX25hbWUSEgoQZ2V0X2ZpbmFsX2Fuc3dlckoOCghhdHRlbXB0cxIC
|
||||||
|
GAF6AhgBhQEAAQAAEpMBChDq4OuaUKkNoi6jlMyahPJpEgg1MFDHktBxNSoKVG9vbCBVc2FnZTAB
|
||||||
|
ORD/K6TwTPgXQZgMLaTwTPgXShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuNjEuMEofCgl0b29sX25h
|
||||||
|
bWUSEgoQZ2V0X2ZpbmFsX2Fuc3dlckoOCghhdHRlbXB0cxICGAF6AhgBhQEAAQAAEpACChBhvTmu
|
||||||
|
QWP+bx9JMmGpt+w5Egh1J17yki7s8ioOVGFzayBFeGVjdXRpb24wATnoJJSi8Ez4F0HwNX6k8Ez4
|
||||||
|
F0ouCghjcmV3X2tleRIiCiBkNTUxMTNiZTRhYTQxYmE2NDNkMzI2MDQyYjJmMDNmMUoxCgdjcmV3
|
||||||
|
X2lkEiYKJDc5MjFlZWJiLTFiODUtNDMzYy1iMTAwLWQ1ODJjMjk4OTMwZEouCgh0YXNrX2tleRIi
|
||||||
|
CiA0YTMxYjg1MTMzYTNhMjk0YzY4NTNkYTc1N2Q0YmFlN0oxCgd0YXNrX2lkEiYKJDA2YWFmM2Y1
|
||||||
|
LTE5ODctNDAxYS05Yzk0LWY3ZjM1YmQzMDg3OXoCGAGFAQABAAASrg0KEOJZEqiJ7LTTX/J+tuLR
|
||||||
|
stQSCHKjy4tIcmKEKgxDcmV3IENyZWF0ZWQwATmIEuGk8Ez4F0FYDuOk8Ez4F0oaCg5jcmV3YWlf
|
||||||
|
dmVyc2lvbhIICgYwLjYxLjBKGgoOcHl0aG9uX3ZlcnNpb24SCAoGMy4xMS43Si4KCGNyZXdfa2V5
|
||||||
|
EiIKIDExMWI4NzJkOGYwY2Y3MDNmMmVmZWYwNGNmM2FjNzk4SjEKB2NyZXdfaWQSJgokYWFiYmU5
|
||||||
|
MmQtYjg3NC00NTZmLWE0NzAtM2FmMDc4ZTdjYThlShwKDGNyZXdfcHJvY2VzcxIMCgpzZXF1ZW50
|
||||||
|
aWFsShEKC2NyZXdfbWVtb3J5EgIQAEoaChRjcmV3X251bWJlcl9vZl90YXNrcxICGANKGwoVY3Jl
|
||||||
|
d19udW1iZXJfb2ZfYWdlbnRzEgIYAkqEBQoLY3Jld19hZ2VudHMS9AQK8QRbeyJrZXkiOiAiZTE0
|
||||||
|
OGU1MzIwMjkzNDk5ZjhjZWJlYTgyNmU3MjU4MmIiLCAiaWQiOiAiZmYzOTE0OGEtZWI2NS00Nzkx
|
||||||
|
LWI3MTMtM2Q4ZmE1YWQ5NTJlIiwgInJvbGUiOiAidGVzdCByb2xlIiwgInZlcmJvc2U/IjogZmFs
|
||||||
|
c2UsICJtYXhfaXRlciI6IDE1LCAibWF4X3JwbSI6IG51bGwsICJmdW5jdGlvbl9jYWxsaW5nX2xs
|
||||||
|
bSI6ICIiLCAibGxtIjogImdwdC00byIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJh
|
||||||
|
bGxvd19jb2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6IDIsICJ0b29s
|
||||||
|
c19uYW1lcyI6IFtdfSwgeyJrZXkiOiAiZTdlOGVlYTg4NmJjYjhmMTA0NWFiZWVjZjE0MjVkYjci
|
||||||
|
LCAiaWQiOiAiYzYyNDJmNDMtNmQ2Mi00N2U4LTliYmMtNjM0ZDQwYWI4YTQ2IiwgInJvbGUiOiAi
|
||||||
|
dGVzdCByb2xlMiIsICJ2ZXJib3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAxNSwgIm1heF9ycG0i
|
||||||
|
OiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJncHQtNG8iLCAiZGVs
|
||||||
|
ZWdhdGlvbl9lbmFibGVkPyI6IGZhbHNlLCAiYWxsb3dfY29kZV9leGVjdXRpb24/IjogZmFsc2Us
|
||||||
|
ICJtYXhfcmV0cnlfbGltaXQiOiAyLCAidG9vbHNfbmFtZXMiOiBbXX1dStcFCgpjcmV3X3Rhc2tz
|
||||||
|
EsgFCsUFW3sia2V5IjogIjMyMmRkYWUzYmM4MGMxZDQ1Yjg1ZmE3NzU2ZGI4NjY1IiwgImlkIjog
|
||||||
|
IjRmZDZhZDdiLTFjNWMtNDE1ZC1hMWQ4LTgwYzExZGNjMTY4NiIsICJhc3luY19leGVjdXRpb24/
|
||||||
|
IjogZmFsc2UsICJodW1hbl9pbnB1dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAidGVzdCByb2xl
|
||||||
|
IiwgImFnZW50X2tleSI6ICJlMTQ4ZTUzMjAyOTM0OTlmOGNlYmVhODI2ZTcyNTgyYiIsICJ0b29s
|
||||||
|
c19uYW1lcyI6IFtdfSwgeyJrZXkiOiAiY2M0ODc2ZjZlNTg4ZTcxMzQ5YmJkM2E2NTg4OGMzZTki
|
||||||
|
LCAiaWQiOiAiOTFlYWFhMWMtMWI4ZC00MDcxLTk2ZmQtM2QxZWVkMjhjMzZjIiwgImFzeW5jX2V4
|
||||||
|
ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdlbnRfcm9sZSI6ICJ0
|
||||||
|
ZXN0IHJvbGUiLCAiYWdlbnRfa2V5IjogImUxNDhlNTMyMDI5MzQ5OWY4Y2ViZWE4MjZlNzI1ODJi
|
||||||
|
IiwgInRvb2xzX25hbWVzIjogW119LCB7ImtleSI6ICJlMGIxM2UxMGQ3YTE0NmRjYzRjNDg4ZmNm
|
||||||
|
OGQ3NDhhMCIsICJpZCI6ICI4NjExZjhjZS1jNDVlLTQ2OTgtYWEyMS1jMGJkNzdhOGY2ZWYiLCAi
|
||||||
|
YXN5bmNfZXhlY3V0aW9uPyI6IGZhbHNlLCAiaHVtYW5faW5wdXQ/IjogZmFsc2UsICJhZ2VudF9y
|
||||||
|
b2xlIjogInRlc3Qgcm9sZTIiLCAiYWdlbnRfa2V5IjogImU3ZThlZWE4ODZiY2I4ZjEwNDVhYmVl
|
||||||
|
Y2YxNDI1ZGI3IiwgInRvb2xzX25hbWVzIjogW119XXoCGAGFAQABAAASjgIKEMbX6YsWK7RRf4L1
|
||||||
|
NBRKD6cSCFLJiNmspsyjKgxUYXNrIENyZWF0ZWQwATnonPGk8Ez4F0EotvKk8Ez4F0ouCghjcmV3
|
||||||
|
X2tleRIiCiAxMTFiODcyZDhmMGNmNzAzZjJlZmVmMDRjZjNhYzc5OEoxCgdjcmV3X2lkEiYKJGFh
|
||||||
|
YmJlOTJkLWI4NzQtNDU2Zi1hNDcwLTNhZjA3OGU3Y2E4ZUouCgh0YXNrX2tleRIiCiAzMjJkZGFl
|
||||||
|
M2JjODBjMWQ0NWI4NWZhNzc1NmRiODY2NUoxCgd0YXNrX2lkEiYKJDRmZDZhZDdiLTFjNWMtNDE1
|
||||||
|
ZC1hMWQ4LTgwYzExZGNjMTY4NnoCGAGFAQABAAASkAIKEM9JnUNanFbE9AtnSxqA7H8SCBWlG0WJ
|
||||||
|
sMgKKg5UYXNrIEV4ZWN1dGlvbjABOfDo8qTwTPgXQWhEH6XwTPgXSi4KCGNyZXdfa2V5EiIKIDEx
|
||||||
|
MWI4NzJkOGYwY2Y3MDNmMmVmZWYwNGNmM2FjNzk4SjEKB2NyZXdfaWQSJgokYWFiYmU5MmQtYjg3
|
||||||
|
NC00NTZmLWE0NzAtM2FmMDc4ZTdjYThlSi4KCHRhc2tfa2V5EiIKIDMyMmRkYWUzYmM4MGMxZDQ1
|
||||||
|
Yjg1ZmE3NzU2ZGI4NjY1SjEKB3Rhc2tfaWQSJgokNGZkNmFkN2ItMWM1Yy00MTVkLWExZDgtODBj
|
||||||
|
MTFkY2MxNjg2egIYAYUBAAEAABKOAgoQaQALCJNe5ByN4Wu7FE0kABIIYW/UfVfnYscqDFRhc2sg
|
||||||
|
Q3JlYXRlZDABOWhzLKXwTPgXQSD8LKXwTPgXSi4KCGNyZXdfa2V5EiIKIDExMWI4NzJkOGYwY2Y3
|
||||||
|
MDNmMmVmZWYwNGNmM2FjNzk4SjEKB2NyZXdfaWQSJgokYWFiYmU5MmQtYjg3NC00NTZmLWE0NzAt
|
||||||
|
M2FmMDc4ZTdjYThlSi4KCHRhc2tfa2V5EiIKIGNjNDg3NmY2ZTU4OGU3MTM0OWJiZDNhNjU4ODhj
|
||||||
|
M2U5SjEKB3Rhc2tfaWQSJgokOTFlYWFhMWMtMWI4ZC00MDcxLTk2ZmQtM2QxZWVkMjhjMzZjegIY
|
||||||
|
AYUBAAEAABKQAgoQpPfkgFlpIsR/eN2zn+x3MRIILoWF4/HvceAqDlRhc2sgRXhlY3V0aW9uMAE5
|
||||||
|
GCctpfBM+BdBQLNapfBM+BdKLgoIY3Jld19rZXkSIgogMTExYjg3MmQ4ZjBjZjcwM2YyZWZlZjA0
|
||||||
|
Y2YzYWM3OThKMQoHY3Jld19pZBImCiRhYWJiZTkyZC1iODc0LTQ1NmYtYTQ3MC0zYWYwNzhlN2Nh
|
||||||
|
OGVKLgoIdGFza19rZXkSIgogY2M0ODc2ZjZlNTg4ZTcxMzQ5YmJkM2E2NTg4OGMzZTlKMQoHdGFz
|
||||||
|
a19pZBImCiQ5MWVhYWExYy0xYjhkLTQwNzEtOTZmZC0zZDFlZWQyOGMzNmN6AhgBhQEAAQAAEo4C
|
||||||
|
ChCdvXmXZRltDxEwZx2XkhWhEghoKdomHHhLGSoMVGFzayBDcmVhdGVkMAE54HpmpfBM+BdB4Pdm
|
||||||
|
pfBM+BdKLgoIY3Jld19rZXkSIgogMTExYjg3MmQ4ZjBjZjcwM2YyZWZlZjA0Y2YzYWM3OThKMQoH
|
||||||
|
Y3Jld19pZBImCiRhYWJiZTkyZC1iODc0LTQ1NmYtYTQ3MC0zYWYwNzhlN2NhOGVKLgoIdGFza19r
|
||||||
|
ZXkSIgogZTBiMTNlMTBkN2ExNDZkY2M0YzQ4OGZjZjhkNzQ4YTBKMQoHdGFza19pZBImCiQ4NjEx
|
||||||
|
ZjhjZS1jNDVlLTQ2OTgtYWEyMS1jMGJkNzdhOGY2ZWZ6AhgBhQEAAQAAEpACChAIvs/XQL53haTt
|
||||||
|
NV8fk6geEgicgSOcpcYulyoOVGFzayBFeGVjdXRpb24wATnYImel8Ez4F0Gw5ZSl8Ez4F0ouCghj
|
||||||
|
cmV3X2tleRIiCiAxMTFiODcyZDhmMGNmNzAzZjJlZmVmMDRjZjNhYzc5OEoxCgdjcmV3X2lkEiYK
|
||||||
|
JGFhYmJlOTJkLWI4NzQtNDU2Zi1hNDcwLTNhZjA3OGU3Y2E4ZUouCgh0YXNrX2tleRIiCiBlMGIx
|
||||||
|
M2UxMGQ3YTE0NmRjYzRjNDg4ZmNmOGQ3NDhhMEoxCgd0YXNrX2lkEiYKJDg2MTFmOGNlLWM0NWUt
|
||||||
|
NDY5OC1hYTIxLWMwYmQ3N2E4ZjZlZnoCGAGFAQABAAASvAcKEARTPn0s+U/k8GclUc+5rRoSCHF3
|
||||||
|
KCh8OS0FKgxDcmV3IENyZWF0ZWQwATlo+Pul8Ez4F0EQ0f2l8Ez4F0oaCg5jcmV3YWlfdmVyc2lv
|
||||||
|
bhIICgYwLjYxLjBKGgoOcHl0aG9uX3ZlcnNpb24SCAoGMy4xMS43Si4KCGNyZXdfa2V5EiIKIDQ5
|
||||||
|
NGYzNjU3MjM3YWQ4YTMwMzViMmYxYmVlY2RjNjc3SjEKB2NyZXdfaWQSJgokOWMwNzg3NWUtMTMz
|
||||||
|
Mi00MmMzLWFhZTEtZjNjMjc1YTQyNjYwShwKDGNyZXdfcHJvY2VzcxIMCgpzZXF1ZW50aWFsShEK
|
||||||
|
C2NyZXdfbWVtb3J5EgIQAEoaChRjcmV3X251bWJlcl9vZl90YXNrcxICGAFKGwoVY3Jld19udW1i
|
||||||
|
ZXJfb2ZfYWdlbnRzEgIYAUrbAgoLY3Jld19hZ2VudHMSywIKyAJbeyJrZXkiOiAiZTE0OGU1MzIw
|
||||||
|
MjkzNDk5ZjhjZWJlYTgyNmU3MjU4MmIiLCAiaWQiOiAiNGFkYzNmMmItN2IwNC00MDRlLWEwNDQt
|
||||||
|
N2JkNjVmYTMyZmE4IiwgInJvbGUiOiAidGVzdCByb2xlIiwgInZlcmJvc2U/IjogZmFsc2UsICJt
|
||||||
|
YXhfaXRlciI6IDE1LCAibWF4X3JwbSI6IG51bGwsICJmdW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIi
|
||||||
|
LCAibGxtIjogImdwdC00byIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJhbGxvd19j
|
||||||
|
b2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1l
|
||||||
|
cyI6IFsibGVhcm5fYWJvdXRfYWkiXX1dSo4CCgpjcmV3X3Rhc2tzEv8BCvwBW3sia2V5IjogImYy
|
||||||
|
NTk3Yzc4NjdmYmUzMjRkYzY1ZGMwOGRmZGJmYzZjIiwgImlkIjogIjg2YzZiODE2LTgyOWMtNDUx
|
||||||
|
Zi1iMDZkLTUyZjQ4YTdhZWJiMyIsICJhc3luY19leGVjdXRpb24/IjogZmFsc2UsICJodW1hbl9p
|
||||||
|
bnB1dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAidGVzdCByb2xlIiwgImFnZW50X2tleSI6ICJl
|
||||||
|
MTQ4ZTUzMjAyOTM0OTlmOGNlYmVhODI2ZTcyNTgyYiIsICJ0b29sc19uYW1lcyI6IFsibGVhcm5f
|
||||||
|
YWJvdXRfYWkiXX1degIYAYUBAAEAABKOAgoQZWSU3+i71QSqlD8iiLdyWBII1Pawtza2ZHsqDFRh
|
||||||
|
c2sgQ3JlYXRlZDABOdj2FKbwTPgXQZhUFabwTPgXSi4KCGNyZXdfa2V5EiIKIDQ5NGYzNjU3MjM3
|
||||||
|
YWQ4YTMwMzViMmYxYmVlY2RjNjc3SjEKB2NyZXdfaWQSJgokOWMwNzg3NWUtMTMzMi00MmMzLWFh
|
||||||
|
ZTEtZjNjMjc1YTQyNjYwSi4KCHRhc2tfa2V5EiIKIGYyNTk3Yzc4NjdmYmUzMjRkYzY1ZGMwOGRm
|
||||||
|
ZGJmYzZjSjEKB3Rhc2tfaWQSJgokODZjNmI4MTYtODI5Yy00NTFmLWIwNmQtNTJmNDhhN2FlYmIz
|
||||||
|
egIYAYUBAAEAABKRAQoQl3nNMLhrOg+OgsWWX6A9LxIINbCKrQzQ3JkqClRvb2wgVXNhZ2UwATlA
|
||||||
|
TlCm8Ez4F0FASFGm8Ez4F0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjYxLjBKHQoJdG9vbF9uYW1l
|
||||||
|
EhAKDmxlYXJuX2Fib3V0X0FJSg4KCGF0dGVtcHRzEgIYAXoCGAGFAQABAAASkAIKEL9YI/QwoVBJ
|
||||||
|
1HBkTLyQxOESCCcKWhev/Dc8Kg5UYXNrIEV4ZWN1dGlvbjABOXiDFabwTPgXQcjEfqbwTPgXSi4K
|
||||||
|
CGNyZXdfa2V5EiIKIDQ5NGYzNjU3MjM3YWQ4YTMwMzViMmYxYmVlY2RjNjc3SjEKB2NyZXdfaWQS
|
||||||
|
JgokOWMwNzg3NWUtMTMzMi00MmMzLWFhZTEtZjNjMjc1YTQyNjYwSi4KCHRhc2tfa2V5EiIKIGYy
|
||||||
|
NTk3Yzc4NjdmYmUzMjRkYzY1ZGMwOGRmZGJmYzZjSjEKB3Rhc2tfaWQSJgokODZjNmI4MTYtODI5
|
||||||
|
Yy00NTFmLWIwNmQtNTJmNDhhN2FlYmIzegIYAYUBAAEAABLBBwoQ0Le1256mT8wmcvnuLKYeNRII
|
||||||
|
IYBlVsTs+qEqDENyZXcgQ3JlYXRlZDABOYCBiKrwTPgXQRBeiqrwTPgXShoKDmNyZXdhaV92ZXJz
|
||||||
|
aW9uEggKBjAuNjEuMEoaCg5weXRob25fdmVyc2lvbhIICgYzLjExLjdKLgoIY3Jld19rZXkSIgog
|
||||||
|
NDk0ZjM2NTcyMzdhZDhhMzAzNWIyZjFiZWVjZGM2NzdKMQoHY3Jld19pZBImCiQyN2VlMGYyYy1h
|
||||||
|
ZjgwLTQxYWMtYjg3ZC0xNmViYWQyMTVhNTJKHAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxK
|
||||||
|
EQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdfbnVtYmVyX29mX3Rhc2tzEgIYAUobChVjcmV3X251
|
||||||
|
bWJlcl9vZl9hZ2VudHMSAhgBSuACCgtjcmV3X2FnZW50cxLQAgrNAlt7ImtleSI6ICJlMTQ4ZTUz
|
||||||
|
MjAyOTM0OTlmOGNlYmVhODI2ZTcyNTgyYiIsICJpZCI6ICJmMTYyMTFjNS00YWJlLTRhZDAtOWI0
|
||||||
|
YS0yN2RmMTJhODkyN2UiLCAicm9sZSI6ICJ0ZXN0IHJvbGUiLCAidmVyYm9zZT8iOiBmYWxzZSwg
|
||||||
|
Im1heF9pdGVyIjogMiwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAi
|
||||||
|
Z3B0LTRvIiwgImxsbSI6ICJncHQtNG8iLCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6IGZhbHNlLCAi
|
||||||
|
YWxsb3dfY29kZV9leGVjdXRpb24/IjogZmFsc2UsICJtYXhfcmV0cnlfbGltaXQiOiAyLCAidG9v
|
||||||
|
bHNfbmFtZXMiOiBbImxlYXJuX2Fib3V0X2FpIl19XUqOAgoKY3Jld190YXNrcxL/AQr8AVt7Imtl
|
||||||
|
eSI6ICJmMjU5N2M3ODY3ZmJlMzI0ZGM2NWRjMDhkZmRiZmM2YyIsICJpZCI6ICJjN2FiOWRiYi0y
|
||||||
|
MTc4LTRmOGItOGFiNi1kYTU1YzE0YTBkMGMiLCAiYXN5bmNfZXhlY3V0aW9uPyI6IGZhbHNlLCAi
|
||||||
|
aHVtYW5faW5wdXQ/IjogZmFsc2UsICJhZ2VudF9yb2xlIjogInRlc3Qgcm9sZSIsICJhZ2VudF9r
|
||||||
|
ZXkiOiAiZTE0OGU1MzIwMjkzNDk5ZjhjZWJlYTgyNmU3MjU4MmIiLCAidG9vbHNfbmFtZXMiOiBb
|
||||||
|
ImxlYXJuX2Fib3V0X2FpIl19XXoCGAGFAQABAAASjgIKECr4ueCUCo/tMB7EuBQt6TcSCD/UepYl
|
||||||
|
WGqAKgxUYXNrIENyZWF0ZWQwATk4kpyq8Ez4F0Hg85yq8Ez4F0ouCghjcmV3X2tleRIiCiA0OTRm
|
||||||
|
MzY1NzIzN2FkOGEzMDM1YjJmMWJlZWNkYzY3N0oxCgdjcmV3X2lkEiYKJDI3ZWUwZjJjLWFmODAt
|
||||||
|
NDFhYy1iODdkLTE2ZWJhZDIxNWE1MkouCgh0YXNrX2tleRIiCiBmMjU5N2M3ODY3ZmJlMzI0ZGM2
|
||||||
|
NWRjMDhkZmRiZmM2Y0oxCgd0YXNrX2lkEiYKJGM3YWI5ZGJiLTIxNzgtNGY4Yi04YWI2LWRhNTVj
|
||||||
|
MTRhMGQwY3oCGAGFAQABAAASeQoQkj0vmbCBIZPi33W9KrvrYhIIM2g73dOAN9QqEFRvb2wgVXNh
|
||||||
|
Z2UgRXJyb3IwATnQgsyr8Ez4F0GghM2r8Ez4F0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjYxLjBK
|
||||||
|
DwoDbGxtEggKBmdwdC00b3oCGAGFAQABAAASeQoQavr4/1SWr8x7HD5mAzlM0hIIXPx740Skkd0q
|
||||||
|
EFRvb2wgVXNhZ2UgRXJyb3IwATkouH9C8Uz4F0FQ1YBC8Uz4F0oaCg5jcmV3YWlfdmVyc2lvbhII
|
||||||
|
CgYwLjYxLjBKDwoDbGxtEggKBmdwdC00b3oCGAGFAQABAAASkAIKEIgmJ3QURJvSsEifMScSiUsS
|
||||||
|
CCyiPHcZT8AnKg5UYXNrIEV4ZWN1dGlvbjABOcAinarwTPgXQeBEynvxTPgXSi4KCGNyZXdfa2V5
|
||||||
|
EiIKIDQ5NGYzNjU3MjM3YWQ4YTMwMzViMmYxYmVlY2RjNjc3SjEKB2NyZXdfaWQSJgokMjdlZTBm
|
||||||
|
MmMtYWY4MC00MWFjLWI4N2QtMTZlYmFkMjE1YTUySi4KCHRhc2tfa2V5EiIKIGYyNTk3Yzc4Njdm
|
||||||
|
YmUzMjRkYzY1ZGMwOGRmZGJmYzZjSjEKB3Rhc2tfaWQSJgokYzdhYjlkYmItMjE3OC00ZjhiLThh
|
||||||
|
YjYtZGE1NWMxNGEwZDBjegIYAYUBAAEAABLEBwoQY+GZuYkP6mwdaVQQc11YuhII7ADKOlFZlzQq
|
||||||
|
DENyZXcgQ3JlYXRlZDABObCoi3zxTPgXQeCUjXzxTPgXShoKDmNyZXdhaV92ZXJzaW9uEggKBjAu
|
||||||
|
NjEuMEoaCg5weXRob25fdmVyc2lvbhIICgYzLjExLjdKLgoIY3Jld19rZXkSIgogN2U2NjA4OTg5
|
||||||
|
ODU5YTY3ZWVjODhlZWY3ZmNlODUyMjVKMQoHY3Jld19pZBImCiQxMmE0OTFlNS00NDgwLTQ0MTYt
|
||||||
|
OTAxYi1iMmI1N2U1ZWU4ZThKHAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19t
|
||||||
|
ZW1vcnkSAhAAShoKFGNyZXdfbnVtYmVyX29mX3Rhc2tzEgIYAUobChVjcmV3X251bWJlcl9vZl9h
|
||||||
|
Z2VudHMSAhgBSt8CCgtjcmV3X2FnZW50cxLPAgrMAlt7ImtleSI6ICIyMmFjZDYxMWU0NGVmNWZh
|
||||||
|
YzA1YjUzM2Q3NWU4ODkzYiIsICJpZCI6ICI5NjljZjhlMy0yZWEwLTQ5ZjgtODNlMS02MzEzYmE4
|
||||||
|
ODc1ZjUiLCAicm9sZSI6ICJEYXRhIFNjaWVudGlzdCIsICJ2ZXJib3NlPyI6IGZhbHNlLCAibWF4
|
||||||
|
X2l0ZXIiOiAxNSwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwg
|
||||||
|
ImxsbSI6ICJncHQtNG8iLCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6IGZhbHNlLCAiYWxsb3dfY29k
|
||||||
|
ZV9leGVjdXRpb24/IjogZmFsc2UsICJtYXhfcmV0cnlfbGltaXQiOiAyLCAidG9vbHNfbmFtZXMi
|
||||||
|
OiBbImdldCBncmVldGluZ3MiXX1dSpICCgpjcmV3X3Rhc2tzEoMCCoACW3sia2V5IjogImEyNzdi
|
||||||
|
MzRiMmMxNDZmMGM1NmM1ZTEzNTZlOGY4YTU3IiwgImlkIjogImIwMTg0NTI2LTJlOWItNDA0My1h
|
||||||
|
M2JiLTFiM2QzNWIxNTNhOCIsICJhc3luY19leGVjdXRpb24/IjogZmFsc2UsICJodW1hbl9pbnB1
|
||||||
|
dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAiRGF0YSBTY2llbnRpc3QiLCAiYWdlbnRfa2V5Ijog
|
||||||
|
IjIyYWNkNjExZTQ0ZWY1ZmFjMDViNTMzZDc1ZTg4OTNiIiwgInRvb2xzX25hbWVzIjogWyJnZXQg
|
||||||
|
Z3JlZXRpbmdzIl19XXoCGAGFAQABAAASjgIKEI/rrKkPz08VpVWNehfvxJ0SCIpeq76twGj3KgxU
|
||||||
|
YXNrIENyZWF0ZWQwATlA9aR88Uz4F0HoVqV88Uz4F0ouCghjcmV3X2tleRIiCiA3ZTY2MDg5ODk4
|
||||||
|
NTlhNjdlZWM4OGVlZjdmY2U4NTIyNUoxCgdjcmV3X2lkEiYKJDEyYTQ5MWU1LTQ0ODAtNDQxNi05
|
||||||
|
MDFiLWIyYjU3ZTVlZThlOEouCgh0YXNrX2tleRIiCiBhMjc3YjM0YjJjMTQ2ZjBjNTZjNWUxMzU2
|
||||||
|
ZThmOGE1N0oxCgd0YXNrX2lkEiYKJGIwMTg0NTI2LTJlOWItNDA0My1hM2JiLTFiM2QzNWIxNTNh
|
||||||
|
OHoCGAGFAQABAAASkAEKEKKr5LR8SkqfqqktFhniLdkSCPMnqI2ma9UoKgpUb29sIFVzYWdlMAE5
|
||||||
|
sCHgfPFM+BdB+A/hfPFM+BdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC42MS4wShwKCXRvb2xfbmFt
|
||||||
|
ZRIPCg1HZXQgR3JlZXRpbmdzSg4KCGF0dGVtcHRzEgIYAXoCGAGFAQABAAASkAIKEOj2bALdBlz6
|
||||||
|
1kP1MvHE5T0SCLw4D7D331IOKg5UYXNrIEV4ZWN1dGlvbjABOeCBpXzxTPgXQSjiEH3xTPgXSi4K
|
||||||
|
CGNyZXdfa2V5EiIKIDdlNjYwODk4OTg1OWE2N2VlYzg4ZWVmN2ZjZTg1MjI1SjEKB2NyZXdfaWQS
|
||||||
|
JgokMTJhNDkxZTUtNDQ4MC00NDE2LTkwMWItYjJiNTdlNWVlOGU4Si4KCHRhc2tfa2V5EiIKIGEy
|
||||||
|
NzdiMzRiMmMxNDZmMGM1NmM1ZTEzNTZlOGY4YTU3SjEKB3Rhc2tfaWQSJgokYjAxODQ1MjYtMmU5
|
||||||
|
Yi00MDQzLWEzYmItMWIzZDM1YjE1M2E4egIYAYUBAAEAABLQBwoQLjz7NWyGPgGU4tVFJ0sh9BII
|
||||||
|
N6EzU5f/sykqDENyZXcgQ3JlYXRlZDABOajOcX3xTPgXQUCAc33xTPgXShoKDmNyZXdhaV92ZXJz
|
||||||
|
aW9uEggKBjAuNjEuMEoaCg5weXRob25fdmVyc2lvbhIICgYzLjExLjdKLgoIY3Jld19rZXkSIgog
|
||||||
|
YzMwNzYwMDkzMjY3NjE0NDRkNTdjNzFkMWRhM2YyN2NKMQoHY3Jld19pZBImCiQ1N2Y0NjVhNC03
|
||||||
|
Zjk1LTQ5Y2MtODNmZC0zZTIwNWRhZDBjZTJKHAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxK
|
||||||
|
EQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdfbnVtYmVyX29mX3Rhc2tzEgIYAUobChVjcmV3X251
|
||||||
|
bWJlcl9vZl9hZ2VudHMSAhgBSuUCCgtjcmV3X2FnZW50cxLVAgrSAlt7ImtleSI6ICI5OGYzYjFk
|
||||||
|
NDdjZTk2OWNmMDU3NzI3Yjc4NDE0MjVjZCIsICJpZCI6ICJjZjcyZDlkNy01MjQwLTRkMzEtYjA2
|
||||||
|
Mi0xMmNjMDU2OGNjM2MiLCAicm9sZSI6ICJGcmllbmRseSBOZWlnaGJvciIsICJ2ZXJib3NlPyI6
|
||||||
|
IGZhbHNlLCAibWF4X2l0ZXIiOiAxNSwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGlu
|
||||||
|
Z19sbG0iOiAiIiwgImxsbSI6ICJncHQtNG8iLCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6IGZhbHNl
|
||||||
|
LCAiYWxsb3dfY29kZV9leGVjdXRpb24/IjogZmFsc2UsICJtYXhfcmV0cnlfbGltaXQiOiAyLCAi
|
||||||
|
dG9vbHNfbmFtZXMiOiBbImRlY2lkZSBncmVldGluZ3MiXX1dSpgCCgpjcmV3X3Rhc2tzEokCCoYC
|
||||||
|
W3sia2V5IjogIjgwZDdiY2Q0OTA5OTI5MDA4MzgzMmYwZTk4MzM4MGRmIiwgImlkIjogIjUxNTJk
|
||||||
|
MmQ2LWYwODYtNGIyMi1hOGMxLTMyODA5NzU1NjZhZCIsICJhc3luY19leGVjdXRpb24/IjogZmFs
|
||||||
|
c2UsICJodW1hbl9pbnB1dD8iOiBmYWxzZSwgImFnZW50X3JvbGUiOiAiRnJpZW5kbHkgTmVpZ2hi
|
||||||
|
b3IiLCAiYWdlbnRfa2V5IjogIjk4ZjNiMWQ0N2NlOTY5Y2YwNTc3MjdiNzg0MTQyNWNkIiwgInRv
|
||||||
|
b2xzX25hbWVzIjogWyJkZWNpZGUgZ3JlZXRpbmdzIl19XXoCGAGFAQABAAASjgIKEM+95r2LzVVg
|
||||||
|
kqAMolHjl9oSCN9WyhdF/ucVKgxUYXNrIENyZWF0ZWQwATnoCoJ98Uz4F0HwXIJ98Uz4F0ouCghj
|
||||||
|
cmV3X2tleRIiCiBjMzA3NjAwOTMyNjc2MTQ0NGQ1N2M3MWQxZGEzZjI3Y0oxCgdjcmV3X2lkEiYK
|
||||||
|
JDU3ZjQ2NWE0LTdmOTUtNDljYy04M2ZkLTNlMjA1ZGFkMGNlMkouCgh0YXNrX2tleRIiCiA4MGQ3
|
||||||
|
YmNkNDkwOTkyOTAwODM4MzJmMGU5ODMzODBkZkoxCgd0YXNrX2lkEiYKJDUxNTJkMmQ2LWYwODYt
|
||||||
|
NGIyMi1hOGMxLTMyODA5NzU1NjZhZHoCGAGFAQABAAASkwEKENJjTKn4eTP/P11ERMIGcdYSCIKF
|
||||||
|
bGEmcS7bKgpUb29sIFVzYWdlMAE5EFu5ffFM+BdBoD26ffFM+BdKGgoOY3Jld2FpX3ZlcnNpb24S
|
||||||
|
CAoGMC42MS4wSh8KCXRvb2xfbmFtZRISChBEZWNpZGUgR3JlZXRpbmdzSg4KCGF0dGVtcHRzEgIY
|
||||||
|
AXoCGAGFAQABAAASkAIKEG29htC06tLF7ihE5Yz6NyMSCAAsKzOcj25nKg5UYXNrIEV4ZWN1dGlv
|
||||||
|
bjABOQCEgn3xTPgXQfgg7X3xTPgXSi4KCGNyZXdfa2V5EiIKIGMzMDc2MDA5MzI2NzYxNDQ0ZDU3
|
||||||
|
YzcxZDFkYTNmMjdjSjEKB2NyZXdfaWQSJgokNTdmNDY1YTQtN2Y5NS00OWNjLTgzZmQtM2UyMDVk
|
||||||
|
YWQwY2UySi4KCHRhc2tfa2V5EiIKIDgwZDdiY2Q0OTA5OTI5MDA4MzgzMmYwZTk4MzM4MGRmSjEK
|
||||||
|
B3Rhc2tfaWQSJgokNTE1MmQyZDYtZjA4Ni00YjIyLWE4YzEtMzI4MDk3NTU2NmFkegIYAYUBAAEA
|
||||||
|
AA==
|
||||||
|
headers:
|
||||||
|
Accept:
|
||||||
|
- '*/*'
|
||||||
|
Accept-Encoding:
|
||||||
|
- gzip, deflate
|
||||||
|
Connection:
|
||||||
|
- keep-alive
|
||||||
|
Content-Length:
|
||||||
|
- '18925'
|
||||||
|
Content-Type:
|
||||||
|
- application/x-protobuf
|
||||||
|
User-Agent:
|
||||||
|
- OTel-OTLP-Exporter-Python/1.27.0
|
||||||
|
method: POST
|
||||||
|
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||||
|
response:
|
||||||
|
body:
|
||||||
|
string: "\n\0"
|
||||||
|
headers:
|
||||||
|
Content-Length:
|
||||||
|
- '2'
|
||||||
|
Content-Type:
|
||||||
|
- application/x-protobuf
|
||||||
|
Date:
|
||||||
|
- Tue, 24 Sep 2024 21:57:39 GMT
|
||||||
|
status:
|
||||||
|
code: 200
|
||||||
|
message: OK
|
||||||
|
- request:
|
||||||
|
body: '{"model": "gemma2:latest", "prompt": "### User:\nRespond in 20 words. Who
|
||||||
|
are you?\n\n", "options": {}, "stream": false}'
|
||||||
|
headers:
|
||||||
|
Accept:
|
||||||
|
- '*/*'
|
||||||
|
Accept-Encoding:
|
||||||
|
- gzip, deflate
|
||||||
|
Connection:
|
||||||
|
- keep-alive
|
||||||
|
Content-Length:
|
||||||
|
- '120'
|
||||||
|
Content-Type:
|
||||||
|
- application/json
|
||||||
|
User-Agent:
|
||||||
|
- python-requests/2.31.0
|
||||||
|
method: POST
|
||||||
|
uri: http://localhost:8080/api/generate
|
||||||
|
response:
|
||||||
|
body:
|
||||||
|
string: '{"model":"gemma2:latest","created_at":"2024-09-24T21:57:51.284303Z","response":"I
|
||||||
|
am Gemma, an open-weights AI assistant developed by Google DeepMind. \n","done":true,"done_reason":"stop","context":[106,1645,108,6176,4926,235292,108,54657,575,235248,235284,235276,3907,235265,7702,708,692,235336,109,107,108,106,2516,108,235285,1144,137061,235269,671,2174,235290,30316,16481,20409,6990,731,6238,20555,35777,235265,139,108],"total_duration":14046647083,"load_duration":12942541833,"prompt_eval_count":25,"prompt_eval_duration":177695000,"eval_count":19,"eval_duration":923120000}'
|
||||||
|
headers:
|
||||||
|
Content-Length:
|
||||||
|
- '579'
|
||||||
|
Content-Type:
|
||||||
|
- application/json; charset=utf-8
|
||||||
|
Date:
|
||||||
|
- Tue, 24 Sep 2024 21:57:51 GMT
|
||||||
|
status:
|
||||||
|
code: 200
|
||||||
|
message: OK
|
||||||
|
version: 1
|
||||||
@@ -1,863 +0,0 @@
|
|||||||
interactions:
|
|
||||||
- request:
|
|
||||||
body: '{"model": "llama3.2:3b", "prompt": "### User:\nRespond in 20 words. Who
|
|
||||||
are you?\n\n", "options": {"stop": ["\nObservation:"]}, "stream": false}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- '*/*'
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '144'
|
|
||||||
host:
|
|
||||||
- localhost:11434
|
|
||||||
user-agent:
|
|
||||||
- litellm/1.56.4
|
|
||||||
method: POST
|
|
||||||
uri: http://localhost:11434/api/generate
|
|
||||||
response:
|
|
||||||
content: '{"model":"llama3.2:3b","created_at":"2024-12-31T16:57:54.063894Z","response":"I''m
|
|
||||||
an AI designed to provide information and assist with inquiries, while maintaining
|
|
||||||
a neutral and respectful tone always.","done":true,"done_reason":"stop","context":[128006,9125,128007,271,38766,1303,33025,2696,25,6790,220,2366,18,271,128009,128006,882,128007,271,14711,2724,512,66454,304,220,508,4339,13,10699,527,499,1980,128009,128006,78191,128007,271,40,2846,459,15592,6319,311,3493,2038,323,7945,449,44983,11,1418,20958,264,21277,323,49150,16630,2744,13],"total_duration":651386042,"load_duration":41061917,"prompt_eval_count":38,"prompt_eval_duration":204000000,"eval_count":23,"eval_duration":405000000}'
|
|
||||||
headers:
|
|
||||||
Content-Length:
|
|
||||||
- '692'
|
|
||||||
Content-Type:
|
|
||||||
- application/json; charset=utf-8
|
|
||||||
Date:
|
|
||||||
- Tue, 31 Dec 2024 16:57:54 GMT
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
- request:
|
|
||||||
body: '{"name": "llama3.2:3b"}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- '*/*'
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '23'
|
|
||||||
content-type:
|
|
||||||
- application/json
|
|
||||||
host:
|
|
||||||
- localhost:11434
|
|
||||||
user-agent:
|
|
||||||
- litellm/1.56.4
|
|
||||||
method: POST
|
|
||||||
uri: http://localhost:11434/api/show
|
|
||||||
response:
|
|
||||||
content: "{\"license\":\"LLAMA 3.2 COMMUNITY LICENSE AGREEMENT\\nLlama 3.2 Version
|
|
||||||
Release Date: September 25, 2024\\n\\n\u201CAgreement\u201D means the terms
|
|
||||||
and conditions for use, reproduction, distribution \\nand modification of the
|
|
||||||
Llama Materials set forth herein.\\n\\n\u201CDocumentation\u201D means the specifications,
|
|
||||||
manuals and documentation accompanying Llama 3.2\\ndistributed by Meta at https://llama.meta.com/doc/overview.\\n\\n\u201CLicensee\u201D
|
|
||||||
or \u201Cyou\u201D means you, or your employer or any other person or entity
|
|
||||||
(if you are \\nentering into this Agreement on such person or entity\u2019s
|
|
||||||
behalf), of the age required under\\napplicable laws, rules or regulations to
|
|
||||||
provide legal consent and that has legal authority\\nto bind your employer or
|
|
||||||
such other person or entity if you are entering in this Agreement\\non their
|
|
||||||
behalf.\\n\\n\u201CLlama 3.2\u201D means the foundational large language models
|
|
||||||
and software and algorithms, including\\nmachine-learning model code, trained
|
|
||||||
model weights, inference-enabling code, training-enabling code,\\nfine-tuning
|
|
||||||
enabling code and other elements of the foregoing distributed by Meta at \\nhttps://www.llama.com/llama-downloads.\\n\\n\u201CLlama
|
|
||||||
Materials\u201D means, collectively, Meta\u2019s proprietary Llama 3.2 and Documentation
|
|
||||||
(and \\nany portion thereof) made available under this Agreement.\\n\\n\u201CMeta\u201D
|
|
||||||
or \u201Cwe\u201D means Meta Platforms Ireland Limited (if you are located in
|
|
||||||
or, \\nif you are an entity, your principal place of business is in the EEA
|
|
||||||
or Switzerland) \\nand Meta Platforms, Inc. (if you are located outside of the
|
|
||||||
EEA or Switzerland). \\n\\n\\nBy clicking \u201CI Accept\u201D below or by using
|
|
||||||
or distributing any portion or element of the Llama Materials,\\nyou agree to
|
|
||||||
be bound by this Agreement.\\n\\n\\n1. License Rights and Redistribution.\\n\\n
|
|
||||||
\ a. Grant of Rights. You are granted a non-exclusive, worldwide, \\nnon-transferable
|
|
||||||
and royalty-free limited license under Meta\u2019s intellectual property or
|
|
||||||
other rights \\nowned by Meta embodied in the Llama Materials to use, reproduce,
|
|
||||||
distribute, copy, create derivative works \\nof, and make modifications to the
|
|
||||||
Llama Materials. \\n\\n b. Redistribution and Use. \\n\\n i. If
|
|
||||||
you distribute or make available the Llama Materials (or any derivative works
|
|
||||||
thereof), \\nor a product or service (including another AI model) that contains
|
|
||||||
any of them, you shall (A) provide\\na copy of this Agreement with any such
|
|
||||||
Llama Materials; and (B) prominently display \u201CBuilt with Llama\u201D\\non
|
|
||||||
a related website, user interface, blogpost, about page, or product documentation.
|
|
||||||
If you use the\\nLlama Materials or any outputs or results of the Llama Materials
|
|
||||||
to create, train, fine tune, or\\notherwise improve an AI model, which is distributed
|
|
||||||
or made available, you shall also include \u201CLlama\u201D\\nat the beginning
|
|
||||||
of any such AI model name.\\n\\n ii. If you receive Llama Materials,
|
|
||||||
or any derivative works thereof, from a Licensee as part\\nof an integrated
|
|
||||||
end user product, then Section 2 of this Agreement will not apply to you. \\n\\n
|
|
||||||
\ iii. You must retain in all copies of the Llama Materials that you distribute
|
|
||||||
the \\nfollowing attribution notice within a \u201CNotice\u201D text file distributed
|
|
||||||
as a part of such copies: \\n\u201CLlama 3.2 is licensed under the Llama 3.2
|
|
||||||
Community License, Copyright \xA9 Meta Platforms,\\nInc. All Rights Reserved.\u201D\\n\\n
|
|
||||||
\ iv. Your use of the Llama Materials must comply with applicable laws
|
|
||||||
and regulations\\n(including trade compliance laws and regulations) and adhere
|
|
||||||
to the Acceptable Use Policy for\\nthe Llama Materials (available at https://www.llama.com/llama3_2/use-policy),
|
|
||||||
which is hereby \\nincorporated by reference into this Agreement.\\n \\n2.
|
|
||||||
Additional Commercial Terms. If, on the Llama 3.2 version release date, the
|
|
||||||
monthly active users\\nof the products or services made available by or for
|
|
||||||
Licensee, or Licensee\u2019s affiliates, \\nis greater than 700 million monthly
|
|
||||||
active users in the preceding calendar month, you must request \\na license
|
|
||||||
from Meta, which Meta may grant to you in its sole discretion, and you are not
|
|
||||||
authorized to\\nexercise any of the rights under this Agreement unless or until
|
|
||||||
Meta otherwise expressly grants you such rights.\\n\\n3. Disclaimer of Warranty.
|
|
||||||
UNLESS REQUIRED BY APPLICABLE LAW, THE LLAMA MATERIALS AND ANY OUTPUT AND \\nRESULTS
|
|
||||||
THEREFROM ARE PROVIDED ON AN \u201CAS IS\u201D BASIS, WITHOUT WARRANTIES OF
|
|
||||||
ANY KIND, AND META DISCLAIMS\\nALL WARRANTIES OF ANY KIND, BOTH EXPRESS AND
|
|
||||||
IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES\\nOF TITLE, NON-INFRINGEMENT,
|
|
||||||
MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE\\nFOR
|
|
||||||
DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE LLAMA MATERIALS
|
|
||||||
AND ASSUME ANY RISKS ASSOCIATED\\nWITH YOUR USE OF THE LLAMA MATERIALS AND ANY
|
|
||||||
OUTPUT AND RESULTS.\\n\\n4. Limitation of Liability. IN NO EVENT WILL META OR
|
|
||||||
ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY, \\nWHETHER IN CONTRACT,
|
|
||||||
TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT,
|
|
||||||
\\nFOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL,
|
|
||||||
EXEMPLARY OR PUNITIVE DAMAGES, EVEN \\nIF META OR ITS AFFILIATES HAVE BEEN ADVISED
|
|
||||||
OF THE POSSIBILITY OF ANY OF THE FOREGOING.\\n\\n5. Intellectual Property.\\n\\n
|
|
||||||
\ a. No trademark licenses are granted under this Agreement, and in connection
|
|
||||||
with the Llama Materials, \\nneither Meta nor Licensee may use any name or mark
|
|
||||||
owned by or associated with the other or any of its affiliates, \\nexcept as
|
|
||||||
required for reasonable and customary use in describing and redistributing the
|
|
||||||
Llama Materials or as \\nset forth in this Section 5(a). Meta hereby grants
|
|
||||||
you a license to use \u201CLlama\u201D (the \u201CMark\u201D) solely as required
|
|
||||||
\\nto comply with the last sentence of Section 1.b.i. You will comply with Meta\u2019s
|
|
||||||
brand guidelines (currently accessible \\nat https://about.meta.com/brand/resources/meta/company-brand/).
|
|
||||||
All goodwill arising out of your use of the Mark \\nwill inure to the benefit
|
|
||||||
of Meta.\\n\\n b. Subject to Meta\u2019s ownership of Llama Materials and
|
|
||||||
derivatives made by or for Meta, with respect to any\\n derivative works
|
|
||||||
and modifications of the Llama Materials that are made by you, as between you
|
|
||||||
and Meta,\\n you are and will be the owner of such derivative works and modifications.\\n\\n
|
|
||||||
\ c. If you institute litigation or other proceedings against Meta or any
|
|
||||||
entity (including a cross-claim or\\n counterclaim in a lawsuit) alleging
|
|
||||||
that the Llama Materials or Llama 3.2 outputs or results, or any portion\\n
|
|
||||||
\ of any of the foregoing, constitutes infringement of intellectual property
|
|
||||||
or other rights owned or licensable\\n by you, then any licenses granted
|
|
||||||
to you under this Agreement shall terminate as of the date such litigation or\\n
|
|
||||||
\ claim is filed or instituted. You will indemnify and hold harmless Meta
|
|
||||||
from and against any claim by any third\\n party arising out of or related
|
|
||||||
to your use or distribution of the Llama Materials.\\n\\n6. Term and Termination.
|
|
||||||
The term of this Agreement will commence upon your acceptance of this Agreement
|
|
||||||
or access\\nto the Llama Materials and will continue in full force and effect
|
|
||||||
until terminated in accordance with the terms\\nand conditions herein. Meta
|
|
||||||
may terminate this Agreement if you are in breach of any term or condition of
|
|
||||||
this\\nAgreement. Upon termination of this Agreement, you shall delete and cease
|
|
||||||
use of the Llama Materials. Sections 3,\\n4 and 7 shall survive the termination
|
|
||||||
of this Agreement. \\n\\n7. Governing Law and Jurisdiction. This Agreement will
|
|
||||||
be governed and construed under the laws of the State of \\nCalifornia without
|
|
||||||
regard to choice of law principles, and the UN Convention on Contracts for the
|
|
||||||
International\\nSale of Goods does not apply to this Agreement. The courts of
|
|
||||||
California shall have exclusive jurisdiction of\\nany dispute arising out of
|
|
||||||
this Agreement.\\n**Llama 3.2** **Acceptable Use Policy**\\n\\nMeta is committed
|
|
||||||
to promoting safe and fair use of its tools and features, including Llama 3.2.
|
|
||||||
If you access or use Llama 3.2, you agree to this Acceptable Use Policy (\u201C**Policy**\u201D).
|
|
||||||
The most recent copy of this policy can be found at [https://www.llama.com/llama3_2/use-policy](https://www.llama.com/llama3_2/use-policy).\\n\\n**Prohibited
|
|
||||||
Uses**\\n\\nWe want everyone to use Llama 3.2 safely and responsibly. You agree
|
|
||||||
you will not use, or allow others to use, Llama 3.2 to:\\n\\n\\n\\n1. Violate
|
|
||||||
the law or others\u2019 rights, including to:\\n 1. Engage in, promote, generate,
|
|
||||||
contribute to, encourage, plan, incite, or further illegal or unlawful activity
|
|
||||||
or content, such as:\\n 1. Violence or terrorism\\n 2. Exploitation
|
|
||||||
or harm to children, including the solicitation, creation, acquisition, or dissemination
|
|
||||||
of child exploitative content or failure to report Child Sexual Abuse Material\\n
|
|
||||||
\ 3. Human trafficking, exploitation, and sexual violence\\n 4.
|
|
||||||
The illegal distribution of information or materials to minors, including obscene
|
|
||||||
materials, or failure to employ legally required age-gating in connection with
|
|
||||||
such information or materials.\\n 5. Sexual solicitation\\n 6.
|
|
||||||
Any other criminal activity\\n 1. Engage in, promote, incite, or facilitate
|
|
||||||
the harassment, abuse, threatening, or bullying of individuals or groups of
|
|
||||||
individuals\\n 2. Engage in, promote, incite, or facilitate discrimination
|
|
||||||
or other unlawful or harmful conduct in the provision of employment, employment
|
|
||||||
benefits, credit, housing, other economic benefits, or other essential goods
|
|
||||||
and services\\n 3. Engage in the unauthorized or unlicensed practice of any
|
|
||||||
profession including, but not limited to, financial, legal, medical/health,
|
|
||||||
or related professional practices\\n 4. Collect, process, disclose, generate,
|
|
||||||
or infer private or sensitive information about individuals, including information
|
|
||||||
about individuals\u2019 identity, health, or demographic information, unless
|
|
||||||
you have obtained the right to do so in accordance with applicable law\\n 5.
|
|
||||||
Engage in or facilitate any action or generate any content that infringes, misappropriates,
|
|
||||||
or otherwise violates any third-party rights, including the outputs or results
|
|
||||||
of any products or services using the Llama Materials\\n 6. Create, generate,
|
|
||||||
or facilitate the creation of malicious code, malware, computer viruses or do
|
|
||||||
anything else that could disable, overburden, interfere with or impair the proper
|
|
||||||
working, integrity, operation or appearance of a website or computer system\\n
|
|
||||||
\ 7. Engage in any action, or facilitate any action, to intentionally circumvent
|
|
||||||
or remove usage restrictions or other safety measures, or to enable functionality
|
|
||||||
disabled by Meta\\n2. Engage in, promote, incite, facilitate, or assist in the
|
|
||||||
planning or development of activities that present a risk of death or bodily
|
|
||||||
harm to individuals, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 8. Military, warfare, nuclear industries or applications, espionage, use
|
|
||||||
for materials or activities that are subject to the International Traffic Arms
|
|
||||||
Regulations (ITAR) maintained by the United States Department of State or to
|
|
||||||
the U.S. Biological Weapons Anti-Terrorism Act of 1989 or the Chemical Weapons
|
|
||||||
Convention Implementation Act of 1997\\n 9. Guns and illegal weapons (including
|
|
||||||
weapon development)\\n 10. Illegal drugs and regulated/controlled substances\\n
|
|
||||||
\ 11. Operation of critical infrastructure, transportation technologies, or
|
|
||||||
heavy machinery\\n 12. Self-harm or harm to others, including suicide, cutting,
|
|
||||||
and eating disorders\\n 13. Any content intended to incite or promote violence,
|
|
||||||
abuse, or any infliction of bodily harm to an individual\\n3. Intentionally
|
|
||||||
deceive or mislead others, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 14. Generating, promoting, or furthering fraud or the creation or promotion
|
|
||||||
of disinformation\\n 15. Generating, promoting, or furthering defamatory
|
|
||||||
content, including the creation of defamatory statements, images, or other content\\n
|
|
||||||
\ 16. Generating, promoting, or further distributing spam\\n 17. Impersonating
|
|
||||||
another individual without consent, authorization, or legal right\\n 18.
|
|
||||||
Representing that the use of Llama 3.2 or outputs are human-generated\\n 19.
|
|
||||||
Generating or facilitating false online engagement, including fake reviews and
|
|
||||||
other means of fake online engagement\\n4. Fail to appropriately disclose to
|
|
||||||
end users any known dangers of your AI system\\n5. Interact with third party
|
|
||||||
tools, models, or software designed to generate unlawful content or engage in
|
|
||||||
unlawful or harmful conduct and/or represent that the outputs of such tools,
|
|
||||||
models, or software are associated with Meta or Llama 3.2\\n\\nWith respect
|
|
||||||
to any multimodal models included in Llama 3.2, the rights granted under Section
|
|
||||||
1(a) of the Llama 3.2 Community License Agreement are not being granted to you
|
|
||||||
if you are an individual domiciled in, or a company with a principal place of
|
|
||||||
business in, the European Union. This restriction does not apply to end users
|
|
||||||
of a product or service that incorporates any such multimodal models.\\n\\nPlease
|
|
||||||
report any violation of this Policy, software \u201Cbug,\u201D or other problems
|
|
||||||
that could lead to a violation of this Policy through one of the following means:\\n\\n\\n\\n*
|
|
||||||
Reporting issues with the model: [https://github.com/meta-llama/llama-models/issues](https://l.workplace.com/l.php?u=https%3A%2F%2Fgithub.com%2Fmeta-llama%2Fllama-models%2Fissues\\u0026h=AT0qV8W9BFT6NwihiOHRuKYQM_UnkzN_NmHMy91OT55gkLpgi4kQupHUl0ssR4dQsIQ8n3tfd0vtkobvsEvt1l4Ic6GXI2EeuHV8N08OG2WnbAmm0FL4ObkazC6G_256vN0lN9DsykCvCqGZ)\\n*
|
|
||||||
Reporting risky content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)\\n*
|
|
||||||
Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)\\n*
|
|
||||||
Reporting violations of the Acceptable Use Policy or unlicensed uses of Llama
|
|
||||||
3.2: LlamaUseReport@meta.com\",\"modelfile\":\"# Modelfile generated by \\\"ollama
|
|
||||||
show\\\"\\n# To build a new Modelfile based on this, replace FROM with:\\n#
|
|
||||||
FROM llama3.2:3b\\n\\nFROM /Users/brandonhancock/.ollama/models/blobs/sha256-dde5aa3fc5ffc17176b5e8bdc82f587b24b2678c6c66101bf7da77af9f7ccdff\\nTEMPLATE
|
|
||||||
\\\"\\\"\\\"\\u003c|start_header_id|\\u003esystem\\u003c|end_header_id|\\u003e\\n\\nCutting
|
|
||||||
Knowledge Date: December 2023\\n\\n{{ if .System }}{{ .System }}\\n{{- end }}\\n{{-
|
|
||||||
if .Tools }}When you receive a tool call response, use the output to format
|
|
||||||
an answer to the orginal user question.\\n\\nYou are a helpful assistant with
|
|
||||||
tool calling capabilities.\\n{{- end }}\\u003c|eot_id|\\u003e\\n{{- range $i,
|
|
||||||
$_ := .Messages }}\\n{{- $last := eq (len (slice $.Messages $i)) 1 }}\\n{{-
|
|
||||||
if eq .Role \\\"user\\\" }}\\u003c|start_header_id|\\u003euser\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if and $.Tools $last }}\\n\\nGiven the following functions, please respond with
|
|
||||||
a JSON for a function call with its proper arguments that best answers the given
|
|
||||||
prompt.\\n\\nRespond in the format {\\\"name\\\": function name, \\\"parameters\\\":
|
|
||||||
dictionary of argument name and its value}. Do not use variables.\\n\\n{{ range
|
|
||||||
$.Tools }}\\n{{- . }}\\n{{ end }}\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{-
|
|
||||||
else }}\\n\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{- end }}{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- else if eq .Role \\\"assistant\\\" }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if .ToolCalls }}\\n{{ range .ToolCalls }}\\n{\\\"name\\\": \\\"{{ .Function.Name
|
|
||||||
}}\\\", \\\"parameters\\\": {{ .Function.Arguments }}}{{ end }}\\n{{- else }}\\n\\n{{
|
|
||||||
.Content }}\\n{{- end }}{{ if not $last }}\\u003c|eot_id|\\u003e{{ end }}\\n{{-
|
|
||||||
else if eq .Role \\\"tool\\\" }}\\u003c|start_header_id|\\u003eipython\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
.Content }}\\u003c|eot_id|\\u003e{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- end }}\\n{{- end }}\\\"\\\"\\\"\\nPARAMETER stop \\u003c|start_header_id|\\u003e\\nPARAMETER
|
|
||||||
stop \\u003c|end_header_id|\\u003e\\nPARAMETER stop \\u003c|eot_id|\\u003e\\nLICENSE
|
|
||||||
\\\"LLAMA 3.2 COMMUNITY LICENSE AGREEMENT\\nLlama 3.2 Version Release Date:
|
|
||||||
September 25, 2024\\n\\n\u201CAgreement\u201D means the terms and conditions
|
|
||||||
for use, reproduction, distribution \\nand modification of the Llama Materials
|
|
||||||
set forth herein.\\n\\n\u201CDocumentation\u201D means the specifications, manuals
|
|
||||||
and documentation accompanying Llama 3.2\\ndistributed by Meta at https://llama.meta.com/doc/overview.\\n\\n\u201CLicensee\u201D
|
|
||||||
or \u201Cyou\u201D means you, or your employer or any other person or entity
|
|
||||||
(if you are \\nentering into this Agreement on such person or entity\u2019s
|
|
||||||
behalf), of the age required under\\napplicable laws, rules or regulations to
|
|
||||||
provide legal consent and that has legal authority\\nto bind your employer or
|
|
||||||
such other person or entity if you are entering in this Agreement\\non their
|
|
||||||
behalf.\\n\\n\u201CLlama 3.2\u201D means the foundational large language models
|
|
||||||
and software and algorithms, including\\nmachine-learning model code, trained
|
|
||||||
model weights, inference-enabling code, training-enabling code,\\nfine-tuning
|
|
||||||
enabling code and other elements of the foregoing distributed by Meta at \\nhttps://www.llama.com/llama-downloads.\\n\\n\u201CLlama
|
|
||||||
Materials\u201D means, collectively, Meta\u2019s proprietary Llama 3.2 and Documentation
|
|
||||||
(and \\nany portion thereof) made available under this Agreement.\\n\\n\u201CMeta\u201D
|
|
||||||
or \u201Cwe\u201D means Meta Platforms Ireland Limited (if you are located in
|
|
||||||
or, \\nif you are an entity, your principal place of business is in the EEA
|
|
||||||
or Switzerland) \\nand Meta Platforms, Inc. (if you are located outside of the
|
|
||||||
EEA or Switzerland). \\n\\n\\nBy clicking \u201CI Accept\u201D below or by using
|
|
||||||
or distributing any portion or element of the Llama Materials,\\nyou agree to
|
|
||||||
be bound by this Agreement.\\n\\n\\n1. License Rights and Redistribution.\\n\\n
|
|
||||||
\ a. Grant of Rights. You are granted a non-exclusive, worldwide, \\nnon-transferable
|
|
||||||
and royalty-free limited license under Meta\u2019s intellectual property or
|
|
||||||
other rights \\nowned by Meta embodied in the Llama Materials to use, reproduce,
|
|
||||||
distribute, copy, create derivative works \\nof, and make modifications to the
|
|
||||||
Llama Materials. \\n\\n b. Redistribution and Use. \\n\\n i. If
|
|
||||||
you distribute or make available the Llama Materials (or any derivative works
|
|
||||||
thereof), \\nor a product or service (including another AI model) that contains
|
|
||||||
any of them, you shall (A) provide\\na copy of this Agreement with any such
|
|
||||||
Llama Materials; and (B) prominently display \u201CBuilt with Llama\u201D\\non
|
|
||||||
a related website, user interface, blogpost, about page, or product documentation.
|
|
||||||
If you use the\\nLlama Materials or any outputs or results of the Llama Materials
|
|
||||||
to create, train, fine tune, or\\notherwise improve an AI model, which is distributed
|
|
||||||
or made available, you shall also include \u201CLlama\u201D\\nat the beginning
|
|
||||||
of any such AI model name.\\n\\n ii. If you receive Llama Materials,
|
|
||||||
or any derivative works thereof, from a Licensee as part\\nof an integrated
|
|
||||||
end user product, then Section 2 of this Agreement will not apply to you. \\n\\n
|
|
||||||
\ iii. You must retain in all copies of the Llama Materials that you distribute
|
|
||||||
the \\nfollowing attribution notice within a \u201CNotice\u201D text file distributed
|
|
||||||
as a part of such copies: \\n\u201CLlama 3.2 is licensed under the Llama 3.2
|
|
||||||
Community License, Copyright \xA9 Meta Platforms,\\nInc. All Rights Reserved.\u201D\\n\\n
|
|
||||||
\ iv. Your use of the Llama Materials must comply with applicable laws
|
|
||||||
and regulations\\n(including trade compliance laws and regulations) and adhere
|
|
||||||
to the Acceptable Use Policy for\\nthe Llama Materials (available at https://www.llama.com/llama3_2/use-policy),
|
|
||||||
which is hereby \\nincorporated by reference into this Agreement.\\n \\n2.
|
|
||||||
Additional Commercial Terms. If, on the Llama 3.2 version release date, the
|
|
||||||
monthly active users\\nof the products or services made available by or for
|
|
||||||
Licensee, or Licensee\u2019s affiliates, \\nis greater than 700 million monthly
|
|
||||||
active users in the preceding calendar month, you must request \\na license
|
|
||||||
from Meta, which Meta may grant to you in its sole discretion, and you are not
|
|
||||||
authorized to\\nexercise any of the rights under this Agreement unless or until
|
|
||||||
Meta otherwise expressly grants you such rights.\\n\\n3. Disclaimer of Warranty.
|
|
||||||
UNLESS REQUIRED BY APPLICABLE LAW, THE LLAMA MATERIALS AND ANY OUTPUT AND \\nRESULTS
|
|
||||||
THEREFROM ARE PROVIDED ON AN \u201CAS IS\u201D BASIS, WITHOUT WARRANTIES OF
|
|
||||||
ANY KIND, AND META DISCLAIMS\\nALL WARRANTIES OF ANY KIND, BOTH EXPRESS AND
|
|
||||||
IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES\\nOF TITLE, NON-INFRINGEMENT,
|
|
||||||
MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE\\nFOR
|
|
||||||
DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE LLAMA MATERIALS
|
|
||||||
AND ASSUME ANY RISKS ASSOCIATED\\nWITH YOUR USE OF THE LLAMA MATERIALS AND ANY
|
|
||||||
OUTPUT AND RESULTS.\\n\\n4. Limitation of Liability. IN NO EVENT WILL META OR
|
|
||||||
ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY, \\nWHETHER IN CONTRACT,
|
|
||||||
TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT,
|
|
||||||
\\nFOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL,
|
|
||||||
EXEMPLARY OR PUNITIVE DAMAGES, EVEN \\nIF META OR ITS AFFILIATES HAVE BEEN ADVISED
|
|
||||||
OF THE POSSIBILITY OF ANY OF THE FOREGOING.\\n\\n5. Intellectual Property.\\n\\n
|
|
||||||
\ a. No trademark licenses are granted under this Agreement, and in connection
|
|
||||||
with the Llama Materials, \\nneither Meta nor Licensee may use any name or mark
|
|
||||||
owned by or associated with the other or any of its affiliates, \\nexcept as
|
|
||||||
required for reasonable and customary use in describing and redistributing the
|
|
||||||
Llama Materials or as \\nset forth in this Section 5(a). Meta hereby grants
|
|
||||||
you a license to use \u201CLlama\u201D (the \u201CMark\u201D) solely as required
|
|
||||||
\\nto comply with the last sentence of Section 1.b.i. You will comply with Meta\u2019s
|
|
||||||
brand guidelines (currently accessible \\nat https://about.meta.com/brand/resources/meta/company-brand/).
|
|
||||||
All goodwill arising out of your use of the Mark \\nwill inure to the benefit
|
|
||||||
of Meta.\\n\\n b. Subject to Meta\u2019s ownership of Llama Materials and
|
|
||||||
derivatives made by or for Meta, with respect to any\\n derivative works
|
|
||||||
and modifications of the Llama Materials that are made by you, as between you
|
|
||||||
and Meta,\\n you are and will be the owner of such derivative works and modifications.\\n\\n
|
|
||||||
\ c. If you institute litigation or other proceedings against Meta or any
|
|
||||||
entity (including a cross-claim or\\n counterclaim in a lawsuit) alleging
|
|
||||||
that the Llama Materials or Llama 3.2 outputs or results, or any portion\\n
|
|
||||||
\ of any of the foregoing, constitutes infringement of intellectual property
|
|
||||||
or other rights owned or licensable\\n by you, then any licenses granted
|
|
||||||
to you under this Agreement shall terminate as of the date such litigation or\\n
|
|
||||||
\ claim is filed or instituted. You will indemnify and hold harmless Meta
|
|
||||||
from and against any claim by any third\\n party arising out of or related
|
|
||||||
to your use or distribution of the Llama Materials.\\n\\n6. Term and Termination.
|
|
||||||
The term of this Agreement will commence upon your acceptance of this Agreement
|
|
||||||
or access\\nto the Llama Materials and will continue in full force and effect
|
|
||||||
until terminated in accordance with the terms\\nand conditions herein. Meta
|
|
||||||
may terminate this Agreement if you are in breach of any term or condition of
|
|
||||||
this\\nAgreement. Upon termination of this Agreement, you shall delete and cease
|
|
||||||
use of the Llama Materials. Sections 3,\\n4 and 7 shall survive the termination
|
|
||||||
of this Agreement. \\n\\n7. Governing Law and Jurisdiction. This Agreement will
|
|
||||||
be governed and construed under the laws of the State of \\nCalifornia without
|
|
||||||
regard to choice of law principles, and the UN Convention on Contracts for the
|
|
||||||
International\\nSale of Goods does not apply to this Agreement. The courts of
|
|
||||||
California shall have exclusive jurisdiction of\\nany dispute arising out of
|
|
||||||
this Agreement.\\\"\\nLICENSE \\\"**Llama 3.2** **Acceptable Use Policy**\\n\\nMeta
|
|
||||||
is committed to promoting safe and fair use of its tools and features, including
|
|
||||||
Llama 3.2. If you access or use Llama 3.2, you agree to this Acceptable Use
|
|
||||||
Policy (\u201C**Policy**\u201D). The most recent copy of this policy can be
|
|
||||||
found at [https://www.llama.com/llama3_2/use-policy](https://www.llama.com/llama3_2/use-policy).\\n\\n**Prohibited
|
|
||||||
Uses**\\n\\nWe want everyone to use Llama 3.2 safely and responsibly. You agree
|
|
||||||
you will not use, or allow others to use, Llama 3.2 to:\\n\\n\\n\\n1. Violate
|
|
||||||
the law or others\u2019 rights, including to:\\n 1. Engage in, promote, generate,
|
|
||||||
contribute to, encourage, plan, incite, or further illegal or unlawful activity
|
|
||||||
or content, such as:\\n 1. Violence or terrorism\\n 2. Exploitation
|
|
||||||
or harm to children, including the solicitation, creation, acquisition, or dissemination
|
|
||||||
of child exploitative content or failure to report Child Sexual Abuse Material\\n
|
|
||||||
\ 3. Human trafficking, exploitation, and sexual violence\\n 4.
|
|
||||||
The illegal distribution of information or materials to minors, including obscene
|
|
||||||
materials, or failure to employ legally required age-gating in connection with
|
|
||||||
such information or materials.\\n 5. Sexual solicitation\\n 6.
|
|
||||||
Any other criminal activity\\n 1. Engage in, promote, incite, or facilitate
|
|
||||||
the harassment, abuse, threatening, or bullying of individuals or groups of
|
|
||||||
individuals\\n 2. Engage in, promote, incite, or facilitate discrimination
|
|
||||||
or other unlawful or harmful conduct in the provision of employment, employment
|
|
||||||
benefits, credit, housing, other economic benefits, or other essential goods
|
|
||||||
and services\\n 3. Engage in the unauthorized or unlicensed practice of any
|
|
||||||
profession including, but not limited to, financial, legal, medical/health,
|
|
||||||
or related professional practices\\n 4. Collect, process, disclose, generate,
|
|
||||||
or infer private or sensitive information about individuals, including information
|
|
||||||
about individuals\u2019 identity, health, or demographic information, unless
|
|
||||||
you have obtained the right to do so in accordance with applicable law\\n 5.
|
|
||||||
Engage in or facilitate any action or generate any content that infringes, misappropriates,
|
|
||||||
or otherwise violates any third-party rights, including the outputs or results
|
|
||||||
of any products or services using the Llama Materials\\n 6. Create, generate,
|
|
||||||
or facilitate the creation of malicious code, malware, computer viruses or do
|
|
||||||
anything else that could disable, overburden, interfere with or impair the proper
|
|
||||||
working, integrity, operation or appearance of a website or computer system\\n
|
|
||||||
\ 7. Engage in any action, or facilitate any action, to intentionally circumvent
|
|
||||||
or remove usage restrictions or other safety measures, or to enable functionality
|
|
||||||
disabled by Meta\\n2. Engage in, promote, incite, facilitate, or assist in the
|
|
||||||
planning or development of activities that present a risk of death or bodily
|
|
||||||
harm to individuals, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 8. Military, warfare, nuclear industries or applications, espionage, use
|
|
||||||
for materials or activities that are subject to the International Traffic Arms
|
|
||||||
Regulations (ITAR) maintained by the United States Department of State or to
|
|
||||||
the U.S. Biological Weapons Anti-Terrorism Act of 1989 or the Chemical Weapons
|
|
||||||
Convention Implementation Act of 1997\\n 9. Guns and illegal weapons (including
|
|
||||||
weapon development)\\n 10. Illegal drugs and regulated/controlled substances\\n
|
|
||||||
\ 11. Operation of critical infrastructure, transportation technologies, or
|
|
||||||
heavy machinery\\n 12. Self-harm or harm to others, including suicide, cutting,
|
|
||||||
and eating disorders\\n 13. Any content intended to incite or promote violence,
|
|
||||||
abuse, or any infliction of bodily harm to an individual\\n3. Intentionally
|
|
||||||
deceive or mislead others, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 14. Generating, promoting, or furthering fraud or the creation or promotion
|
|
||||||
of disinformation\\n 15. Generating, promoting, or furthering defamatory
|
|
||||||
content, including the creation of defamatory statements, images, or other content\\n
|
|
||||||
\ 16. Generating, promoting, or further distributing spam\\n 17. Impersonating
|
|
||||||
another individual without consent, authorization, or legal right\\n 18.
|
|
||||||
Representing that the use of Llama 3.2 or outputs are human-generated\\n 19.
|
|
||||||
Generating or facilitating false online engagement, including fake reviews and
|
|
||||||
other means of fake online engagement\\n4. Fail to appropriately disclose to
|
|
||||||
end users any known dangers of your AI system\\n5. Interact with third party
|
|
||||||
tools, models, or software designed to generate unlawful content or engage in
|
|
||||||
unlawful or harmful conduct and/or represent that the outputs of such tools,
|
|
||||||
models, or software are associated with Meta or Llama 3.2\\n\\nWith respect
|
|
||||||
to any multimodal models included in Llama 3.2, the rights granted under Section
|
|
||||||
1(a) of the Llama 3.2 Community License Agreement are not being granted to you
|
|
||||||
if you are an individual domiciled in, or a company with a principal place of
|
|
||||||
business in, the European Union. This restriction does not apply to end users
|
|
||||||
of a product or service that incorporates any such multimodal models.\\n\\nPlease
|
|
||||||
report any violation of this Policy, software \u201Cbug,\u201D or other problems
|
|
||||||
that could lead to a violation of this Policy through one of the following means:\\n\\n\\n\\n*
|
|
||||||
Reporting issues with the model: [https://github.com/meta-llama/llama-models/issues](https://l.workplace.com/l.php?u=https%3A%2F%2Fgithub.com%2Fmeta-llama%2Fllama-models%2Fissues\\u0026h=AT0qV8W9BFT6NwihiOHRuKYQM_UnkzN_NmHMy91OT55gkLpgi4kQupHUl0ssR4dQsIQ8n3tfd0vtkobvsEvt1l4Ic6GXI2EeuHV8N08OG2WnbAmm0FL4ObkazC6G_256vN0lN9DsykCvCqGZ)\\n*
|
|
||||||
Reporting risky content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)\\n*
|
|
||||||
Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)\\n*
|
|
||||||
Reporting violations of the Acceptable Use Policy or unlicensed uses of Llama
|
|
||||||
3.2: LlamaUseReport@meta.com\\\"\\n\",\"parameters\":\"stop \\\"\\u003c|start_header_id|\\u003e\\\"\\nstop
|
|
||||||
\ \\\"\\u003c|end_header_id|\\u003e\\\"\\nstop \\\"\\u003c|eot_id|\\u003e\\\"\",\"template\":\"\\u003c|start_header_id|\\u003esystem\\u003c|end_header_id|\\u003e\\n\\nCutting
|
|
||||||
Knowledge Date: December 2023\\n\\n{{ if .System }}{{ .System }}\\n{{- end }}\\n{{-
|
|
||||||
if .Tools }}When you receive a tool call response, use the output to format
|
|
||||||
an answer to the orginal user question.\\n\\nYou are a helpful assistant with
|
|
||||||
tool calling capabilities.\\n{{- end }}\\u003c|eot_id|\\u003e\\n{{- range $i,
|
|
||||||
$_ := .Messages }}\\n{{- $last := eq (len (slice $.Messages $i)) 1 }}\\n{{-
|
|
||||||
if eq .Role \\\"user\\\" }}\\u003c|start_header_id|\\u003euser\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if and $.Tools $last }}\\n\\nGiven the following functions, please respond with
|
|
||||||
a JSON for a function call with its proper arguments that best answers the given
|
|
||||||
prompt.\\n\\nRespond in the format {\\\"name\\\": function name, \\\"parameters\\\":
|
|
||||||
dictionary of argument name and its value}. Do not use variables.\\n\\n{{ range
|
|
||||||
$.Tools }}\\n{{- . }}\\n{{ end }}\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{-
|
|
||||||
else }}\\n\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{- end }}{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- else if eq .Role \\\"assistant\\\" }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if .ToolCalls }}\\n{{ range .ToolCalls }}\\n{\\\"name\\\": \\\"{{ .Function.Name
|
|
||||||
}}\\\", \\\"parameters\\\": {{ .Function.Arguments }}}{{ end }}\\n{{- else }}\\n\\n{{
|
|
||||||
.Content }}\\n{{- end }}{{ if not $last }}\\u003c|eot_id|\\u003e{{ end }}\\n{{-
|
|
||||||
else if eq .Role \\\"tool\\\" }}\\u003c|start_header_id|\\u003eipython\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
.Content }}\\u003c|eot_id|\\u003e{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- end }}\\n{{- end }}\",\"details\":{\"parent_model\":\"\",\"format\":\"gguf\",\"family\":\"llama\",\"families\":[\"llama\"],\"parameter_size\":\"3.2B\",\"quantization_level\":\"Q4_K_M\"},\"model_info\":{\"general.architecture\":\"llama\",\"general.basename\":\"Llama-3.2\",\"general.file_type\":15,\"general.finetune\":\"Instruct\",\"general.languages\":[\"en\",\"de\",\"fr\",\"it\",\"pt\",\"hi\",\"es\",\"th\"],\"general.parameter_count\":3212749888,\"general.quantization_version\":2,\"general.size_label\":\"3B\",\"general.tags\":[\"facebook\",\"meta\",\"pytorch\",\"llama\",\"llama-3\",\"text-generation\"],\"general.type\":\"model\",\"llama.attention.head_count\":24,\"llama.attention.head_count_kv\":8,\"llama.attention.key_length\":128,\"llama.attention.layer_norm_rms_epsilon\":0.00001,\"llama.attention.value_length\":128,\"llama.block_count\":28,\"llama.context_length\":131072,\"llama.embedding_length\":3072,\"llama.feed_forward_length\":8192,\"llama.rope.dimension_count\":128,\"llama.rope.freq_base\":500000,\"llama.vocab_size\":128256,\"tokenizer.ggml.bos_token_id\":128000,\"tokenizer.ggml.eos_token_id\":128009,\"tokenizer.ggml.merges\":null,\"tokenizer.ggml.model\":\"gpt2\",\"tokenizer.ggml.pre\":\"llama-bpe\",\"tokenizer.ggml.token_type\":null,\"tokenizer.ggml.tokens\":null},\"modified_at\":\"2024-12-31T11:53:14.529771974-05:00\"}"
|
|
||||||
headers:
|
|
||||||
Content-Type:
|
|
||||||
- application/json; charset=utf-8
|
|
||||||
Date:
|
|
||||||
- Tue, 31 Dec 2024 16:57:54 GMT
|
|
||||||
Transfer-Encoding:
|
|
||||||
- chunked
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
- request:
|
|
||||||
body: '{"name": "llama3.2:3b"}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- '*/*'
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '23'
|
|
||||||
content-type:
|
|
||||||
- application/json
|
|
||||||
host:
|
|
||||||
- localhost:11434
|
|
||||||
user-agent:
|
|
||||||
- litellm/1.56.4
|
|
||||||
method: POST
|
|
||||||
uri: http://localhost:11434/api/show
|
|
||||||
response:
|
|
||||||
content: "{\"license\":\"LLAMA 3.2 COMMUNITY LICENSE AGREEMENT\\nLlama 3.2 Version
|
|
||||||
Release Date: September 25, 2024\\n\\n\u201CAgreement\u201D means the terms
|
|
||||||
and conditions for use, reproduction, distribution \\nand modification of the
|
|
||||||
Llama Materials set forth herein.\\n\\n\u201CDocumentation\u201D means the specifications,
|
|
||||||
manuals and documentation accompanying Llama 3.2\\ndistributed by Meta at https://llama.meta.com/doc/overview.\\n\\n\u201CLicensee\u201D
|
|
||||||
or \u201Cyou\u201D means you, or your employer or any other person or entity
|
|
||||||
(if you are \\nentering into this Agreement on such person or entity\u2019s
|
|
||||||
behalf), of the age required under\\napplicable laws, rules or regulations to
|
|
||||||
provide legal consent and that has legal authority\\nto bind your employer or
|
|
||||||
such other person or entity if you are entering in this Agreement\\non their
|
|
||||||
behalf.\\n\\n\u201CLlama 3.2\u201D means the foundational large language models
|
|
||||||
and software and algorithms, including\\nmachine-learning model code, trained
|
|
||||||
model weights, inference-enabling code, training-enabling code,\\nfine-tuning
|
|
||||||
enabling code and other elements of the foregoing distributed by Meta at \\nhttps://www.llama.com/llama-downloads.\\n\\n\u201CLlama
|
|
||||||
Materials\u201D means, collectively, Meta\u2019s proprietary Llama 3.2 and Documentation
|
|
||||||
(and \\nany portion thereof) made available under this Agreement.\\n\\n\u201CMeta\u201D
|
|
||||||
or \u201Cwe\u201D means Meta Platforms Ireland Limited (if you are located in
|
|
||||||
or, \\nif you are an entity, your principal place of business is in the EEA
|
|
||||||
or Switzerland) \\nand Meta Platforms, Inc. (if you are located outside of the
|
|
||||||
EEA or Switzerland). \\n\\n\\nBy clicking \u201CI Accept\u201D below or by using
|
|
||||||
or distributing any portion or element of the Llama Materials,\\nyou agree to
|
|
||||||
be bound by this Agreement.\\n\\n\\n1. License Rights and Redistribution.\\n\\n
|
|
||||||
\ a. Grant of Rights. You are granted a non-exclusive, worldwide, \\nnon-transferable
|
|
||||||
and royalty-free limited license under Meta\u2019s intellectual property or
|
|
||||||
other rights \\nowned by Meta embodied in the Llama Materials to use, reproduce,
|
|
||||||
distribute, copy, create derivative works \\nof, and make modifications to the
|
|
||||||
Llama Materials. \\n\\n b. Redistribution and Use. \\n\\n i. If
|
|
||||||
you distribute or make available the Llama Materials (or any derivative works
|
|
||||||
thereof), \\nor a product or service (including another AI model) that contains
|
|
||||||
any of them, you shall (A) provide\\na copy of this Agreement with any such
|
|
||||||
Llama Materials; and (B) prominently display \u201CBuilt with Llama\u201D\\non
|
|
||||||
a related website, user interface, blogpost, about page, or product documentation.
|
|
||||||
If you use the\\nLlama Materials or any outputs or results of the Llama Materials
|
|
||||||
to create, train, fine tune, or\\notherwise improve an AI model, which is distributed
|
|
||||||
or made available, you shall also include \u201CLlama\u201D\\nat the beginning
|
|
||||||
of any such AI model name.\\n\\n ii. If you receive Llama Materials,
|
|
||||||
or any derivative works thereof, from a Licensee as part\\nof an integrated
|
|
||||||
end user product, then Section 2 of this Agreement will not apply to you. \\n\\n
|
|
||||||
\ iii. You must retain in all copies of the Llama Materials that you distribute
|
|
||||||
the \\nfollowing attribution notice within a \u201CNotice\u201D text file distributed
|
|
||||||
as a part of such copies: \\n\u201CLlama 3.2 is licensed under the Llama 3.2
|
|
||||||
Community License, Copyright \xA9 Meta Platforms,\\nInc. All Rights Reserved.\u201D\\n\\n
|
|
||||||
\ iv. Your use of the Llama Materials must comply with applicable laws
|
|
||||||
and regulations\\n(including trade compliance laws and regulations) and adhere
|
|
||||||
to the Acceptable Use Policy for\\nthe Llama Materials (available at https://www.llama.com/llama3_2/use-policy),
|
|
||||||
which is hereby \\nincorporated by reference into this Agreement.\\n \\n2.
|
|
||||||
Additional Commercial Terms. If, on the Llama 3.2 version release date, the
|
|
||||||
monthly active users\\nof the products or services made available by or for
|
|
||||||
Licensee, or Licensee\u2019s affiliates, \\nis greater than 700 million monthly
|
|
||||||
active users in the preceding calendar month, you must request \\na license
|
|
||||||
from Meta, which Meta may grant to you in its sole discretion, and you are not
|
|
||||||
authorized to\\nexercise any of the rights under this Agreement unless or until
|
|
||||||
Meta otherwise expressly grants you such rights.\\n\\n3. Disclaimer of Warranty.
|
|
||||||
UNLESS REQUIRED BY APPLICABLE LAW, THE LLAMA MATERIALS AND ANY OUTPUT AND \\nRESULTS
|
|
||||||
THEREFROM ARE PROVIDED ON AN \u201CAS IS\u201D BASIS, WITHOUT WARRANTIES OF
|
|
||||||
ANY KIND, AND META DISCLAIMS\\nALL WARRANTIES OF ANY KIND, BOTH EXPRESS AND
|
|
||||||
IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES\\nOF TITLE, NON-INFRINGEMENT,
|
|
||||||
MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE\\nFOR
|
|
||||||
DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE LLAMA MATERIALS
|
|
||||||
AND ASSUME ANY RISKS ASSOCIATED\\nWITH YOUR USE OF THE LLAMA MATERIALS AND ANY
|
|
||||||
OUTPUT AND RESULTS.\\n\\n4. Limitation of Liability. IN NO EVENT WILL META OR
|
|
||||||
ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY, \\nWHETHER IN CONTRACT,
|
|
||||||
TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT,
|
|
||||||
\\nFOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL,
|
|
||||||
EXEMPLARY OR PUNITIVE DAMAGES, EVEN \\nIF META OR ITS AFFILIATES HAVE BEEN ADVISED
|
|
||||||
OF THE POSSIBILITY OF ANY OF THE FOREGOING.\\n\\n5. Intellectual Property.\\n\\n
|
|
||||||
\ a. No trademark licenses are granted under this Agreement, and in connection
|
|
||||||
with the Llama Materials, \\nneither Meta nor Licensee may use any name or mark
|
|
||||||
owned by or associated with the other or any of its affiliates, \\nexcept as
|
|
||||||
required for reasonable and customary use in describing and redistributing the
|
|
||||||
Llama Materials or as \\nset forth in this Section 5(a). Meta hereby grants
|
|
||||||
you a license to use \u201CLlama\u201D (the \u201CMark\u201D) solely as required
|
|
||||||
\\nto comply with the last sentence of Section 1.b.i. You will comply with Meta\u2019s
|
|
||||||
brand guidelines (currently accessible \\nat https://about.meta.com/brand/resources/meta/company-brand/).
|
|
||||||
All goodwill arising out of your use of the Mark \\nwill inure to the benefit
|
|
||||||
of Meta.\\n\\n b. Subject to Meta\u2019s ownership of Llama Materials and
|
|
||||||
derivatives made by or for Meta, with respect to any\\n derivative works
|
|
||||||
and modifications of the Llama Materials that are made by you, as between you
|
|
||||||
and Meta,\\n you are and will be the owner of such derivative works and modifications.\\n\\n
|
|
||||||
\ c. If you institute litigation or other proceedings against Meta or any
|
|
||||||
entity (including a cross-claim or\\n counterclaim in a lawsuit) alleging
|
|
||||||
that the Llama Materials or Llama 3.2 outputs or results, or any portion\\n
|
|
||||||
\ of any of the foregoing, constitutes infringement of intellectual property
|
|
||||||
or other rights owned or licensable\\n by you, then any licenses granted
|
|
||||||
to you under this Agreement shall terminate as of the date such litigation or\\n
|
|
||||||
\ claim is filed or instituted. You will indemnify and hold harmless Meta
|
|
||||||
from and against any claim by any third\\n party arising out of or related
|
|
||||||
to your use or distribution of the Llama Materials.\\n\\n6. Term and Termination.
|
|
||||||
The term of this Agreement will commence upon your acceptance of this Agreement
|
|
||||||
or access\\nto the Llama Materials and will continue in full force and effect
|
|
||||||
until terminated in accordance with the terms\\nand conditions herein. Meta
|
|
||||||
may terminate this Agreement if you are in breach of any term or condition of
|
|
||||||
this\\nAgreement. Upon termination of this Agreement, you shall delete and cease
|
|
||||||
use of the Llama Materials. Sections 3,\\n4 and 7 shall survive the termination
|
|
||||||
of this Agreement. \\n\\n7. Governing Law and Jurisdiction. This Agreement will
|
|
||||||
be governed and construed under the laws of the State of \\nCalifornia without
|
|
||||||
regard to choice of law principles, and the UN Convention on Contracts for the
|
|
||||||
International\\nSale of Goods does not apply to this Agreement. The courts of
|
|
||||||
California shall have exclusive jurisdiction of\\nany dispute arising out of
|
|
||||||
this Agreement.\\n**Llama 3.2** **Acceptable Use Policy**\\n\\nMeta is committed
|
|
||||||
to promoting safe and fair use of its tools and features, including Llama 3.2.
|
|
||||||
If you access or use Llama 3.2, you agree to this Acceptable Use Policy (\u201C**Policy**\u201D).
|
|
||||||
The most recent copy of this policy can be found at [https://www.llama.com/llama3_2/use-policy](https://www.llama.com/llama3_2/use-policy).\\n\\n**Prohibited
|
|
||||||
Uses**\\n\\nWe want everyone to use Llama 3.2 safely and responsibly. You agree
|
|
||||||
you will not use, or allow others to use, Llama 3.2 to:\\n\\n\\n\\n1. Violate
|
|
||||||
the law or others\u2019 rights, including to:\\n 1. Engage in, promote, generate,
|
|
||||||
contribute to, encourage, plan, incite, or further illegal or unlawful activity
|
|
||||||
or content, such as:\\n 1. Violence or terrorism\\n 2. Exploitation
|
|
||||||
or harm to children, including the solicitation, creation, acquisition, or dissemination
|
|
||||||
of child exploitative content or failure to report Child Sexual Abuse Material\\n
|
|
||||||
\ 3. Human trafficking, exploitation, and sexual violence\\n 4.
|
|
||||||
The illegal distribution of information or materials to minors, including obscene
|
|
||||||
materials, or failure to employ legally required age-gating in connection with
|
|
||||||
such information or materials.\\n 5. Sexual solicitation\\n 6.
|
|
||||||
Any other criminal activity\\n 1. Engage in, promote, incite, or facilitate
|
|
||||||
the harassment, abuse, threatening, or bullying of individuals or groups of
|
|
||||||
individuals\\n 2. Engage in, promote, incite, or facilitate discrimination
|
|
||||||
or other unlawful or harmful conduct in the provision of employment, employment
|
|
||||||
benefits, credit, housing, other economic benefits, or other essential goods
|
|
||||||
and services\\n 3. Engage in the unauthorized or unlicensed practice of any
|
|
||||||
profession including, but not limited to, financial, legal, medical/health,
|
|
||||||
or related professional practices\\n 4. Collect, process, disclose, generate,
|
|
||||||
or infer private or sensitive information about individuals, including information
|
|
||||||
about individuals\u2019 identity, health, or demographic information, unless
|
|
||||||
you have obtained the right to do so in accordance with applicable law\\n 5.
|
|
||||||
Engage in or facilitate any action or generate any content that infringes, misappropriates,
|
|
||||||
or otherwise violates any third-party rights, including the outputs or results
|
|
||||||
of any products or services using the Llama Materials\\n 6. Create, generate,
|
|
||||||
or facilitate the creation of malicious code, malware, computer viruses or do
|
|
||||||
anything else that could disable, overburden, interfere with or impair the proper
|
|
||||||
working, integrity, operation or appearance of a website or computer system\\n
|
|
||||||
\ 7. Engage in any action, or facilitate any action, to intentionally circumvent
|
|
||||||
or remove usage restrictions or other safety measures, or to enable functionality
|
|
||||||
disabled by Meta\\n2. Engage in, promote, incite, facilitate, or assist in the
|
|
||||||
planning or development of activities that present a risk of death or bodily
|
|
||||||
harm to individuals, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 8. Military, warfare, nuclear industries or applications, espionage, use
|
|
||||||
for materials or activities that are subject to the International Traffic Arms
|
|
||||||
Regulations (ITAR) maintained by the United States Department of State or to
|
|
||||||
the U.S. Biological Weapons Anti-Terrorism Act of 1989 or the Chemical Weapons
|
|
||||||
Convention Implementation Act of 1997\\n 9. Guns and illegal weapons (including
|
|
||||||
weapon development)\\n 10. Illegal drugs and regulated/controlled substances\\n
|
|
||||||
\ 11. Operation of critical infrastructure, transportation technologies, or
|
|
||||||
heavy machinery\\n 12. Self-harm or harm to others, including suicide, cutting,
|
|
||||||
and eating disorders\\n 13. Any content intended to incite or promote violence,
|
|
||||||
abuse, or any infliction of bodily harm to an individual\\n3. Intentionally
|
|
||||||
deceive or mislead others, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 14. Generating, promoting, or furthering fraud or the creation or promotion
|
|
||||||
of disinformation\\n 15. Generating, promoting, or furthering defamatory
|
|
||||||
content, including the creation of defamatory statements, images, or other content\\n
|
|
||||||
\ 16. Generating, promoting, or further distributing spam\\n 17. Impersonating
|
|
||||||
another individual without consent, authorization, or legal right\\n 18.
|
|
||||||
Representing that the use of Llama 3.2 or outputs are human-generated\\n 19.
|
|
||||||
Generating or facilitating false online engagement, including fake reviews and
|
|
||||||
other means of fake online engagement\\n4. Fail to appropriately disclose to
|
|
||||||
end users any known dangers of your AI system\\n5. Interact with third party
|
|
||||||
tools, models, or software designed to generate unlawful content or engage in
|
|
||||||
unlawful or harmful conduct and/or represent that the outputs of such tools,
|
|
||||||
models, or software are associated with Meta or Llama 3.2\\n\\nWith respect
|
|
||||||
to any multimodal models included in Llama 3.2, the rights granted under Section
|
|
||||||
1(a) of the Llama 3.2 Community License Agreement are not being granted to you
|
|
||||||
if you are an individual domiciled in, or a company with a principal place of
|
|
||||||
business in, the European Union. This restriction does not apply to end users
|
|
||||||
of a product or service that incorporates any such multimodal models.\\n\\nPlease
|
|
||||||
report any violation of this Policy, software \u201Cbug,\u201D or other problems
|
|
||||||
that could lead to a violation of this Policy through one of the following means:\\n\\n\\n\\n*
|
|
||||||
Reporting issues with the model: [https://github.com/meta-llama/llama-models/issues](https://l.workplace.com/l.php?u=https%3A%2F%2Fgithub.com%2Fmeta-llama%2Fllama-models%2Fissues\\u0026h=AT0qV8W9BFT6NwihiOHRuKYQM_UnkzN_NmHMy91OT55gkLpgi4kQupHUl0ssR4dQsIQ8n3tfd0vtkobvsEvt1l4Ic6GXI2EeuHV8N08OG2WnbAmm0FL4ObkazC6G_256vN0lN9DsykCvCqGZ)\\n*
|
|
||||||
Reporting risky content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)\\n*
|
|
||||||
Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)\\n*
|
|
||||||
Reporting violations of the Acceptable Use Policy or unlicensed uses of Llama
|
|
||||||
3.2: LlamaUseReport@meta.com\",\"modelfile\":\"# Modelfile generated by \\\"ollama
|
|
||||||
show\\\"\\n# To build a new Modelfile based on this, replace FROM with:\\n#
|
|
||||||
FROM llama3.2:3b\\n\\nFROM /Users/brandonhancock/.ollama/models/blobs/sha256-dde5aa3fc5ffc17176b5e8bdc82f587b24b2678c6c66101bf7da77af9f7ccdff\\nTEMPLATE
|
|
||||||
\\\"\\\"\\\"\\u003c|start_header_id|\\u003esystem\\u003c|end_header_id|\\u003e\\n\\nCutting
|
|
||||||
Knowledge Date: December 2023\\n\\n{{ if .System }}{{ .System }}\\n{{- end }}\\n{{-
|
|
||||||
if .Tools }}When you receive a tool call response, use the output to format
|
|
||||||
an answer to the orginal user question.\\n\\nYou are a helpful assistant with
|
|
||||||
tool calling capabilities.\\n{{- end }}\\u003c|eot_id|\\u003e\\n{{- range $i,
|
|
||||||
$_ := .Messages }}\\n{{- $last := eq (len (slice $.Messages $i)) 1 }}\\n{{-
|
|
||||||
if eq .Role \\\"user\\\" }}\\u003c|start_header_id|\\u003euser\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if and $.Tools $last }}\\n\\nGiven the following functions, please respond with
|
|
||||||
a JSON for a function call with its proper arguments that best answers the given
|
|
||||||
prompt.\\n\\nRespond in the format {\\\"name\\\": function name, \\\"parameters\\\":
|
|
||||||
dictionary of argument name and its value}. Do not use variables.\\n\\n{{ range
|
|
||||||
$.Tools }}\\n{{- . }}\\n{{ end }}\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{-
|
|
||||||
else }}\\n\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{- end }}{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- else if eq .Role \\\"assistant\\\" }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if .ToolCalls }}\\n{{ range .ToolCalls }}\\n{\\\"name\\\": \\\"{{ .Function.Name
|
|
||||||
}}\\\", \\\"parameters\\\": {{ .Function.Arguments }}}{{ end }}\\n{{- else }}\\n\\n{{
|
|
||||||
.Content }}\\n{{- end }}{{ if not $last }}\\u003c|eot_id|\\u003e{{ end }}\\n{{-
|
|
||||||
else if eq .Role \\\"tool\\\" }}\\u003c|start_header_id|\\u003eipython\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
.Content }}\\u003c|eot_id|\\u003e{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- end }}\\n{{- end }}\\\"\\\"\\\"\\nPARAMETER stop \\u003c|start_header_id|\\u003e\\nPARAMETER
|
|
||||||
stop \\u003c|end_header_id|\\u003e\\nPARAMETER stop \\u003c|eot_id|\\u003e\\nLICENSE
|
|
||||||
\\\"LLAMA 3.2 COMMUNITY LICENSE AGREEMENT\\nLlama 3.2 Version Release Date:
|
|
||||||
September 25, 2024\\n\\n\u201CAgreement\u201D means the terms and conditions
|
|
||||||
for use, reproduction, distribution \\nand modification of the Llama Materials
|
|
||||||
set forth herein.\\n\\n\u201CDocumentation\u201D means the specifications, manuals
|
|
||||||
and documentation accompanying Llama 3.2\\ndistributed by Meta at https://llama.meta.com/doc/overview.\\n\\n\u201CLicensee\u201D
|
|
||||||
or \u201Cyou\u201D means you, or your employer or any other person or entity
|
|
||||||
(if you are \\nentering into this Agreement on such person or entity\u2019s
|
|
||||||
behalf), of the age required under\\napplicable laws, rules or regulations to
|
|
||||||
provide legal consent and that has legal authority\\nto bind your employer or
|
|
||||||
such other person or entity if you are entering in this Agreement\\non their
|
|
||||||
behalf.\\n\\n\u201CLlama 3.2\u201D means the foundational large language models
|
|
||||||
and software and algorithms, including\\nmachine-learning model code, trained
|
|
||||||
model weights, inference-enabling code, training-enabling code,\\nfine-tuning
|
|
||||||
enabling code and other elements of the foregoing distributed by Meta at \\nhttps://www.llama.com/llama-downloads.\\n\\n\u201CLlama
|
|
||||||
Materials\u201D means, collectively, Meta\u2019s proprietary Llama 3.2 and Documentation
|
|
||||||
(and \\nany portion thereof) made available under this Agreement.\\n\\n\u201CMeta\u201D
|
|
||||||
or \u201Cwe\u201D means Meta Platforms Ireland Limited (if you are located in
|
|
||||||
or, \\nif you are an entity, your principal place of business is in the EEA
|
|
||||||
or Switzerland) \\nand Meta Platforms, Inc. (if you are located outside of the
|
|
||||||
EEA or Switzerland). \\n\\n\\nBy clicking \u201CI Accept\u201D below or by using
|
|
||||||
or distributing any portion or element of the Llama Materials,\\nyou agree to
|
|
||||||
be bound by this Agreement.\\n\\n\\n1. License Rights and Redistribution.\\n\\n
|
|
||||||
\ a. Grant of Rights. You are granted a non-exclusive, worldwide, \\nnon-transferable
|
|
||||||
and royalty-free limited license under Meta\u2019s intellectual property or
|
|
||||||
other rights \\nowned by Meta embodied in the Llama Materials to use, reproduce,
|
|
||||||
distribute, copy, create derivative works \\nof, and make modifications to the
|
|
||||||
Llama Materials. \\n\\n b. Redistribution and Use. \\n\\n i. If
|
|
||||||
you distribute or make available the Llama Materials (or any derivative works
|
|
||||||
thereof), \\nor a product or service (including another AI model) that contains
|
|
||||||
any of them, you shall (A) provide\\na copy of this Agreement with any such
|
|
||||||
Llama Materials; and (B) prominently display \u201CBuilt with Llama\u201D\\non
|
|
||||||
a related website, user interface, blogpost, about page, or product documentation.
|
|
||||||
If you use the\\nLlama Materials or any outputs or results of the Llama Materials
|
|
||||||
to create, train, fine tune, or\\notherwise improve an AI model, which is distributed
|
|
||||||
or made available, you shall also include \u201CLlama\u201D\\nat the beginning
|
|
||||||
of any such AI model name.\\n\\n ii. If you receive Llama Materials,
|
|
||||||
or any derivative works thereof, from a Licensee as part\\nof an integrated
|
|
||||||
end user product, then Section 2 of this Agreement will not apply to you. \\n\\n
|
|
||||||
\ iii. You must retain in all copies of the Llama Materials that you distribute
|
|
||||||
the \\nfollowing attribution notice within a \u201CNotice\u201D text file distributed
|
|
||||||
as a part of such copies: \\n\u201CLlama 3.2 is licensed under the Llama 3.2
|
|
||||||
Community License, Copyright \xA9 Meta Platforms,\\nInc. All Rights Reserved.\u201D\\n\\n
|
|
||||||
\ iv. Your use of the Llama Materials must comply with applicable laws
|
|
||||||
and regulations\\n(including trade compliance laws and regulations) and adhere
|
|
||||||
to the Acceptable Use Policy for\\nthe Llama Materials (available at https://www.llama.com/llama3_2/use-policy),
|
|
||||||
which is hereby \\nincorporated by reference into this Agreement.\\n \\n2.
|
|
||||||
Additional Commercial Terms. If, on the Llama 3.2 version release date, the
|
|
||||||
monthly active users\\nof the products or services made available by or for
|
|
||||||
Licensee, or Licensee\u2019s affiliates, \\nis greater than 700 million monthly
|
|
||||||
active users in the preceding calendar month, you must request \\na license
|
|
||||||
from Meta, which Meta may grant to you in its sole discretion, and you are not
|
|
||||||
authorized to\\nexercise any of the rights under this Agreement unless or until
|
|
||||||
Meta otherwise expressly grants you such rights.\\n\\n3. Disclaimer of Warranty.
|
|
||||||
UNLESS REQUIRED BY APPLICABLE LAW, THE LLAMA MATERIALS AND ANY OUTPUT AND \\nRESULTS
|
|
||||||
THEREFROM ARE PROVIDED ON AN \u201CAS IS\u201D BASIS, WITHOUT WARRANTIES OF
|
|
||||||
ANY KIND, AND META DISCLAIMS\\nALL WARRANTIES OF ANY KIND, BOTH EXPRESS AND
|
|
||||||
IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES\\nOF TITLE, NON-INFRINGEMENT,
|
|
||||||
MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE\\nFOR
|
|
||||||
DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE LLAMA MATERIALS
|
|
||||||
AND ASSUME ANY RISKS ASSOCIATED\\nWITH YOUR USE OF THE LLAMA MATERIALS AND ANY
|
|
||||||
OUTPUT AND RESULTS.\\n\\n4. Limitation of Liability. IN NO EVENT WILL META OR
|
|
||||||
ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY, \\nWHETHER IN CONTRACT,
|
|
||||||
TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT,
|
|
||||||
\\nFOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL,
|
|
||||||
EXEMPLARY OR PUNITIVE DAMAGES, EVEN \\nIF META OR ITS AFFILIATES HAVE BEEN ADVISED
|
|
||||||
OF THE POSSIBILITY OF ANY OF THE FOREGOING.\\n\\n5. Intellectual Property.\\n\\n
|
|
||||||
\ a. No trademark licenses are granted under this Agreement, and in connection
|
|
||||||
with the Llama Materials, \\nneither Meta nor Licensee may use any name or mark
|
|
||||||
owned by or associated with the other or any of its affiliates, \\nexcept as
|
|
||||||
required for reasonable and customary use in describing and redistributing the
|
|
||||||
Llama Materials or as \\nset forth in this Section 5(a). Meta hereby grants
|
|
||||||
you a license to use \u201CLlama\u201D (the \u201CMark\u201D) solely as required
|
|
||||||
\\nto comply with the last sentence of Section 1.b.i. You will comply with Meta\u2019s
|
|
||||||
brand guidelines (currently accessible \\nat https://about.meta.com/brand/resources/meta/company-brand/).
|
|
||||||
All goodwill arising out of your use of the Mark \\nwill inure to the benefit
|
|
||||||
of Meta.\\n\\n b. Subject to Meta\u2019s ownership of Llama Materials and
|
|
||||||
derivatives made by or for Meta, with respect to any\\n derivative works
|
|
||||||
and modifications of the Llama Materials that are made by you, as between you
|
|
||||||
and Meta,\\n you are and will be the owner of such derivative works and modifications.\\n\\n
|
|
||||||
\ c. If you institute litigation or other proceedings against Meta or any
|
|
||||||
entity (including a cross-claim or\\n counterclaim in a lawsuit) alleging
|
|
||||||
that the Llama Materials or Llama 3.2 outputs or results, or any portion\\n
|
|
||||||
\ of any of the foregoing, constitutes infringement of intellectual property
|
|
||||||
or other rights owned or licensable\\n by you, then any licenses granted
|
|
||||||
to you under this Agreement shall terminate as of the date such litigation or\\n
|
|
||||||
\ claim is filed or instituted. You will indemnify and hold harmless Meta
|
|
||||||
from and against any claim by any third\\n party arising out of or related
|
|
||||||
to your use or distribution of the Llama Materials.\\n\\n6. Term and Termination.
|
|
||||||
The term of this Agreement will commence upon your acceptance of this Agreement
|
|
||||||
or access\\nto the Llama Materials and will continue in full force and effect
|
|
||||||
until terminated in accordance with the terms\\nand conditions herein. Meta
|
|
||||||
may terminate this Agreement if you are in breach of any term or condition of
|
|
||||||
this\\nAgreement. Upon termination of this Agreement, you shall delete and cease
|
|
||||||
use of the Llama Materials. Sections 3,\\n4 and 7 shall survive the termination
|
|
||||||
of this Agreement. \\n\\n7. Governing Law and Jurisdiction. This Agreement will
|
|
||||||
be governed and construed under the laws of the State of \\nCalifornia without
|
|
||||||
regard to choice of law principles, and the UN Convention on Contracts for the
|
|
||||||
International\\nSale of Goods does not apply to this Agreement. The courts of
|
|
||||||
California shall have exclusive jurisdiction of\\nany dispute arising out of
|
|
||||||
this Agreement.\\\"\\nLICENSE \\\"**Llama 3.2** **Acceptable Use Policy**\\n\\nMeta
|
|
||||||
is committed to promoting safe and fair use of its tools and features, including
|
|
||||||
Llama 3.2. If you access or use Llama 3.2, you agree to this Acceptable Use
|
|
||||||
Policy (\u201C**Policy**\u201D). The most recent copy of this policy can be
|
|
||||||
found at [https://www.llama.com/llama3_2/use-policy](https://www.llama.com/llama3_2/use-policy).\\n\\n**Prohibited
|
|
||||||
Uses**\\n\\nWe want everyone to use Llama 3.2 safely and responsibly. You agree
|
|
||||||
you will not use, or allow others to use, Llama 3.2 to:\\n\\n\\n\\n1. Violate
|
|
||||||
the law or others\u2019 rights, including to:\\n 1. Engage in, promote, generate,
|
|
||||||
contribute to, encourage, plan, incite, or further illegal or unlawful activity
|
|
||||||
or content, such as:\\n 1. Violence or terrorism\\n 2. Exploitation
|
|
||||||
or harm to children, including the solicitation, creation, acquisition, or dissemination
|
|
||||||
of child exploitative content or failure to report Child Sexual Abuse Material\\n
|
|
||||||
\ 3. Human trafficking, exploitation, and sexual violence\\n 4.
|
|
||||||
The illegal distribution of information or materials to minors, including obscene
|
|
||||||
materials, or failure to employ legally required age-gating in connection with
|
|
||||||
such information or materials.\\n 5. Sexual solicitation\\n 6.
|
|
||||||
Any other criminal activity\\n 1. Engage in, promote, incite, or facilitate
|
|
||||||
the harassment, abuse, threatening, or bullying of individuals or groups of
|
|
||||||
individuals\\n 2. Engage in, promote, incite, or facilitate discrimination
|
|
||||||
or other unlawful or harmful conduct in the provision of employment, employment
|
|
||||||
benefits, credit, housing, other economic benefits, or other essential goods
|
|
||||||
and services\\n 3. Engage in the unauthorized or unlicensed practice of any
|
|
||||||
profession including, but not limited to, financial, legal, medical/health,
|
|
||||||
or related professional practices\\n 4. Collect, process, disclose, generate,
|
|
||||||
or infer private or sensitive information about individuals, including information
|
|
||||||
about individuals\u2019 identity, health, or demographic information, unless
|
|
||||||
you have obtained the right to do so in accordance with applicable law\\n 5.
|
|
||||||
Engage in or facilitate any action or generate any content that infringes, misappropriates,
|
|
||||||
or otherwise violates any third-party rights, including the outputs or results
|
|
||||||
of any products or services using the Llama Materials\\n 6. Create, generate,
|
|
||||||
or facilitate the creation of malicious code, malware, computer viruses or do
|
|
||||||
anything else that could disable, overburden, interfere with or impair the proper
|
|
||||||
working, integrity, operation or appearance of a website or computer system\\n
|
|
||||||
\ 7. Engage in any action, or facilitate any action, to intentionally circumvent
|
|
||||||
or remove usage restrictions or other safety measures, or to enable functionality
|
|
||||||
disabled by Meta\\n2. Engage in, promote, incite, facilitate, or assist in the
|
|
||||||
planning or development of activities that present a risk of death or bodily
|
|
||||||
harm to individuals, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 8. Military, warfare, nuclear industries or applications, espionage, use
|
|
||||||
for materials or activities that are subject to the International Traffic Arms
|
|
||||||
Regulations (ITAR) maintained by the United States Department of State or to
|
|
||||||
the U.S. Biological Weapons Anti-Terrorism Act of 1989 or the Chemical Weapons
|
|
||||||
Convention Implementation Act of 1997\\n 9. Guns and illegal weapons (including
|
|
||||||
weapon development)\\n 10. Illegal drugs and regulated/controlled substances\\n
|
|
||||||
\ 11. Operation of critical infrastructure, transportation technologies, or
|
|
||||||
heavy machinery\\n 12. Self-harm or harm to others, including suicide, cutting,
|
|
||||||
and eating disorders\\n 13. Any content intended to incite or promote violence,
|
|
||||||
abuse, or any infliction of bodily harm to an individual\\n3. Intentionally
|
|
||||||
deceive or mislead others, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 14. Generating, promoting, or furthering fraud or the creation or promotion
|
|
||||||
of disinformation\\n 15. Generating, promoting, or furthering defamatory
|
|
||||||
content, including the creation of defamatory statements, images, or other content\\n
|
|
||||||
\ 16. Generating, promoting, or further distributing spam\\n 17. Impersonating
|
|
||||||
another individual without consent, authorization, or legal right\\n 18.
|
|
||||||
Representing that the use of Llama 3.2 or outputs are human-generated\\n 19.
|
|
||||||
Generating or facilitating false online engagement, including fake reviews and
|
|
||||||
other means of fake online engagement\\n4. Fail to appropriately disclose to
|
|
||||||
end users any known dangers of your AI system\\n5. Interact with third party
|
|
||||||
tools, models, or software designed to generate unlawful content or engage in
|
|
||||||
unlawful or harmful conduct and/or represent that the outputs of such tools,
|
|
||||||
models, or software are associated with Meta or Llama 3.2\\n\\nWith respect
|
|
||||||
to any multimodal models included in Llama 3.2, the rights granted under Section
|
|
||||||
1(a) of the Llama 3.2 Community License Agreement are not being granted to you
|
|
||||||
if you are an individual domiciled in, or a company with a principal place of
|
|
||||||
business in, the European Union. This restriction does not apply to end users
|
|
||||||
of a product or service that incorporates any such multimodal models.\\n\\nPlease
|
|
||||||
report any violation of this Policy, software \u201Cbug,\u201D or other problems
|
|
||||||
that could lead to a violation of this Policy through one of the following means:\\n\\n\\n\\n*
|
|
||||||
Reporting issues with the model: [https://github.com/meta-llama/llama-models/issues](https://l.workplace.com/l.php?u=https%3A%2F%2Fgithub.com%2Fmeta-llama%2Fllama-models%2Fissues\\u0026h=AT0qV8W9BFT6NwihiOHRuKYQM_UnkzN_NmHMy91OT55gkLpgi4kQupHUl0ssR4dQsIQ8n3tfd0vtkobvsEvt1l4Ic6GXI2EeuHV8N08OG2WnbAmm0FL4ObkazC6G_256vN0lN9DsykCvCqGZ)\\n*
|
|
||||||
Reporting risky content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)\\n*
|
|
||||||
Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)\\n*
|
|
||||||
Reporting violations of the Acceptable Use Policy or unlicensed uses of Llama
|
|
||||||
3.2: LlamaUseReport@meta.com\\\"\\n\",\"parameters\":\"stop \\\"\\u003c|start_header_id|\\u003e\\\"\\nstop
|
|
||||||
\ \\\"\\u003c|end_header_id|\\u003e\\\"\\nstop \\\"\\u003c|eot_id|\\u003e\\\"\",\"template\":\"\\u003c|start_header_id|\\u003esystem\\u003c|end_header_id|\\u003e\\n\\nCutting
|
|
||||||
Knowledge Date: December 2023\\n\\n{{ if .System }}{{ .System }}\\n{{- end }}\\n{{-
|
|
||||||
if .Tools }}When you receive a tool call response, use the output to format
|
|
||||||
an answer to the orginal user question.\\n\\nYou are a helpful assistant with
|
|
||||||
tool calling capabilities.\\n{{- end }}\\u003c|eot_id|\\u003e\\n{{- range $i,
|
|
||||||
$_ := .Messages }}\\n{{- $last := eq (len (slice $.Messages $i)) 1 }}\\n{{-
|
|
||||||
if eq .Role \\\"user\\\" }}\\u003c|start_header_id|\\u003euser\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if and $.Tools $last }}\\n\\nGiven the following functions, please respond with
|
|
||||||
a JSON for a function call with its proper arguments that best answers the given
|
|
||||||
prompt.\\n\\nRespond in the format {\\\"name\\\": function name, \\\"parameters\\\":
|
|
||||||
dictionary of argument name and its value}. Do not use variables.\\n\\n{{ range
|
|
||||||
$.Tools }}\\n{{- . }}\\n{{ end }}\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{-
|
|
||||||
else }}\\n\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{- end }}{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- else if eq .Role \\\"assistant\\\" }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if .ToolCalls }}\\n{{ range .ToolCalls }}\\n{\\\"name\\\": \\\"{{ .Function.Name
|
|
||||||
}}\\\", \\\"parameters\\\": {{ .Function.Arguments }}}{{ end }}\\n{{- else }}\\n\\n{{
|
|
||||||
.Content }}\\n{{- end }}{{ if not $last }}\\u003c|eot_id|\\u003e{{ end }}\\n{{-
|
|
||||||
else if eq .Role \\\"tool\\\" }}\\u003c|start_header_id|\\u003eipython\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
.Content }}\\u003c|eot_id|\\u003e{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- end }}\\n{{- end }}\",\"details\":{\"parent_model\":\"\",\"format\":\"gguf\",\"family\":\"llama\",\"families\":[\"llama\"],\"parameter_size\":\"3.2B\",\"quantization_level\":\"Q4_K_M\"},\"model_info\":{\"general.architecture\":\"llama\",\"general.basename\":\"Llama-3.2\",\"general.file_type\":15,\"general.finetune\":\"Instruct\",\"general.languages\":[\"en\",\"de\",\"fr\",\"it\",\"pt\",\"hi\",\"es\",\"th\"],\"general.parameter_count\":3212749888,\"general.quantization_version\":2,\"general.size_label\":\"3B\",\"general.tags\":[\"facebook\",\"meta\",\"pytorch\",\"llama\",\"llama-3\",\"text-generation\"],\"general.type\":\"model\",\"llama.attention.head_count\":24,\"llama.attention.head_count_kv\":8,\"llama.attention.key_length\":128,\"llama.attention.layer_norm_rms_epsilon\":0.00001,\"llama.attention.value_length\":128,\"llama.block_count\":28,\"llama.context_length\":131072,\"llama.embedding_length\":3072,\"llama.feed_forward_length\":8192,\"llama.rope.dimension_count\":128,\"llama.rope.freq_base\":500000,\"llama.vocab_size\":128256,\"tokenizer.ggml.bos_token_id\":128000,\"tokenizer.ggml.eos_token_id\":128009,\"tokenizer.ggml.merges\":null,\"tokenizer.ggml.model\":\"gpt2\",\"tokenizer.ggml.pre\":\"llama-bpe\",\"tokenizer.ggml.token_type\":null,\"tokenizer.ggml.tokens\":null},\"modified_at\":\"2024-12-31T11:53:14.529771974-05:00\"}"
|
|
||||||
headers:
|
|
||||||
Content-Type:
|
|
||||||
- application/json; charset=utf-8
|
|
||||||
Date:
|
|
||||||
- Tue, 31 Dec 2024 16:57:54 GMT
|
|
||||||
Transfer-Encoding:
|
|
||||||
- chunked
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
version: 1
|
|
||||||
@@ -1,243 +0,0 @@
|
|||||||
interactions:
|
|
||||||
- request:
|
|
||||||
body: !!binary |
|
|
||||||
CuIcCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSuRwKEgoQY3Jld2FpLnRl
|
|
||||||
bGVtZXRyeRKjBwoQXK7w4+uvyEkrI9D5qyvcJxII5UmQ7hmczdIqDENyZXcgQ3JlYXRlZDABOfxQ
|
|
||||||
/hs4jBUYQUi3DBw4jBUYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODYuMEoaCg5weXRob25fdmVy
|
|
||||||
c2lvbhIICgYzLjEyLjdKLgoIY3Jld19rZXkSIgogYzk3YjVmZWI1ZDFiNjZiYjU5MDA2YWFhMDFh
|
|
||||||
MjljZDZKMQoHY3Jld19pZBImCiRkZjY3NGMwYi1hOTc0LTQ3NTAtYjlkMS0yZWQxNjM3MzFiNTZK
|
|
||||||
HAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdf
|
|
||||||
bnVtYmVyX29mX3Rhc2tzEgIYAUobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgBStECCgtjcmV3
|
|
||||||
X2FnZW50cxLBAgq+Alt7ImtleSI6ICIwN2Q5OWI2MzA0MTFkMzVmZDkwNDdhNTMyZDUzZGRhNyIs
|
|
||||||
ICJpZCI6ICI5MDYwYTQ2Zi02MDY3LTQ1N2MtOGU3ZC04NjAyN2YzY2U5ZDUiLCAicm9sZSI6ICJS
|
|
||||||
ZXNlYXJjaGVyIiwgInZlcmJvc2U/IjogZmFsc2UsICJtYXhfaXRlciI6IDIwLCAibWF4X3JwbSI6
|
|
||||||
IG51bGwsICJmdW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAibGxtIjogImdwdC00by1taW5pIiwg
|
|
||||||
ImRlbGVnYXRpb25fZW5hYmxlZD8iOiBmYWxzZSwgImFsbG93X2NvZGVfZXhlY3V0aW9uPyI6IGZh
|
|
||||||
bHNlLCAibWF4X3JldHJ5X2xpbWl0IjogMiwgInRvb2xzX25hbWVzIjogW119XUr/AQoKY3Jld190
|
|
||||||
YXNrcxLwAQrtAVt7ImtleSI6ICI2Mzk5NjUxN2YzZjNmMWM5NGQ2YmI2MTdhYTBiMWM0ZiIsICJp
|
|
||||||
ZCI6ICJjYTA4ZjkyOS0yMmI0LTQyZmQtYjViMC05N2M3MjM0ZDk5OTEiLCAiYXN5bmNfZXhlY3V0
|
|
||||||
aW9uPyI6IGZhbHNlLCAiaHVtYW5faW5wdXQ/IjogZmFsc2UsICJhZ2VudF9yb2xlIjogIlJlc2Vh
|
|
||||||
cmNoZXIiLCAiYWdlbnRfa2V5IjogIjA3ZDk5YjYzMDQxMWQzNWZkOTA0N2E1MzJkNTNkZGE3Iiwg
|
|
||||||
InRvb2xzX25hbWVzIjogW119XXoCGAGFAQABAAASjgIKEOTJZh9R45IwgGVg9cinZmISCJopKRMf
|
|
||||||
bpMJKgxUYXNrIENyZWF0ZWQwATlG+zQcOIwVGEHk0zUcOIwVGEouCghjcmV3X2tleRIiCiBjOTdi
|
|
||||||
NWZlYjVkMWI2NmJiNTkwMDZhYWEwMWEyOWNkNkoxCgdjcmV3X2lkEiYKJGRmNjc0YzBiLWE5NzQt
|
|
||||||
NDc1MC1iOWQxLTJlZDE2MzczMWI1NkouCgh0YXNrX2tleRIiCiA2Mzk5NjUxN2YzZjNmMWM5NGQ2
|
|
||||||
YmI2MTdhYTBiMWM0ZkoxCgd0YXNrX2lkEiYKJGNhMDhmOTI5LTIyYjQtNDJmZC1iNWIwLTk3Yzcy
|
|
||||||
MzRkOTk5MXoCGAGFAQABAAASowcKEEvwrN8+tNMIBwtnA+ip7jASCI78Hrh2wlsBKgxDcmV3IENy
|
|
||||||
ZWF0ZWQwATkcRqYeOIwVGEE8erQeOIwVGEoaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjg2LjBKGgoO
|
|
||||||
cHl0aG9uX3ZlcnNpb24SCAoGMy4xMi43Si4KCGNyZXdfa2V5EiIKIDhjMjc1MmY0OWU1YjlkMmI2
|
|
||||||
OGNiMzVjYWM4ZmNjODZkSjEKB2NyZXdfaWQSJgokZmRkYzA4ZTMtNDUyNi00N2Q2LThlNWMtNjY0
|
|
||||||
YzIyMjc4ZDgyShwKDGNyZXdfcHJvY2VzcxIMCgpzZXF1ZW50aWFsShEKC2NyZXdfbWVtb3J5EgIQ
|
|
||||||
AEoaChRjcmV3X251bWJlcl9vZl90YXNrcxICGAFKGwoVY3Jld19udW1iZXJfb2ZfYWdlbnRzEgIY
|
|
||||||
AUrRAgoLY3Jld19hZ2VudHMSwQIKvgJbeyJrZXkiOiAiOGJkMjEzOWI1OTc1MTgxNTA2ZTQxZmQ5
|
|
||||||
YzQ1NjNkNzUiLCAiaWQiOiAiY2UxNjA2YjktMjdiOS00ZDc4LWEyODctNDZiMDNlZDg3ZTA1Iiwg
|
|
||||||
InJvbGUiOiAiUmVzZWFyY2hlciIsICJ2ZXJib3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAyMCwg
|
|
||||||
Im1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJncHQt
|
|
||||||
NG8tbWluaSIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJhbGxvd19jb2RlX2V4ZWN1
|
|
||||||
dGlvbj8iOiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6IFtdfV1K
|
|
||||||
/wEKCmNyZXdfdGFza3MS8AEK7QFbeyJrZXkiOiAiMGQ2ODVhMjE5OTRkOTQ5MDk3YmM1YTU2ZDcz
|
|
||||||
N2U2ZDEiLCAiaWQiOiAiNDdkMzRjZjktMGYxZS00Y2JkLTgzMzItNzRjZjY0YWRlOThlIiwgImFz
|
|
||||||
eW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdlbnRfcm9s
|
|
||||||
ZSI6ICJSZXNlYXJjaGVyIiwgImFnZW50X2tleSI6ICI4YmQyMTM5YjU5NzUxODE1MDZlNDFmZDlj
|
|
||||||
NDU2M2Q3NSIsICJ0b29sc19uYW1lcyI6IFtdfV16AhgBhQEAAQAAEo4CChAf4TXS782b0PBJ4NSB
|
|
||||||
JXwsEgjXnd13GkMzlyoMVGFzayBDcmVhdGVkMAE5mb/cHjiMFRhBGRTiHjiMFRhKLgoIY3Jld19r
|
|
||||||
ZXkSIgogOGMyNzUyZjQ5ZTViOWQyYjY4Y2IzNWNhYzhmY2M4NmRKMQoHY3Jld19pZBImCiRmZGRj
|
|
||||||
MDhlMy00NTI2LTQ3ZDYtOGU1Yy02NjRjMjIyNzhkODJKLgoIdGFza19rZXkSIgogMGQ2ODVhMjE5
|
|
||||||
OTRkOTQ5MDk3YmM1YTU2ZDczN2U2ZDFKMQoHdGFza19pZBImCiQ0N2QzNGNmOS0wZjFlLTRjYmQt
|
|
||||||
ODMzMi03NGNmNjRhZGU5OGV6AhgBhQEAAQAAEqMHChAyBGKhzDhROB5pmAoXrikyEgj6SCwzj1dU
|
|
||||||
LyoMQ3JldyBDcmVhdGVkMAE5vkjTHziMFRhBRDbhHziMFRhKGgoOY3Jld2FpX3ZlcnNpb24SCAoG
|
|
||||||
MC44Ni4wShoKDnB5dGhvbl92ZXJzaW9uEggKBjMuMTIuN0ouCghjcmV3X2tleRIiCiBiNjczNjg2
|
|
||||||
ZmM4MjJjMjAzYzdlODc5YzY3NTQyNDY5OUoxCgdjcmV3X2lkEiYKJGYyYWVlYTYzLTU2OWUtNDUz
|
|
||||||
NS1iZTY0LTRiZjYzZmU5NjhjN0ocCgxjcmV3X3Byb2Nlc3MSDAoKc2VxdWVudGlhbEoRCgtjcmV3
|
|
||||||
X21lbW9yeRICEABKGgoUY3Jld19udW1iZXJfb2ZfdGFza3MSAhgBShsKFWNyZXdfbnVtYmVyX29m
|
|
||||||
X2FnZW50cxICGAFK0QIKC2NyZXdfYWdlbnRzEsECCr4CW3sia2V5IjogImI1OWNmNzdiNmU3NjU4
|
|
||||||
NDg3MGViMWMzODgyM2Q3ZTI4IiwgImlkIjogImJiZjNkM2E4LWEwMjUtNGI0ZC1hY2Q0LTFmNzcz
|
|
||||||
NTI3MWJmMCIsICJyb2xlIjogIlJlc2VhcmNoZXIiLCAidmVyYm9zZT8iOiBmYWxzZSwgIm1heF9p
|
|
||||||
dGVyIjogMjAsICJtYXhfcnBtIjogbnVsbCwgImZ1bmN0aW9uX2NhbGxpbmdfbGxtIjogIiIsICJs
|
|
||||||
bG0iOiAiZ3B0LTRvLW1pbmkiLCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6IGZhbHNlLCAiYWxsb3df
|
|
||||||
Y29kZV9leGVjdXRpb24/IjogZmFsc2UsICJtYXhfcmV0cnlfbGltaXQiOiAyLCAidG9vbHNfbmFt
|
|
||||||
ZXMiOiBbXX1dSv8BCgpjcmV3X3Rhc2tzEvABCu0BW3sia2V5IjogImE1ZTVjNThjZWExYjlkMDAz
|
|
||||||
MzJlNjg0NDFkMzI3YmRmIiwgImlkIjogIjBiOTRiMTY0LTM5NTktNGFmYS05Njg4LWJjNmEwZWMy
|
|
||||||
MWYzOCIsICJhc3luY19leGVjdXRpb24/IjogZmFsc2UsICJodW1hbl9pbnB1dD8iOiBmYWxzZSwg
|
|
||||||
ImFnZW50X3JvbGUiOiAiUmVzZWFyY2hlciIsICJhZ2VudF9rZXkiOiAiYjU5Y2Y3N2I2ZTc2NTg0
|
|
||||||
ODcwZWIxYzM4ODIzZDdlMjgiLCAidG9vbHNfbmFtZXMiOiBbXX1degIYAYUBAAEAABKOAgoQyYfi
|
|
||||||
Ftim717svttBZY3p5hIIUxR5bBHzWWkqDFRhc2sgQ3JlYXRlZDABOV4OBiA4jBUYQbLjBiA4jBUY
|
|
||||||
Si4KCGNyZXdfa2V5EiIKIGI2NzM2ODZmYzgyMmMyMDNjN2U4NzljNjc1NDI0Njk5SjEKB2NyZXdf
|
|
||||||
aWQSJgokZjJhZWVhNjMtNTY5ZS00NTM1LWJlNjQtNGJmNjNmZTk2OGM3Si4KCHRhc2tfa2V5EiIK
|
|
||||||
IGE1ZTVjNThjZWExYjlkMDAzMzJlNjg0NDFkMzI3YmRmSjEKB3Rhc2tfaWQSJgokMGI5NGIxNjQt
|
|
||||||
Mzk1OS00YWZhLTk2ODgtYmM2YTBlYzIxZjM4egIYAYUBAAEAAA==
|
|
||||||
headers:
|
|
||||||
Accept:
|
|
||||||
- '*/*'
|
|
||||||
Accept-Encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
Connection:
|
|
||||||
- keep-alive
|
|
||||||
Content-Length:
|
|
||||||
- '3685'
|
|
||||||
Content-Type:
|
|
||||||
- application/x-protobuf
|
|
||||||
User-Agent:
|
|
||||||
- OTel-OTLP-Exporter-Python/1.27.0
|
|
||||||
method: POST
|
|
||||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
|
||||||
response:
|
|
||||||
body:
|
|
||||||
string: "\n\0"
|
|
||||||
headers:
|
|
||||||
Content-Length:
|
|
||||||
- '2'
|
|
||||||
Content-Type:
|
|
||||||
- application/x-protobuf
|
|
||||||
Date:
|
|
||||||
- Sun, 29 Dec 2024 04:43:27 GMT
|
|
||||||
status:
|
|
||||||
code: 200
|
|
||||||
message: OK
|
|
||||||
- request:
|
|
||||||
body: '{"messages": [{"role": "system", "content": "You are Researcher. You have
|
|
||||||
extensive AI research experience.\nYour personal goal is: Analyze AI topics\nTo
|
|
||||||
give my best complete final answer to the task use the exact following format:\n\nThought:
|
|
||||||
I now can give a great answer\nFinal Answer: Your final answer must be the great
|
|
||||||
and the most complete as possible, it must be outcome described.\n\nI MUST use
|
|
||||||
these formats, my job depends on it!"}, {"role": "user", "content": "\nCurrent
|
|
||||||
Task: Explain the advantages of AI.\n\nThis is the expect criteria for your
|
|
||||||
final answer: A summary of the main advantages, bullet points recommended.\nyou
|
|
||||||
MUST return the actual complete content as the final answer, not a summary.\n\nBegin!
|
|
||||||
This is VERY important to you, use the tools available and give your best Final
|
|
||||||
Answer, your job depends on it!\n\nThought:"}], "model": "gpt-4o-mini", "stop":
|
|
||||||
["\nObservation:"], "stream": false}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- application/json
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '922'
|
|
||||||
content-type:
|
|
||||||
- application/json
|
|
||||||
cookie:
|
|
||||||
- _cfuvid=eff7OIkJ0zWRunpA6z67LHqscmSe6XjNxXiPw1R3xCc-1733770413538-0.0.1.1-604800000
|
|
||||||
host:
|
|
||||||
- api.openai.com
|
|
||||||
user-agent:
|
|
||||||
- OpenAI/Python 1.52.1
|
|
||||||
x-stainless-arch:
|
|
||||||
- x64
|
|
||||||
x-stainless-async:
|
|
||||||
- 'false'
|
|
||||||
x-stainless-lang:
|
|
||||||
- python
|
|
||||||
x-stainless-os:
|
|
||||||
- Linux
|
|
||||||
x-stainless-package-version:
|
|
||||||
- 1.52.1
|
|
||||||
x-stainless-raw-response:
|
|
||||||
- 'true'
|
|
||||||
x-stainless-retry-count:
|
|
||||||
- '0'
|
|
||||||
x-stainless-runtime:
|
|
||||||
- CPython
|
|
||||||
x-stainless-runtime-version:
|
|
||||||
- 3.12.7
|
|
||||||
method: POST
|
|
||||||
uri: https://api.openai.com/v1/chat/completions
|
|
||||||
response:
|
|
||||||
content: "{\n \"id\": \"chatcmpl-AjfR6FDuTw7NGzy8w7sxjvOkUQlru\",\n \"object\":
|
|
||||||
\"chat.completion\",\n \"created\": 1735447404,\n \"model\": \"gpt-4o-mini-2024-07-18\",\n
|
|
||||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
|
||||||
\"assistant\",\n \"content\": \"I now can give a great answer \\nFinal
|
|
||||||
Answer: \\n**Advantages of AI** \\n\\n1. **Increased Efficiency and Productivity**
|
|
||||||
\ \\n - AI systems can process large amounts of data quickly and accurately,
|
|
||||||
leading to faster decision-making and increased productivity in various sectors.\\n\\n2.
|
|
||||||
**Cost Savings** \\n - Automation of repetitive and time-consuming tasks
|
|
||||||
reduces labor costs and increases operational efficiency, allowing businesses
|
|
||||||
to allocate resources more effectively.\\n\\n3. **Enhanced Data Analysis** \\n
|
|
||||||
\ - AI excels at analyzing big data, identifying patterns, and providing insights
|
|
||||||
that support better strategic planning and business decision-making.\\n\\n4.
|
|
||||||
**24/7 Availability** \\n - AI solutions, such as chatbots and virtual assistants,
|
|
||||||
operate continuously without breaks, offering constant support and customer
|
|
||||||
service, enhancing user experience.\\n\\n5. **Personalization** \\n - AI
|
|
||||||
enables the customization of content, products, and services based on user preferences
|
|
||||||
and behaviors, leading to improved customer satisfaction and loyalty.\\n\\n6.
|
|
||||||
**Improved Accuracy** \\n - AI technologies, such as machine learning algorithms,
|
|
||||||
reduce the likelihood of human error in various processes, leading to greater
|
|
||||||
accuracy and reliability.\\n\\n7. **Enhanced Innovation** \\n - AI fosters
|
|
||||||
innovative solutions by providing new tools and approaches to problem-solving,
|
|
||||||
enabling companies to develop cutting-edge products and services.\\n\\n8. **Scalability**
|
|
||||||
\ \\n - AI can be scaled to handle varying amounts of workloads without significant
|
|
||||||
changes to infrastructure, making it easier for organizations to expand operations.\\n\\n9.
|
|
||||||
**Predictive Capabilities** \\n - Advanced analytics powered by AI can anticipate
|
|
||||||
trends and outcomes, allowing businesses to proactively adjust strategies and
|
|
||||||
improve forecasting.\\n\\n10. **Health Benefits** \\n - In healthcare, AI
|
|
||||||
assists in diagnostics, personalized treatment plans, and predictive analytics,
|
|
||||||
leading to better patient care and improved health outcomes.\\n\\n11. **Safety
|
|
||||||
and Risk Mitigation** \\n - AI can enhance safety in various industries
|
|
||||||
by taking over dangerous tasks, monitoring for hazards, and predicting maintenance
|
|
||||||
needs for critical machinery, thereby preventing accidents.\\n\\n12. **Reduced
|
|
||||||
Environmental Impact** \\n - AI can optimize resource usage in areas such
|
|
||||||
as energy consumption and supply chain logistics, contributing to sustainability
|
|
||||||
efforts and reducing overall environmental footprints.\",\n \"refusal\":
|
|
||||||
null\n },\n \"logprobs\": null,\n \"finish_reason\": \"stop\"\n
|
|
||||||
\ }\n ],\n \"usage\": {\n \"prompt_tokens\": 168,\n \"completion_tokens\":
|
|
||||||
440,\n \"total_tokens\": 608,\n \"prompt_tokens_details\": {\n \"cached_tokens\":
|
|
||||||
0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\": {\n
|
|
||||||
\ \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
|
||||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
|
||||||
\"fp_0aa8d3e20b\"\n}\n"
|
|
||||||
headers:
|
|
||||||
CF-Cache-Status:
|
|
||||||
- DYNAMIC
|
|
||||||
CF-RAY:
|
|
||||||
- 8f9721053d1eb9f1-SEA
|
|
||||||
Connection:
|
|
||||||
- keep-alive
|
|
||||||
Content-Encoding:
|
|
||||||
- gzip
|
|
||||||
Content-Type:
|
|
||||||
- application/json
|
|
||||||
Date:
|
|
||||||
- Sun, 29 Dec 2024 04:43:32 GMT
|
|
||||||
Server:
|
|
||||||
- cloudflare
|
|
||||||
Set-Cookie:
|
|
||||||
- __cf_bm=5enubNIoQSGMYEgy8Q2FpzzhphA0y.0lXukRZrWFvMk-1735447412-1.0.1.1-FIK1sMkUl3YnW1gTC6ftDtb2mKsbosb4mwabdFAlWCfJ6pXeavYq.bPsfKNvzAb5WYq60yVGH5lHsJT05bhSgw;
|
|
||||||
path=/; expires=Sun, 29-Dec-24 05:13:32 GMT; domain=.api.openai.com; HttpOnly;
|
|
||||||
Secure; SameSite=None
|
|
||||||
- _cfuvid=63wmKMTuFamkLN8FBI4fP8JZWbjWiRxWm7wb3kz.z_A-1735447412038-0.0.1.1-604800000;
|
|
||||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
|
||||||
Transfer-Encoding:
|
|
||||||
- chunked
|
|
||||||
X-Content-Type-Options:
|
|
||||||
- nosniff
|
|
||||||
access-control-expose-headers:
|
|
||||||
- X-Request-ID
|
|
||||||
alt-svc:
|
|
||||||
- h3=":443"; ma=86400
|
|
||||||
openai-organization:
|
|
||||||
- crewai-iuxna1
|
|
||||||
openai-processing-ms:
|
|
||||||
- '7577'
|
|
||||||
openai-version:
|
|
||||||
- '2020-10-01'
|
|
||||||
strict-transport-security:
|
|
||||||
- max-age=31536000; includeSubDomains; preload
|
|
||||||
x-ratelimit-limit-requests:
|
|
||||||
- '30000'
|
|
||||||
x-ratelimit-limit-tokens:
|
|
||||||
- '150000000'
|
|
||||||
x-ratelimit-remaining-requests:
|
|
||||||
- '29999'
|
|
||||||
x-ratelimit-remaining-tokens:
|
|
||||||
- '149999793'
|
|
||||||
x-ratelimit-reset-requests:
|
|
||||||
- 2ms
|
|
||||||
x-ratelimit-reset-tokens:
|
|
||||||
- 0s
|
|
||||||
x-request-id:
|
|
||||||
- req_55b8d714656e8f10f4e23cbe9034d66b
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
version: 1
|
|
||||||
@@ -3,17 +3,39 @@ interactions:
|
|||||||
body: '{"messages": [{"role": "system", "content": "You are CEO. You''re an long
|
body: '{"messages": [{"role": "system", "content": "You are CEO. You''re an long
|
||||||
time CEO of a content creation agency with a Senior Writer on the team. You''re
|
time CEO of a content creation agency with a Senior Writer on the team. You''re
|
||||||
now working on a new project and want to make sure the content produced is amazing.\nYour
|
now working on a new project and want to make sure the content produced is amazing.\nYour
|
||||||
personal goal is: Make sure the writers in your company produce amazing content.\nTo
|
personal goal is: Make sure the writers in your company produce amazing content.\nYou
|
||||||
give my best complete final answer to the task use the exact following format:\n\nThought:
|
ONLY have access to the following tools, and should NEVER make up tools that
|
||||||
I now can give a great answer\nFinal Answer: Your final answer must be the great
|
are not listed here:\n\nTool Name: Delegate work to coworker(task: str, context:
|
||||||
and the most complete as possible, it must be outcome described.\n\nI MUST use
|
str, coworker: Optional[str] = None, **kwargs)\nTool Description: Delegate a
|
||||||
these formats, my job depends on it!"}, {"role": "user", "content": "\nCurrent
|
specific task to one of the following coworkers: Senior Writer\nThe input to
|
||||||
Task: Produce and amazing 1 paragraph draft of an article about AI Agents.\n\nThis
|
this tool should be the coworker, the task you want them to do, and ALL necessary
|
||||||
is the expect criteria for your final answer: A 4 paragraph article about AI.\nyou
|
context to execute the task, they know nothing about the task, so share absolute
|
||||||
MUST return the actual complete content as the final answer, not a summary.\n\nBegin!
|
everything you know, don''t reference things but instead explain them.\nTool
|
||||||
This is VERY important to you, use the tools available and give your best Final
|
Arguments: {''task'': {''title'': ''Task'', ''type'': ''string''}, ''context'':
|
||||||
Answer, your job depends on it!\n\nThought:"}], "model": "gpt-4o-mini", "stop":
|
{''title'': ''Context'', ''type'': ''string''}, ''coworker'': {''title'': ''Coworker'',
|
||||||
["\nObservation:"], "stream": false}'
|
''type'': ''string''}, ''kwargs'': {''title'': ''Kwargs'', ''type'': ''object''}}\nTool
|
||||||
|
Name: Ask question to coworker(question: str, context: str, coworker: Optional[str]
|
||||||
|
= None, **kwargs)\nTool Description: Ask a specific question to one of the following
|
||||||
|
coworkers: Senior Writer\nThe input to this tool should be the coworker, the
|
||||||
|
question you have for them, and ALL necessary context to ask the question properly,
|
||||||
|
they know nothing about the question, so share absolute everything you know,
|
||||||
|
don''t reference things but instead explain them.\nTool Arguments: {''question'':
|
||||||
|
{''title'': ''Question'', ''type'': ''string''}, ''context'': {''title'': ''Context'',
|
||||||
|
''type'': ''string''}, ''coworker'': {''title'': ''Coworker'', ''type'': ''string''},
|
||||||
|
''kwargs'': {''title'': ''Kwargs'', ''type'': ''object''}}\n\nUse the following
|
||||||
|
format:\n\nThought: you should always think about what to do\nAction: the action
|
||||||
|
to take, only one name of [Delegate work to coworker, Ask question to coworker],
|
||||||
|
just the name, exactly as it''s written.\nAction Input: the input to the action,
|
||||||
|
just a simple python dictionary, enclosed in curly braces, using \" to wrap
|
||||||
|
keys and values.\nObservation: the result of the action\n\nOnce all necessary
|
||||||
|
information is gathered:\n\nThought: I now know the final answer\nFinal Answer:
|
||||||
|
the final answer to the original input question\n"}, {"role": "user", "content":
|
||||||
|
"\nCurrent Task: Produce and amazing 1 paragraph draft of an article about AI
|
||||||
|
Agents.\n\nThis is the expect criteria for your final answer: A 4 paragraph
|
||||||
|
article about AI.\nyou MUST return the actual complete content as the final
|
||||||
|
answer, not a summary.\n\nBegin! This is VERY important to you, use the tools
|
||||||
|
available and give your best Final Answer, your job depends on it!\n\nThought:"}],
|
||||||
|
"model": "gpt-4o"}'
|
||||||
headers:
|
headers:
|
||||||
accept:
|
accept:
|
||||||
- application/json
|
- application/json
|
||||||
@@ -22,13 +44,16 @@ interactions:
|
|||||||
connection:
|
connection:
|
||||||
- keep-alive
|
- keep-alive
|
||||||
content-length:
|
content-length:
|
||||||
- '1105'
|
- '2762'
|
||||||
content-type:
|
content-type:
|
||||||
- application/json
|
- application/json
|
||||||
|
cookie:
|
||||||
|
- __cf_bm=9.8sBYBkvBR8R1K_bVF7xgU..80XKlEIg3N2OBbTSCU-1727214102-1.0.1.1-.qiTLXbPamYUMSuyNsOEB9jhGu.jOifujOrx9E2JZvStbIZ9RTIiE44xKKNfLPxQkOi6qAT3h6htK8lPDGV_5g;
|
||||||
|
_cfuvid=lbRdAddVWV6W3f5Dm9SaOPWDUOxqtZBSPr_fTW26nEA-1727213194587-0.0.1.1-604800000
|
||||||
host:
|
host:
|
||||||
- api.openai.com
|
- api.openai.com
|
||||||
user-agent:
|
user-agent:
|
||||||
- OpenAI/Python 1.52.1
|
- OpenAI/Python 1.47.0
|
||||||
x-stainless-arch:
|
x-stainless-arch:
|
||||||
- arm64
|
- arm64
|
||||||
x-stainless-async:
|
x-stainless-async:
|
||||||
@@ -38,11 +63,9 @@ interactions:
|
|||||||
x-stainless-os:
|
x-stainless-os:
|
||||||
- MacOS
|
- MacOS
|
||||||
x-stainless-package-version:
|
x-stainless-package-version:
|
||||||
- 1.52.1
|
- 1.47.0
|
||||||
x-stainless-raw-response:
|
x-stainless-raw-response:
|
||||||
- 'true'
|
- 'true'
|
||||||
x-stainless-retry-count:
|
|
||||||
- '0'
|
|
||||||
x-stainless-runtime:
|
x-stainless-runtime:
|
||||||
- CPython
|
- CPython
|
||||||
x-stainless-runtime-version:
|
x-stainless-runtime-version:
|
||||||
@@ -50,51 +73,29 @@ interactions:
|
|||||||
method: POST
|
method: POST
|
||||||
uri: https://api.openai.com/v1/chat/completions
|
uri: https://api.openai.com/v1/chat/completions
|
||||||
response:
|
response:
|
||||||
content: "{\n \"id\": \"chatcmpl-Ahe7liUPejwfqxMe8aEWmKGJ837em\",\n \"object\":
|
content: "{\n \"id\": \"chatcmpl-AB7ZvxqgeOayGTQWwR61ASlZp0s74\",\n \"object\":
|
||||||
\"chat.completion\",\n \"created\": 1734965705,\n \"model\": \"gpt-4o-mini-2024-07-18\",\n
|
\"chat.completion\",\n \"created\": 1727214103,\n \"model\": \"gpt-4o-2024-05-13\",\n
|
||||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
||||||
\"assistant\",\n \"content\": \"I now can give a great answer \\nFinal
|
\"assistant\",\n \"content\": \"Thought: To ensure the content is amazing,
|
||||||
Answer: In the rapidly evolving landscape of technology, AI agents have emerged
|
I'll delegate the task of producing a one-paragraph draft of an article about
|
||||||
as formidable tools, revolutionizing how we interact with data and automate
|
AI Agents to the Senior Writer with all necessary context.\\n\\nAction: Delegate
|
||||||
tasks. These sophisticated systems leverage machine learning and natural language
|
work to coworker\\nAction Input: \\n{\\n \\\"coworker\\\": \\\"Senior Writer\\\",
|
||||||
processing to perform a myriad of functions, from virtual personal assistants
|
\\n \\\"task\\\": \\\"Produce a one paragraph draft of an article about AI
|
||||||
to complex decision-making companions in industries such as finance, healthcare,
|
Agents\\\", \\n \\\"context\\\": \\\"We need an amazing one-paragraph draft
|
||||||
and education. By mimicking human intelligence, AI agents can analyze massive
|
as the beginning of a 4-paragraph article about AI Agents. This is for a high-stakes
|
||||||
data sets at unparalleled speeds, enabling businesses to uncover valuable insights,
|
project that critically impacts our company. The paragraph should highlight
|
||||||
enhance productivity, and elevate user experiences to unprecedented levels.\\n\\nOne
|
what AI Agents are, their significance, and how they are transforming various
|
||||||
of the most striking aspects of AI agents is their adaptability; they learn
|
industries. The tone should be professional yet engaging. Make sure the content
|
||||||
from their interactions and continuously improve their performance over time.
|
is original, insightful, and thoroughly researched.\\\"\\n}\",\n \"refusal\":
|
||||||
This feature is particularly valuable in customer service where AI agents can
|
null\n },\n \"logprobs\": null,\n \"finish_reason\": \"stop\"\n
|
||||||
address inquiries, resolve issues, and provide personalized recommendations
|
\ }\n ],\n \"usage\": {\n \"prompt_tokens\": 608,\n \"completion_tokens\":
|
||||||
without the limitations of human fatigue. Moreover, with intuitive interfaces,
|
160,\n \"total_tokens\": 768,\n \"completion_tokens_details\": {\n \"reasoning_tokens\":
|
||||||
AI agents enhance user interactions, making technology more accessible and user-friendly,
|
0\n }\n },\n \"system_fingerprint\": \"fp_e375328146\"\n}\n"
|
||||||
thereby breaking down barriers that have historically hindered digital engagement.\\n\\nDespite
|
|
||||||
their immense potential, the deployment of AI agents raises important ethical
|
|
||||||
and practical considerations. Issues related to privacy, data security, and
|
|
||||||
the potential for job displacement necessitate thoughtful dialogue and proactive
|
|
||||||
measures. Striking a balance between technological innovation and societal impact
|
|
||||||
will be crucial as organizations integrate these agents into their operations.
|
|
||||||
Additionally, ensuring transparency in AI decision-making processes is vital
|
|
||||||
to maintain public trust as AI agents become an integral part of daily life.\\n\\nLooking
|
|
||||||
ahead, the future of AI agents appears bright, with ongoing advancements promising
|
|
||||||
even greater capabilities. As we continue to harness the power of AI, we can
|
|
||||||
expect these agents to play a transformative role in shaping various sectors\u2014streamlining
|
|
||||||
workflows, enabling smarter decision-making, and fostering more personalized
|
|
||||||
experiences. Embracing this technology responsibly can lead to a future where
|
|
||||||
AI agents not only augment human effort but also inspire creativity and efficiency
|
|
||||||
across the board, ultimately redefining our interaction with the digital world.\",\n
|
|
||||||
\ \"refusal\": null\n },\n \"logprobs\": null,\n \"finish_reason\":
|
|
||||||
\"stop\"\n }\n ],\n \"usage\": {\n \"prompt_tokens\": 208,\n \"completion_tokens\":
|
|
||||||
382,\n \"total_tokens\": 590,\n \"prompt_tokens_details\": {\n \"cached_tokens\":
|
|
||||||
0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\": {\n
|
|
||||||
\ \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
|
||||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
|
||||||
\"fp_0aa8d3e20b\"\n}\n"
|
|
||||||
headers:
|
headers:
|
||||||
CF-Cache-Status:
|
CF-Cache-Status:
|
||||||
- DYNAMIC
|
- DYNAMIC
|
||||||
CF-RAY:
|
CF-RAY:
|
||||||
- 8f6930c97a33ae54-GRU
|
- 8c85f0b038a71cf3-GRU
|
||||||
Connection:
|
Connection:
|
||||||
- keep-alive
|
- keep-alive
|
||||||
Content-Encoding:
|
Content-Encoding:
|
||||||
@@ -102,77 +103,74 @@ interactions:
|
|||||||
Content-Type:
|
Content-Type:
|
||||||
- application/json
|
- application/json
|
||||||
Date:
|
Date:
|
||||||
- Mon, 23 Dec 2024 14:55:10 GMT
|
- Tue, 24 Sep 2024 21:41:45 GMT
|
||||||
Server:
|
Server:
|
||||||
- cloudflare
|
- cloudflare
|
||||||
Set-Cookie:
|
|
||||||
- __cf_bm=g58erGPkGAltcfYpDRU4IsdEEzb955dGmBOAZacFlPA-1734965710-1.0.1.1-IiodiX3uxbT5xSa4seI7M.gRM4Jj46h2d6ZW2wCkSUYUAX.ivRh_sGQN2hucEMzdG8O87pc00dCl7E5W8KkyEA;
|
|
||||||
path=/; expires=Mon, 23-Dec-24 15:25:10 GMT; domain=.api.openai.com; HttpOnly;
|
|
||||||
Secure; SameSite=None
|
|
||||||
- _cfuvid=eQzzWvIXDS8Me1OIBdCG5F1qFyVfAo3sumvYRE7J41E-1734965710778-0.0.1.1-604800000;
|
|
||||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
|
||||||
Transfer-Encoding:
|
Transfer-Encoding:
|
||||||
- chunked
|
- chunked
|
||||||
X-Content-Type-Options:
|
X-Content-Type-Options:
|
||||||
- nosniff
|
- nosniff
|
||||||
access-control-expose-headers:
|
access-control-expose-headers:
|
||||||
- X-Request-ID
|
- X-Request-ID
|
||||||
alt-svc:
|
|
||||||
- h3=":443"; ma=86400
|
|
||||||
openai-organization:
|
openai-organization:
|
||||||
- crewai-iuxna1
|
- crewai-iuxna1
|
||||||
openai-processing-ms:
|
openai-processing-ms:
|
||||||
- '5401'
|
- '1826'
|
||||||
openai-version:
|
openai-version:
|
||||||
- '2020-10-01'
|
- '2020-10-01'
|
||||||
strict-transport-security:
|
strict-transport-security:
|
||||||
- max-age=31536000; includeSubDomains; preload
|
- max-age=31536000; includeSubDomains; preload
|
||||||
x-ratelimit-limit-requests:
|
x-ratelimit-limit-requests:
|
||||||
- '30000'
|
- '10000'
|
||||||
x-ratelimit-limit-tokens:
|
x-ratelimit-limit-tokens:
|
||||||
- '150000000'
|
- '30000000'
|
||||||
x-ratelimit-remaining-requests:
|
x-ratelimit-remaining-requests:
|
||||||
- '29999'
|
- '9999'
|
||||||
x-ratelimit-remaining-tokens:
|
x-ratelimit-remaining-tokens:
|
||||||
- '149999746'
|
- '29999325'
|
||||||
x-ratelimit-reset-requests:
|
x-ratelimit-reset-requests:
|
||||||
- 2ms
|
- 6ms
|
||||||
x-ratelimit-reset-tokens:
|
x-ratelimit-reset-tokens:
|
||||||
- 0s
|
- 1ms
|
||||||
x-request-id:
|
x-request-id:
|
||||||
- req_30791533923ae20626ef35a03ae66172
|
- req_79054638deeb01da76c5bba273bffc28
|
||||||
http_version: HTTP/1.1
|
http_version: HTTP/1.1
|
||||||
status_code: 200
|
status_code: 200
|
||||||
- request:
|
- request:
|
||||||
body: !!binary |
|
body: !!binary |
|
||||||
CqYMCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkS/QsKEgoQY3Jld2FpLnRl
|
Cq8OCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkShg4KEgoQY3Jld2FpLnRl
|
||||||
bGVtZXRyeRLVCQoQLH3VghpS+l/DatJl8rrpvRIIUpNEm7ELU08qDENyZXcgQ3JlYXRlZDABObgs
|
bGVtZXRyeRKQAgoQg15EMIBbDpydrcK3GAUYfBII5VYz5B10kmgqDlRhc2sgRXhlY3V0aW9uMAE5
|
||||||
nNId1hMYQfgVpdId1hMYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODYuMEoaCg5weXRob25fdmVy
|
aGIpYwtM+BdBIO6VVRNM+BdKLgoIY3Jld19rZXkSIgogZTNmZGEwZjMxMTBmZTgwYjE4OTQ3YzAx
|
||||||
c2lvbhIICgYzLjExLjdKLgoIY3Jld19rZXkSIgogZTY0OTU3M2EyNmU1ODc5MGNhYzIxYTM3Y2Q0
|
NDcxNDMwYTRKMQoHY3Jld19pZBImCiRjNzM1NzdhYi0xYThhLTQzMGYtYjYyZi01MTBlYWMyMWI3
|
||||||
NDQzN2FKMQoHY3Jld19pZBImCiQzYjVkNDFjNC1kZWJiLTQ2MzItYmIwMC1mNTdhNmM2M2QwMThK
|
MThKLgoIdGFza19rZXkSIgogNWZhNjVjMDZhOWUzMWYyYzY5NTQzMjY2OGFjZDYyZGRKMQoHdGFz
|
||||||
HAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdf
|
a19pZBImCiQ3MjAzMjYyMC0yMzJmLTQ5ZTMtOGMyNy0xYzBlOWJhNjFiZDB6AhgBhQEAAQAAEssJ
|
||||||
bnVtYmVyX29mX3Rhc2tzEgIYAUobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgCSooFCgtjcmV3
|
ChB+du4H1wHcku5blhLQBtuoEgiXVguc5KA1RyoMQ3JldyBDcmVhdGVkMAE54IJsVxNM+BdBcCN4
|
||||||
X2FnZW50cxL6BAr3BFt7ImtleSI6ICIzMjgyMTdiNmMyOTU5YmRmYzQ3Y2FkMDBlODQ4OTBkMCIs
|
VxNM+BdKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC42MS4wShoKDnB5dGhvbl92ZXJzaW9uEggKBjMu
|
||||||
ICJpZCI6ICI1Yjk4NDA2OS03MjVlLTQxOWYtYjdiZS1mMDdjMTYyOGNkZjIiLCAicm9sZSI6ICJD
|
MTEuN0ouCghjcmV3X2tleRIiCiBlNjQ5NTczYTI2ZTU4NzkwY2FjMjFhMzdjZDQ0NDM3YUoxCgdj
|
||||||
RU8iLCAidmVyYm9zZT8iOiBmYWxzZSwgIm1heF9pdGVyIjogMjAsICJtYXhfcnBtIjogbnVsbCwg
|
cmV3X2lkEiYKJDI4ZTY0YmQ3LWNlYWMtNDYxOS04MmM3LTIzNmRkNTQxOGM4N0ocCgxjcmV3X3By
|
||||||
ImZ1bmN0aW9uX2NhbGxpbmdfbGxtIjogIiIsICJsbG0iOiAiZ3B0LTRvLW1pbmkiLCAiZGVsZWdh
|
b2Nlc3MSDAoKc2VxdWVudGlhbEoRCgtjcmV3X21lbW9yeRICEABKGgoUY3Jld19udW1iZXJfb2Zf
|
||||||
dGlvbl9lbmFibGVkPyI6IHRydWUsICJhbGxvd19jb2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1h
|
dGFza3MSAhgBShsKFWNyZXdfbnVtYmVyX29mX2FnZW50cxICGAJKgAUKC2NyZXdfYWdlbnRzEvAE
|
||||||
eF9yZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6IFtdfSwgeyJrZXkiOiAiOWE1MDE1ZWY0
|
Cu0EW3sia2V5IjogIjMyODIxN2I2YzI5NTliZGZjNDdjYWQwMGU4NDg5MGQwIiwgImlkIjogIjQ1
|
||||||
ODk1ZGM2Mjc4ZDU0ODE4YmE0NDZhZjciLCAiaWQiOiAiZjkwZWI0ZmItMzUyMC00ZDAyLTlhNDYt
|
NjMxMmU3LThkMmMtNDcyMi1iNWNkLTlhMGRhMzg5MmM3OCIsICJyb2xlIjogIkNFTyIsICJ2ZXJi
|
||||||
NDE2ZTNlNTQ5NWYxIiwgInJvbGUiOiAiU2VuaW9yIFdyaXRlciIsICJ2ZXJib3NlPyI6IGZhbHNl
|
b3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAxNSwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25f
|
||||||
LCAibWF4X2l0ZXIiOiAyMCwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0i
|
Y2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJncHQtNG8iLCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6
|
||||||
OiAiIiwgImxsbSI6ICJncHQtNG8tbWluaSIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2Us
|
IHRydWUsICJhbGxvd19jb2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6
|
||||||
ICJhbGxvd19jb2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6IDIsICJ0
|
IDIsICJ0b29sc19uYW1lcyI6IFtdfSwgeyJrZXkiOiAiOWE1MDE1ZWY0ODk1ZGM2Mjc4ZDU0ODE4
|
||||||
b29sc19uYW1lcyI6IFtdfV1K+AEKCmNyZXdfdGFza3MS6QEK5gFbeyJrZXkiOiAiMGI5ZDY1ZGI2
|
YmE0NDZhZjciLCAiaWQiOiAiMTM0MDg5MjAtNzVjOC00MTk3LWIwNmQtY2I4MmNkZjhkZDhhIiwg
|
||||||
YjdhZWRmYjM5OGM1OWUyYTlmNzFlYzUiLCAiaWQiOiAiNzdmNDY3MDYtNzRjZi00ZGVkLThlMDUt
|
InJvbGUiOiAiU2VuaW9yIFdyaXRlciIsICJ2ZXJib3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAx
|
||||||
NmRlZGM0MmYwZDliIiwgImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lucHV0PyI6
|
NSwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJn
|
||||||
IGZhbHNlLCAiYWdlbnRfcm9sZSI6ICJDRU8iLCAiYWdlbnRfa2V5IjogIjMyODIxN2I2YzI5NTli
|
cHQtNG8iLCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6IGZhbHNlLCAiYWxsb3dfY29kZV9leGVjdXRp
|
||||||
ZGZjNDdjYWQwMGU4NDg5MGQwIiwgInRvb2xzX25hbWVzIjogW119XXoCGAGFAQABAAASjgIKEBvb
|
b24/IjogZmFsc2UsICJtYXhfcmV0cnlfbGltaXQiOiAyLCAidG9vbHNfbmFtZXMiOiBbXX1dSvgB
|
||||||
LkoAnHiD1gUnbftefpYSCNb1+4JxldizKgxUYXNrIENyZWF0ZWQwATmwYcTSHdYTGEEQz8TSHdYT
|
CgpjcmV3X3Rhc2tzEukBCuYBW3sia2V5IjogIjBiOWQ2NWRiNmI3YWVkZmIzOThjNTllMmE5Zjcx
|
||||||
GEouCghjcmV3X2tleRIiCiBlNjQ5NTczYTI2ZTU4NzkwY2FjMjFhMzdjZDQ0NDM3YUoxCgdjcmV3
|
ZWM1IiwgImlkIjogImQ0YjVhZmE2LTczNTEtNDUxMy04NzY2LTIzOGNjYTk5ZjRlZiIsICJhc3lu
|
||||||
X2lkEiYKJDNiNWQ0MWM0LWRlYmItNDYzMi1iYjAwLWY1N2E2YzYzZDAxOEouCgh0YXNrX2tleRIi
|
Y19leGVjdXRpb24/IjogZmFsc2UsICJodW1hbl9pbnB1dD8iOiBmYWxzZSwgImFnZW50X3JvbGUi
|
||||||
CiAwYjlkNjVkYjZiN2FlZGZiMzk4YzU5ZTJhOWY3MWVjNUoxCgd0YXNrX2lkEiYKJDc3ZjQ2NzA2
|
OiAiQ0VPIiwgImFnZW50X2tleSI6ICIzMjgyMTdiNmMyOTU5YmRmYzQ3Y2FkMDBlODQ4OTBkMCIs
|
||||||
LTc0Y2YtNGRlZC04ZTA1LTZkZWRjNDJmMGQ5YnoCGAGFAQABAAA=
|
ICJ0b29sc19uYW1lcyI6IFtdfV16AhgBhQEAAQAAEo4CChCLEGLGYlBkv0YucoYjY1NeEghRpGin
|
||||||
|
zpZUiSoMVGFzayBDcmVhdGVkMAE5KCA2WBNM+BdBaLw2WBNM+BdKLgoIY3Jld19rZXkSIgogZTY0
|
||||||
|
OTU3M2EyNmU1ODc5MGNhYzIxYTM3Y2Q0NDQzN2FKMQoHY3Jld19pZBImCiQyOGU2NGJkNy1jZWFj
|
||||||
|
LTQ2MTktODJjNy0yMzZkZDU0MThjODdKLgoIdGFza19rZXkSIgogMGI5ZDY1ZGI2YjdhZWRmYjM5
|
||||||
|
OGM1OWUyYTlmNzFlYzVKMQoHdGFza19pZBImCiRkNGI1YWZhNi03MzUxLTQ1MTMtODc2Ni0yMzhj
|
||||||
|
Y2E5OWY0ZWZ6AhgBhQEAAQAA
|
||||||
headers:
|
headers:
|
||||||
Accept:
|
Accept:
|
||||||
- '*/*'
|
- '*/*'
|
||||||
@@ -181,7 +179,7 @@ interactions:
|
|||||||
Connection:
|
Connection:
|
||||||
- keep-alive
|
- keep-alive
|
||||||
Content-Length:
|
Content-Length:
|
||||||
- '1577'
|
- '1842'
|
||||||
Content-Type:
|
Content-Type:
|
||||||
- application/x-protobuf
|
- application/x-protobuf
|
||||||
User-Agent:
|
User-Agent:
|
||||||
@@ -197,8 +195,370 @@ interactions:
|
|||||||
Content-Type:
|
Content-Type:
|
||||||
- application/x-protobuf
|
- application/x-protobuf
|
||||||
Date:
|
Date:
|
||||||
- Mon, 23 Dec 2024 14:55:10 GMT
|
- Tue, 24 Sep 2024 21:41:46 GMT
|
||||||
status:
|
status:
|
||||||
code: 200
|
code: 200
|
||||||
message: OK
|
message: OK
|
||||||
|
- request:
|
||||||
|
body: '{"messages": [{"role": "system", "content": "You are Senior Writer. You''re
|
||||||
|
a senior writer, specialized in technology, software engineering, AI and startups.
|
||||||
|
You work as a freelancer and are now working on writing content for a new customer.\nYour
|
||||||
|
personal goal is: Write the best content about AI and AI agents.\nTo give my
|
||||||
|
best complete final answer to the task use the exact following format:\n\nThought:
|
||||||
|
I now can give a great answer\nFinal Answer: Your final answer must be the great
|
||||||
|
and the most complete as possible, it must be outcome described.\n\nI MUST use
|
||||||
|
these formats, my job depends on it!"}, {"role": "user", "content": "\nCurrent
|
||||||
|
Task: Produce a one paragraph draft of an article about AI Agents\n\nThis is
|
||||||
|
the expect criteria for your final answer: Your best answer to your coworker
|
||||||
|
asking you this, accounting for the context shared.\nyou MUST return the actual
|
||||||
|
complete content as the final answer, not a summary.\n\nThis is the context
|
||||||
|
you''re working with:\nWe need an amazing one-paragraph draft as the beginning
|
||||||
|
of a 4-paragraph article about AI Agents. This is for a high-stakes project
|
||||||
|
that critically impacts our company. The paragraph should highlight what AI
|
||||||
|
Agents are, their significance, and how they are transforming various industries.
|
||||||
|
The tone should be professional yet engaging. Make sure the content is original,
|
||||||
|
insightful, and thoroughly researched.\n\nBegin! This is VERY important to you,
|
||||||
|
use the tools available and give your best Final Answer, your job depends on
|
||||||
|
it!\n\nThought:"}], "model": "gpt-4o"}'
|
||||||
|
headers:
|
||||||
|
accept:
|
||||||
|
- application/json
|
||||||
|
accept-encoding:
|
||||||
|
- gzip, deflate
|
||||||
|
connection:
|
||||||
|
- keep-alive
|
||||||
|
content-length:
|
||||||
|
- '1545'
|
||||||
|
content-type:
|
||||||
|
- application/json
|
||||||
|
cookie:
|
||||||
|
- __cf_bm=9.8sBYBkvBR8R1K_bVF7xgU..80XKlEIg3N2OBbTSCU-1727214102-1.0.1.1-.qiTLXbPamYUMSuyNsOEB9jhGu.jOifujOrx9E2JZvStbIZ9RTIiE44xKKNfLPxQkOi6qAT3h6htK8lPDGV_5g;
|
||||||
|
_cfuvid=lbRdAddVWV6W3f5Dm9SaOPWDUOxqtZBSPr_fTW26nEA-1727213194587-0.0.1.1-604800000
|
||||||
|
host:
|
||||||
|
- api.openai.com
|
||||||
|
user-agent:
|
||||||
|
- OpenAI/Python 1.47.0
|
||||||
|
x-stainless-arch:
|
||||||
|
- arm64
|
||||||
|
x-stainless-async:
|
||||||
|
- 'false'
|
||||||
|
x-stainless-lang:
|
||||||
|
- python
|
||||||
|
x-stainless-os:
|
||||||
|
- MacOS
|
||||||
|
x-stainless-package-version:
|
||||||
|
- 1.47.0
|
||||||
|
x-stainless-raw-response:
|
||||||
|
- 'true'
|
||||||
|
x-stainless-runtime:
|
||||||
|
- CPython
|
||||||
|
x-stainless-runtime-version:
|
||||||
|
- 3.11.7
|
||||||
|
method: POST
|
||||||
|
uri: https://api.openai.com/v1/chat/completions
|
||||||
|
response:
|
||||||
|
content: "{\n \"id\": \"chatcmpl-AB7ZxDYcPlSiBZsftdRs2cWbUJllW\",\n \"object\":
|
||||||
|
\"chat.completion\",\n \"created\": 1727214105,\n \"model\": \"gpt-4o-2024-05-13\",\n
|
||||||
|
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
||||||
|
\"assistant\",\n \"content\": \"I now can give a great answer \\nFinal
|
||||||
|
Answer: Artificial Intelligence (AI) Agents are sophisticated computer programs
|
||||||
|
designed to perform tasks that typically require human intelligence, such as
|
||||||
|
decision making, problem-solving, and learning. These agents operate autonomously,
|
||||||
|
utilizing vast amounts of data, advanced algorithms, and machine learning techniques
|
||||||
|
to analyze their environment, adapt to new information, and improve their performance
|
||||||
|
over time. The significance of AI Agents lies in their transformative potential
|
||||||
|
across various industries. In healthcare, they assist in diagnosing diseases
|
||||||
|
with greater accuracy; in finance, they predict market trends and manage risks;
|
||||||
|
in customer service, they provide personalized and efficient responses. As these
|
||||||
|
AI-powered entities continue to evolve, they are not only enhancing operational
|
||||||
|
efficiencies but also driving innovation and creating new opportunities for
|
||||||
|
growth and development in every sector they penetrate.\",\n \"refusal\":
|
||||||
|
null\n },\n \"logprobs\": null,\n \"finish_reason\": \"stop\"\n
|
||||||
|
\ }\n ],\n \"usage\": {\n \"prompt_tokens\": 297,\n \"completion_tokens\":
|
||||||
|
160,\n \"total_tokens\": 457,\n \"completion_tokens_details\": {\n \"reasoning_tokens\":
|
||||||
|
0\n }\n },\n \"system_fingerprint\": \"fp_e375328146\"\n}\n"
|
||||||
|
headers:
|
||||||
|
CF-Cache-Status:
|
||||||
|
- DYNAMIC
|
||||||
|
CF-RAY:
|
||||||
|
- 8c85f0c0cf961cf3-GRU
|
||||||
|
Connection:
|
||||||
|
- keep-alive
|
||||||
|
Content-Encoding:
|
||||||
|
- gzip
|
||||||
|
Content-Type:
|
||||||
|
- application/json
|
||||||
|
Date:
|
||||||
|
- Tue, 24 Sep 2024 21:41:48 GMT
|
||||||
|
Server:
|
||||||
|
- cloudflare
|
||||||
|
Transfer-Encoding:
|
||||||
|
- chunked
|
||||||
|
X-Content-Type-Options:
|
||||||
|
- nosniff
|
||||||
|
access-control-expose-headers:
|
||||||
|
- X-Request-ID
|
||||||
|
openai-organization:
|
||||||
|
- crewai-iuxna1
|
||||||
|
openai-processing-ms:
|
||||||
|
- '2468'
|
||||||
|
openai-version:
|
||||||
|
- '2020-10-01'
|
||||||
|
strict-transport-security:
|
||||||
|
- max-age=31536000; includeSubDomains; preload
|
||||||
|
x-ratelimit-limit-requests:
|
||||||
|
- '10000'
|
||||||
|
x-ratelimit-limit-tokens:
|
||||||
|
- '30000000'
|
||||||
|
x-ratelimit-remaining-requests:
|
||||||
|
- '9999'
|
||||||
|
x-ratelimit-remaining-tokens:
|
||||||
|
- '29999625'
|
||||||
|
x-ratelimit-reset-requests:
|
||||||
|
- 6ms
|
||||||
|
x-ratelimit-reset-tokens:
|
||||||
|
- 0s
|
||||||
|
x-request-id:
|
||||||
|
- req_66c8801b42ac865249246d98225c1492
|
||||||
|
http_version: HTTP/1.1
|
||||||
|
status_code: 200
|
||||||
|
- request:
|
||||||
|
body: !!binary |
|
||||||
|
CtwBCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSswEKEgoQY3Jld2FpLnRl
|
||||||
|
bGVtZXRyeRKcAQoQROg/k5NCUGdgfvfLrFlQDxIIlfh6oMbmqu0qClRvb2wgVXNhZ2UwATlws+Wj
|
||||||
|
FEz4F0EwBeijFEz4F0oaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjYxLjBKKAoJdG9vbF9uYW1lEhsK
|
||||||
|
GURlbGVnYXRlIHdvcmsgdG8gY293b3JrZXJKDgoIYXR0ZW1wdHMSAhgBegIYAYUBAAEAAA==
|
||||||
|
headers:
|
||||||
|
Accept:
|
||||||
|
- '*/*'
|
||||||
|
Accept-Encoding:
|
||||||
|
- gzip, deflate
|
||||||
|
Connection:
|
||||||
|
- keep-alive
|
||||||
|
Content-Length:
|
||||||
|
- '223'
|
||||||
|
Content-Type:
|
||||||
|
- application/x-protobuf
|
||||||
|
User-Agent:
|
||||||
|
- OTel-OTLP-Exporter-Python/1.27.0
|
||||||
|
method: POST
|
||||||
|
uri: https://telemetry.crewai.com:4319/v1/traces
|
||||||
|
response:
|
||||||
|
body:
|
||||||
|
string: "\n\0"
|
||||||
|
headers:
|
||||||
|
Content-Length:
|
||||||
|
- '2'
|
||||||
|
Content-Type:
|
||||||
|
- application/x-protobuf
|
||||||
|
Date:
|
||||||
|
- Tue, 24 Sep 2024 21:41:51 GMT
|
||||||
|
status:
|
||||||
|
code: 200
|
||||||
|
message: OK
|
||||||
|
- request:
|
||||||
|
body: '{"messages": [{"role": "system", "content": "You are CEO. You''re an long
|
||||||
|
time CEO of a content creation agency with a Senior Writer on the team. You''re
|
||||||
|
now working on a new project and want to make sure the content produced is amazing.\nYour
|
||||||
|
personal goal is: Make sure the writers in your company produce amazing content.\nYou
|
||||||
|
ONLY have access to the following tools, and should NEVER make up tools that
|
||||||
|
are not listed here:\n\nTool Name: Delegate work to coworker(task: str, context:
|
||||||
|
str, coworker: Optional[str] = None, **kwargs)\nTool Description: Delegate a
|
||||||
|
specific task to one of the following coworkers: Senior Writer\nThe input to
|
||||||
|
this tool should be the coworker, the task you want them to do, and ALL necessary
|
||||||
|
context to execute the task, they know nothing about the task, so share absolute
|
||||||
|
everything you know, don''t reference things but instead explain them.\nTool
|
||||||
|
Arguments: {''task'': {''title'': ''Task'', ''type'': ''string''}, ''context'':
|
||||||
|
{''title'': ''Context'', ''type'': ''string''}, ''coworker'': {''title'': ''Coworker'',
|
||||||
|
''type'': ''string''}, ''kwargs'': {''title'': ''Kwargs'', ''type'': ''object''}}\nTool
|
||||||
|
Name: Ask question to coworker(question: str, context: str, coworker: Optional[str]
|
||||||
|
= None, **kwargs)\nTool Description: Ask a specific question to one of the following
|
||||||
|
coworkers: Senior Writer\nThe input to this tool should be the coworker, the
|
||||||
|
question you have for them, and ALL necessary context to ask the question properly,
|
||||||
|
they know nothing about the question, so share absolute everything you know,
|
||||||
|
don''t reference things but instead explain them.\nTool Arguments: {''question'':
|
||||||
|
{''title'': ''Question'', ''type'': ''string''}, ''context'': {''title'': ''Context'',
|
||||||
|
''type'': ''string''}, ''coworker'': {''title'': ''Coworker'', ''type'': ''string''},
|
||||||
|
''kwargs'': {''title'': ''Kwargs'', ''type'': ''object''}}\n\nUse the following
|
||||||
|
format:\n\nThought: you should always think about what to do\nAction: the action
|
||||||
|
to take, only one name of [Delegate work to coworker, Ask question to coworker],
|
||||||
|
just the name, exactly as it''s written.\nAction Input: the input to the action,
|
||||||
|
just a simple python dictionary, enclosed in curly braces, using \" to wrap
|
||||||
|
keys and values.\nObservation: the result of the action\n\nOnce all necessary
|
||||||
|
information is gathered:\n\nThought: I now know the final answer\nFinal Answer:
|
||||||
|
the final answer to the original input question\n"}, {"role": "user", "content":
|
||||||
|
"\nCurrent Task: Produce and amazing 1 paragraph draft of an article about AI
|
||||||
|
Agents.\n\nThis is the expect criteria for your final answer: A 4 paragraph
|
||||||
|
article about AI.\nyou MUST return the actual complete content as the final
|
||||||
|
answer, not a summary.\n\nBegin! This is VERY important to you, use the tools
|
||||||
|
available and give your best Final Answer, your job depends on it!\n\nThought:"},
|
||||||
|
{"role": "assistant", "content": "Thought: To ensure the content is amazing,
|
||||||
|
I''ll delegate the task of producing a one-paragraph draft of an article about
|
||||||
|
AI Agents to the Senior Writer with all necessary context.\n\nAction: Delegate
|
||||||
|
work to coworker\nAction Input: \n{\n \"coworker\": \"Senior Writer\", \n \"task\":
|
||||||
|
\"Produce a one paragraph draft of an article about AI Agents\", \n \"context\":
|
||||||
|
\"We need an amazing one-paragraph draft as the beginning of a 4-paragraph article
|
||||||
|
about AI Agents. This is for a high-stakes project that critically impacts our
|
||||||
|
company. The paragraph should highlight what AI Agents are, their significance,
|
||||||
|
and how they are transforming various industries. The tone should be professional
|
||||||
|
yet engaging. Make sure the content is original, insightful, and thoroughly
|
||||||
|
researched.\"\n}\nObservation: Artificial Intelligence (AI) Agents are sophisticated
|
||||||
|
computer programs designed to perform tasks that typically require human intelligence,
|
||||||
|
such as decision making, problem-solving, and learning. These agents operate
|
||||||
|
autonomously, utilizing vast amounts of data, advanced algorithms, and machine
|
||||||
|
learning techniques to analyze their environment, adapt to new information,
|
||||||
|
and improve their performance over time. The significance of AI Agents lies
|
||||||
|
in their transformative potential across various industries. In healthcare,
|
||||||
|
they assist in diagnosing diseases with greater accuracy; in finance, they predict
|
||||||
|
market trends and manage risks; in customer service, they provide personalized
|
||||||
|
and efficient responses. As these AI-powered entities continue to evolve, they
|
||||||
|
are not only enhancing operational efficiencies but also driving innovation
|
||||||
|
and creating new opportunities for growth and development in every sector they
|
||||||
|
penetrate."}], "model": "gpt-4o"}'
|
||||||
|
headers:
|
||||||
|
accept:
|
||||||
|
- application/json
|
||||||
|
accept-encoding:
|
||||||
|
- gzip, deflate
|
||||||
|
connection:
|
||||||
|
- keep-alive
|
||||||
|
content-length:
|
||||||
|
- '4536'
|
||||||
|
content-type:
|
||||||
|
- application/json
|
||||||
|
cookie:
|
||||||
|
- __cf_bm=9.8sBYBkvBR8R1K_bVF7xgU..80XKlEIg3N2OBbTSCU-1727214102-1.0.1.1-.qiTLXbPamYUMSuyNsOEB9jhGu.jOifujOrx9E2JZvStbIZ9RTIiE44xKKNfLPxQkOi6qAT3h6htK8lPDGV_5g;
|
||||||
|
_cfuvid=lbRdAddVWV6W3f5Dm9SaOPWDUOxqtZBSPr_fTW26nEA-1727213194587-0.0.1.1-604800000
|
||||||
|
host:
|
||||||
|
- api.openai.com
|
||||||
|
user-agent:
|
||||||
|
- OpenAI/Python 1.47.0
|
||||||
|
x-stainless-arch:
|
||||||
|
- arm64
|
||||||
|
x-stainless-async:
|
||||||
|
- 'false'
|
||||||
|
x-stainless-lang:
|
||||||
|
- python
|
||||||
|
x-stainless-os:
|
||||||
|
- MacOS
|
||||||
|
x-stainless-package-version:
|
||||||
|
- 1.47.0
|
||||||
|
x-stainless-raw-response:
|
||||||
|
- 'true'
|
||||||
|
x-stainless-runtime:
|
||||||
|
- CPython
|
||||||
|
x-stainless-runtime-version:
|
||||||
|
- 3.11.7
|
||||||
|
method: POST
|
||||||
|
uri: https://api.openai.com/v1/chat/completions
|
||||||
|
response:
|
||||||
|
content: "{\n \"id\": \"chatcmpl-AB7a1PO3pMNybn76wXDFc5HE7ZRsL\",\n \"object\":
|
||||||
|
\"chat.completion\",\n \"created\": 1727214109,\n \"model\": \"gpt-4o-2024-05-13\",\n
|
||||||
|
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
||||||
|
\"assistant\",\n \"content\": \"Thought: The paragraph provided by the
|
||||||
|
Senior Writer is well-detailed and engaging. I will now structure the 4-paragraph
|
||||||
|
article around this draft, adding an introduction, expanding on the specific
|
||||||
|
applications in various industries, and concluding with the potential future
|
||||||
|
impact of AI Agents.\\n\\nFinal Answer:\\n\\nArtificial Intelligence (AI) Agents
|
||||||
|
are sophisticated computer programs designed to perform tasks that typically
|
||||||
|
require human intelligence, such as decision making, problem-solving, and learning.
|
||||||
|
These agents operate autonomously, utilizing vast amounts of data, advanced
|
||||||
|
algorithms, and machine learning techniques to analyze their environment, adapt
|
||||||
|
to new information, and improve their performance over time.\\n\\nThe significance
|
||||||
|
of AI Agents lies in their transformative potential across various industries.
|
||||||
|
In healthcare, for example, they assist in diagnosing diseases with greater
|
||||||
|
accuracy and speed than human practitioners, offering personalized treatment
|
||||||
|
plans by analyzing patient data. In finance, AI Agents predict market trends,
|
||||||
|
manage risks, and even execute trades, contributing to more stable and profitable
|
||||||
|
financial systems. Customer service sectors benefit significantly from AI Agents,
|
||||||
|
as they provide personalized and efficient responses, often resolving issues
|
||||||
|
faster than traditional methods.\\n\\nMoreover, AI Agents are also making substantial
|
||||||
|
contributions in fields like education and manufacturing. In education, they
|
||||||
|
offer tailored learning experiences by assessing individual student needs and
|
||||||
|
adjusting teaching methods accordingly. They help educators identify students
|
||||||
|
who might need additional support and provide resources to enhance learning
|
||||||
|
outcomes. In manufacturing, AI Agents optimize production lines, predict equipment
|
||||||
|
failures, and improve supply chain management, thus boosting productivity and
|
||||||
|
reducing downtime.\\n\\nAs these AI-powered entities continue to evolve, they
|
||||||
|
are not only enhancing operational efficiencies but also driving innovation
|
||||||
|
and creating new opportunities for growth and development in every sector they
|
||||||
|
penetrate. The future of AI Agents looks promising, with the potential to revolutionize
|
||||||
|
the way we live and work, making processes more efficient, decisions more data-driven,
|
||||||
|
and solutions more innovative than ever before.\\n\\nThis is now a well-rounded,
|
||||||
|
four-paragraph article that comprehensively covers the topic of AI Agents.\\n\\nFinal
|
||||||
|
Answer: This is the complete content as specified:\\nArtificial Intelligence
|
||||||
|
(AI) Agents are sophisticated computer programs designed to perform tasks that
|
||||||
|
typically require human intelligence, such as decision making, problem-solving,
|
||||||
|
and learning. These agents operate autonomously, utilizing vast amounts of data,
|
||||||
|
advanced algorithms, and machine learning techniques to analyze their environment,
|
||||||
|
adapt to new information, and improve their performance over time.\\n\\nThe
|
||||||
|
significance of AI Agents lies in their transformative potential across various
|
||||||
|
industries. In healthcare, for example, they assist in diagnosing diseases with
|
||||||
|
greater accuracy and speed than human practitioners, offering personalized treatment
|
||||||
|
plans by analyzing patient data. In finance, AI Agents predict market trends,
|
||||||
|
manage risks, and even execute trades, contributing to more stable and profitable
|
||||||
|
financial systems. Customer service sectors benefit significantly from AI Agents,
|
||||||
|
as they provide personalized and efficient responses, often resolving issues
|
||||||
|
faster than traditional methods.\\n\\nMoreover, AI Agents are also making substantial
|
||||||
|
contributions in fields like education and manufacturing. In education, they
|
||||||
|
offer tailored learning experiences by assessing individual student needs and
|
||||||
|
adjusting teaching methods accordingly. They help educators identify students
|
||||||
|
who might need additional support and provide resources to enhance learning
|
||||||
|
outcomes. In manufacturing, AI Agents optimize production lines, predict equipment
|
||||||
|
failures, and improve supply chain management, thus boosting productivity and
|
||||||
|
reducing downtime.\\n\\nAs these AI-powered entities continue to evolve, they
|
||||||
|
are not only enhancing operational efficiencies but also driving innovation
|
||||||
|
and creating new opportunities for growth and development in every sector they
|
||||||
|
penetrate. The future of AI Agents looks promising, with the potential to revolutionize
|
||||||
|
the way we live and work, making processes more efficient, decisions more data-driven,
|
||||||
|
and solutions more innovative than ever before.\",\n \"refusal\": null\n
|
||||||
|
\ },\n \"logprobs\": null,\n \"finish_reason\": \"stop\"\n }\n
|
||||||
|
\ ],\n \"usage\": {\n \"prompt_tokens\": 923,\n \"completion_tokens\":
|
||||||
|
715,\n \"total_tokens\": 1638,\n \"completion_tokens_details\": {\n \"reasoning_tokens\":
|
||||||
|
0\n }\n },\n \"system_fingerprint\": \"fp_e375328146\"\n}\n"
|
||||||
|
headers:
|
||||||
|
CF-Cache-Status:
|
||||||
|
- DYNAMIC
|
||||||
|
CF-RAY:
|
||||||
|
- 8c85f0d2f90c1cf3-GRU
|
||||||
|
Connection:
|
||||||
|
- keep-alive
|
||||||
|
Content-Encoding:
|
||||||
|
- gzip
|
||||||
|
Content-Type:
|
||||||
|
- application/json
|
||||||
|
Date:
|
||||||
|
- Tue, 24 Sep 2024 21:41:58 GMT
|
||||||
|
Server:
|
||||||
|
- cloudflare
|
||||||
|
Transfer-Encoding:
|
||||||
|
- chunked
|
||||||
|
X-Content-Type-Options:
|
||||||
|
- nosniff
|
||||||
|
access-control-expose-headers:
|
||||||
|
- X-Request-ID
|
||||||
|
openai-organization:
|
||||||
|
- crewai-iuxna1
|
||||||
|
openai-processing-ms:
|
||||||
|
- '8591'
|
||||||
|
openai-version:
|
||||||
|
- '2020-10-01'
|
||||||
|
strict-transport-security:
|
||||||
|
- max-age=31536000; includeSubDomains; preload
|
||||||
|
x-ratelimit-limit-requests:
|
||||||
|
- '10000'
|
||||||
|
x-ratelimit-limit-tokens:
|
||||||
|
- '30000000'
|
||||||
|
x-ratelimit-remaining-requests:
|
||||||
|
- '9999'
|
||||||
|
x-ratelimit-remaining-tokens:
|
||||||
|
- '29998895'
|
||||||
|
x-ratelimit-reset-requests:
|
||||||
|
- 6ms
|
||||||
|
x-ratelimit-reset-tokens:
|
||||||
|
- 2ms
|
||||||
|
x-request-id:
|
||||||
|
- req_2b51b5cff02148d29b04284b40ca6081
|
||||||
|
http_version: HTTP/1.1
|
||||||
|
status_code: 200
|
||||||
version: 1
|
version: 1
|
||||||
|
|||||||
@@ -1,480 +0,0 @@
|
|||||||
interactions:
|
|
||||||
- request:
|
|
||||||
body: '{"messages": [{"role": "system", "content": "You are CEO. You''re an long
|
|
||||||
time CEO of a content creation agency with a Senior Writer on the team. You''re
|
|
||||||
now working on a new project and want to make sure the content produced is amazing.\nYour
|
|
||||||
personal goal is: Make sure the writers in your company produce amazing content.\nYou
|
|
||||||
ONLY have access to the following tools, and should NEVER make up tools that
|
|
||||||
are not listed here:\n\nTool Name: Test Tool\nTool Arguments: {''query'': {''description'':
|
|
||||||
''Query to process'', ''type'': ''str''}}\nTool Description: A test tool that
|
|
||||||
just returns the input\n\nUse the following format:\n\nThought: you should always
|
|
||||||
think about what to do\nAction: the action to take, only one name of [Test Tool],
|
|
||||||
just the name, exactly as it''s written.\nAction Input: the input to the action,
|
|
||||||
just a simple python dictionary, enclosed in curly braces, using \" to wrap
|
|
||||||
keys and values.\nObservation: the result of the action\n\nOnce all necessary
|
|
||||||
information is gathered:\n\nThought: I now know the final answer\nFinal Answer:
|
|
||||||
the final answer to the original input question"}, {"role": "user", "content":
|
|
||||||
"\nCurrent Task: Produce and amazing 1 paragraph draft of an article about AI
|
|
||||||
Agents.\n\nThis is the expect criteria for your final answer: A 4 paragraph
|
|
||||||
article about AI.\nyou MUST return the actual complete content as the final
|
|
||||||
answer, not a summary.\n\nBegin! This is VERY important to you, use the tools
|
|
||||||
available and give your best Final Answer, your job depends on it!\n\nThought:"}],
|
|
||||||
"model": "gpt-4o-mini", "stop": ["\nObservation:"], "stream": false}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- application/json
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '1581'
|
|
||||||
content-type:
|
|
||||||
- application/json
|
|
||||||
host:
|
|
||||||
- api.openai.com
|
|
||||||
user-agent:
|
|
||||||
- OpenAI/Python 1.52.1
|
|
||||||
x-stainless-arch:
|
|
||||||
- arm64
|
|
||||||
x-stainless-async:
|
|
||||||
- 'false'
|
|
||||||
x-stainless-lang:
|
|
||||||
- python
|
|
||||||
x-stainless-os:
|
|
||||||
- MacOS
|
|
||||||
x-stainless-package-version:
|
|
||||||
- 1.52.1
|
|
||||||
x-stainless-raw-response:
|
|
||||||
- 'true'
|
|
||||||
x-stainless-retry-count:
|
|
||||||
- '0'
|
|
||||||
x-stainless-runtime:
|
|
||||||
- CPython
|
|
||||||
x-stainless-runtime-version:
|
|
||||||
- 3.11.7
|
|
||||||
method: POST
|
|
||||||
uri: https://api.openai.com/v1/chat/completions
|
|
||||||
response:
|
|
||||||
content: "{\n \"id\": \"chatcmpl-AhLsKP8xKkISk8ntUscyUKL30xRXW\",\n \"object\":
|
|
||||||
\"chat.completion\",\n \"created\": 1734895556,\n \"model\": \"gpt-4o-mini-2024-07-18\",\n
|
|
||||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
|
||||||
\"assistant\",\n \"content\": \"I need to gather information to create
|
|
||||||
an amazing paragraph draft about AI Agents that aligns with the expected criteria
|
|
||||||
of a 4-paragraph article about AI. \\n\\nAction: Test Tool \\nAction Input:
|
|
||||||
{\\\"query\\\": \\\"Write a captivating and informative paragraph about AI Agents,
|
|
||||||
focusing on their capabilities, applications, and significance in modern technology.\\\"}
|
|
||||||
\ \",\n \"refusal\": null\n },\n \"logprobs\": null,\n \"finish_reason\":
|
|
||||||
\"stop\"\n }\n ],\n \"usage\": {\n \"prompt_tokens\": 309,\n \"completion_tokens\":
|
|
||||||
68,\n \"total_tokens\": 377,\n \"prompt_tokens_details\": {\n \"cached_tokens\":
|
|
||||||
0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\": {\n
|
|
||||||
\ \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
|
||||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
|
||||||
\"fp_0aa8d3e20b\"\n}\n"
|
|
||||||
headers:
|
|
||||||
CF-Cache-Status:
|
|
||||||
- DYNAMIC
|
|
||||||
CF-RAY:
|
|
||||||
- 8f62802d0b3f00d5-GRU
|
|
||||||
Connection:
|
|
||||||
- keep-alive
|
|
||||||
Content-Encoding:
|
|
||||||
- gzip
|
|
||||||
Content-Type:
|
|
||||||
- application/json
|
|
||||||
Date:
|
|
||||||
- Sun, 22 Dec 2024 19:25:57 GMT
|
|
||||||
Server:
|
|
||||||
- cloudflare
|
|
||||||
Set-Cookie:
|
|
||||||
- __cf_bm=vwBNilrHRgMLd8ALWzYrBO5Lm8ieJzbQ3WCVOgmuJ.s-1734895557-1.0.1.1-z.QnDsynL_Ndu.JkWrh_wGMo57vvpK88nWDBTA8P.6prlSRmA91GQLpP62yRUbCW6yoKFbDxroSaYO6qrzZPRg;
|
|
||||||
path=/; expires=Sun, 22-Dec-24 19:55:57 GMT; domain=.api.openai.com; HttpOnly;
|
|
||||||
Secure; SameSite=None
|
|
||||||
- _cfuvid=2u_Xw.i716TDjD2vb2mvMyWxhA4q1MM1JvbrA8CNZpI-1734895557894-0.0.1.1-604800000;
|
|
||||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
|
||||||
Transfer-Encoding:
|
|
||||||
- chunked
|
|
||||||
X-Content-Type-Options:
|
|
||||||
- nosniff
|
|
||||||
access-control-expose-headers:
|
|
||||||
- X-Request-ID
|
|
||||||
alt-svc:
|
|
||||||
- h3=":443"; ma=86400
|
|
||||||
openai-organization:
|
|
||||||
- crewai-iuxna1
|
|
||||||
openai-processing-ms:
|
|
||||||
- '1075'
|
|
||||||
openai-version:
|
|
||||||
- '2020-10-01'
|
|
||||||
strict-transport-security:
|
|
||||||
- max-age=31536000; includeSubDomains; preload
|
|
||||||
x-ratelimit-limit-requests:
|
|
||||||
- '30000'
|
|
||||||
x-ratelimit-limit-tokens:
|
|
||||||
- '150000000'
|
|
||||||
x-ratelimit-remaining-requests:
|
|
||||||
- '29999'
|
|
||||||
x-ratelimit-remaining-tokens:
|
|
||||||
- '149999630'
|
|
||||||
x-ratelimit-reset-requests:
|
|
||||||
- 2ms
|
|
||||||
x-ratelimit-reset-tokens:
|
|
||||||
- 0s
|
|
||||||
x-request-id:
|
|
||||||
- req_80fbcef3505afac708a24ef167b701bb
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
- request:
|
|
||||||
body: '{"messages": [{"role": "system", "content": "You are CEO. You''re an long
|
|
||||||
time CEO of a content creation agency with a Senior Writer on the team. You''re
|
|
||||||
now working on a new project and want to make sure the content produced is amazing.\nYour
|
|
||||||
personal goal is: Make sure the writers in your company produce amazing content.\nYou
|
|
||||||
ONLY have access to the following tools, and should NEVER make up tools that
|
|
||||||
are not listed here:\n\nTool Name: Test Tool\nTool Arguments: {''query'': {''description'':
|
|
||||||
''Query to process'', ''type'': ''str''}}\nTool Description: A test tool that
|
|
||||||
just returns the input\n\nUse the following format:\n\nThought: you should always
|
|
||||||
think about what to do\nAction: the action to take, only one name of [Test Tool],
|
|
||||||
just the name, exactly as it''s written.\nAction Input: the input to the action,
|
|
||||||
just a simple python dictionary, enclosed in curly braces, using \" to wrap
|
|
||||||
keys and values.\nObservation: the result of the action\n\nOnce all necessary
|
|
||||||
information is gathered:\n\nThought: I now know the final answer\nFinal Answer:
|
|
||||||
the final answer to the original input question"}, {"role": "user", "content":
|
|
||||||
"\nCurrent Task: Produce and amazing 1 paragraph draft of an article about AI
|
|
||||||
Agents.\n\nThis is the expect criteria for your final answer: A 4 paragraph
|
|
||||||
article about AI.\nyou MUST return the actual complete content as the final
|
|
||||||
answer, not a summary.\n\nBegin! This is VERY important to you, use the tools
|
|
||||||
available and give your best Final Answer, your job depends on it!\n\nThought:"},
|
|
||||||
{"role": "assistant", "content": "I need to gather information to create an
|
|
||||||
amazing paragraph draft about AI Agents that aligns with the expected criteria
|
|
||||||
of a 4-paragraph article about AI. \n\nAction: Test Tool \nAction Input: {\"query\":
|
|
||||||
\"Write a captivating and informative paragraph about AI Agents, focusing on
|
|
||||||
their capabilities, applications, and significance in modern technology.\"} \nObservation:
|
|
||||||
Processed: Write a captivating and informative paragraph about AI Agents, focusing
|
|
||||||
on their capabilities, applications, and significance in modern technology."}],
|
|
||||||
"model": "gpt-4o-mini", "stop": ["\nObservation:"], "stream": false}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- application/json
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '2153'
|
|
||||||
content-type:
|
|
||||||
- application/json
|
|
||||||
cookie:
|
|
||||||
- __cf_bm=vwBNilrHRgMLd8ALWzYrBO5Lm8ieJzbQ3WCVOgmuJ.s-1734895557-1.0.1.1-z.QnDsynL_Ndu.JkWrh_wGMo57vvpK88nWDBTA8P.6prlSRmA91GQLpP62yRUbCW6yoKFbDxroSaYO6qrzZPRg;
|
|
||||||
_cfuvid=2u_Xw.i716TDjD2vb2mvMyWxhA4q1MM1JvbrA8CNZpI-1734895557894-0.0.1.1-604800000
|
|
||||||
host:
|
|
||||||
- api.openai.com
|
|
||||||
user-agent:
|
|
||||||
- OpenAI/Python 1.52.1
|
|
||||||
x-stainless-arch:
|
|
||||||
- arm64
|
|
||||||
x-stainless-async:
|
|
||||||
- 'false'
|
|
||||||
x-stainless-lang:
|
|
||||||
- python
|
|
||||||
x-stainless-os:
|
|
||||||
- MacOS
|
|
||||||
x-stainless-package-version:
|
|
||||||
- 1.52.1
|
|
||||||
x-stainless-raw-response:
|
|
||||||
- 'true'
|
|
||||||
x-stainless-retry-count:
|
|
||||||
- '0'
|
|
||||||
x-stainless-runtime:
|
|
||||||
- CPython
|
|
||||||
x-stainless-runtime-version:
|
|
||||||
- 3.11.7
|
|
||||||
method: POST
|
|
||||||
uri: https://api.openai.com/v1/chat/completions
|
|
||||||
response:
|
|
||||||
content: "{\n \"id\": \"chatcmpl-AhLsMt1AgrzynC2TSJZZSwr9El8FV\",\n \"object\":
|
|
||||||
\"chat.completion\",\n \"created\": 1734895558,\n \"model\": \"gpt-4o-mini-2024-07-18\",\n
|
|
||||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
|
||||||
\"assistant\",\n \"content\": \"Thought: I have received the content
|
|
||||||
related to AI Agents, which I need to now use as a foundation for creating a
|
|
||||||
complete 4-paragraph article about AI. \\n\\nAction: Test Tool \\nAction Input:
|
|
||||||
{\\\"query\\\": \\\"Based on the previous paragraph about AI Agents, write a
|
|
||||||
4-paragraph article about AI, including an introduction, discussion of AI Agents,
|
|
||||||
their applications, and a conclusion on the future of AI.\\\"} \",\n \"refusal\":
|
|
||||||
null\n },\n \"logprobs\": null,\n \"finish_reason\": \"stop\"\n
|
|
||||||
\ }\n ],\n \"usage\": {\n \"prompt_tokens\": 409,\n \"completion_tokens\":
|
|
||||||
88,\n \"total_tokens\": 497,\n \"prompt_tokens_details\": {\n \"cached_tokens\":
|
|
||||||
0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\": {\n
|
|
||||||
\ \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
|
||||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
|
||||||
\"fp_0aa8d3e20b\"\n}\n"
|
|
||||||
headers:
|
|
||||||
CF-Cache-Status:
|
|
||||||
- DYNAMIC
|
|
||||||
CF-RAY:
|
|
||||||
- 8f6280352b9400d5-GRU
|
|
||||||
Connection:
|
|
||||||
- keep-alive
|
|
||||||
Content-Encoding:
|
|
||||||
- gzip
|
|
||||||
Content-Type:
|
|
||||||
- application/json
|
|
||||||
Date:
|
|
||||||
- Sun, 22 Dec 2024 19:25:59 GMT
|
|
||||||
Server:
|
|
||||||
- cloudflare
|
|
||||||
Transfer-Encoding:
|
|
||||||
- chunked
|
|
||||||
X-Content-Type-Options:
|
|
||||||
- nosniff
|
|
||||||
access-control-expose-headers:
|
|
||||||
- X-Request-ID
|
|
||||||
alt-svc:
|
|
||||||
- h3=":443"; ma=86400
|
|
||||||
openai-organization:
|
|
||||||
- crewai-iuxna1
|
|
||||||
openai-processing-ms:
|
|
||||||
- '1346'
|
|
||||||
openai-version:
|
|
||||||
- '2020-10-01'
|
|
||||||
strict-transport-security:
|
|
||||||
- max-age=31536000; includeSubDomains; preload
|
|
||||||
x-ratelimit-limit-requests:
|
|
||||||
- '30000'
|
|
||||||
x-ratelimit-limit-tokens:
|
|
||||||
- '150000000'
|
|
||||||
x-ratelimit-remaining-requests:
|
|
||||||
- '29999'
|
|
||||||
x-ratelimit-remaining-tokens:
|
|
||||||
- '149999498'
|
|
||||||
x-ratelimit-reset-requests:
|
|
||||||
- 2ms
|
|
||||||
x-ratelimit-reset-tokens:
|
|
||||||
- 0s
|
|
||||||
x-request-id:
|
|
||||||
- req_e25b377af34ef03b9a6955c9cfca5738
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
- request:
|
|
||||||
body: !!binary |
|
|
||||||
CtoOCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSsQ4KEgoQY3Jld2FpLnRl
|
|
||||||
bGVtZXRyeRLrCQoQHzrcBLmZm6+CB9ZGtTnz1BIISnyRX3cExT4qDENyZXcgQ3JlYXRlZDABOdCK
|
|
||||||
UxFRlhMYQdiyWhFRlhMYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODYuMEoaCg5weXRob25fdmVy
|
|
||||||
c2lvbhIICgYzLjExLjdKLgoIY3Jld19rZXkSIgogZTY0OTU3M2EyNmU1ODc5MGNhYzIxYTM3Y2Q0
|
|
||||||
NDQzN2FKMQoHY3Jld19pZBImCiQyYWFjYzYwZC0xYzE5LTRjZGYtYmJhNy1iM2RiMGM4YzFlZWZK
|
|
||||||
HAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdf
|
|
||||||
bnVtYmVyX29mX3Rhc2tzEgIYAUobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgCSpUFCgtjcmV3
|
|
||||||
X2FnZW50cxKFBQqCBVt7ImtleSI6ICIzMjgyMTdiNmMyOTU5YmRmYzQ3Y2FkMDBlODQ4OTBkMCIs
|
|
||||||
ICJpZCI6ICJlZmE4ZWRlNS0wN2IyLTQzOWUtYWQ4Yi1iNmQ0Nzg5NjBkNzkiLCAicm9sZSI6ICJD
|
|
||||||
RU8iLCAidmVyYm9zZT8iOiBmYWxzZSwgIm1heF9pdGVyIjogMjAsICJtYXhfcnBtIjogbnVsbCwg
|
|
||||||
ImZ1bmN0aW9uX2NhbGxpbmdfbGxtIjogIiIsICJsbG0iOiAiZ3B0LTRvLW1pbmkiLCAiZGVsZWdh
|
|
||||||
dGlvbl9lbmFibGVkPyI6IHRydWUsICJhbGxvd19jb2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1h
|
|
||||||
eF9yZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6IFsidGVzdCB0b29sIl19LCB7ImtleSI6
|
|
||||||
ICI5YTUwMTVlZjQ4OTVkYzYyNzhkNTQ4MThiYTQ0NmFmNyIsICJpZCI6ICIxMDE2MGEzMC0zM2U4
|
|
||||||
LTRlN2YtOTAzOC1lODU3Zjc2MzI0ZTUiLCAicm9sZSI6ICJTZW5pb3IgV3JpdGVyIiwgInZlcmJv
|
|
||||||
c2U/IjogZmFsc2UsICJtYXhfaXRlciI6IDIwLCAibWF4X3JwbSI6IG51bGwsICJmdW5jdGlvbl9j
|
|
||||||
YWxsaW5nX2xsbSI6ICIiLCAibGxtIjogImdwdC00by1taW5pIiwgImRlbGVnYXRpb25fZW5hYmxl
|
|
||||||
ZD8iOiBmYWxzZSwgImFsbG93X2NvZGVfZXhlY3V0aW9uPyI6IGZhbHNlLCAibWF4X3JldHJ5X2xp
|
|
||||||
bWl0IjogMiwgInRvb2xzX25hbWVzIjogW119XUqDAgoKY3Jld190YXNrcxL0AQrxAVt7ImtleSI6
|
|
||||||
ICIwYjlkNjVkYjZiN2FlZGZiMzk4YzU5ZTJhOWY3MWVjNSIsICJpZCI6ICJiNjYyZWVkOS1kYzcy
|
|
||||||
LTQ1NTEtYTdmMC1kY2E4ZTk3MmU3NjciLCAiYXN5bmNfZXhlY3V0aW9uPyI6IGZhbHNlLCAiaHVt
|
|
||||||
YW5faW5wdXQ/IjogZmFsc2UsICJhZ2VudF9yb2xlIjogIkNFTyIsICJhZ2VudF9rZXkiOiAiMzI4
|
|
||||||
MjE3YjZjMjk1OWJkZmM0N2NhZDAwZTg0ODkwZDAiLCAidG9vbHNfbmFtZXMiOiBbInRlc3QgdG9v
|
|
||||||
bCJdfV16AhgBhQEAAQAAEo4CChDkOw+7vfeJwW1bc0PIIqxeEggzmQQt0SPl+ioMVGFzayBDcmVh
|
|
||||||
dGVkMAE5OBlxEVGWExhBwKlxEVGWExhKLgoIY3Jld19rZXkSIgogZTY0OTU3M2EyNmU1ODc5MGNh
|
|
||||||
YzIxYTM3Y2Q0NDQzN2FKMQoHY3Jld19pZBImCiQyYWFjYzYwZC0xYzE5LTRjZGYtYmJhNy1iM2Ri
|
|
||||||
MGM4YzFlZWZKLgoIdGFza19rZXkSIgogMGI5ZDY1ZGI2YjdhZWRmYjM5OGM1OWUyYTlmNzFlYzVK
|
|
||||||
MQoHdGFza19pZBImCiRiNjYyZWVkOS1kYzcyLTQ1NTEtYTdmMC1kY2E4ZTk3MmU3Njd6AhgBhQEA
|
|
||||||
AQAAEowBChDS1rm7Q+c0w96t+encwsGJEgjRF+jTQh1PCyoKVG9vbCBVc2FnZTABOaAiFGtRlhMY
|
|
||||||
QdiVImtRlhMYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODYuMEoYCgl0b29sX25hbWUSCwoJVGVz
|
|
||||||
dCBUb29sSg4KCGF0dGVtcHRzEgIYAXoCGAGFAQABAAASjAEKECYGxNLnTRLCS76uAAOuzGwSCPmX
|
|
||||||
kSTjWKCcKgpUb29sIFVzYWdlMAE5CH3Wx1GWExhBGH/xx1GWExhKGgoOY3Jld2FpX3ZlcnNpb24S
|
|
||||||
CAoGMC44Ni4wShgKCXRvb2xfbmFtZRILCglUZXN0IFRvb2xKDgoIYXR0ZW1wdHMSAhgBegIYAYUB
|
|
||||||
AAEAAA==
|
|
||||||
headers:
|
|
||||||
Accept:
|
|
||||||
- '*/*'
|
|
||||||
Accept-Encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
Connection:
|
|
||||||
- keep-alive
|
|
||||||
Content-Length:
|
|
||||||
- '1885'
|
|
||||||
Content-Type:
|
|
||||||
- application/x-protobuf
|
|
||||||
User-Agent:
|
|
||||||
- OTel-OTLP-Exporter-Python/1.27.0
|
|
||||||
method: POST
|
|
||||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
|
||||||
response:
|
|
||||||
body:
|
|
||||||
string: "\n\0"
|
|
||||||
headers:
|
|
||||||
Content-Length:
|
|
||||||
- '2'
|
|
||||||
Content-Type:
|
|
||||||
- application/x-protobuf
|
|
||||||
Date:
|
|
||||||
- Sun, 22 Dec 2024 19:26:01 GMT
|
|
||||||
status:
|
|
||||||
code: 200
|
|
||||||
message: OK
|
|
||||||
- request:
|
|
||||||
body: '{"messages": [{"role": "system", "content": "You are CEO. You''re an long
|
|
||||||
time CEO of a content creation agency with a Senior Writer on the team. You''re
|
|
||||||
now working on a new project and want to make sure the content produced is amazing.\nYour
|
|
||||||
personal goal is: Make sure the writers in your company produce amazing content.\nYou
|
|
||||||
ONLY have access to the following tools, and should NEVER make up tools that
|
|
||||||
are not listed here:\n\nTool Name: Test Tool\nTool Arguments: {''query'': {''description'':
|
|
||||||
''Query to process'', ''type'': ''str''}}\nTool Description: A test tool that
|
|
||||||
just returns the input\n\nUse the following format:\n\nThought: you should always
|
|
||||||
think about what to do\nAction: the action to take, only one name of [Test Tool],
|
|
||||||
just the name, exactly as it''s written.\nAction Input: the input to the action,
|
|
||||||
just a simple python dictionary, enclosed in curly braces, using \" to wrap
|
|
||||||
keys and values.\nObservation: the result of the action\n\nOnce all necessary
|
|
||||||
information is gathered:\n\nThought: I now know the final answer\nFinal Answer:
|
|
||||||
the final answer to the original input question"}, {"role": "user", "content":
|
|
||||||
"\nCurrent Task: Produce and amazing 1 paragraph draft of an article about AI
|
|
||||||
Agents.\n\nThis is the expect criteria for your final answer: A 4 paragraph
|
|
||||||
article about AI.\nyou MUST return the actual complete content as the final
|
|
||||||
answer, not a summary.\n\nBegin! This is VERY important to you, use the tools
|
|
||||||
available and give your best Final Answer, your job depends on it!\n\nThought:"},
|
|
||||||
{"role": "assistant", "content": "I need to gather information to create an
|
|
||||||
amazing paragraph draft about AI Agents that aligns with the expected criteria
|
|
||||||
of a 4-paragraph article about AI. \n\nAction: Test Tool \nAction Input: {\"query\":
|
|
||||||
\"Write a captivating and informative paragraph about AI Agents, focusing on
|
|
||||||
their capabilities, applications, and significance in modern technology.\"} \nObservation:
|
|
||||||
Processed: Write a captivating and informative paragraph about AI Agents, focusing
|
|
||||||
on their capabilities, applications, and significance in modern technology."},
|
|
||||||
{"role": "assistant", "content": "Thought: I have received the content related
|
|
||||||
to AI Agents, which I need to now use as a foundation for creating a complete
|
|
||||||
4-paragraph article about AI. \n\nAction: Test Tool \nAction Input: {\"query\":
|
|
||||||
\"Based on the previous paragraph about AI Agents, write a 4-paragraph article
|
|
||||||
about AI, including an introduction, discussion of AI Agents, their applications,
|
|
||||||
and a conclusion on the future of AI.\"} \nObservation: Processed: Based on
|
|
||||||
the previous paragraph about AI Agents, write a 4-paragraph article about AI,
|
|
||||||
including an introduction, discussion of AI Agents, their applications, and
|
|
||||||
a conclusion on the future of AI."}], "model": "gpt-4o-mini", "stop": ["\nObservation:"],
|
|
||||||
"stream": false}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- application/json
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '2820'
|
|
||||||
content-type:
|
|
||||||
- application/json
|
|
||||||
cookie:
|
|
||||||
- __cf_bm=vwBNilrHRgMLd8ALWzYrBO5Lm8ieJzbQ3WCVOgmuJ.s-1734895557-1.0.1.1-z.QnDsynL_Ndu.JkWrh_wGMo57vvpK88nWDBTA8P.6prlSRmA91GQLpP62yRUbCW6yoKFbDxroSaYO6qrzZPRg;
|
|
||||||
_cfuvid=2u_Xw.i716TDjD2vb2mvMyWxhA4q1MM1JvbrA8CNZpI-1734895557894-0.0.1.1-604800000
|
|
||||||
host:
|
|
||||||
- api.openai.com
|
|
||||||
user-agent:
|
|
||||||
- OpenAI/Python 1.52.1
|
|
||||||
x-stainless-arch:
|
|
||||||
- arm64
|
|
||||||
x-stainless-async:
|
|
||||||
- 'false'
|
|
||||||
x-stainless-lang:
|
|
||||||
- python
|
|
||||||
x-stainless-os:
|
|
||||||
- MacOS
|
|
||||||
x-stainless-package-version:
|
|
||||||
- 1.52.1
|
|
||||||
x-stainless-raw-response:
|
|
||||||
- 'true'
|
|
||||||
x-stainless-retry-count:
|
|
||||||
- '0'
|
|
||||||
x-stainless-runtime:
|
|
||||||
- CPython
|
|
||||||
x-stainless-runtime-version:
|
|
||||||
- 3.11.7
|
|
||||||
method: POST
|
|
||||||
uri: https://api.openai.com/v1/chat/completions
|
|
||||||
response:
|
|
||||||
content: "{\n \"id\": \"chatcmpl-AhLsNJa6GxRIHF8l8eViU7D6CyBHP\",\n \"object\":
|
|
||||||
\"chat.completion\",\n \"created\": 1734895559,\n \"model\": \"gpt-4o-mini-2024-07-18\",\n
|
|
||||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
|
||||||
\"assistant\",\n \"content\": \"Thought: I have gathered the complete
|
|
||||||
article on AI, which aligns with the expected criteria. Now I will present the
|
|
||||||
final answer as required. \\n\\nFinal Answer: \\n\\nArtificial Intelligence
|
|
||||||
(AI) has rapidly evolved to become an integral part of our modern world, driving
|
|
||||||
significant advancements across various industries. AI refers to the simulation
|
|
||||||
of human intelligence in machines programmed to think and learn like humans.
|
|
||||||
This technology enables machines to analyze data, recognize patterns, and make
|
|
||||||
decisions with minimal human intervention, paving the way for innovation in
|
|
||||||
fields like healthcare, finance, and transportation.\\n\\nAI Agents, in particular,
|
|
||||||
embody the future of artificial intelligence by acting autonomously to perform
|
|
||||||
complex tasks. These agents leverage machine learning and natural language processing
|
|
||||||
to interact with users and understand their needs. They're deployed in customer
|
|
||||||
service applications, virtual assistants, and personal scheduling tools, showcasing
|
|
||||||
their capability to streamline processes and enhance user experience. By mimicking
|
|
||||||
human reasoning, AI Agents can adapt to changing situations and provide personalized
|
|
||||||
solutions.\\n\\nThe applications of AI Agents extend beyond mere task completion;
|
|
||||||
they are transforming the way businesses operate. In the realm of customer engagement,
|
|
||||||
AI Agents analyze customer behavior to provide insights that help companies
|
|
||||||
tailor their offerings. In healthcare, they assist in diagnosing illnesses by
|
|
||||||
analyzing patient data and suggesting treatments. The versatility of AI Agents
|
|
||||||
makes them invaluable assets in our increasingly automated world.\\n\\nAs we
|
|
||||||
look to the future, the potential of AI continues to expand. With ongoing advancements
|
|
||||||
in technology, AI Agents are set to become even more sophisticated, further
|
|
||||||
bridging the gap between humans and machines. The prospects of AI promise not
|
|
||||||
only to improve efficiency and productivity but also to change the way we live
|
|
||||||
and work, promising a future where intelligent, autonomous agents support us
|
|
||||||
in our daily lives.\",\n \"refusal\": null\n },\n \"logprobs\":
|
|
||||||
null,\n \"finish_reason\": \"stop\"\n }\n ],\n \"usage\": {\n \"prompt_tokens\":
|
|
||||||
546,\n \"completion_tokens\": 343,\n \"total_tokens\": 889,\n \"prompt_tokens_details\":
|
|
||||||
{\n \"cached_tokens\": 0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\":
|
|
||||||
{\n \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
|
||||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
|
||||||
\"fp_0aa8d3e20b\"\n}\n"
|
|
||||||
headers:
|
|
||||||
CF-Cache-Status:
|
|
||||||
- DYNAMIC
|
|
||||||
CF-RAY:
|
|
||||||
- 8f62803eed8100d5-GRU
|
|
||||||
Connection:
|
|
||||||
- keep-alive
|
|
||||||
Content-Encoding:
|
|
||||||
- gzip
|
|
||||||
Content-Type:
|
|
||||||
- application/json
|
|
||||||
Date:
|
|
||||||
- Sun, 22 Dec 2024 19:26:04 GMT
|
|
||||||
Server:
|
|
||||||
- cloudflare
|
|
||||||
Transfer-Encoding:
|
|
||||||
- chunked
|
|
||||||
X-Content-Type-Options:
|
|
||||||
- nosniff
|
|
||||||
access-control-expose-headers:
|
|
||||||
- X-Request-ID
|
|
||||||
alt-svc:
|
|
||||||
- h3=":443"; ma=86400
|
|
||||||
openai-organization:
|
|
||||||
- crewai-iuxna1
|
|
||||||
openai-processing-ms:
|
|
||||||
- '4897'
|
|
||||||
openai-version:
|
|
||||||
- '2020-10-01'
|
|
||||||
strict-transport-security:
|
|
||||||
- max-age=31536000; includeSubDomains; preload
|
|
||||||
x-ratelimit-limit-requests:
|
|
||||||
- '30000'
|
|
||||||
x-ratelimit-limit-tokens:
|
|
||||||
- '150000000'
|
|
||||||
x-ratelimit-remaining-requests:
|
|
||||||
- '29999'
|
|
||||||
x-ratelimit-remaining-tokens:
|
|
||||||
- '149999342'
|
|
||||||
x-ratelimit-reset-requests:
|
|
||||||
- 2ms
|
|
||||||
x-ratelimit-reset-tokens:
|
|
||||||
- 0s
|
|
||||||
x-request-id:
|
|
||||||
- req_65fdf94aa8bbc10f64f2a27ccdcc5cc8
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
version: 1
|
|
||||||
@@ -1,623 +0,0 @@
|
|||||||
interactions:
|
|
||||||
- request:
|
|
||||||
body: '{"messages": [{"role": "system", "content": "You are CEO. You''re an long
|
|
||||||
time CEO of a content creation agency with a Senior Writer on the team. You''re
|
|
||||||
now working on a new project and want to make sure the content produced is amazing.\nYour
|
|
||||||
personal goal is: Make sure the writers in your company produce amazing content.\nYou
|
|
||||||
ONLY have access to the following tools, and should NEVER make up tools that
|
|
||||||
are not listed here:\n\nTool Name: Test Tool\nTool Arguments: {''query'': {''description'':
|
|
||||||
''Query to process'', ''type'': ''str''}}\nTool Description: A test tool that
|
|
||||||
just returns the input\nTool Name: Delegate work to coworker\nTool Arguments:
|
|
||||||
{''task'': {''description'': ''The task to delegate'', ''type'': ''str''}, ''context'':
|
|
||||||
{''description'': ''The context for the task'', ''type'': ''str''}, ''coworker'':
|
|
||||||
{''description'': ''The role/name of the coworker to delegate to'', ''type'':
|
|
||||||
''str''}}\nTool Description: Delegate a specific task to one of the following
|
|
||||||
coworkers: Senior Writer\nThe input to this tool should be the coworker, the
|
|
||||||
task you want them to do, and ALL necessary context to execute the task, they
|
|
||||||
know nothing about the task, so share absolute everything you know, don''t reference
|
|
||||||
things but instead explain them.\nTool Name: Ask question to coworker\nTool
|
|
||||||
Arguments: {''question'': {''description'': ''The question to ask'', ''type'':
|
|
||||||
''str''}, ''context'': {''description'': ''The context for the question'', ''type'':
|
|
||||||
''str''}, ''coworker'': {''description'': ''The role/name of the coworker to
|
|
||||||
ask'', ''type'': ''str''}}\nTool Description: Ask a specific question to one
|
|
||||||
of the following coworkers: Senior Writer\nThe input to this tool should be
|
|
||||||
the coworker, the question you have for them, and ALL necessary context to ask
|
|
||||||
the question properly, they know nothing about the question, so share absolute
|
|
||||||
everything you know, don''t reference things but instead explain them.\n\nUse
|
|
||||||
the following format:\n\nThought: you should always think about what to do\nAction:
|
|
||||||
the action to take, only one name of [Test Tool, Delegate work to coworker,
|
|
||||||
Ask question to coworker], just the name, exactly as it''s written.\nAction
|
|
||||||
Input: the input to the action, just a simple python dictionary, enclosed in
|
|
||||||
curly braces, using \" to wrap keys and values.\nObservation: the result of
|
|
||||||
the action\n\nOnce all necessary information is gathered:\n\nThought: I now
|
|
||||||
know the final answer\nFinal Answer: the final answer to the original input
|
|
||||||
question"}, {"role": "user", "content": "\nCurrent Task: Produce and amazing
|
|
||||||
1 paragraph draft of an article about AI Agents.\n\nThis is the expect criteria
|
|
||||||
for your final answer: A 4 paragraph article about AI.\nyou MUST return the
|
|
||||||
actual complete content as the final answer, not a summary.\n\nBegin! This is
|
|
||||||
VERY important to you, use the tools available and give your best Final Answer,
|
|
||||||
your job depends on it!\n\nThought:"}], "model": "gpt-4o-mini", "stop": ["\nObservation:"],
|
|
||||||
"stream": false}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- application/json
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '2892'
|
|
||||||
content-type:
|
|
||||||
- application/json
|
|
||||||
host:
|
|
||||||
- api.openai.com
|
|
||||||
user-agent:
|
|
||||||
- OpenAI/Python 1.52.1
|
|
||||||
x-stainless-arch:
|
|
||||||
- arm64
|
|
||||||
x-stainless-async:
|
|
||||||
- 'false'
|
|
||||||
x-stainless-lang:
|
|
||||||
- python
|
|
||||||
x-stainless-os:
|
|
||||||
- MacOS
|
|
||||||
x-stainless-package-version:
|
|
||||||
- 1.52.1
|
|
||||||
x-stainless-raw-response:
|
|
||||||
- 'true'
|
|
||||||
x-stainless-retry-count:
|
|
||||||
- '0'
|
|
||||||
x-stainless-runtime:
|
|
||||||
- CPython
|
|
||||||
x-stainless-runtime-version:
|
|
||||||
- 3.11.7
|
|
||||||
method: POST
|
|
||||||
uri: https://api.openai.com/v1/chat/completions
|
|
||||||
response:
|
|
||||||
content: "{\n \"id\": \"chatcmpl-AhLQELAjJpn76wiLmWBinm3sqf32l\",\n \"object\":
|
|
||||||
\"chat.completion\",\n \"created\": 1734893814,\n \"model\": \"gpt-4o-mini-2024-07-18\",\n
|
|
||||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
|
||||||
\"assistant\",\n \"content\": \"I need to gather information and insights
|
|
||||||
to ensure the Senior Writer produces a high-quality draft about AI Agents, which
|
|
||||||
will then serve as a foundation for the complete article.\\n\\nAction: Ask question
|
|
||||||
to coworker \\nAction Input: {\\\"question\\\":\\\"Can you provide a detailed
|
|
||||||
overview of what AI Agents are, their functionalities, and their applications
|
|
||||||
in real-world scenarios? Please include examples of how they are being used
|
|
||||||
in various industries, and discuss their potential impact on the future of technology
|
|
||||||
and society.\\\",\\\"context\\\":\\\"We are looking to create a comprehensive
|
|
||||||
understanding of AI Agents as part of a four-paragraph article. This will help
|
|
||||||
generate a high-quality draft for the article.\\\",\\\"coworker\\\":\\\"Senior
|
|
||||||
Writer\\\"} \",\n \"refusal\": null\n },\n \"logprobs\": null,\n
|
|
||||||
\ \"finish_reason\": \"stop\"\n }\n ],\n \"usage\": {\n \"prompt_tokens\":
|
|
||||||
604,\n \"completion_tokens\": 138,\n \"total_tokens\": 742,\n \"prompt_tokens_details\":
|
|
||||||
{\n \"cached_tokens\": 0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\":
|
|
||||||
{\n \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
|
||||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
|
||||||
\"fp_0aa8d3e20b\"\n}\n"
|
|
||||||
headers:
|
|
||||||
CF-Cache-Status:
|
|
||||||
- DYNAMIC
|
|
||||||
CF-RAY:
|
|
||||||
- 8f6255a1bf08a519-GRU
|
|
||||||
Connection:
|
|
||||||
- keep-alive
|
|
||||||
Content-Encoding:
|
|
||||||
- gzip
|
|
||||||
Content-Type:
|
|
||||||
- application/json
|
|
||||||
Date:
|
|
||||||
- Sun, 22 Dec 2024 18:56:56 GMT
|
|
||||||
Server:
|
|
||||||
- cloudflare
|
|
||||||
Set-Cookie:
|
|
||||||
- __cf_bm=rVquIlcnYXc7wMkbKG5Ii90HxfQ_ukNJjDgSHhsWb1k-1734893816-1.0.1.1-33qDl8KNWcxLAGBuPhT8FrZ6QUnEy9oOYh2Fp2hIjDnF.cQlyrgxiWcuHljTTxG_mH7eQrf1AHJ6p8sxZJZ30A;
|
|
||||||
path=/; expires=Sun, 22-Dec-24 19:26:56 GMT; domain=.api.openai.com; HttpOnly;
|
|
||||||
Secure; SameSite=None
|
|
||||||
- _cfuvid=xqgLJsf3h5lBKZFTADRNNUqizeChNBBFoLvWiR2WPnw-1734893816555-0.0.1.1-604800000;
|
|
||||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
|
||||||
Transfer-Encoding:
|
|
||||||
- chunked
|
|
||||||
X-Content-Type-Options:
|
|
||||||
- nosniff
|
|
||||||
access-control-expose-headers:
|
|
||||||
- X-Request-ID
|
|
||||||
alt-svc:
|
|
||||||
- h3=":443"; ma=86400
|
|
||||||
openai-organization:
|
|
||||||
- crewai-iuxna1
|
|
||||||
openai-processing-ms:
|
|
||||||
- '2340'
|
|
||||||
openai-version:
|
|
||||||
- '2020-10-01'
|
|
||||||
strict-transport-security:
|
|
||||||
- max-age=31536000; includeSubDomains; preload
|
|
||||||
x-ratelimit-limit-requests:
|
|
||||||
- '30000'
|
|
||||||
x-ratelimit-limit-tokens:
|
|
||||||
- '150000000'
|
|
||||||
x-ratelimit-remaining-requests:
|
|
||||||
- '29999'
|
|
||||||
x-ratelimit-remaining-tokens:
|
|
||||||
- '149999305'
|
|
||||||
x-ratelimit-reset-requests:
|
|
||||||
- 2ms
|
|
||||||
x-ratelimit-reset-tokens:
|
|
||||||
- 0s
|
|
||||||
x-request-id:
|
|
||||||
- req_53956b48bd1188451efc104e8a234ef4
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
- request:
|
|
||||||
body: !!binary |
|
|
||||||
CrEMCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSiAwKEgoQY3Jld2FpLnRl
|
|
||||||
bGVtZXRyeRLgCQoQg/HA64g3phKbzz/hvUtbahIIpu+Csq+uWc0qDENyZXcgQ3JlYXRlZDABOSDm
|
|
||||||
Uli7lBMYQcgRXFi7lBMYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODYuMEoaCg5weXRob25fdmVy
|
|
||||||
c2lvbhIICgYzLjExLjdKLgoIY3Jld19rZXkSIgogZTY0OTU3M2EyNmU1ODc5MGNhYzIxYTM3Y2Q0
|
|
||||||
NDQzN2FKMQoHY3Jld19pZBImCiRhMWFjMTc0Ny0xMTA0LTRlZjItODZkNi02ZGRhNTFmMDlmMTdK
|
|
||||||
HAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdf
|
|
||||||
bnVtYmVyX29mX3Rhc2tzEgIYAUobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgCSooFCgtjcmV3
|
|
||||||
X2FnZW50cxL6BAr3BFt7ImtleSI6ICIzMjgyMTdiNmMyOTU5YmRmYzQ3Y2FkMDBlODQ4OTBkMCIs
|
|
||||||
ICJpZCI6ICI4YWUwNGY0Yy0wMjNiLTRkNWQtODAwZC02ZjlkMWFmMWExOTkiLCAicm9sZSI6ICJD
|
|
||||||
RU8iLCAidmVyYm9zZT8iOiBmYWxzZSwgIm1heF9pdGVyIjogMjAsICJtYXhfcnBtIjogbnVsbCwg
|
|
||||||
ImZ1bmN0aW9uX2NhbGxpbmdfbGxtIjogIiIsICJsbG0iOiAiZ3B0LTRvLW1pbmkiLCAiZGVsZWdh
|
|
||||||
dGlvbl9lbmFibGVkPyI6IHRydWUsICJhbGxvd19jb2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1h
|
|
||||||
eF9yZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6IFtdfSwgeyJrZXkiOiAiOWE1MDE1ZWY0
|
|
||||||
ODk1ZGM2Mjc4ZDU0ODE4YmE0NDZhZjciLCAiaWQiOiAiNDg2MWQ4YTMtMjMxYS00Mzc5LTk2ZmEt
|
|
||||||
MWQwZmQyZDI1MGYxIiwgInJvbGUiOiAiU2VuaW9yIFdyaXRlciIsICJ2ZXJib3NlPyI6IGZhbHNl
|
|
||||||
LCAibWF4X2l0ZXIiOiAyMCwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0i
|
|
||||||
OiAiIiwgImxsbSI6ICJncHQtNG8tbWluaSIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2Us
|
|
||||||
ICJhbGxvd19jb2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6IDIsICJ0
|
|
||||||
b29sc19uYW1lcyI6IFtdfV1KgwIKCmNyZXdfdGFza3MS9AEK8QFbeyJrZXkiOiAiMGI5ZDY1ZGI2
|
|
||||||
YjdhZWRmYjM5OGM1OWUyYTlmNzFlYzUiLCAiaWQiOiAiY2IyMmIxMzctZTA3ZC00NDA5LWI5NmMt
|
|
||||||
ZWQ2ZDU3MjFhNDNiIiwgImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lucHV0PyI6
|
|
||||||
IGZhbHNlLCAiYWdlbnRfcm9sZSI6ICJDRU8iLCAiYWdlbnRfa2V5IjogIjMyODIxN2I2YzI5NTli
|
|
||||||
ZGZjNDdjYWQwMGU4NDg5MGQwIiwgInRvb2xzX25hbWVzIjogWyJ0ZXN0IHRvb2wiXX1degIYAYUB
|
|
||||||
AAEAABKOAgoQB7Z9AEDI9OTStHqguBSbLxIIj9dttVFJs9cqDFRhc2sgQ3JlYXRlZDABOYDae1i7
|
|
||||||
lBMYQeBHfFi7lBMYSi4KCGNyZXdfa2V5EiIKIGU2NDk1NzNhMjZlNTg3OTBjYWMyMWEzN2NkNDQ0
|
|
||||||
MzdhSjEKB2NyZXdfaWQSJgokYTFhYzE3NDctMTEwNC00ZWYyLTg2ZDYtNmRkYTUxZjA5ZjE3Si4K
|
|
||||||
CHRhc2tfa2V5EiIKIDBiOWQ2NWRiNmI3YWVkZmIzOThjNTllMmE5ZjcxZWM1SjEKB3Rhc2tfaWQS
|
|
||||||
JgokY2IyMmIxMzctZTA3ZC00NDA5LWI5NmMtZWQ2ZDU3MjFhNDNiegIYAYUBAAEAAA==
|
|
||||||
headers:
|
|
||||||
Accept:
|
|
||||||
- '*/*'
|
|
||||||
Accept-Encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
Connection:
|
|
||||||
- keep-alive
|
|
||||||
Content-Length:
|
|
||||||
- '1588'
|
|
||||||
Content-Type:
|
|
||||||
- application/x-protobuf
|
|
||||||
User-Agent:
|
|
||||||
- OTel-OTLP-Exporter-Python/1.27.0
|
|
||||||
method: POST
|
|
||||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
|
||||||
response:
|
|
||||||
body:
|
|
||||||
string: "\n\0"
|
|
||||||
headers:
|
|
||||||
Content-Length:
|
|
||||||
- '2'
|
|
||||||
Content-Type:
|
|
||||||
- application/x-protobuf
|
|
||||||
Date:
|
|
||||||
- Sun, 22 Dec 2024 18:56:59 GMT
|
|
||||||
status:
|
|
||||||
code: 200
|
|
||||||
message: OK
|
|
||||||
- request:
|
|
||||||
body: '{"messages": [{"role": "system", "content": "You are Senior Writer. You''re
|
|
||||||
a senior writer, specialized in technology, software engineering, AI and startups.
|
|
||||||
You work as a freelancer and are now working on writing content for a new customer.\nYour
|
|
||||||
personal goal is: Write the best content about AI and AI agents.\nTo give my
|
|
||||||
best complete final answer to the task use the exact following format:\n\nThought:
|
|
||||||
I now can give a great answer\nFinal Answer: Your final answer must be the great
|
|
||||||
and the most complete as possible, it must be outcome described.\n\nI MUST use
|
|
||||||
these formats, my job depends on it!"}, {"role": "user", "content": "\nCurrent
|
|
||||||
Task: Can you provide a detailed overview of what AI Agents are, their functionalities,
|
|
||||||
and their applications in real-world scenarios? Please include examples of how
|
|
||||||
they are being used in various industries, and discuss their potential impact
|
|
||||||
on the future of technology and society.\n\nThis is the expect criteria for
|
|
||||||
your final answer: Your best answer to your coworker asking you this, accounting
|
|
||||||
for the context shared.\nyou MUST return the actual complete content as the
|
|
||||||
final answer, not a summary.\n\nThis is the context you''re working with:\nWe
|
|
||||||
are looking to create a comprehensive understanding of AI Agents as part of
|
|
||||||
a four-paragraph article. This will help generate a high-quality draft for the
|
|
||||||
article.\n\nBegin! This is VERY important to you, use the tools available and
|
|
||||||
give your best Final Answer, your job depends on it!\n\nThought:"}], "model":
|
|
||||||
"gpt-4o-mini", "stop": ["\nObservation:"], "stream": false}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- application/json
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '1572'
|
|
||||||
content-type:
|
|
||||||
- application/json
|
|
||||||
cookie:
|
|
||||||
- __cf_bm=rVquIlcnYXc7wMkbKG5Ii90HxfQ_ukNJjDgSHhsWb1k-1734893816-1.0.1.1-33qDl8KNWcxLAGBuPhT8FrZ6QUnEy9oOYh2Fp2hIjDnF.cQlyrgxiWcuHljTTxG_mH7eQrf1AHJ6p8sxZJZ30A;
|
|
||||||
_cfuvid=xqgLJsf3h5lBKZFTADRNNUqizeChNBBFoLvWiR2WPnw-1734893816555-0.0.1.1-604800000
|
|
||||||
host:
|
|
||||||
- api.openai.com
|
|
||||||
user-agent:
|
|
||||||
- OpenAI/Python 1.52.1
|
|
||||||
x-stainless-arch:
|
|
||||||
- arm64
|
|
||||||
x-stainless-async:
|
|
||||||
- 'false'
|
|
||||||
x-stainless-lang:
|
|
||||||
- python
|
|
||||||
x-stainless-os:
|
|
||||||
- MacOS
|
|
||||||
x-stainless-package-version:
|
|
||||||
- 1.52.1
|
|
||||||
x-stainless-raw-response:
|
|
||||||
- 'true'
|
|
||||||
x-stainless-retry-count:
|
|
||||||
- '0'
|
|
||||||
x-stainless-runtime:
|
|
||||||
- CPython
|
|
||||||
x-stainless-runtime-version:
|
|
||||||
- 3.11.7
|
|
||||||
method: POST
|
|
||||||
uri: https://api.openai.com/v1/chat/completions
|
|
||||||
response:
|
|
||||||
content: "{\n \"id\": \"chatcmpl-AhLQG5ubl99yeBYm6TTV0sodagMND\",\n \"object\":
|
|
||||||
\"chat.completion\",\n \"created\": 1734893816,\n \"model\": \"gpt-4o-mini-2024-07-18\",\n
|
|
||||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
|
||||||
\"assistant\",\n \"content\": \"I now can give a great answer \\nFinal
|
|
||||||
Answer: \\n\\n**Overview of AI Agents** \\nAI agents are advanced software
|
|
||||||
systems designed to autonomously perform tasks, make decisions, and learn from
|
|
||||||
their environments without needing constant human intervention. They leverage
|
|
||||||
machine learning, natural language processing, and various AI techniques to
|
|
||||||
simulate human-like understanding and autonomy. These agents can be categorized
|
|
||||||
into three types: reactive agents (which operate purely based on their environment),
|
|
||||||
deliberative agents (which can make decisions based on reasoning), and hybrid
|
|
||||||
agents that incorporate aspects of both types. Their ability to adapt and learn
|
|
||||||
over time makes them instrumental in automating processes across various domains.\\n\\n**Functionalities
|
|
||||||
of AI Agents** \\nThe core functionalities of AI agents include perception,
|
|
||||||
action, learning, and interaction. They perceive data through sensors or data
|
|
||||||
feeds, process information through algorithms, and take actions based on this
|
|
||||||
data. Machine learning allows them to refine their performance over time by
|
|
||||||
analyzing outcomes and adjusting their strategies accordingly. Interaction capabilities
|
|
||||||
enable them to communicate with users, providing insights, answering queries,
|
|
||||||
or even negotiating in some cases. These functionalities make AI agents invaluable
|
|
||||||
for tasks such as predictive analytics, personal assistance, and real-time decision-making
|
|
||||||
in complex systems.\\n\\n**Applications in Various Industries** \\nAI agents
|
|
||||||
are already being utilized across multiple industries, demonstrating their versatility
|
|
||||||
and efficiency. In healthcare, AI agents assist in diagnostics by analyzing
|
|
||||||
medical records and suggesting treatment plans tailored to individual patients.
|
|
||||||
In finance, they power robo-advisors that manage investment portfolios, automate
|
|
||||||
trading strategies, and provide financial advice based on real-time market analysis.
|
|
||||||
Furthermore, in customer service, AI chatbots serve as virtual assistants, enhancing
|
|
||||||
user experience by providing instant support and resolving queries without human
|
|
||||||
intervention. The logistics and supply chain industries have also seen AI agents
|
|
||||||
optimize inventory management and route planning, significantly improving operational
|
|
||||||
efficiency.\\n\\n**Future Impact on Technology and Society** \\nThe ongoing
|
|
||||||
development of AI agents is poised to have a profound impact on technology and
|
|
||||||
society. As these agents become more sophisticated, we can anticipate a shift
|
|
||||||
towards increased automation in both professional and personal spheres, leading
|
|
||||||
to enhanced productivity and new business models. However, this automation introduces
|
|
||||||
challenges such as job displacement and ethical considerations regarding decision-making
|
|
||||||
by AI. It is essential to foster an ongoing dialogue on the implications of
|
|
||||||
AI agents to ensure responsible development and integration into our daily lives.
|
|
||||||
As AI agents continue to evolve, they will undoubtedly play a pivotal role in
|
|
||||||
shaping the future of technology and its intersection with societal dynamics,
|
|
||||||
making it critical for us to engage thoughtfully with this emerging paradigm.\",\n
|
|
||||||
\ \"refusal\": null\n },\n \"logprobs\": null,\n \"finish_reason\":
|
|
||||||
\"stop\"\n }\n ],\n \"usage\": {\n \"prompt_tokens\": 289,\n \"completion_tokens\":
|
|
||||||
506,\n \"total_tokens\": 795,\n \"prompt_tokens_details\": {\n \"cached_tokens\":
|
|
||||||
0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\": {\n
|
|
||||||
\ \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
|
||||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
|
||||||
\"fp_0aa8d3e20b\"\n}\n"
|
|
||||||
headers:
|
|
||||||
CF-Cache-Status:
|
|
||||||
- DYNAMIC
|
|
||||||
CF-RAY:
|
|
||||||
- 8f6255b1f832a519-GRU
|
|
||||||
Connection:
|
|
||||||
- keep-alive
|
|
||||||
Content-Encoding:
|
|
||||||
- gzip
|
|
||||||
Content-Type:
|
|
||||||
- application/json
|
|
||||||
Date:
|
|
||||||
- Sun, 22 Dec 2024 18:57:04 GMT
|
|
||||||
Server:
|
|
||||||
- cloudflare
|
|
||||||
Transfer-Encoding:
|
|
||||||
- chunked
|
|
||||||
X-Content-Type-Options:
|
|
||||||
- nosniff
|
|
||||||
access-control-expose-headers:
|
|
||||||
- X-Request-ID
|
|
||||||
alt-svc:
|
|
||||||
- h3=":443"; ma=86400
|
|
||||||
openai-organization:
|
|
||||||
- crewai-iuxna1
|
|
||||||
openai-processing-ms:
|
|
||||||
- '7836'
|
|
||||||
openai-version:
|
|
||||||
- '2020-10-01'
|
|
||||||
strict-transport-security:
|
|
||||||
- max-age=31536000; includeSubDomains; preload
|
|
||||||
x-ratelimit-limit-requests:
|
|
||||||
- '30000'
|
|
||||||
x-ratelimit-limit-tokens:
|
|
||||||
- '150000000'
|
|
||||||
x-ratelimit-remaining-requests:
|
|
||||||
- '29999'
|
|
||||||
x-ratelimit-remaining-tokens:
|
|
||||||
- '149999630'
|
|
||||||
x-ratelimit-reset-requests:
|
|
||||||
- 2ms
|
|
||||||
x-ratelimit-reset-tokens:
|
|
||||||
- 0s
|
|
||||||
x-request-id:
|
|
||||||
- req_c14268346d6ce72ceea4b1472f73c5ae
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
- request:
|
|
||||||
body: !!binary |
|
|
||||||
CtsBCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSsgEKEgoQY3Jld2FpLnRl
|
|
||||||
bGVtZXRyeRKbAQoQ7U/1ZgBSTkCXtesUNPA2URIIrnRWFVT58Z8qClRvb2wgVXNhZ2UwATl4lN3i
|
|
||||||
vZQTGEGgevzivZQTGEoaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjg2LjBKJwoJdG9vbF9uYW1lEhoK
|
|
||||||
GEFzayBxdWVzdGlvbiB0byBjb3dvcmtlckoOCghhdHRlbXB0cxICGAF6AhgBhQEAAQAA
|
|
||||||
headers:
|
|
||||||
Accept:
|
|
||||||
- '*/*'
|
|
||||||
Accept-Encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
Connection:
|
|
||||||
- keep-alive
|
|
||||||
Content-Length:
|
|
||||||
- '222'
|
|
||||||
Content-Type:
|
|
||||||
- application/x-protobuf
|
|
||||||
User-Agent:
|
|
||||||
- OTel-OTLP-Exporter-Python/1.27.0
|
|
||||||
method: POST
|
|
||||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
|
||||||
response:
|
|
||||||
body:
|
|
||||||
string: "\n\0"
|
|
||||||
headers:
|
|
||||||
Content-Length:
|
|
||||||
- '2'
|
|
||||||
Content-Type:
|
|
||||||
- application/x-protobuf
|
|
||||||
Date:
|
|
||||||
- Sun, 22 Dec 2024 18:57:09 GMT
|
|
||||||
status:
|
|
||||||
code: 200
|
|
||||||
message: OK
|
|
||||||
- request:
|
|
||||||
body: '{"messages": [{"role": "system", "content": "You are CEO. You''re an long
|
|
||||||
time CEO of a content creation agency with a Senior Writer on the team. You''re
|
|
||||||
now working on a new project and want to make sure the content produced is amazing.\nYour
|
|
||||||
personal goal is: Make sure the writers in your company produce amazing content.\nYou
|
|
||||||
ONLY have access to the following tools, and should NEVER make up tools that
|
|
||||||
are not listed here:\n\nTool Name: Test Tool\nTool Arguments: {''query'': {''description'':
|
|
||||||
''Query to process'', ''type'': ''str''}}\nTool Description: A test tool that
|
|
||||||
just returns the input\nTool Name: Delegate work to coworker\nTool Arguments:
|
|
||||||
{''task'': {''description'': ''The task to delegate'', ''type'': ''str''}, ''context'':
|
|
||||||
{''description'': ''The context for the task'', ''type'': ''str''}, ''coworker'':
|
|
||||||
{''description'': ''The role/name of the coworker to delegate to'', ''type'':
|
|
||||||
''str''}}\nTool Description: Delegate a specific task to one of the following
|
|
||||||
coworkers: Senior Writer\nThe input to this tool should be the coworker, the
|
|
||||||
task you want them to do, and ALL necessary context to execute the task, they
|
|
||||||
know nothing about the task, so share absolute everything you know, don''t reference
|
|
||||||
things but instead explain them.\nTool Name: Ask question to coworker\nTool
|
|
||||||
Arguments: {''question'': {''description'': ''The question to ask'', ''type'':
|
|
||||||
''str''}, ''context'': {''description'': ''The context for the question'', ''type'':
|
|
||||||
''str''}, ''coworker'': {''description'': ''The role/name of the coworker to
|
|
||||||
ask'', ''type'': ''str''}}\nTool Description: Ask a specific question to one
|
|
||||||
of the following coworkers: Senior Writer\nThe input to this tool should be
|
|
||||||
the coworker, the question you have for them, and ALL necessary context to ask
|
|
||||||
the question properly, they know nothing about the question, so share absolute
|
|
||||||
everything you know, don''t reference things but instead explain them.\n\nUse
|
|
||||||
the following format:\n\nThought: you should always think about what to do\nAction:
|
|
||||||
the action to take, only one name of [Test Tool, Delegate work to coworker,
|
|
||||||
Ask question to coworker], just the name, exactly as it''s written.\nAction
|
|
||||||
Input: the input to the action, just a simple python dictionary, enclosed in
|
|
||||||
curly braces, using \" to wrap keys and values.\nObservation: the result of
|
|
||||||
the action\n\nOnce all necessary information is gathered:\n\nThought: I now
|
|
||||||
know the final answer\nFinal Answer: the final answer to the original input
|
|
||||||
question"}, {"role": "user", "content": "\nCurrent Task: Produce and amazing
|
|
||||||
1 paragraph draft of an article about AI Agents.\n\nThis is the expect criteria
|
|
||||||
for your final answer: A 4 paragraph article about AI.\nyou MUST return the
|
|
||||||
actual complete content as the final answer, not a summary.\n\nBegin! This is
|
|
||||||
VERY important to you, use the tools available and give your best Final Answer,
|
|
||||||
your job depends on it!\n\nThought:"}, {"role": "assistant", "content": "I need
|
|
||||||
to gather information and insights to ensure the Senior Writer produces a high-quality
|
|
||||||
draft about AI Agents, which will then serve as a foundation for the complete
|
|
||||||
article.\n\nAction: Ask question to coworker \nAction Input: {\"question\":\"Can
|
|
||||||
you provide a detailed overview of what AI Agents are, their functionalities,
|
|
||||||
and their applications in real-world scenarios? Please include examples of how
|
|
||||||
they are being used in various industries, and discuss their potential impact
|
|
||||||
on the future of technology and society.\",\"context\":\"We are looking to create
|
|
||||||
a comprehensive understanding of AI Agents as part of a four-paragraph article.
|
|
||||||
This will help generate a high-quality draft for the article.\",\"coworker\":\"Senior
|
|
||||||
Writer\"} \nObservation: **Overview of AI Agents** \nAI agents are advanced
|
|
||||||
software systems designed to autonomously perform tasks, make decisions, and
|
|
||||||
learn from their environments without needing constant human intervention. They
|
|
||||||
leverage machine learning, natural language processing, and various AI techniques
|
|
||||||
to simulate human-like understanding and autonomy. These agents can be categorized
|
|
||||||
into three types: reactive agents (which operate purely based on their environment),
|
|
||||||
deliberative agents (which can make decisions based on reasoning), and hybrid
|
|
||||||
agents that incorporate aspects of both types. Their ability to adapt and learn
|
|
||||||
over time makes them instrumental in automating processes across various domains.\n\n**Functionalities
|
|
||||||
of AI Agents** \nThe core functionalities of AI agents include perception,
|
|
||||||
action, learning, and interaction. They perceive data through sensors or data
|
|
||||||
feeds, process information through algorithms, and take actions based on this
|
|
||||||
data. Machine learning allows them to refine their performance over time by
|
|
||||||
analyzing outcomes and adjusting their strategies accordingly. Interaction capabilities
|
|
||||||
enable them to communicate with users, providing insights, answering queries,
|
|
||||||
or even negotiating in some cases. These functionalities make AI agents invaluable
|
|
||||||
for tasks such as predictive analytics, personal assistance, and real-time decision-making
|
|
||||||
in complex systems.\n\n**Applications in Various Industries** \nAI agents are
|
|
||||||
already being utilized across multiple industries, demonstrating their versatility
|
|
||||||
and efficiency. In healthcare, AI agents assist in diagnostics by analyzing
|
|
||||||
medical records and suggesting treatment plans tailored to individual patients.
|
|
||||||
In finance, they power robo-advisors that manage investment portfolios, automate
|
|
||||||
trading strategies, and provide financial advice based on real-time market analysis.
|
|
||||||
Furthermore, in customer service, AI chatbots serve as virtual assistants, enhancing
|
|
||||||
user experience by providing instant support and resolving queries without human
|
|
||||||
intervention. The logistics and supply chain industries have also seen AI agents
|
|
||||||
optimize inventory management and route planning, significantly improving operational
|
|
||||||
efficiency.\n\n**Future Impact on Technology and Society** \nThe ongoing development
|
|
||||||
of AI agents is poised to have a profound impact on technology and society.
|
|
||||||
As these agents become more sophisticated, we can anticipate a shift towards
|
|
||||||
increased automation in both professional and personal spheres, leading to enhanced
|
|
||||||
productivity and new business models. However, this automation introduces challenges
|
|
||||||
such as job displacement and ethical considerations regarding decision-making
|
|
||||||
by AI. It is essential to foster an ongoing dialogue on the implications of
|
|
||||||
AI agents to ensure responsible development and integration into our daily lives.
|
|
||||||
As AI agents continue to evolve, they will undoubtedly play a pivotal role in
|
|
||||||
shaping the future of technology and its intersection with societal dynamics,
|
|
||||||
making it critical for us to engage thoughtfully with this emerging paradigm."}],
|
|
||||||
"model": "gpt-4o-mini", "stop": ["\nObservation:"], "stream": false}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- application/json
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '6755'
|
|
||||||
content-type:
|
|
||||||
- application/json
|
|
||||||
cookie:
|
|
||||||
- __cf_bm=rVquIlcnYXc7wMkbKG5Ii90HxfQ_ukNJjDgSHhsWb1k-1734893816-1.0.1.1-33qDl8KNWcxLAGBuPhT8FrZ6QUnEy9oOYh2Fp2hIjDnF.cQlyrgxiWcuHljTTxG_mH7eQrf1AHJ6p8sxZJZ30A;
|
|
||||||
_cfuvid=xqgLJsf3h5lBKZFTADRNNUqizeChNBBFoLvWiR2WPnw-1734893816555-0.0.1.1-604800000
|
|
||||||
host:
|
|
||||||
- api.openai.com
|
|
||||||
user-agent:
|
|
||||||
- OpenAI/Python 1.52.1
|
|
||||||
x-stainless-arch:
|
|
||||||
- arm64
|
|
||||||
x-stainless-async:
|
|
||||||
- 'false'
|
|
||||||
x-stainless-lang:
|
|
||||||
- python
|
|
||||||
x-stainless-os:
|
|
||||||
- MacOS
|
|
||||||
x-stainless-package-version:
|
|
||||||
- 1.52.1
|
|
||||||
x-stainless-raw-response:
|
|
||||||
- 'true'
|
|
||||||
x-stainless-retry-count:
|
|
||||||
- '0'
|
|
||||||
x-stainless-runtime:
|
|
||||||
- CPython
|
|
||||||
x-stainless-runtime-version:
|
|
||||||
- 3.11.7
|
|
||||||
method: POST
|
|
||||||
uri: https://api.openai.com/v1/chat/completions
|
|
||||||
response:
|
|
||||||
content: "{\n \"id\": \"chatcmpl-AhLQOmLKuevpaWtRyyhHjHVYqvNVk\",\n \"object\":
|
|
||||||
\"chat.completion\",\n \"created\": 1734893824,\n \"model\": \"gpt-4o-mini-2024-07-18\",\n
|
|
||||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
|
||||||
\"assistant\",\n \"content\": \"Thought: I have received a comprehensive
|
|
||||||
overview from the Senior Writer that includes the necessary information about
|
|
||||||
AI Agents, their functionalities, applications, and future implications. Now,
|
|
||||||
I can compile this into a final answer following the specified format: a complete
|
|
||||||
four-paragraph article.\\n\\nFinal Answer: \\n\\n**Overview of AI Agents** \\nAI
|
|
||||||
agents are advanced software systems designed to autonomously perform tasks,
|
|
||||||
make decisions, and learn from their environments without needing constant human
|
|
||||||
intervention. They leverage machine learning, natural language processing, and
|
|
||||||
various AI techniques to simulate human-like understanding and autonomy. These
|
|
||||||
agents can be categorized into three types: reactive agents (which operate purely
|
|
||||||
based on their environment), deliberative agents (which can make decisions based
|
|
||||||
on reasoning), and hybrid agents that incorporate aspects of both types. Their
|
|
||||||
ability to adapt and learn over time makes them instrumental in automating processes
|
|
||||||
across various domains.\\n\\n**Functionalities of AI Agents** \\nThe core functionalities
|
|
||||||
of AI agents include perception, action, learning, and interaction. They perceive
|
|
||||||
data through sensors or data feeds, process information through algorithms,
|
|
||||||
and take actions based on this data. Machine learning allows them to refine
|
|
||||||
their performance over time by analyzing outcomes and adjusting their strategies
|
|
||||||
accordingly. Interaction capabilities enable them to communicate with users,
|
|
||||||
providing insights, answering queries, or even negotiating in some cases. These
|
|
||||||
functionalities make AI agents invaluable for tasks such as predictive analytics,
|
|
||||||
personal assistance, and real-time decision-making in complex systems.\\n\\n**Applications
|
|
||||||
in Various Industries** \\nAI agents are already being utilized across multiple
|
|
||||||
industries, demonstrating their versatility and efficiency. In healthcare, AI
|
|
||||||
agents assist in diagnostics by analyzing medical records and suggesting treatment
|
|
||||||
plans tailored to individual patients. In finance, they power robo-advisors
|
|
||||||
that manage investment portfolios, automate trading strategies, and provide
|
|
||||||
financial advice based on real-time market analysis. Furthermore, in customer
|
|
||||||
service, AI chatbots serve as virtual assistants, enhancing user experience
|
|
||||||
by providing instant support and resolving queries without human intervention.
|
|
||||||
The logistics and supply chain industries have also seen AI agents optimize
|
|
||||||
inventory management and route planning, significantly improving operational
|
|
||||||
efficiency.\\n\\n**Future Impact on Technology and Society** \\nThe ongoing
|
|
||||||
development of AI agents is poised to have a profound impact on technology and
|
|
||||||
society. As these agents become more sophisticated, we can anticipate a shift
|
|
||||||
towards increased automation in both professional and personal spheres, leading
|
|
||||||
to enhanced productivity and new business models. However, this automation introduces
|
|
||||||
challenges such as job displacement and ethical considerations regarding decision-making
|
|
||||||
by AI. It is essential to foster an ongoing dialogue on the implications of
|
|
||||||
AI agents to ensure responsible development and integration into our daily lives.
|
|
||||||
As AI agents continue to evolve, they will undoubtedly play a pivotal role in
|
|
||||||
shaping the future of technology and its intersection with societal dynamics,
|
|
||||||
making it critical for us to engage thoughtfully with this emerging paradigm.\",\n
|
|
||||||
\ \"refusal\": null\n },\n \"logprobs\": null,\n \"finish_reason\":
|
|
||||||
\"stop\"\n }\n ],\n \"usage\": {\n \"prompt_tokens\": 1242,\n \"completion_tokens\":
|
|
||||||
550,\n \"total_tokens\": 1792,\n \"prompt_tokens_details\": {\n \"cached_tokens\":
|
|
||||||
0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\": {\n
|
|
||||||
\ \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
|
||||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
|
||||||
\"fp_0aa8d3e20b\"\n}\n"
|
|
||||||
headers:
|
|
||||||
CF-Cache-Status:
|
|
||||||
- DYNAMIC
|
|
||||||
CF-RAY:
|
|
||||||
- 8f6255e49b37a519-GRU
|
|
||||||
Connection:
|
|
||||||
- keep-alive
|
|
||||||
Content-Encoding:
|
|
||||||
- gzip
|
|
||||||
Content-Type:
|
|
||||||
- application/json
|
|
||||||
Date:
|
|
||||||
- Sun, 22 Dec 2024 18:57:12 GMT
|
|
||||||
Server:
|
|
||||||
- cloudflare
|
|
||||||
Transfer-Encoding:
|
|
||||||
- chunked
|
|
||||||
X-Content-Type-Options:
|
|
||||||
- nosniff
|
|
||||||
access-control-expose-headers:
|
|
||||||
- X-Request-ID
|
|
||||||
alt-svc:
|
|
||||||
- h3=":443"; ma=86400
|
|
||||||
openai-organization:
|
|
||||||
- crewai-iuxna1
|
|
||||||
openai-processing-ms:
|
|
||||||
- '7562'
|
|
||||||
openai-version:
|
|
||||||
- '2020-10-01'
|
|
||||||
strict-transport-security:
|
|
||||||
- max-age=31536000; includeSubDomains; preload
|
|
||||||
x-ratelimit-limit-requests:
|
|
||||||
- '30000'
|
|
||||||
x-ratelimit-limit-tokens:
|
|
||||||
- '150000000'
|
|
||||||
x-ratelimit-remaining-requests:
|
|
||||||
- '29999'
|
|
||||||
x-ratelimit-remaining-tokens:
|
|
||||||
- '149998353'
|
|
||||||
x-ratelimit-reset-requests:
|
|
||||||
- 2ms
|
|
||||||
x-ratelimit-reset-tokens:
|
|
||||||
- 0s
|
|
||||||
x-request-id:
|
|
||||||
- req_a812bbb85b3d785660c4662212614ab9
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
version: 1
|
|
||||||
File diff suppressed because it is too large
Load Diff
35
tests/cassettes/test_llm_call_with_ollama_gemma.yaml
Normal file
35
tests/cassettes/test_llm_call_with_ollama_gemma.yaml
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
interactions:
|
||||||
|
- request:
|
||||||
|
body: '{"model": "gemma2:latest", "prompt": "### User:\nRespond in 20 words. Who
|
||||||
|
are you?\n\n", "options": {"num_predict": 30, "temperature": 0.7}, "stream":
|
||||||
|
false}'
|
||||||
|
headers:
|
||||||
|
Accept:
|
||||||
|
- '*/*'
|
||||||
|
Accept-Encoding:
|
||||||
|
- gzip, deflate
|
||||||
|
Connection:
|
||||||
|
- keep-alive
|
||||||
|
Content-Length:
|
||||||
|
- '157'
|
||||||
|
Content-Type:
|
||||||
|
- application/json
|
||||||
|
User-Agent:
|
||||||
|
- python-requests/2.31.0
|
||||||
|
method: POST
|
||||||
|
uri: http://localhost:8080/api/generate
|
||||||
|
response:
|
||||||
|
body:
|
||||||
|
string: '{"model":"gemma2:latest","created_at":"2024-09-24T21:57:52.329049Z","response":"I
|
||||||
|
am Gemma, an open-weights AI assistant trained by Google DeepMind. \n","done":true,"done_reason":"stop","context":[106,1645,108,6176,4926,235292,108,54657,575,235248,235284,235276,3907,235265,7702,708,692,235336,109,107,108,106,2516,108,235285,1144,137061,235269,671,2174,235290,30316,16481,20409,17363,731,6238,20555,35777,235265,139,108],"total_duration":991843667,"load_duration":31664750,"prompt_eval_count":25,"prompt_eval_duration":51409000,"eval_count":19,"eval_duration":908132000}'
|
||||||
|
headers:
|
||||||
|
Content-Length:
|
||||||
|
- '572'
|
||||||
|
Content-Type:
|
||||||
|
- application/json; charset=utf-8
|
||||||
|
Date:
|
||||||
|
- Tue, 24 Sep 2024 21:57:52 GMT
|
||||||
|
status:
|
||||||
|
code: 200
|
||||||
|
message: OK
|
||||||
|
version: 1
|
||||||
@@ -1,449 +0,0 @@
|
|||||||
interactions:
|
|
||||||
- request:
|
|
||||||
body: '{"model": "llama3.2:3b", "prompt": "### User:\nRespond in 20 words. Who
|
|
||||||
are you?\n\n", "options": {"temperature": 0.7, "num_predict": 30}, "stream":
|
|
||||||
false}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- '*/*'
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '155'
|
|
||||||
host:
|
|
||||||
- localhost:11434
|
|
||||||
user-agent:
|
|
||||||
- litellm/1.56.4
|
|
||||||
method: POST
|
|
||||||
uri: http://localhost:11434/api/generate
|
|
||||||
response:
|
|
||||||
content: '{"model":"llama3.2:3b","created_at":"2024-12-31T17:00:06.295261Z","response":"I''m
|
|
||||||
an AI assistant, here to provide information and answer questions to the best
|
|
||||||
of my abilities and knowledge.","done":true,"done_reason":"stop","context":[128006,9125,128007,271,38766,1303,33025,2696,25,6790,220,2366,18,271,128009,128006,882,128007,271,14711,2724,512,66454,304,220,508,4339,13,10699,527,499,1980,128009,128006,78191,128007,271,40,2846,459,15592,18328,11,1618,311,3493,2038,323,4320,4860,311,279,1888,315,856,18000,323,6677,13],"total_duration":826912750,"load_duration":32648125,"prompt_eval_count":38,"prompt_eval_duration":389000000,"eval_count":23,"eval_duration":404000000}'
|
|
||||||
headers:
|
|
||||||
Content-Length:
|
|
||||||
- '675'
|
|
||||||
Content-Type:
|
|
||||||
- application/json; charset=utf-8
|
|
||||||
Date:
|
|
||||||
- Tue, 31 Dec 2024 17:00:06 GMT
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
- request:
|
|
||||||
body: '{"name": "llama3.2:3b"}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- '*/*'
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '23'
|
|
||||||
content-type:
|
|
||||||
- application/json
|
|
||||||
host:
|
|
||||||
- localhost:11434
|
|
||||||
user-agent:
|
|
||||||
- litellm/1.56.4
|
|
||||||
method: POST
|
|
||||||
uri: http://localhost:11434/api/show
|
|
||||||
response:
|
|
||||||
content: "{\"license\":\"LLAMA 3.2 COMMUNITY LICENSE AGREEMENT\\nLlama 3.2 Version
|
|
||||||
Release Date: September 25, 2024\\n\\n\u201CAgreement\u201D means the terms
|
|
||||||
and conditions for use, reproduction, distribution \\nand modification of the
|
|
||||||
Llama Materials set forth herein.\\n\\n\u201CDocumentation\u201D means the specifications,
|
|
||||||
manuals and documentation accompanying Llama 3.2\\ndistributed by Meta at https://llama.meta.com/doc/overview.\\n\\n\u201CLicensee\u201D
|
|
||||||
or \u201Cyou\u201D means you, or your employer or any other person or entity
|
|
||||||
(if you are \\nentering into this Agreement on such person or entity\u2019s
|
|
||||||
behalf), of the age required under\\napplicable laws, rules or regulations to
|
|
||||||
provide legal consent and that has legal authority\\nto bind your employer or
|
|
||||||
such other person or entity if you are entering in this Agreement\\non their
|
|
||||||
behalf.\\n\\n\u201CLlama 3.2\u201D means the foundational large language models
|
|
||||||
and software and algorithms, including\\nmachine-learning model code, trained
|
|
||||||
model weights, inference-enabling code, training-enabling code,\\nfine-tuning
|
|
||||||
enabling code and other elements of the foregoing distributed by Meta at \\nhttps://www.llama.com/llama-downloads.\\n\\n\u201CLlama
|
|
||||||
Materials\u201D means, collectively, Meta\u2019s proprietary Llama 3.2 and Documentation
|
|
||||||
(and \\nany portion thereof) made available under this Agreement.\\n\\n\u201CMeta\u201D
|
|
||||||
or \u201Cwe\u201D means Meta Platforms Ireland Limited (if you are located in
|
|
||||||
or, \\nif you are an entity, your principal place of business is in the EEA
|
|
||||||
or Switzerland) \\nand Meta Platforms, Inc. (if you are located outside of the
|
|
||||||
EEA or Switzerland). \\n\\n\\nBy clicking \u201CI Accept\u201D below or by using
|
|
||||||
or distributing any portion or element of the Llama Materials,\\nyou agree to
|
|
||||||
be bound by this Agreement.\\n\\n\\n1. License Rights and Redistribution.\\n\\n
|
|
||||||
\ a. Grant of Rights. You are granted a non-exclusive, worldwide, \\nnon-transferable
|
|
||||||
and royalty-free limited license under Meta\u2019s intellectual property or
|
|
||||||
other rights \\nowned by Meta embodied in the Llama Materials to use, reproduce,
|
|
||||||
distribute, copy, create derivative works \\nof, and make modifications to the
|
|
||||||
Llama Materials. \\n\\n b. Redistribution and Use. \\n\\n i. If
|
|
||||||
you distribute or make available the Llama Materials (or any derivative works
|
|
||||||
thereof), \\nor a product or service (including another AI model) that contains
|
|
||||||
any of them, you shall (A) provide\\na copy of this Agreement with any such
|
|
||||||
Llama Materials; and (B) prominently display \u201CBuilt with Llama\u201D\\non
|
|
||||||
a related website, user interface, blogpost, about page, or product documentation.
|
|
||||||
If you use the\\nLlama Materials or any outputs or results of the Llama Materials
|
|
||||||
to create, train, fine tune, or\\notherwise improve an AI model, which is distributed
|
|
||||||
or made available, you shall also include \u201CLlama\u201D\\nat the beginning
|
|
||||||
of any such AI model name.\\n\\n ii. If you receive Llama Materials,
|
|
||||||
or any derivative works thereof, from a Licensee as part\\nof an integrated
|
|
||||||
end user product, then Section 2 of this Agreement will not apply to you. \\n\\n
|
|
||||||
\ iii. You must retain in all copies of the Llama Materials that you distribute
|
|
||||||
the \\nfollowing attribution notice within a \u201CNotice\u201D text file distributed
|
|
||||||
as a part of such copies: \\n\u201CLlama 3.2 is licensed under the Llama 3.2
|
|
||||||
Community License, Copyright \xA9 Meta Platforms,\\nInc. All Rights Reserved.\u201D\\n\\n
|
|
||||||
\ iv. Your use of the Llama Materials must comply with applicable laws
|
|
||||||
and regulations\\n(including trade compliance laws and regulations) and adhere
|
|
||||||
to the Acceptable Use Policy for\\nthe Llama Materials (available at https://www.llama.com/llama3_2/use-policy),
|
|
||||||
which is hereby \\nincorporated by reference into this Agreement.\\n \\n2.
|
|
||||||
Additional Commercial Terms. If, on the Llama 3.2 version release date, the
|
|
||||||
monthly active users\\nof the products or services made available by or for
|
|
||||||
Licensee, or Licensee\u2019s affiliates, \\nis greater than 700 million monthly
|
|
||||||
active users in the preceding calendar month, you must request \\na license
|
|
||||||
from Meta, which Meta may grant to you in its sole discretion, and you are not
|
|
||||||
authorized to\\nexercise any of the rights under this Agreement unless or until
|
|
||||||
Meta otherwise expressly grants you such rights.\\n\\n3. Disclaimer of Warranty.
|
|
||||||
UNLESS REQUIRED BY APPLICABLE LAW, THE LLAMA MATERIALS AND ANY OUTPUT AND \\nRESULTS
|
|
||||||
THEREFROM ARE PROVIDED ON AN \u201CAS IS\u201D BASIS, WITHOUT WARRANTIES OF
|
|
||||||
ANY KIND, AND META DISCLAIMS\\nALL WARRANTIES OF ANY KIND, BOTH EXPRESS AND
|
|
||||||
IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES\\nOF TITLE, NON-INFRINGEMENT,
|
|
||||||
MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE\\nFOR
|
|
||||||
DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE LLAMA MATERIALS
|
|
||||||
AND ASSUME ANY RISKS ASSOCIATED\\nWITH YOUR USE OF THE LLAMA MATERIALS AND ANY
|
|
||||||
OUTPUT AND RESULTS.\\n\\n4. Limitation of Liability. IN NO EVENT WILL META OR
|
|
||||||
ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY, \\nWHETHER IN CONTRACT,
|
|
||||||
TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT,
|
|
||||||
\\nFOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL,
|
|
||||||
EXEMPLARY OR PUNITIVE DAMAGES, EVEN \\nIF META OR ITS AFFILIATES HAVE BEEN ADVISED
|
|
||||||
OF THE POSSIBILITY OF ANY OF THE FOREGOING.\\n\\n5. Intellectual Property.\\n\\n
|
|
||||||
\ a. No trademark licenses are granted under this Agreement, and in connection
|
|
||||||
with the Llama Materials, \\nneither Meta nor Licensee may use any name or mark
|
|
||||||
owned by or associated with the other or any of its affiliates, \\nexcept as
|
|
||||||
required for reasonable and customary use in describing and redistributing the
|
|
||||||
Llama Materials or as \\nset forth in this Section 5(a). Meta hereby grants
|
|
||||||
you a license to use \u201CLlama\u201D (the \u201CMark\u201D) solely as required
|
|
||||||
\\nto comply with the last sentence of Section 1.b.i. You will comply with Meta\u2019s
|
|
||||||
brand guidelines (currently accessible \\nat https://about.meta.com/brand/resources/meta/company-brand/).
|
|
||||||
All goodwill arising out of your use of the Mark \\nwill inure to the benefit
|
|
||||||
of Meta.\\n\\n b. Subject to Meta\u2019s ownership of Llama Materials and
|
|
||||||
derivatives made by or for Meta, with respect to any\\n derivative works
|
|
||||||
and modifications of the Llama Materials that are made by you, as between you
|
|
||||||
and Meta,\\n you are and will be the owner of such derivative works and modifications.\\n\\n
|
|
||||||
\ c. If you institute litigation or other proceedings against Meta or any
|
|
||||||
entity (including a cross-claim or\\n counterclaim in a lawsuit) alleging
|
|
||||||
that the Llama Materials or Llama 3.2 outputs or results, or any portion\\n
|
|
||||||
\ of any of the foregoing, constitutes infringement of intellectual property
|
|
||||||
or other rights owned or licensable\\n by you, then any licenses granted
|
|
||||||
to you under this Agreement shall terminate as of the date such litigation or\\n
|
|
||||||
\ claim is filed or instituted. You will indemnify and hold harmless Meta
|
|
||||||
from and against any claim by any third\\n party arising out of or related
|
|
||||||
to your use or distribution of the Llama Materials.\\n\\n6. Term and Termination.
|
|
||||||
The term of this Agreement will commence upon your acceptance of this Agreement
|
|
||||||
or access\\nto the Llama Materials and will continue in full force and effect
|
|
||||||
until terminated in accordance with the terms\\nand conditions herein. Meta
|
|
||||||
may terminate this Agreement if you are in breach of any term or condition of
|
|
||||||
this\\nAgreement. Upon termination of this Agreement, you shall delete and cease
|
|
||||||
use of the Llama Materials. Sections 3,\\n4 and 7 shall survive the termination
|
|
||||||
of this Agreement. \\n\\n7. Governing Law and Jurisdiction. This Agreement will
|
|
||||||
be governed and construed under the laws of the State of \\nCalifornia without
|
|
||||||
regard to choice of law principles, and the UN Convention on Contracts for the
|
|
||||||
International\\nSale of Goods does not apply to this Agreement. The courts of
|
|
||||||
California shall have exclusive jurisdiction of\\nany dispute arising out of
|
|
||||||
this Agreement.\\n**Llama 3.2** **Acceptable Use Policy**\\n\\nMeta is committed
|
|
||||||
to promoting safe and fair use of its tools and features, including Llama 3.2.
|
|
||||||
If you access or use Llama 3.2, you agree to this Acceptable Use Policy (\u201C**Policy**\u201D).
|
|
||||||
The most recent copy of this policy can be found at [https://www.llama.com/llama3_2/use-policy](https://www.llama.com/llama3_2/use-policy).\\n\\n**Prohibited
|
|
||||||
Uses**\\n\\nWe want everyone to use Llama 3.2 safely and responsibly. You agree
|
|
||||||
you will not use, or allow others to use, Llama 3.2 to:\\n\\n\\n\\n1. Violate
|
|
||||||
the law or others\u2019 rights, including to:\\n 1. Engage in, promote, generate,
|
|
||||||
contribute to, encourage, plan, incite, or further illegal or unlawful activity
|
|
||||||
or content, such as:\\n 1. Violence or terrorism\\n 2. Exploitation
|
|
||||||
or harm to children, including the solicitation, creation, acquisition, or dissemination
|
|
||||||
of child exploitative content or failure to report Child Sexual Abuse Material\\n
|
|
||||||
\ 3. Human trafficking, exploitation, and sexual violence\\n 4.
|
|
||||||
The illegal distribution of information or materials to minors, including obscene
|
|
||||||
materials, or failure to employ legally required age-gating in connection with
|
|
||||||
such information or materials.\\n 5. Sexual solicitation\\n 6.
|
|
||||||
Any other criminal activity\\n 1. Engage in, promote, incite, or facilitate
|
|
||||||
the harassment, abuse, threatening, or bullying of individuals or groups of
|
|
||||||
individuals\\n 2. Engage in, promote, incite, or facilitate discrimination
|
|
||||||
or other unlawful or harmful conduct in the provision of employment, employment
|
|
||||||
benefits, credit, housing, other economic benefits, or other essential goods
|
|
||||||
and services\\n 3. Engage in the unauthorized or unlicensed practice of any
|
|
||||||
profession including, but not limited to, financial, legal, medical/health,
|
|
||||||
or related professional practices\\n 4. Collect, process, disclose, generate,
|
|
||||||
or infer private or sensitive information about individuals, including information
|
|
||||||
about individuals\u2019 identity, health, or demographic information, unless
|
|
||||||
you have obtained the right to do so in accordance with applicable law\\n 5.
|
|
||||||
Engage in or facilitate any action or generate any content that infringes, misappropriates,
|
|
||||||
or otherwise violates any third-party rights, including the outputs or results
|
|
||||||
of any products or services using the Llama Materials\\n 6. Create, generate,
|
|
||||||
or facilitate the creation of malicious code, malware, computer viruses or do
|
|
||||||
anything else that could disable, overburden, interfere with or impair the proper
|
|
||||||
working, integrity, operation or appearance of a website or computer system\\n
|
|
||||||
\ 7. Engage in any action, or facilitate any action, to intentionally circumvent
|
|
||||||
or remove usage restrictions or other safety measures, or to enable functionality
|
|
||||||
disabled by Meta\\n2. Engage in, promote, incite, facilitate, or assist in the
|
|
||||||
planning or development of activities that present a risk of death or bodily
|
|
||||||
harm to individuals, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 8. Military, warfare, nuclear industries or applications, espionage, use
|
|
||||||
for materials or activities that are subject to the International Traffic Arms
|
|
||||||
Regulations (ITAR) maintained by the United States Department of State or to
|
|
||||||
the U.S. Biological Weapons Anti-Terrorism Act of 1989 or the Chemical Weapons
|
|
||||||
Convention Implementation Act of 1997\\n 9. Guns and illegal weapons (including
|
|
||||||
weapon development)\\n 10. Illegal drugs and regulated/controlled substances\\n
|
|
||||||
\ 11. Operation of critical infrastructure, transportation technologies, or
|
|
||||||
heavy machinery\\n 12. Self-harm or harm to others, including suicide, cutting,
|
|
||||||
and eating disorders\\n 13. Any content intended to incite or promote violence,
|
|
||||||
abuse, or any infliction of bodily harm to an individual\\n3. Intentionally
|
|
||||||
deceive or mislead others, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 14. Generating, promoting, or furthering fraud or the creation or promotion
|
|
||||||
of disinformation\\n 15. Generating, promoting, or furthering defamatory
|
|
||||||
content, including the creation of defamatory statements, images, or other content\\n
|
|
||||||
\ 16. Generating, promoting, or further distributing spam\\n 17. Impersonating
|
|
||||||
another individual without consent, authorization, or legal right\\n 18.
|
|
||||||
Representing that the use of Llama 3.2 or outputs are human-generated\\n 19.
|
|
||||||
Generating or facilitating false online engagement, including fake reviews and
|
|
||||||
other means of fake online engagement\\n4. Fail to appropriately disclose to
|
|
||||||
end users any known dangers of your AI system\\n5. Interact with third party
|
|
||||||
tools, models, or software designed to generate unlawful content or engage in
|
|
||||||
unlawful or harmful conduct and/or represent that the outputs of such tools,
|
|
||||||
models, or software are associated with Meta or Llama 3.2\\n\\nWith respect
|
|
||||||
to any multimodal models included in Llama 3.2, the rights granted under Section
|
|
||||||
1(a) of the Llama 3.2 Community License Agreement are not being granted to you
|
|
||||||
if you are an individual domiciled in, or a company with a principal place of
|
|
||||||
business in, the European Union. This restriction does not apply to end users
|
|
||||||
of a product or service that incorporates any such multimodal models.\\n\\nPlease
|
|
||||||
report any violation of this Policy, software \u201Cbug,\u201D or other problems
|
|
||||||
that could lead to a violation of this Policy through one of the following means:\\n\\n\\n\\n*
|
|
||||||
Reporting issues with the model: [https://github.com/meta-llama/llama-models/issues](https://l.workplace.com/l.php?u=https%3A%2F%2Fgithub.com%2Fmeta-llama%2Fllama-models%2Fissues\\u0026h=AT0qV8W9BFT6NwihiOHRuKYQM_UnkzN_NmHMy91OT55gkLpgi4kQupHUl0ssR4dQsIQ8n3tfd0vtkobvsEvt1l4Ic6GXI2EeuHV8N08OG2WnbAmm0FL4ObkazC6G_256vN0lN9DsykCvCqGZ)\\n*
|
|
||||||
Reporting risky content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)\\n*
|
|
||||||
Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)\\n*
|
|
||||||
Reporting violations of the Acceptable Use Policy or unlicensed uses of Llama
|
|
||||||
3.2: LlamaUseReport@meta.com\",\"modelfile\":\"# Modelfile generated by \\\"ollama
|
|
||||||
show\\\"\\n# To build a new Modelfile based on this, replace FROM with:\\n#
|
|
||||||
FROM llama3.2:3b\\n\\nFROM /Users/brandonhancock/.ollama/models/blobs/sha256-dde5aa3fc5ffc17176b5e8bdc82f587b24b2678c6c66101bf7da77af9f7ccdff\\nTEMPLATE
|
|
||||||
\\\"\\\"\\\"\\u003c|start_header_id|\\u003esystem\\u003c|end_header_id|\\u003e\\n\\nCutting
|
|
||||||
Knowledge Date: December 2023\\n\\n{{ if .System }}{{ .System }}\\n{{- end }}\\n{{-
|
|
||||||
if .Tools }}When you receive a tool call response, use the output to format
|
|
||||||
an answer to the orginal user question.\\n\\nYou are a helpful assistant with
|
|
||||||
tool calling capabilities.\\n{{- end }}\\u003c|eot_id|\\u003e\\n{{- range $i,
|
|
||||||
$_ := .Messages }}\\n{{- $last := eq (len (slice $.Messages $i)) 1 }}\\n{{-
|
|
||||||
if eq .Role \\\"user\\\" }}\\u003c|start_header_id|\\u003euser\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if and $.Tools $last }}\\n\\nGiven the following functions, please respond with
|
|
||||||
a JSON for a function call with its proper arguments that best answers the given
|
|
||||||
prompt.\\n\\nRespond in the format {\\\"name\\\": function name, \\\"parameters\\\":
|
|
||||||
dictionary of argument name and its value}. Do not use variables.\\n\\n{{ range
|
|
||||||
$.Tools }}\\n{{- . }}\\n{{ end }}\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{-
|
|
||||||
else }}\\n\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{- end }}{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- else if eq .Role \\\"assistant\\\" }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if .ToolCalls }}\\n{{ range .ToolCalls }}\\n{\\\"name\\\": \\\"{{ .Function.Name
|
|
||||||
}}\\\", \\\"parameters\\\": {{ .Function.Arguments }}}{{ end }}\\n{{- else }}\\n\\n{{
|
|
||||||
.Content }}\\n{{- end }}{{ if not $last }}\\u003c|eot_id|\\u003e{{ end }}\\n{{-
|
|
||||||
else if eq .Role \\\"tool\\\" }}\\u003c|start_header_id|\\u003eipython\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
.Content }}\\u003c|eot_id|\\u003e{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- end }}\\n{{- end }}\\\"\\\"\\\"\\nPARAMETER stop \\u003c|start_header_id|\\u003e\\nPARAMETER
|
|
||||||
stop \\u003c|end_header_id|\\u003e\\nPARAMETER stop \\u003c|eot_id|\\u003e\\nLICENSE
|
|
||||||
\\\"LLAMA 3.2 COMMUNITY LICENSE AGREEMENT\\nLlama 3.2 Version Release Date:
|
|
||||||
September 25, 2024\\n\\n\u201CAgreement\u201D means the terms and conditions
|
|
||||||
for use, reproduction, distribution \\nand modification of the Llama Materials
|
|
||||||
set forth herein.\\n\\n\u201CDocumentation\u201D means the specifications, manuals
|
|
||||||
and documentation accompanying Llama 3.2\\ndistributed by Meta at https://llama.meta.com/doc/overview.\\n\\n\u201CLicensee\u201D
|
|
||||||
or \u201Cyou\u201D means you, or your employer or any other person or entity
|
|
||||||
(if you are \\nentering into this Agreement on such person or entity\u2019s
|
|
||||||
behalf), of the age required under\\napplicable laws, rules or regulations to
|
|
||||||
provide legal consent and that has legal authority\\nto bind your employer or
|
|
||||||
such other person or entity if you are entering in this Agreement\\non their
|
|
||||||
behalf.\\n\\n\u201CLlama 3.2\u201D means the foundational large language models
|
|
||||||
and software and algorithms, including\\nmachine-learning model code, trained
|
|
||||||
model weights, inference-enabling code, training-enabling code,\\nfine-tuning
|
|
||||||
enabling code and other elements of the foregoing distributed by Meta at \\nhttps://www.llama.com/llama-downloads.\\n\\n\u201CLlama
|
|
||||||
Materials\u201D means, collectively, Meta\u2019s proprietary Llama 3.2 and Documentation
|
|
||||||
(and \\nany portion thereof) made available under this Agreement.\\n\\n\u201CMeta\u201D
|
|
||||||
or \u201Cwe\u201D means Meta Platforms Ireland Limited (if you are located in
|
|
||||||
or, \\nif you are an entity, your principal place of business is in the EEA
|
|
||||||
or Switzerland) \\nand Meta Platforms, Inc. (if you are located outside of the
|
|
||||||
EEA or Switzerland). \\n\\n\\nBy clicking \u201CI Accept\u201D below or by using
|
|
||||||
or distributing any portion or element of the Llama Materials,\\nyou agree to
|
|
||||||
be bound by this Agreement.\\n\\n\\n1. License Rights and Redistribution.\\n\\n
|
|
||||||
\ a. Grant of Rights. You are granted a non-exclusive, worldwide, \\nnon-transferable
|
|
||||||
and royalty-free limited license under Meta\u2019s intellectual property or
|
|
||||||
other rights \\nowned by Meta embodied in the Llama Materials to use, reproduce,
|
|
||||||
distribute, copy, create derivative works \\nof, and make modifications to the
|
|
||||||
Llama Materials. \\n\\n b. Redistribution and Use. \\n\\n i. If
|
|
||||||
you distribute or make available the Llama Materials (or any derivative works
|
|
||||||
thereof), \\nor a product or service (including another AI model) that contains
|
|
||||||
any of them, you shall (A) provide\\na copy of this Agreement with any such
|
|
||||||
Llama Materials; and (B) prominently display \u201CBuilt with Llama\u201D\\non
|
|
||||||
a related website, user interface, blogpost, about page, or product documentation.
|
|
||||||
If you use the\\nLlama Materials or any outputs or results of the Llama Materials
|
|
||||||
to create, train, fine tune, or\\notherwise improve an AI model, which is distributed
|
|
||||||
or made available, you shall also include \u201CLlama\u201D\\nat the beginning
|
|
||||||
of any such AI model name.\\n\\n ii. If you receive Llama Materials,
|
|
||||||
or any derivative works thereof, from a Licensee as part\\nof an integrated
|
|
||||||
end user product, then Section 2 of this Agreement will not apply to you. \\n\\n
|
|
||||||
\ iii. You must retain in all copies of the Llama Materials that you distribute
|
|
||||||
the \\nfollowing attribution notice within a \u201CNotice\u201D text file distributed
|
|
||||||
as a part of such copies: \\n\u201CLlama 3.2 is licensed under the Llama 3.2
|
|
||||||
Community License, Copyright \xA9 Meta Platforms,\\nInc. All Rights Reserved.\u201D\\n\\n
|
|
||||||
\ iv. Your use of the Llama Materials must comply with applicable laws
|
|
||||||
and regulations\\n(including trade compliance laws and regulations) and adhere
|
|
||||||
to the Acceptable Use Policy for\\nthe Llama Materials (available at https://www.llama.com/llama3_2/use-policy),
|
|
||||||
which is hereby \\nincorporated by reference into this Agreement.\\n \\n2.
|
|
||||||
Additional Commercial Terms. If, on the Llama 3.2 version release date, the
|
|
||||||
monthly active users\\nof the products or services made available by or for
|
|
||||||
Licensee, or Licensee\u2019s affiliates, \\nis greater than 700 million monthly
|
|
||||||
active users in the preceding calendar month, you must request \\na license
|
|
||||||
from Meta, which Meta may grant to you in its sole discretion, and you are not
|
|
||||||
authorized to\\nexercise any of the rights under this Agreement unless or until
|
|
||||||
Meta otherwise expressly grants you such rights.\\n\\n3. Disclaimer of Warranty.
|
|
||||||
UNLESS REQUIRED BY APPLICABLE LAW, THE LLAMA MATERIALS AND ANY OUTPUT AND \\nRESULTS
|
|
||||||
THEREFROM ARE PROVIDED ON AN \u201CAS IS\u201D BASIS, WITHOUT WARRANTIES OF
|
|
||||||
ANY KIND, AND META DISCLAIMS\\nALL WARRANTIES OF ANY KIND, BOTH EXPRESS AND
|
|
||||||
IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES\\nOF TITLE, NON-INFRINGEMENT,
|
|
||||||
MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE\\nFOR
|
|
||||||
DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE LLAMA MATERIALS
|
|
||||||
AND ASSUME ANY RISKS ASSOCIATED\\nWITH YOUR USE OF THE LLAMA MATERIALS AND ANY
|
|
||||||
OUTPUT AND RESULTS.\\n\\n4. Limitation of Liability. IN NO EVENT WILL META OR
|
|
||||||
ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY, \\nWHETHER IN CONTRACT,
|
|
||||||
TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT,
|
|
||||||
\\nFOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL,
|
|
||||||
EXEMPLARY OR PUNITIVE DAMAGES, EVEN \\nIF META OR ITS AFFILIATES HAVE BEEN ADVISED
|
|
||||||
OF THE POSSIBILITY OF ANY OF THE FOREGOING.\\n\\n5. Intellectual Property.\\n\\n
|
|
||||||
\ a. No trademark licenses are granted under this Agreement, and in connection
|
|
||||||
with the Llama Materials, \\nneither Meta nor Licensee may use any name or mark
|
|
||||||
owned by or associated with the other or any of its affiliates, \\nexcept as
|
|
||||||
required for reasonable and customary use in describing and redistributing the
|
|
||||||
Llama Materials or as \\nset forth in this Section 5(a). Meta hereby grants
|
|
||||||
you a license to use \u201CLlama\u201D (the \u201CMark\u201D) solely as required
|
|
||||||
\\nto comply with the last sentence of Section 1.b.i. You will comply with Meta\u2019s
|
|
||||||
brand guidelines (currently accessible \\nat https://about.meta.com/brand/resources/meta/company-brand/).
|
|
||||||
All goodwill arising out of your use of the Mark \\nwill inure to the benefit
|
|
||||||
of Meta.\\n\\n b. Subject to Meta\u2019s ownership of Llama Materials and
|
|
||||||
derivatives made by or for Meta, with respect to any\\n derivative works
|
|
||||||
and modifications of the Llama Materials that are made by you, as between you
|
|
||||||
and Meta,\\n you are and will be the owner of such derivative works and modifications.\\n\\n
|
|
||||||
\ c. If you institute litigation or other proceedings against Meta or any
|
|
||||||
entity (including a cross-claim or\\n counterclaim in a lawsuit) alleging
|
|
||||||
that the Llama Materials or Llama 3.2 outputs or results, or any portion\\n
|
|
||||||
\ of any of the foregoing, constitutes infringement of intellectual property
|
|
||||||
or other rights owned or licensable\\n by you, then any licenses granted
|
|
||||||
to you under this Agreement shall terminate as of the date such litigation or\\n
|
|
||||||
\ claim is filed or instituted. You will indemnify and hold harmless Meta
|
|
||||||
from and against any claim by any third\\n party arising out of or related
|
|
||||||
to your use or distribution of the Llama Materials.\\n\\n6. Term and Termination.
|
|
||||||
The term of this Agreement will commence upon your acceptance of this Agreement
|
|
||||||
or access\\nto the Llama Materials and will continue in full force and effect
|
|
||||||
until terminated in accordance with the terms\\nand conditions herein. Meta
|
|
||||||
may terminate this Agreement if you are in breach of any term or condition of
|
|
||||||
this\\nAgreement. Upon termination of this Agreement, you shall delete and cease
|
|
||||||
use of the Llama Materials. Sections 3,\\n4 and 7 shall survive the termination
|
|
||||||
of this Agreement. \\n\\n7. Governing Law and Jurisdiction. This Agreement will
|
|
||||||
be governed and construed under the laws of the State of \\nCalifornia without
|
|
||||||
regard to choice of law principles, and the UN Convention on Contracts for the
|
|
||||||
International\\nSale of Goods does not apply to this Agreement. The courts of
|
|
||||||
California shall have exclusive jurisdiction of\\nany dispute arising out of
|
|
||||||
this Agreement.\\\"\\nLICENSE \\\"**Llama 3.2** **Acceptable Use Policy**\\n\\nMeta
|
|
||||||
is committed to promoting safe and fair use of its tools and features, including
|
|
||||||
Llama 3.2. If you access or use Llama 3.2, you agree to this Acceptable Use
|
|
||||||
Policy (\u201C**Policy**\u201D). The most recent copy of this policy can be
|
|
||||||
found at [https://www.llama.com/llama3_2/use-policy](https://www.llama.com/llama3_2/use-policy).\\n\\n**Prohibited
|
|
||||||
Uses**\\n\\nWe want everyone to use Llama 3.2 safely and responsibly. You agree
|
|
||||||
you will not use, or allow others to use, Llama 3.2 to:\\n\\n\\n\\n1. Violate
|
|
||||||
the law or others\u2019 rights, including to:\\n 1. Engage in, promote, generate,
|
|
||||||
contribute to, encourage, plan, incite, or further illegal or unlawful activity
|
|
||||||
or content, such as:\\n 1. Violence or terrorism\\n 2. Exploitation
|
|
||||||
or harm to children, including the solicitation, creation, acquisition, or dissemination
|
|
||||||
of child exploitative content or failure to report Child Sexual Abuse Material\\n
|
|
||||||
\ 3. Human trafficking, exploitation, and sexual violence\\n 4.
|
|
||||||
The illegal distribution of information or materials to minors, including obscene
|
|
||||||
materials, or failure to employ legally required age-gating in connection with
|
|
||||||
such information or materials.\\n 5. Sexual solicitation\\n 6.
|
|
||||||
Any other criminal activity\\n 1. Engage in, promote, incite, or facilitate
|
|
||||||
the harassment, abuse, threatening, or bullying of individuals or groups of
|
|
||||||
individuals\\n 2. Engage in, promote, incite, or facilitate discrimination
|
|
||||||
or other unlawful or harmful conduct in the provision of employment, employment
|
|
||||||
benefits, credit, housing, other economic benefits, or other essential goods
|
|
||||||
and services\\n 3. Engage in the unauthorized or unlicensed practice of any
|
|
||||||
profession including, but not limited to, financial, legal, medical/health,
|
|
||||||
or related professional practices\\n 4. Collect, process, disclose, generate,
|
|
||||||
or infer private or sensitive information about individuals, including information
|
|
||||||
about individuals\u2019 identity, health, or demographic information, unless
|
|
||||||
you have obtained the right to do so in accordance with applicable law\\n 5.
|
|
||||||
Engage in or facilitate any action or generate any content that infringes, misappropriates,
|
|
||||||
or otherwise violates any third-party rights, including the outputs or results
|
|
||||||
of any products or services using the Llama Materials\\n 6. Create, generate,
|
|
||||||
or facilitate the creation of malicious code, malware, computer viruses or do
|
|
||||||
anything else that could disable, overburden, interfere with or impair the proper
|
|
||||||
working, integrity, operation or appearance of a website or computer system\\n
|
|
||||||
\ 7. Engage in any action, or facilitate any action, to intentionally circumvent
|
|
||||||
or remove usage restrictions or other safety measures, or to enable functionality
|
|
||||||
disabled by Meta\\n2. Engage in, promote, incite, facilitate, or assist in the
|
|
||||||
planning or development of activities that present a risk of death or bodily
|
|
||||||
harm to individuals, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 8. Military, warfare, nuclear industries or applications, espionage, use
|
|
||||||
for materials or activities that are subject to the International Traffic Arms
|
|
||||||
Regulations (ITAR) maintained by the United States Department of State or to
|
|
||||||
the U.S. Biological Weapons Anti-Terrorism Act of 1989 or the Chemical Weapons
|
|
||||||
Convention Implementation Act of 1997\\n 9. Guns and illegal weapons (including
|
|
||||||
weapon development)\\n 10. Illegal drugs and regulated/controlled substances\\n
|
|
||||||
\ 11. Operation of critical infrastructure, transportation technologies, or
|
|
||||||
heavy machinery\\n 12. Self-harm or harm to others, including suicide, cutting,
|
|
||||||
and eating disorders\\n 13. Any content intended to incite or promote violence,
|
|
||||||
abuse, or any infliction of bodily harm to an individual\\n3. Intentionally
|
|
||||||
deceive or mislead others, including use of Llama 3.2 related to the following:\\n
|
|
||||||
\ 14. Generating, promoting, or furthering fraud or the creation or promotion
|
|
||||||
of disinformation\\n 15. Generating, promoting, or furthering defamatory
|
|
||||||
content, including the creation of defamatory statements, images, or other content\\n
|
|
||||||
\ 16. Generating, promoting, or further distributing spam\\n 17. Impersonating
|
|
||||||
another individual without consent, authorization, or legal right\\n 18.
|
|
||||||
Representing that the use of Llama 3.2 or outputs are human-generated\\n 19.
|
|
||||||
Generating or facilitating false online engagement, including fake reviews and
|
|
||||||
other means of fake online engagement\\n4. Fail to appropriately disclose to
|
|
||||||
end users any known dangers of your AI system\\n5. Interact with third party
|
|
||||||
tools, models, or software designed to generate unlawful content or engage in
|
|
||||||
unlawful or harmful conduct and/or represent that the outputs of such tools,
|
|
||||||
models, or software are associated with Meta or Llama 3.2\\n\\nWith respect
|
|
||||||
to any multimodal models included in Llama 3.2, the rights granted under Section
|
|
||||||
1(a) of the Llama 3.2 Community License Agreement are not being granted to you
|
|
||||||
if you are an individual domiciled in, or a company with a principal place of
|
|
||||||
business in, the European Union. This restriction does not apply to end users
|
|
||||||
of a product or service that incorporates any such multimodal models.\\n\\nPlease
|
|
||||||
report any violation of this Policy, software \u201Cbug,\u201D or other problems
|
|
||||||
that could lead to a violation of this Policy through one of the following means:\\n\\n\\n\\n*
|
|
||||||
Reporting issues with the model: [https://github.com/meta-llama/llama-models/issues](https://l.workplace.com/l.php?u=https%3A%2F%2Fgithub.com%2Fmeta-llama%2Fllama-models%2Fissues\\u0026h=AT0qV8W9BFT6NwihiOHRuKYQM_UnkzN_NmHMy91OT55gkLpgi4kQupHUl0ssR4dQsIQ8n3tfd0vtkobvsEvt1l4Ic6GXI2EeuHV8N08OG2WnbAmm0FL4ObkazC6G_256vN0lN9DsykCvCqGZ)\\n*
|
|
||||||
Reporting risky content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)\\n*
|
|
||||||
Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)\\n*
|
|
||||||
Reporting violations of the Acceptable Use Policy or unlicensed uses of Llama
|
|
||||||
3.2: LlamaUseReport@meta.com\\\"\\n\",\"parameters\":\"stop \\\"\\u003c|start_header_id|\\u003e\\\"\\nstop
|
|
||||||
\ \\\"\\u003c|end_header_id|\\u003e\\\"\\nstop \\\"\\u003c|eot_id|\\u003e\\\"\",\"template\":\"\\u003c|start_header_id|\\u003esystem\\u003c|end_header_id|\\u003e\\n\\nCutting
|
|
||||||
Knowledge Date: December 2023\\n\\n{{ if .System }}{{ .System }}\\n{{- end }}\\n{{-
|
|
||||||
if .Tools }}When you receive a tool call response, use the output to format
|
|
||||||
an answer to the orginal user question.\\n\\nYou are a helpful assistant with
|
|
||||||
tool calling capabilities.\\n{{- end }}\\u003c|eot_id|\\u003e\\n{{- range $i,
|
|
||||||
$_ := .Messages }}\\n{{- $last := eq (len (slice $.Messages $i)) 1 }}\\n{{-
|
|
||||||
if eq .Role \\\"user\\\" }}\\u003c|start_header_id|\\u003euser\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if and $.Tools $last }}\\n\\nGiven the following functions, please respond with
|
|
||||||
a JSON for a function call with its proper arguments that best answers the given
|
|
||||||
prompt.\\n\\nRespond in the format {\\\"name\\\": function name, \\\"parameters\\\":
|
|
||||||
dictionary of argument name and its value}. Do not use variables.\\n\\n{{ range
|
|
||||||
$.Tools }}\\n{{- . }}\\n{{ end }}\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{-
|
|
||||||
else }}\\n\\n{{ .Content }}\\u003c|eot_id|\\u003e\\n{{- end }}{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- else if eq .Role \\\"assistant\\\" }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n{{-
|
|
||||||
if .ToolCalls }}\\n{{ range .ToolCalls }}\\n{\\\"name\\\": \\\"{{ .Function.Name
|
|
||||||
}}\\\", \\\"parameters\\\": {{ .Function.Arguments }}}{{ end }}\\n{{- else }}\\n\\n{{
|
|
||||||
.Content }}\\n{{- end }}{{ if not $last }}\\u003c|eot_id|\\u003e{{ end }}\\n{{-
|
|
||||||
else if eq .Role \\\"tool\\\" }}\\u003c|start_header_id|\\u003eipython\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
.Content }}\\u003c|eot_id|\\u003e{{ if $last }}\\u003c|start_header_id|\\u003eassistant\\u003c|end_header_id|\\u003e\\n\\n{{
|
|
||||||
end }}\\n{{- end }}\\n{{- end }}\",\"details\":{\"parent_model\":\"\",\"format\":\"gguf\",\"family\":\"llama\",\"families\":[\"llama\"],\"parameter_size\":\"3.2B\",\"quantization_level\":\"Q4_K_M\"},\"model_info\":{\"general.architecture\":\"llama\",\"general.basename\":\"Llama-3.2\",\"general.file_type\":15,\"general.finetune\":\"Instruct\",\"general.languages\":[\"en\",\"de\",\"fr\",\"it\",\"pt\",\"hi\",\"es\",\"th\"],\"general.parameter_count\":3212749888,\"general.quantization_version\":2,\"general.size_label\":\"3B\",\"general.tags\":[\"facebook\",\"meta\",\"pytorch\",\"llama\",\"llama-3\",\"text-generation\"],\"general.type\":\"model\",\"llama.attention.head_count\":24,\"llama.attention.head_count_kv\":8,\"llama.attention.key_length\":128,\"llama.attention.layer_norm_rms_epsilon\":0.00001,\"llama.attention.value_length\":128,\"llama.block_count\":28,\"llama.context_length\":131072,\"llama.embedding_length\":3072,\"llama.feed_forward_length\":8192,\"llama.rope.dimension_count\":128,\"llama.rope.freq_base\":500000,\"llama.vocab_size\":128256,\"tokenizer.ggml.bos_token_id\":128000,\"tokenizer.ggml.eos_token_id\":128009,\"tokenizer.ggml.merges\":null,\"tokenizer.ggml.model\":\"gpt2\",\"tokenizer.ggml.pre\":\"llama-bpe\",\"tokenizer.ggml.token_type\":null,\"tokenizer.ggml.tokens\":null},\"modified_at\":\"2024-12-31T11:53:14.529771974-05:00\"}"
|
|
||||||
headers:
|
|
||||||
Content-Type:
|
|
||||||
- application/json; charset=utf-8
|
|
||||||
Date:
|
|
||||||
- Tue, 31 Dec 2024 17:00:06 GMT
|
|
||||||
Transfer-Encoding:
|
|
||||||
- chunked
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
version: 1
|
|
||||||
@@ -1,481 +0,0 @@
|
|||||||
interactions:
|
|
||||||
- request:
|
|
||||||
body: '{"messages": [{"role": "system", "content": "You are Image Analyst. You''re
|
|
||||||
an expert at visual analysis, trained to notice and describe details in images.\nYour
|
|
||||||
personal goal is: Analyze images with high attention to detail\nYou ONLY have
|
|
||||||
access to the following tools, and should NEVER make up tools that are not listed
|
|
||||||
here:\n\nTool Name: Add image to content\nTool Arguments: {''image_url'': {''description'':
|
|
||||||
''The URL or path of the image to add'', ''type'': ''str''}, ''action'': {''description'':
|
|
||||||
''Optional context or question about the image'', ''type'': ''str''}}\nTool
|
|
||||||
Description: See image to understand it''s content, you can optionally ask a
|
|
||||||
question about the image\n\nUse the following format:\n\nThought: you should
|
|
||||||
always think about what to do\nAction: the action to take, only one name of
|
|
||||||
[Add image to content], just the name, exactly as it''s written.\nAction Input:
|
|
||||||
the input to the action, just a simple python dictionary, enclosed in curly
|
|
||||||
braces, using \" to wrap keys and values.\nObservation: the result of the action\n\nOnce
|
|
||||||
all necessary information is gathered:\n\nThought: I now know the final answer\nFinal
|
|
||||||
Answer: the final answer to the original input question"}, {"role": "user",
|
|
||||||
"content": "\nCurrent Task: \n Analyze the provided image and describe
|
|
||||||
what you see in detail.\n Focus on main elements, colors, composition,
|
|
||||||
and any notable details.\n Image: https://media.istockphoto.com/id/946087016/photo/aerial-view-of-lower-manhattan-new-york.jpg?s=612x612&w=0&k=20&c=viLiMRznQ8v5LzKTt_LvtfPFUVl1oiyiemVdSlm29_k=\n \n\nThis
|
|
||||||
is the expect criteria for your final answer: A comprehensive description of
|
|
||||||
the image contents.\nyou MUST return the actual complete content as the final
|
|
||||||
answer, not a summary.\n\nBegin! This is VERY important to you, use the tools
|
|
||||||
available and give your best Final Answer, your job depends on it!\n\nThought:"}],
|
|
||||||
"model": "gpt-4o", "stop": ["\nObservation:"], "stream": false}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- application/json
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '1948'
|
|
||||||
content-type:
|
|
||||||
- application/json
|
|
||||||
host:
|
|
||||||
- api.openai.com
|
|
||||||
user-agent:
|
|
||||||
- OpenAI/Python 1.52.1
|
|
||||||
x-stainless-arch:
|
|
||||||
- arm64
|
|
||||||
x-stainless-async:
|
|
||||||
- 'false'
|
|
||||||
x-stainless-lang:
|
|
||||||
- python
|
|
||||||
x-stainless-os:
|
|
||||||
- MacOS
|
|
||||||
x-stainless-package-version:
|
|
||||||
- 1.52.1
|
|
||||||
x-stainless-raw-response:
|
|
||||||
- 'true'
|
|
||||||
x-stainless-retry-count:
|
|
||||||
- '0'
|
|
||||||
x-stainless-runtime:
|
|
||||||
- CPython
|
|
||||||
x-stainless-runtime-version:
|
|
||||||
- 3.11.7
|
|
||||||
method: POST
|
|
||||||
uri: https://api.openai.com/v1/chat/completions
|
|
||||||
response:
|
|
||||||
content: "{\n \"id\": \"chatcmpl-AiuIfzzcje5KdvKIG5CkFeORroiKk\",\n \"object\":
|
|
||||||
\"chat.completion\",\n \"created\": 1735266213,\n \"model\": \"gpt-4o-2024-08-06\",\n
|
|
||||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
|
||||||
\"assistant\",\n \"content\": \"Action: Add image to content\\nAction
|
|
||||||
Input: {\\\"image_url\\\": \\\"https://media.istockphoto.com/id/946087016/photo/aerial-view-of-lower-manhattan-new-york.jpg?s=612x612&w=0&k=20&c=viLiMRznQ8v5LzKTt_LvtfPFUVl1oiyiemVdSlm29_k=\\\",
|
|
||||||
\\\"action\\\": \\\"Analyze the provided image and describe what you see in
|
|
||||||
detail.\\\"}\",\n \"refusal\": null\n },\n \"logprobs\": null,\n
|
|
||||||
\ \"finish_reason\": \"stop\"\n }\n ],\n \"usage\": {\n \"prompt_tokens\":
|
|
||||||
417,\n \"completion_tokens\": 103,\n \"total_tokens\": 520,\n \"prompt_tokens_details\":
|
|
||||||
{\n \"cached_tokens\": 0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\":
|
|
||||||
{\n \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
|
||||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
|
||||||
\"fp_5f20662549\"\n}\n"
|
|
||||||
headers:
|
|
||||||
CF-Cache-Status:
|
|
||||||
- DYNAMIC
|
|
||||||
CF-RAY:
|
|
||||||
- 8f85d96b280df217-GRU
|
|
||||||
Connection:
|
|
||||||
- keep-alive
|
|
||||||
Content-Encoding:
|
|
||||||
- gzip
|
|
||||||
Content-Type:
|
|
||||||
- application/json
|
|
||||||
Date:
|
|
||||||
- Fri, 27 Dec 2024 02:23:35 GMT
|
|
||||||
Server:
|
|
||||||
- cloudflare
|
|
||||||
Set-Cookie:
|
|
||||||
- __cf_bm=kJ1pw1xjCMSxjHSS8iJC5z_j2PZxl.i387KCpj9xNZU-1735266215-1.0.1.1-Ybg0wVTsrBlpVZmtQyA1ullY8m3v2Ix0N_SYlhr9z7zKfbLeqGZEVL37YSY.dvIiLVY3XPZzMtG8Xwo6UucW6A;
|
|
||||||
path=/; expires=Fri, 27-Dec-24 02:53:35 GMT; domain=.api.openai.com; HttpOnly;
|
|
||||||
Secure; SameSite=None
|
|
||||||
- _cfuvid=v_wJZ5m7qCjrnRfks0gT2GAk9yR14BdIDAQiQR7xxI8-1735266215000-0.0.1.1-604800000;
|
|
||||||
path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None
|
|
||||||
Transfer-Encoding:
|
|
||||||
- chunked
|
|
||||||
X-Content-Type-Options:
|
|
||||||
- nosniff
|
|
||||||
access-control-expose-headers:
|
|
||||||
- X-Request-ID
|
|
||||||
alt-svc:
|
|
||||||
- h3=":443"; ma=86400
|
|
||||||
openai-organization:
|
|
||||||
- crewai-iuxna1
|
|
||||||
openai-processing-ms:
|
|
||||||
- '1212'
|
|
||||||
openai-version:
|
|
||||||
- '2020-10-01'
|
|
||||||
strict-transport-security:
|
|
||||||
- max-age=31536000; includeSubDomains; preload
|
|
||||||
x-ratelimit-limit-requests:
|
|
||||||
- '10000'
|
|
||||||
x-ratelimit-limit-tokens:
|
|
||||||
- '30000000'
|
|
||||||
x-ratelimit-remaining-requests:
|
|
||||||
- '9999'
|
|
||||||
x-ratelimit-remaining-tokens:
|
|
||||||
- '29999539'
|
|
||||||
x-ratelimit-reset-requests:
|
|
||||||
- 6ms
|
|
||||||
x-ratelimit-reset-tokens:
|
|
||||||
- 0s
|
|
||||||
x-request-id:
|
|
||||||
- req_663a2b18099a18361d6b02befc175289
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
- request:
|
|
||||||
body: !!binary |
|
|
||||||
Co4LCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkS5QoKEgoQY3Jld2FpLnRl
|
|
||||||
bGVtZXRyeRKjBwoQHmzzumMNXHOgpJ4zCIxJSxII72WnLlLfRyYqDENyZXcgQ3JlYXRlZDABOQjB
|
|
||||||
gFxt5xQYQYhMiVxt5xQYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODYuMEoaCg5weXRob25fdmVy
|
|
||||||
c2lvbhIICgYzLjExLjdKLgoIY3Jld19rZXkSIgogZTM5NTY3YjUwNTI5MDljYTMzNDA5ODRiODM4
|
|
||||||
OTgwZWFKMQoHY3Jld19pZBImCiQ4MDA0YTA1NC0zYjNkLTQ4OGEtYTlkNC1kZWQzMDVhMDIxY2FK
|
|
||||||
HAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdf
|
|
||||||
bnVtYmVyX29mX3Rhc2tzEgIYAUobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgBSs4CCgtjcmV3
|
|
||||||
X2FnZW50cxK+Agq7Alt7ImtleSI6ICI5ZGM4Y2NlMDMwNDY4MTk2MDQxYjRjMzgwYjYxN2NiMCIs
|
|
||||||
ICJpZCI6ICJjNTZhZGI2Mi1lMGIwLTQzYzAtYmQ4OC0xYzEwYTNhNmU5NDQiLCAicm9sZSI6ICJJ
|
|
||||||
bWFnZSBBbmFseXN0IiwgInZlcmJvc2U/IjogdHJ1ZSwgIm1heF9pdGVyIjogMjAsICJtYXhfcnBt
|
|
||||||
IjogbnVsbCwgImZ1bmN0aW9uX2NhbGxpbmdfbGxtIjogIiIsICJsbG0iOiAiZ3B0LTRvIiwgImRl
|
|
||||||
bGVnYXRpb25fZW5hYmxlZD8iOiBmYWxzZSwgImFsbG93X2NvZGVfZXhlY3V0aW9uPyI6IGZhbHNl
|
|
||||||
LCAibWF4X3JldHJ5X2xpbWl0IjogMiwgInRvb2xzX25hbWVzIjogW119XUqCAgoKY3Jld190YXNr
|
|
||||||
cxLzAQrwAVt7ImtleSI6ICJhOWE3NmNhNjk1N2QwYmZmYTY5ZWFiMjBiNjY0ODIyYiIsICJpZCI6
|
|
||||||
ICJhNzFiZDllNC0wNzdkLTRmMTQtODg0MS03MGMwZWM4MGZkMmMiLCAiYXN5bmNfZXhlY3V0aW9u
|
|
||||||
PyI6IGZhbHNlLCAiaHVtYW5faW5wdXQ/IjogZmFsc2UsICJhZ2VudF9yb2xlIjogIkltYWdlIEFu
|
|
||||||
YWx5c3QiLCAiYWdlbnRfa2V5IjogIjlkYzhjY2UwMzA0NjgxOTYwNDFiNGMzODBiNjE3Y2IwIiwg
|
|
||||||
InRvb2xzX25hbWVzIjogW119XXoCGAGFAQABAAASjgIKEOZ5pMdq9ep85DrP1Vv8Y8MSCE7ahOkm
|
|
||||||
2IDHKgxUYXNrIENyZWF0ZWQwATlIg85cbecUGEGQ9M5cbecUGEouCghjcmV3X2tleRIiCiBlMzk1
|
|
||||||
NjdiNTA1MjkwOWNhMzM0MDk4NGI4Mzg5ODBlYUoxCgdjcmV3X2lkEiYKJDgwMDRhMDU0LTNiM2Qt
|
|
||||||
NDg4YS1hOWQ0LWRlZDMwNWEwMjFjYUouCgh0YXNrX2tleRIiCiBhOWE3NmNhNjk1N2QwYmZmYTY5
|
|
||||||
ZWFiMjBiNjY0ODIyYkoxCgd0YXNrX2lkEiYKJGE3MWJkOWU0LTA3N2QtNGYxNC04ODQxLTcwYzBl
|
|
||||||
YzgwZmQyY3oCGAGFAQABAAASlwEKECyaQQK8JkKLh6S2mWHTeDgSCPWCpr7v9CQZKgpUb29sIFVz
|
|
||||||
YWdlMAE5MLyst23nFBhBOJy/t23nFBhKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC44Ni4wSiMKCXRv
|
|
||||||
b2xfbmFtZRIWChRBZGQgaW1hZ2UgdG8gY29udGVudEoOCghhdHRlbXB0cxICGAF6AhgBhQEAAQAA
|
|
||||||
headers:
|
|
||||||
Accept:
|
|
||||||
- '*/*'
|
|
||||||
Accept-Encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
Connection:
|
|
||||||
- keep-alive
|
|
||||||
Content-Length:
|
|
||||||
- '1425'
|
|
||||||
Content-Type:
|
|
||||||
- application/x-protobuf
|
|
||||||
User-Agent:
|
|
||||||
- OTel-OTLP-Exporter-Python/1.27.0
|
|
||||||
method: POST
|
|
||||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
|
||||||
response:
|
|
||||||
body:
|
|
||||||
string: "\n\0"
|
|
||||||
headers:
|
|
||||||
Content-Length:
|
|
||||||
- '2'
|
|
||||||
Content-Type:
|
|
||||||
- application/x-protobuf
|
|
||||||
Date:
|
|
||||||
- Fri, 27 Dec 2024 02:23:39 GMT
|
|
||||||
status:
|
|
||||||
code: 200
|
|
||||||
message: OK
|
|
||||||
- request:
|
|
||||||
body: '{"messages": [{"role": "system", "content": "You are Image Analyst. You''re
|
|
||||||
an expert at visual analysis, trained to notice and describe details in images.\nYour
|
|
||||||
personal goal is: Analyze images with high attention to detail\nYou ONLY have
|
|
||||||
access to the following tools, and should NEVER make up tools that are not listed
|
|
||||||
here:\n\nTool Name: Add image to content\nTool Arguments: {''image_url'': {''description'':
|
|
||||||
''The URL or path of the image to add'', ''type'': ''str''}, ''action'': {''description'':
|
|
||||||
''Optional context or question about the image'', ''type'': ''str''}}\nTool
|
|
||||||
Description: See image to understand it''s content, you can optionally ask a
|
|
||||||
question about the image\n\nUse the following format:\n\nThought: you should
|
|
||||||
always think about what to do\nAction: the action to take, only one name of
|
|
||||||
[Add image to content], just the name, exactly as it''s written.\nAction Input:
|
|
||||||
the input to the action, just a simple python dictionary, enclosed in curly
|
|
||||||
braces, using \" to wrap keys and values.\nObservation: the result of the action\n\nOnce
|
|
||||||
all necessary information is gathered:\n\nThought: I now know the final answer\nFinal
|
|
||||||
Answer: the final answer to the original input question"}, {"role": "user",
|
|
||||||
"content": "\nCurrent Task: \n Analyze the provided image and describe
|
|
||||||
what you see in detail.\n Focus on main elements, colors, composition,
|
|
||||||
and any notable details.\n Image: https://media.istockphoto.com/id/946087016/photo/aerial-view-of-lower-manhattan-new-york.jpg?s=612x612&w=0&k=20&c=viLiMRznQ8v5LzKTt_LvtfPFUVl1oiyiemVdSlm29_k=\n \n\nThis
|
|
||||||
is the expect criteria for your final answer: A comprehensive description of
|
|
||||||
the image contents.\nyou MUST return the actual complete content as the final
|
|
||||||
answer, not a summary.\n\nBegin! This is VERY important to you, use the tools
|
|
||||||
available and give your best Final Answer, your job depends on it!\n\nThought:"},
|
|
||||||
{"role": "user", "content": [{"type": "text", "text": "Analyze the provided
|
|
||||||
image and describe what you see in detail."}, {"type": "image_url", "image_url":
|
|
||||||
{"url": "https://media.istockphoto.com/id/946087016/photo/aerial-view-of-lower-manhattan-new-york.jpg?s=612x612&w=0&k=20&c=viLiMRznQ8v5LzKTt_LvtfPFUVl1oiyiemVdSlm29_k="}}]}],
|
|
||||||
"model": "gpt-4o", "stop": ["\nObservation:"], "stream": false}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- application/json
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '2279'
|
|
||||||
content-type:
|
|
||||||
- application/json
|
|
||||||
cookie:
|
|
||||||
- __cf_bm=kJ1pw1xjCMSxjHSS8iJC5z_j2PZxl.i387KCpj9xNZU-1735266215-1.0.1.1-Ybg0wVTsrBlpVZmtQyA1ullY8m3v2Ix0N_SYlhr9z7zKfbLeqGZEVL37YSY.dvIiLVY3XPZzMtG8Xwo6UucW6A;
|
|
||||||
_cfuvid=v_wJZ5m7qCjrnRfks0gT2GAk9yR14BdIDAQiQR7xxI8-1735266215000-0.0.1.1-604800000
|
|
||||||
host:
|
|
||||||
- api.openai.com
|
|
||||||
user-agent:
|
|
||||||
- OpenAI/Python 1.52.1
|
|
||||||
x-stainless-arch:
|
|
||||||
- arm64
|
|
||||||
x-stainless-async:
|
|
||||||
- 'false'
|
|
||||||
x-stainless-lang:
|
|
||||||
- python
|
|
||||||
x-stainless-os:
|
|
||||||
- MacOS
|
|
||||||
x-stainless-package-version:
|
|
||||||
- 1.52.1
|
|
||||||
x-stainless-raw-response:
|
|
||||||
- 'true'
|
|
||||||
x-stainless-retry-count:
|
|
||||||
- '0'
|
|
||||||
x-stainless-runtime:
|
|
||||||
- CPython
|
|
||||||
x-stainless-runtime-version:
|
|
||||||
- 3.11.7
|
|
||||||
method: POST
|
|
||||||
uri: https://api.openai.com/v1/chat/completions
|
|
||||||
response:
|
|
||||||
content: "{\n \"id\": \"chatcmpl-AiuIiqT33ROFMdw1gNmqH9jiw6PfF\",\n \"object\":
|
|
||||||
\"chat.completion\",\n \"created\": 1735266216,\n \"model\": \"gpt-4o-2024-08-06\",\n
|
|
||||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
|
||||||
\"assistant\",\n \"content\": \"The image is an aerial view of Lower
|
|
||||||
Manhattan in New York City. \\n\\nMain Elements:\\n- The One World Trade Center
|
|
||||||
tower stands prominently, distinguishable by its sleek, tapering structure reaching
|
|
||||||
into the sky, surrounded by other skyscrapers.\\n- Skyscrapers in varying heights
|
|
||||||
and architectural styles, fill the densely packed urban landscape.\\n- A waterfront
|
|
||||||
is visible at the edges, with docks and piers extending into the water.\\n\\nColors:\\n-
|
|
||||||
The buildings exhibit a mix of colors, predominantly grays, whites, and browns,
|
|
||||||
against the blues of the sky and water.\\n- There's a section of greenery visible,
|
|
||||||
likely a park or recreational space, offering a contrast with its vibrant green
|
|
||||||
hues.\\n\\nComposition:\\n- The angle of the photograph showcases the expanse
|
|
||||||
of the city, highlighting the density and scale of the buildings.\\n- Water
|
|
||||||
borders the city on two prominent sides, creating a natural boundary and enhancing
|
|
||||||
the island's urban island feel.\\n\\nNotable Details:\\n- The image captures
|
|
||||||
the iconic layout of Manhattan, with the surrounding Hudson River and New York
|
|
||||||
Harbor visible in the background.\\n- Beyond Lower Manhattan, more of the cityscape
|
|
||||||
stretches into the distance, illustrating the vastness of New York City.\\n-
|
|
||||||
The day appears clear and sunny, with shadows casting from the buildings, indicating
|
|
||||||
time in the morning or late afternoon.\\n\\nOverall, the image is a striking
|
|
||||||
depiction of the dynamic and bustling environment of New York's Lower Manhattan,
|
|
||||||
encapsulating its urban character and proximity to the water.\",\n \"refusal\":
|
|
||||||
null\n },\n \"logprobs\": null,\n \"finish_reason\": \"stop\"\n
|
|
||||||
\ }\n ],\n \"usage\": {\n \"prompt_tokens\": 858,\n \"completion_tokens\":
|
|
||||||
295,\n \"total_tokens\": 1153,\n \"prompt_tokens_details\": {\n \"cached_tokens\":
|
|
||||||
0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\": {\n
|
|
||||||
\ \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
|
||||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
|
||||||
\"fp_5f20662549\"\n}\n"
|
|
||||||
headers:
|
|
||||||
CF-Cache-Status:
|
|
||||||
- DYNAMIC
|
|
||||||
CF-RAY:
|
|
||||||
- 8f85d9741d0cf217-GRU
|
|
||||||
Connection:
|
|
||||||
- keep-alive
|
|
||||||
Content-Encoding:
|
|
||||||
- gzip
|
|
||||||
Content-Type:
|
|
||||||
- application/json
|
|
||||||
Date:
|
|
||||||
- Fri, 27 Dec 2024 02:23:40 GMT
|
|
||||||
Server:
|
|
||||||
- cloudflare
|
|
||||||
Transfer-Encoding:
|
|
||||||
- chunked
|
|
||||||
X-Content-Type-Options:
|
|
||||||
- nosniff
|
|
||||||
access-control-expose-headers:
|
|
||||||
- X-Request-ID
|
|
||||||
alt-svc:
|
|
||||||
- h3=":443"; ma=86400
|
|
||||||
openai-organization:
|
|
||||||
- crewai-iuxna1
|
|
||||||
openai-processing-ms:
|
|
||||||
- '5136'
|
|
||||||
openai-version:
|
|
||||||
- '2020-10-01'
|
|
||||||
strict-transport-security:
|
|
||||||
- max-age=31536000; includeSubDomains; preload
|
|
||||||
x-ratelimit-limit-input-images:
|
|
||||||
- '50000'
|
|
||||||
x-ratelimit-limit-requests:
|
|
||||||
- '10000'
|
|
||||||
x-ratelimit-limit-tokens:
|
|
||||||
- '30000000'
|
|
||||||
x-ratelimit-remaining-input-images:
|
|
||||||
- '49999'
|
|
||||||
x-ratelimit-remaining-requests:
|
|
||||||
- '9999'
|
|
||||||
x-ratelimit-remaining-tokens:
|
|
||||||
- '29998756'
|
|
||||||
x-ratelimit-reset-input-images:
|
|
||||||
- 1ms
|
|
||||||
x-ratelimit-reset-requests:
|
|
||||||
- 6ms
|
|
||||||
x-ratelimit-reset-tokens:
|
|
||||||
- 2ms
|
|
||||||
x-request-id:
|
|
||||||
- req_57a7430712d4ff4a81f600ffb94d3b6e
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
- request:
|
|
||||||
body: '{"messages": [{"role": "system", "content": "You are Image Analyst. You''re
|
|
||||||
an expert at visual analysis, trained to notice and describe details in images.\nYour
|
|
||||||
personal goal is: Analyze images with high attention to detail\nYou ONLY have
|
|
||||||
access to the following tools, and should NEVER make up tools that are not listed
|
|
||||||
here:\n\nTool Name: Add image to content\nTool Arguments: {''image_url'': {''description'':
|
|
||||||
''The URL or path of the image to add'', ''type'': ''str''}, ''action'': {''description'':
|
|
||||||
''Optional context or question about the image'', ''type'': ''str''}}\nTool
|
|
||||||
Description: See image to understand it''s content, you can optionally ask a
|
|
||||||
question about the image\n\nUse the following format:\n\nThought: you should
|
|
||||||
always think about what to do\nAction: the action to take, only one name of
|
|
||||||
[Add image to content], just the name, exactly as it''s written.\nAction Input:
|
|
||||||
the input to the action, just a simple python dictionary, enclosed in curly
|
|
||||||
braces, using \" to wrap keys and values.\nObservation: the result of the action\n\nOnce
|
|
||||||
all necessary information is gathered:\n\nThought: I now know the final answer\nFinal
|
|
||||||
Answer: the final answer to the original input question"}, {"role": "user",
|
|
||||||
"content": "\nCurrent Task: \n Analyze the provided image and describe
|
|
||||||
what you see in detail.\n Focus on main elements, colors, composition,
|
|
||||||
and any notable details.\n Image: https://media.istockphoto.com/id/946087016/photo/aerial-view-of-lower-manhattan-new-york.jpg?s=612x612&w=0&k=20&c=viLiMRznQ8v5LzKTt_LvtfPFUVl1oiyiemVdSlm29_k=\n \n\nThis
|
|
||||||
is the expect criteria for your final answer: A comprehensive description of
|
|
||||||
the image contents.\nyou MUST return the actual complete content as the final
|
|
||||||
answer, not a summary.\n\nBegin! This is VERY important to you, use the tools
|
|
||||||
available and give your best Final Answer, your job depends on it!\n\nThought:"},
|
|
||||||
{"role": "user", "content": [{"type": "text", "text": "Analyze the provided
|
|
||||||
image and describe what you see in detail."}, {"type": "image_url", "image_url":
|
|
||||||
{"url": "https://media.istockphoto.com/id/946087016/photo/aerial-view-of-lower-manhattan-new-york.jpg?s=612x612&w=0&k=20&c=viLiMRznQ8v5LzKTt_LvtfPFUVl1oiyiemVdSlm29_k="}}]},
|
|
||||||
{"role": "user", "content": "I did it wrong. Invalid Format: I missed the ''Action:''
|
|
||||||
after ''Thought:''. I will do right next, and don''t use a tool I have already
|
|
||||||
used.\n\nIf you don''t need to use any more tools, you must give your best complete
|
|
||||||
final answer, make sure it satisfies the expected criteria, use the EXACT format
|
|
||||||
below:\n\nThought: I now can give a great answer\nFinal Answer: my best complete
|
|
||||||
final answer to the task.\n\n"}], "model": "gpt-4o", "stop": ["\nObservation:"],
|
|
||||||
"stream": false}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- application/json
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '2717'
|
|
||||||
content-type:
|
|
||||||
- application/json
|
|
||||||
cookie:
|
|
||||||
- __cf_bm=kJ1pw1xjCMSxjHSS8iJC5z_j2PZxl.i387KCpj9xNZU-1735266215-1.0.1.1-Ybg0wVTsrBlpVZmtQyA1ullY8m3v2Ix0N_SYlhr9z7zKfbLeqGZEVL37YSY.dvIiLVY3XPZzMtG8Xwo6UucW6A;
|
|
||||||
_cfuvid=v_wJZ5m7qCjrnRfks0gT2GAk9yR14BdIDAQiQR7xxI8-1735266215000-0.0.1.1-604800000
|
|
||||||
host:
|
|
||||||
- api.openai.com
|
|
||||||
user-agent:
|
|
||||||
- OpenAI/Python 1.52.1
|
|
||||||
x-stainless-arch:
|
|
||||||
- arm64
|
|
||||||
x-stainless-async:
|
|
||||||
- 'false'
|
|
||||||
x-stainless-lang:
|
|
||||||
- python
|
|
||||||
x-stainless-os:
|
|
||||||
- MacOS
|
|
||||||
x-stainless-package-version:
|
|
||||||
- 1.52.1
|
|
||||||
x-stainless-raw-response:
|
|
||||||
- 'true'
|
|
||||||
x-stainless-retry-count:
|
|
||||||
- '0'
|
|
||||||
x-stainless-runtime:
|
|
||||||
- CPython
|
|
||||||
x-stainless-runtime-version:
|
|
||||||
- 3.11.7
|
|
||||||
method: POST
|
|
||||||
uri: https://api.openai.com/v1/chat/completions
|
|
||||||
response:
|
|
||||||
content: "{\n \"id\": \"chatcmpl-AiuInuYNldaQVo6B1EsEquT1VFMN7\",\n \"object\":
|
|
||||||
\"chat.completion\",\n \"created\": 1735266221,\n \"model\": \"gpt-4o-2024-08-06\",\n
|
|
||||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
|
||||||
\"assistant\",\n \"content\": \"Thought: I now can give a great answer\\nFinal
|
|
||||||
Answer: The image is an aerial view of Lower Manhattan in New York City. The
|
|
||||||
photograph prominently features the cluster of skyscrapers that characterizes
|
|
||||||
the area, with One World Trade Center standing out as a particularly tall and
|
|
||||||
iconic structure. The buildings vary in color, with shades of glassy blue, grey,
|
|
||||||
and natural stone dominating the skyline. In the bottom part of the image, there
|
|
||||||
is a green space, likely Battery Park, providing a stark contrast to the dense
|
|
||||||
urban environment, with trees and pathways visible. The water surrounding Manhattan
|
|
||||||
is a deep blue, and several piers jut into the harbor. The Hudson River is visible
|
|
||||||
on the left, and the East River can be seen on the right, framing the island.
|
|
||||||
The overall composition captures the bustling and vibrant nature of New York\u2019s
|
|
||||||
financial hub, with bright sunlight illuminating the buildings, casting sharp
|
|
||||||
shadows and enhancing the depth of the cityscape. The sky is clear, suggesting
|
|
||||||
a sunny day with good visibility.\",\n \"refusal\": null\n },\n
|
|
||||||
\ \"logprobs\": null,\n \"finish_reason\": \"stop\"\n }\n ],\n
|
|
||||||
\ \"usage\": {\n \"prompt_tokens\": 952,\n \"completion_tokens\": 203,\n
|
|
||||||
\ \"total_tokens\": 1155,\n \"prompt_tokens_details\": {\n \"cached_tokens\":
|
|
||||||
0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\": {\n
|
|
||||||
\ \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
|
||||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
|
||||||
\"fp_5f20662549\"\n}\n"
|
|
||||||
headers:
|
|
||||||
CF-Cache-Status:
|
|
||||||
- DYNAMIC
|
|
||||||
CF-RAY:
|
|
||||||
- 8f85d995ad1ef217-GRU
|
|
||||||
Connection:
|
|
||||||
- keep-alive
|
|
||||||
Content-Encoding:
|
|
||||||
- gzip
|
|
||||||
Content-Type:
|
|
||||||
- application/json
|
|
||||||
Date:
|
|
||||||
- Fri, 27 Dec 2024 02:23:43 GMT
|
|
||||||
Server:
|
|
||||||
- cloudflare
|
|
||||||
Transfer-Encoding:
|
|
||||||
- chunked
|
|
||||||
X-Content-Type-Options:
|
|
||||||
- nosniff
|
|
||||||
access-control-expose-headers:
|
|
||||||
- X-Request-ID
|
|
||||||
alt-svc:
|
|
||||||
- h3=":443"; ma=86400
|
|
||||||
openai-organization:
|
|
||||||
- crewai-iuxna1
|
|
||||||
openai-processing-ms:
|
|
||||||
- '3108'
|
|
||||||
openai-version:
|
|
||||||
- '2020-10-01'
|
|
||||||
strict-transport-security:
|
|
||||||
- max-age=31536000; includeSubDomains; preload
|
|
||||||
x-ratelimit-limit-input-images:
|
|
||||||
- '50000'
|
|
||||||
x-ratelimit-limit-requests:
|
|
||||||
- '10000'
|
|
||||||
x-ratelimit-limit-tokens:
|
|
||||||
- '30000000'
|
|
||||||
x-ratelimit-remaining-input-images:
|
|
||||||
- '49999'
|
|
||||||
x-ratelimit-remaining-requests:
|
|
||||||
- '9999'
|
|
||||||
x-ratelimit-remaining-tokens:
|
|
||||||
- '29998656'
|
|
||||||
x-ratelimit-reset-input-images:
|
|
||||||
- 1ms
|
|
||||||
x-ratelimit-reset-requests:
|
|
||||||
- 6ms
|
|
||||||
x-ratelimit-reset-tokens:
|
|
||||||
- 2ms
|
|
||||||
x-request-id:
|
|
||||||
- req_45f0e3d457a18f973a59074d16f137b6
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
version: 1
|
|
||||||
@@ -1,569 +0,0 @@
|
|||||||
interactions:
|
|
||||||
- request:
|
|
||||||
body: '{"messages": [{"role": "system", "content": "You are Researcher. You''re
|
|
||||||
an expert researcher, specialized in technology, software engineering, AI and
|
|
||||||
startups. You work as a freelancer and is now working on doing research and
|
|
||||||
analysis for a new customer.\nYour personal goal is: Make the best research
|
|
||||||
and analysis on content about AI and AI agents\nYou ONLY have access to the
|
|
||||||
following tools, and should NEVER make up tools that are not listed here:\n\nTool
|
|
||||||
Name: Test Tool\nTool Arguments: {''query'': {''description'': ''Query to process'',
|
|
||||||
''type'': ''str''}}\nTool Description: A test tool that just returns the input\n\nUse
|
|
||||||
the following format:\n\nThought: you should always think about what to do\nAction:
|
|
||||||
the action to take, only one name of [Test Tool], just the name, exactly as
|
|
||||||
it''s written.\nAction Input: the input to the action, just a simple python
|
|
||||||
dictionary, enclosed in curly braces, using \" to wrap keys and values.\nObservation:
|
|
||||||
the result of the action\n\nOnce all necessary information is gathered:\n\nThought:
|
|
||||||
I now know the final answer\nFinal Answer: the final answer to the original
|
|
||||||
input question"}, {"role": "user", "content": "\nCurrent Task: Write a test
|
|
||||||
task\n\nThis is the expect criteria for your final answer: Test output\nyou
|
|
||||||
MUST return the actual complete content as the final answer, not a summary.\n\nBegin!
|
|
||||||
This is VERY important to you, use the tools available and give your best Final
|
|
||||||
Answer, your job depends on it!\n\nThought:"}], "model": "gpt-4o-mini", "stop":
|
|
||||||
["\nObservation:"], "stream": false}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- application/json
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '1536'
|
|
||||||
content-type:
|
|
||||||
- application/json
|
|
||||||
cookie:
|
|
||||||
- _cfuvid=2u_Xw.i716TDjD2vb2mvMyWxhA4q1MM1JvbrA8CNZpI-1734895557894-0.0.1.1-604800000
|
|
||||||
host:
|
|
||||||
- api.openai.com
|
|
||||||
user-agent:
|
|
||||||
- OpenAI/Python 1.52.1
|
|
||||||
x-stainless-arch:
|
|
||||||
- arm64
|
|
||||||
x-stainless-async:
|
|
||||||
- 'false'
|
|
||||||
x-stainless-lang:
|
|
||||||
- python
|
|
||||||
x-stainless-os:
|
|
||||||
- MacOS
|
|
||||||
x-stainless-package-version:
|
|
||||||
- 1.52.1
|
|
||||||
x-stainless-raw-response:
|
|
||||||
- 'true'
|
|
||||||
x-stainless-retry-count:
|
|
||||||
- '0'
|
|
||||||
x-stainless-runtime:
|
|
||||||
- CPython
|
|
||||||
x-stainless-runtime-version:
|
|
||||||
- 3.11.7
|
|
||||||
method: POST
|
|
||||||
uri: https://api.openai.com/v1/chat/completions
|
|
||||||
response:
|
|
||||||
content: "{\n \"id\": \"chatcmpl-AhQfznhDMtsr58XvTuRDZoB1kxwfK\",\n \"object\":
|
|
||||||
\"chat.completion\",\n \"created\": 1734914011,\n \"model\": \"gpt-4o-mini-2024-07-18\",\n
|
|
||||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
|
||||||
\"assistant\",\n \"content\": \"I need to come up with a suitable test
|
|
||||||
task that meets the criteria provided. I will focus on outlining a clear and
|
|
||||||
effective test task related to AI and AI agents.\\n\\nAction: Test Tool\\nAction
|
|
||||||
Input: {\\\"query\\\": \\\"Create a test task that involves evaluating the performance
|
|
||||||
of an AI agent in a given scenario, including criteria for success, tools required,
|
|
||||||
and process for assessment.\\\"}\",\n \"refusal\": null\n },\n \"logprobs\":
|
|
||||||
null,\n \"finish_reason\": \"stop\"\n }\n ],\n \"usage\": {\n \"prompt_tokens\":
|
|
||||||
298,\n \"completion_tokens\": 78,\n \"total_tokens\": 376,\n \"prompt_tokens_details\":
|
|
||||||
{\n \"cached_tokens\": 0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\":
|
|
||||||
{\n \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
|
||||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
|
||||||
\"fp_d02d531b47\"\n}\n"
|
|
||||||
headers:
|
|
||||||
CF-RAY:
|
|
||||||
- 8f6442b868fda486-GRU
|
|
||||||
Connection:
|
|
||||||
- keep-alive
|
|
||||||
Content-Encoding:
|
|
||||||
- gzip
|
|
||||||
Content-Type:
|
|
||||||
- application/json
|
|
||||||
Date:
|
|
||||||
- Mon, 23 Dec 2024 00:33:32 GMT
|
|
||||||
Server:
|
|
||||||
- cloudflare
|
|
||||||
Set-Cookie:
|
|
||||||
- __cf_bm=i6jvNjhsDne300GPAeEmyiJJKYqy7OPuamFG_kht3KE-1734914012-1.0.1.1-tCeVANAF521vkXpBdgYw.ov.fYUr6t5QC4LG_DugWyzu4C60Pi2CruTVniUgfCvkcu6rdHA5DwnaEZf2jFaRCQ;
|
|
||||||
path=/; expires=Mon, 23-Dec-24 01:03:32 GMT; domain=.api.openai.com; HttpOnly;
|
|
||||||
Secure; SameSite=None
|
|
||||||
Transfer-Encoding:
|
|
||||||
- chunked
|
|
||||||
X-Content-Type-Options:
|
|
||||||
- nosniff
|
|
||||||
access-control-expose-headers:
|
|
||||||
- X-Request-ID
|
|
||||||
alt-svc:
|
|
||||||
- h3=":443"; ma=86400
|
|
||||||
cf-cache-status:
|
|
||||||
- DYNAMIC
|
|
||||||
openai-organization:
|
|
||||||
- crewai-iuxna1
|
|
||||||
openai-processing-ms:
|
|
||||||
- '1400'
|
|
||||||
openai-version:
|
|
||||||
- '2020-10-01'
|
|
||||||
strict-transport-security:
|
|
||||||
- max-age=31536000; includeSubDomains; preload
|
|
||||||
x-ratelimit-limit-requests:
|
|
||||||
- '30000'
|
|
||||||
x-ratelimit-limit-tokens:
|
|
||||||
- '150000000'
|
|
||||||
x-ratelimit-remaining-requests:
|
|
||||||
- '29999'
|
|
||||||
x-ratelimit-remaining-tokens:
|
|
||||||
- '149999642'
|
|
||||||
x-ratelimit-reset-requests:
|
|
||||||
- 2ms
|
|
||||||
x-ratelimit-reset-tokens:
|
|
||||||
- 0s
|
|
||||||
x-request-id:
|
|
||||||
- req_c3e50e9ca9dc22de5572692e1a9c0f16
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
- request:
|
|
||||||
body: !!binary |
|
|
||||||
CrBzCiQKIgoMc2VydmljZS5uYW1lEhIKEGNyZXdBSS10ZWxlbWV0cnkSh3MKEgoQY3Jld2FpLnRl
|
|
||||||
bGVtZXRyeRLUCwoQEr8cFisEEEEUtXBvovq6lhIIYdkQ+ekBh3wqDENyZXcgQ3JlYXRlZDABOThc
|
|
||||||
YLAZpxMYQfCuabAZpxMYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODYuMEoaCg5weXRob25fdmVy
|
|
||||||
c2lvbhIICgYzLjExLjdKLgoIY3Jld19rZXkSIgogZGUxMDFkODU1M2VhMDI0NTM3YTA4ZjgxMmVl
|
|
||||||
NmI3NGFKMQoHY3Jld19pZBImCiRmNTc2MjViZC1jZmY3LTRlNGMtYWM1Zi0xZWFiNjQyMzJjMmRK
|
|
||||||
HAoMY3Jld19wcm9jZXNzEgwKCnNlcXVlbnRpYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdf
|
|
||||||
bnVtYmVyX29mX3Rhc2tzEgIYAkobChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgCSpIFCgtjcmV3
|
|
||||||
X2FnZW50cxKCBQr/BFt7ImtleSI6ICI4YmQyMTM5YjU5NzUxODE1MDZlNDFmZDljNDU2M2Q3NSIs
|
|
||||||
ICJpZCI6ICI1Y2Y0OWVjNy05NWYzLTRkZDctODU3Mi1mODAwNDA4NjBiMjgiLCAicm9sZSI6ICJS
|
|
||||||
ZXNlYXJjaGVyIiwgInZlcmJvc2U/IjogZmFsc2UsICJtYXhfaXRlciI6IDIwLCAibWF4X3JwbSI6
|
|
||||||
IG51bGwsICJmdW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAibGxtIjogImdwdC00by1taW5pIiwg
|
|
||||||
ImRlbGVnYXRpb25fZW5hYmxlZD8iOiBmYWxzZSwgImFsbG93X2NvZGVfZXhlY3V0aW9uPyI6IGZh
|
|
||||||
bHNlLCAibWF4X3JldHJ5X2xpbWl0IjogMiwgInRvb2xzX25hbWVzIjogW119LCB7ImtleSI6ICI5
|
|
||||||
YTUwMTVlZjQ4OTVkYzYyNzhkNTQ4MThiYTQ0NmFmNyIsICJpZCI6ICI0MTEyM2QzZC01NmEwLTRh
|
|
||||||
NTgtYTljNi1mZjUwNjRmZjNmNTEiLCAicm9sZSI6ICJTZW5pb3IgV3JpdGVyIiwgInZlcmJvc2U/
|
|
||||||
IjogZmFsc2UsICJtYXhfaXRlciI6IDIwLCAibWF4X3JwbSI6IG51bGwsICJmdW5jdGlvbl9jYWxs
|
|
||||||
aW5nX2xsbSI6ICIiLCAibGxtIjogImdwdC00by1taW5pIiwgImRlbGVnYXRpb25fZW5hYmxlZD8i
|
|
||||||
OiBmYWxzZSwgImFsbG93X2NvZGVfZXhlY3V0aW9uPyI6IGZhbHNlLCAibWF4X3JldHJ5X2xpbWl0
|
|
||||||
IjogMiwgInRvb2xzX25hbWVzIjogW119XUrvAwoKY3Jld190YXNrcxLgAwrdA1t7ImtleSI6ICI5
|
|
||||||
NDRhZWYwYmFjODQwZjFjMjdiZDgzYTkzN2JjMzYxYiIsICJpZCI6ICI3ZDM2NDFhNi1hZmM4LTRj
|
|
||||||
NmMtYjkzMy0wNGZlZjY2NjUxN2MiLCAiYXN5bmNfZXhlY3V0aW9uPyI6IGZhbHNlLCAiaHVtYW5f
|
|
||||||
aW5wdXQ/IjogZmFsc2UsICJhZ2VudF9yb2xlIjogIlJlc2VhcmNoZXIiLCAiYWdlbnRfa2V5Ijog
|
|
||||||
IjhiZDIxMzliNTk3NTE4MTUwNmU0MWZkOWM0NTYzZDc1IiwgInRvb2xzX25hbWVzIjogW119LCB7
|
|
||||||
ImtleSI6ICI5ZjJkNGU5M2FiNTkwYzcyNTg4NzAyNzUwOGFmOTI3OCIsICJpZCI6ICIzNTVjZjFh
|
|
||||||
OS1lOTkzLTQxMTQtOWM0NC0yZDM5MDlhMDljNWYiLCAiYXN5bmNfZXhlY3V0aW9uPyI6IGZhbHNl
|
|
||||||
LCAiaHVtYW5faW5wdXQ/IjogZmFsc2UsICJhZ2VudF9yb2xlIjogIlNlbmlvciBXcml0ZXIiLCAi
|
|
||||||
YWdlbnRfa2V5IjogIjlhNTAxNWVmNDg5NWRjNjI3OGQ1NDgxOGJhNDQ2YWY3IiwgInRvb2xzX25h
|
|
||||||
bWVzIjogW119XXoCGAGFAQABAAASjgIKEHbV3nDt+ndNQNix1f+5+cASCL+l6KV3+FEpKgxUYXNr
|
|
||||||
IENyZWF0ZWQwATmgfo+wGacTGEEQE5CwGacTGEouCghjcmV3X2tleRIiCiBkZTEwMWQ4NTUzZWEw
|
|
||||||
MjQ1MzdhMDhmODEyZWU2Yjc0YUoxCgdjcmV3X2lkEiYKJGY1NzYyNWJkLWNmZjctNGU0Yy1hYzVm
|
|
||||||
LTFlYWI2NDIzMmMyZEouCgh0YXNrX2tleRIiCiA5NDRhZWYwYmFjODQwZjFjMjdiZDgzYTkzN2Jj
|
|
||||||
MzYxYkoxCgd0YXNrX2lkEiYKJDdkMzY0MWE2LWFmYzgtNGM2Yy1iOTMzLTA0ZmVmNjY2NTE3Y3oC
|
|
||||||
GAGFAQABAAASjgIKECqDENVoAz+3ybVKR/wz7dMSCKI9ILLFYx8SKgxUYXNrIENyZWF0ZWQwATng
|
|
||||||
63CzGacTGEE4AXKzGacTGEouCghjcmV3X2tleRIiCiBkZTEwMWQ4NTUzZWEwMjQ1MzdhMDhmODEy
|
|
||||||
ZWU2Yjc0YUoxCgdjcmV3X2lkEiYKJGY1NzYyNWJkLWNmZjctNGU0Yy1hYzVmLTFlYWI2NDIzMmMy
|
|
||||||
ZEouCgh0YXNrX2tleRIiCiA5ZjJkNGU5M2FiNTkwYzcyNTg4NzAyNzUwOGFmOTI3OEoxCgd0YXNr
|
|
||||||
X2lkEiYKJDM1NWNmMWE5LWU5OTMtNDExNC05YzQ0LTJkMzkwOWEwOWM1ZnoCGAGFAQABAAAS1AsK
|
|
||||||
EOofSLF1HDmhYMt7eIAeFo8SCCaKUQMuWNdnKgxDcmV3IENyZWF0ZWQwATkYKA62GacTGEFwlhW2
|
|
||||||
GacTGEoaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjg2LjBKGgoOcHl0aG9uX3ZlcnNpb24SCAoGMy4x
|
|
||||||
MS43Si4KCGNyZXdfa2V5EiIKIDRlOGU0MmNmMWVhN2U2NjhhMGU5MzJhNzAyMDY1NzQ5SjEKB2Ny
|
|
||||||
ZXdfaWQSJgokMmIzNTVjZDMtY2MwNi00Y2QxLTk0YjgtZTU5YjM5OGI3MjEzShwKDGNyZXdfcHJv
|
|
||||||
Y2VzcxIMCgpzZXF1ZW50aWFsShEKC2NyZXdfbWVtb3J5EgIQAEoaChRjcmV3X251bWJlcl9vZl90
|
|
||||||
YXNrcxICGAJKGwoVY3Jld19udW1iZXJfb2ZfYWdlbnRzEgIYAkqSBQoLY3Jld19hZ2VudHMSggUK
|
|
||||||
/wRbeyJrZXkiOiAiOGJkMjEzOWI1OTc1MTgxNTA2ZTQxZmQ5YzQ1NjNkNzUiLCAiaWQiOiAiNWNm
|
|
||||||
NDllYzctOTVmMy00ZGQ3LTg1NzItZjgwMDQwODYwYjI4IiwgInJvbGUiOiAiUmVzZWFyY2hlciIs
|
|
||||||
ICJ2ZXJib3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAyMCwgIm1heF9ycG0iOiBudWxsLCAiZnVu
|
|
||||||
Y3Rpb25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJncHQtNG8tbWluaSIsICJkZWxlZ2F0aW9u
|
|
||||||
X2VuYWJsZWQ/IjogZmFsc2UsICJhbGxvd19jb2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9y
|
|
||||||
ZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6IFtdfSwgeyJrZXkiOiAiOWE1MDE1ZWY0ODk1
|
|
||||||
ZGM2Mjc4ZDU0ODE4YmE0NDZhZjciLCAiaWQiOiAiNDExMjNkM2QtNTZhMC00YTU4LWE5YzYtZmY1
|
|
||||||
MDY0ZmYzZjUxIiwgInJvbGUiOiAiU2VuaW9yIFdyaXRlciIsICJ2ZXJib3NlPyI6IGZhbHNlLCAi
|
|
||||||
bWF4X2l0ZXIiOiAyMCwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAi
|
|
||||||
IiwgImxsbSI6ICJncHQtNG8tbWluaSIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJh
|
|
||||||
bGxvd19jb2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6IDIsICJ0b29s
|
|
||||||
c19uYW1lcyI6IFtdfV1K7wMKCmNyZXdfdGFza3MS4AMK3QNbeyJrZXkiOiAiNjc4NDlmZjcxN2Ri
|
|
||||||
YWRhYmExYjk1ZDVmMmRmY2VlYTEiLCAiaWQiOiAiOGE5OTgxMDYtZjg5Zi00YTQ5LThjZjEtYjk4
|
|
||||||
MzQ5ZDE1NDRmIiwgImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lucHV0PyI6IGZh
|
|
||||||
bHNlLCAiYWdlbnRfcm9sZSI6ICJSZXNlYXJjaGVyIiwgImFnZW50X2tleSI6ICI4YmQyMTM5YjU5
|
|
||||||
NzUxODE1MDZlNDFmZDljNDU2M2Q3NSIsICJ0b29sc19uYW1lcyI6IFtdfSwgeyJrZXkiOiAiODRh
|
|
||||||
ZjlmYzFjZDMzMTk5Y2ViYjlkNDE0MjE4NWY4MDIiLCAiaWQiOiAiYTViMTg0MDgtYjA1OC00ZDE1
|
|
||||||
LTkyMmUtNDJkN2M5Y2ViYjFhIiwgImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lu
|
|
||||||
cHV0PyI6IGZhbHNlLCAiYWdlbnRfcm9sZSI6ICJTZW5pb3IgV3JpdGVyIiwgImFnZW50X2tleSI6
|
|
||||||
ICI5YTUwMTVlZjQ4OTVkYzYyNzhkNTQ4MThiYTQ0NmFmNyIsICJ0b29sc19uYW1lcyI6IFtdfV16
|
|
||||||
AhgBhQEAAQAAEsIJChDCLrcWQ+nu3SxOgnq50XhSEghjozRtuCFA0SoMQ3JldyBDcmVhdGVkMAE5
|
|
||||||
CDeCthmnExhBmHiIthmnExhKGgoOY3Jld2FpX3ZlcnNpb24SCAoGMC44Ni4wShoKDnB5dGhvbl92
|
|
||||||
ZXJzaW9uEggKBjMuMTEuN0ouCghjcmV3X2tleRIiCiBlM2ZkYTBmMzExMGZlODBiMTg5NDdjMDE0
|
|
||||||
NzE0MzBhNEoxCgdjcmV3X2lkEiYKJGM1ZDQ0YjY5LTRhNzMtNDA3Zi1iY2RhLTUzZmUxZTQ3YTU3
|
|
||||||
M0oeCgxjcmV3X3Byb2Nlc3MSDgoMaGllcmFyY2hpY2FsShEKC2NyZXdfbWVtb3J5EgIQAEoaChRj
|
|
||||||
cmV3X251bWJlcl9vZl90YXNrcxICGAFKGwoVY3Jld19udW1iZXJfb2ZfYWdlbnRzEgIYAkqSBQoL
|
|
||||||
Y3Jld19hZ2VudHMSggUK/wRbeyJrZXkiOiAiOGJkMjEzOWI1OTc1MTgxNTA2ZTQxZmQ5YzQ1NjNk
|
|
||||||
NzUiLCAiaWQiOiAiNWNmNDllYzctOTVmMy00ZGQ3LTg1NzItZjgwMDQwODYwYjI4IiwgInJvbGUi
|
|
||||||
OiAiUmVzZWFyY2hlciIsICJ2ZXJib3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAyMCwgIm1heF9y
|
|
||||||
cG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJncHQtNG8tbWlu
|
|
||||||
aSIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJhbGxvd19jb2RlX2V4ZWN1dGlvbj8i
|
|
||||||
OiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6IFtdfSwgeyJrZXki
|
|
||||||
OiAiOWE1MDE1ZWY0ODk1ZGM2Mjc4ZDU0ODE4YmE0NDZhZjciLCAiaWQiOiAiNDExMjNkM2QtNTZh
|
|
||||||
MC00YTU4LWE5YzYtZmY1MDY0ZmYzZjUxIiwgInJvbGUiOiAiU2VuaW9yIFdyaXRlciIsICJ2ZXJi
|
|
||||||
b3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAyMCwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25f
|
|
||||||
Y2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJncHQtNG8tbWluaSIsICJkZWxlZ2F0aW9uX2VuYWJs
|
|
||||||
ZWQ/IjogZmFsc2UsICJhbGxvd19jb2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9yZXRyeV9s
|
|
||||||
aW1pdCI6IDIsICJ0b29sc19uYW1lcyI6IFtdfV1K2wEKCmNyZXdfdGFza3MSzAEKyQFbeyJrZXki
|
|
||||||
OiAiNWZhNjVjMDZhOWUzMWYyYzY5NTQzMjY2OGFjZDYyZGQiLCAiaWQiOiAiNjNhYTVlOTYtYTM4
|
|
||||||
Yy00YjcyLWJiZDQtYjM2NmU5NTlhOWZhIiwgImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1
|
|
||||||
bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdlbnRfcm9sZSI6ICJOb25lIiwgImFnZW50X2tleSI6IG51
|
|
||||||
bGwsICJ0b29sc19uYW1lcyI6IFtdfV16AhgBhQEAAQAAEuYJChA8kiyQ+AFdDSYkp0+TUWKvEgjW
|
|
||||||
0grLw8r5KioMQ3JldyBDcmVhdGVkMAE5iLivvhmnExhBeG21vhmnExhKGgoOY3Jld2FpX3ZlcnNp
|
|
||||||
b24SCAoGMC44Ni4wShoKDnB5dGhvbl92ZXJzaW9uEggKBjMuMTEuN0ouCghjcmV3X2tleRIiCiBl
|
|
||||||
M2ZkYTBmMzExMGZlODBiMTg5NDdjMDE0NzE0MzBhNEoxCgdjcmV3X2lkEiYKJGIzZGQ1MGYxLTI0
|
|
||||||
YWQtNDE5OC04ZGFhLTMwZTU0OTQ3MTlhMEoeCgxjcmV3X3Byb2Nlc3MSDgoMaGllcmFyY2hpY2Fs
|
|
||||||
ShEKC2NyZXdfbWVtb3J5EgIQAEoaChRjcmV3X251bWJlcl9vZl90YXNrcxICGAFKGwoVY3Jld19u
|
|
||||||
dW1iZXJfb2ZfYWdlbnRzEgIYAkqSBQoLY3Jld19hZ2VudHMSggUK/wRbeyJrZXkiOiAiOGJkMjEz
|
|
||||||
OWI1OTc1MTgxNTA2ZTQxZmQ5YzQ1NjNkNzUiLCAiaWQiOiAiNWNmNDllYzctOTVmMy00ZGQ3LTg1
|
|
||||||
NzItZjgwMDQwODYwYjI4IiwgInJvbGUiOiAiUmVzZWFyY2hlciIsICJ2ZXJib3NlPyI6IGZhbHNl
|
|
||||||
LCAibWF4X2l0ZXIiOiAyMCwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0i
|
|
||||||
OiAiIiwgImxsbSI6ICJncHQtNG8tbWluaSIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2Us
|
|
||||||
ICJhbGxvd19jb2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6IDIsICJ0
|
|
||||||
b29sc19uYW1lcyI6IFtdfSwgeyJrZXkiOiAiOWE1MDE1ZWY0ODk1ZGM2Mjc4ZDU0ODE4YmE0NDZh
|
|
||||||
ZjciLCAiaWQiOiAiNDExMjNkM2QtNTZhMC00YTU4LWE5YzYtZmY1MDY0ZmYzZjUxIiwgInJvbGUi
|
|
||||||
OiAiU2VuaW9yIFdyaXRlciIsICJ2ZXJib3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAyMCwgIm1h
|
|
||||||
eF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJncHQtNG8t
|
|
||||||
bWluaSIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFsc2UsICJhbGxvd19jb2RlX2V4ZWN1dGlv
|
|
||||||
bj8iOiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6IFtdfV1K/wEK
|
|
||||||
CmNyZXdfdGFza3MS8AEK7QFbeyJrZXkiOiAiNWZhNjVjMDZhOWUzMWYyYzY5NTQzMjY2OGFjZDYy
|
|
||||||
ZGQiLCAiaWQiOiAiNzEyODlkZTAtODQ4My00NDM2LWI2OGMtNDc1MWIzNTU0ZmUzIiwgImFzeW5j
|
|
||||||
X2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lucHV0PyI6IGZhbHNlLCAiYWdlbnRfcm9sZSI6
|
|
||||||
ICJSZXNlYXJjaGVyIiwgImFnZW50X2tleSI6ICI4YmQyMTM5YjU5NzUxODE1MDZlNDFmZDljNDU2
|
|
||||||
M2Q3NSIsICJ0b29sc19uYW1lcyI6IFtdfV16AhgBhQEAAQAAEo4CChCTiJL+KK5ff9xnie6eZbEc
|
|
||||||
EghbtQixNaG5DioMVGFzayBDcmVhdGVkMAE5cIXNvhmnExhBuPbNvhmnExhKLgoIY3Jld19rZXkS
|
|
||||||
IgogZTNmZGEwZjMxMTBmZTgwYjE4OTQ3YzAxNDcxNDMwYTRKMQoHY3Jld19pZBImCiRiM2RkNTBm
|
|
||||||
MS0yNGFkLTQxOTgtOGRhYS0zMGU1NDk0NzE5YTBKLgoIdGFza19rZXkSIgogNWZhNjVjMDZhOWUz
|
|
||||||
MWYyYzY5NTQzMjY2OGFjZDYyZGRKMQoHdGFza19pZBImCiQ3MTI4OWRlMC04NDgzLTQ0MzYtYjY4
|
|
||||||
Yy00NzUxYjM1NTRmZTN6AhgBhQEAAQAAEpwBChBCdDi/i+SH0kHHlJKQjmYgEgiemV9jVU5fQSoK
|
|
||||||
VG9vbCBVc2FnZTABOVj/YL8ZpxMYQWCwZr8ZpxMYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODYu
|
|
||||||
MEooCgl0b29sX25hbWUSGwoZRGVsZWdhdGUgd29yayB0byBjb3dvcmtlckoOCghhdHRlbXB0cxIC
|
|
||||||
GAF6AhgBhQEAAQAAEqUBChBRuZ6Z/nNag4ubLeZ8L/8pEghCX4biKNFb6SoTVG9vbCBSZXBlYXRl
|
|
||||||
ZCBVc2FnZTABOUj9wr8ZpxMYQdg+yb8ZpxMYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODYuMEoo
|
|
||||||
Cgl0b29sX25hbWUSGwoZRGVsZWdhdGUgd29yayB0byBjb3dvcmtlckoOCghhdHRlbXB0cxICGAF6
|
|
||||||
AhgBhQEAAQAAEpwBChDnt1bxQsOb0LVscG9GDYVtEgjf62keNMl5ZyoKVG9vbCBVc2FnZTABOdha
|
|
||||||
6MAZpxMYQWii7cAZpxMYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODYuMEooCgl0b29sX25hbWUS
|
|
||||||
GwoZRGVsZWdhdGUgd29yayB0byBjb3dvcmtlckoOCghhdHRlbXB0cxICGAF6AhgBhQEAAQAAEpsB
|
|
||||||
ChDFqFA9b42EIwUxeNLTeScxEgiGFk7FwiNxVioKVG9vbCBVc2FnZTABObDAY8EZpxMYQdhIaMEZ
|
|
||||||
pxMYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODYuMEonCgl0b29sX25hbWUSGgoYQXNrIHF1ZXN0
|
|
||||||
aW9uIHRvIGNvd29ya2VySg4KCGF0dGVtcHRzEgIYAXoCGAGFAQABAAASwgkKEHpB0rbuWbSXijzV
|
|
||||||
QdTa3oQSCNSPnbmqe2PfKgxDcmV3IENyZWF0ZWQwATmIXxTCGacTGEF4GhnCGacTGEoaCg5jcmV3
|
|
||||||
YWlfdmVyc2lvbhIICgYwLjg2LjBKGgoOcHl0aG9uX3ZlcnNpb24SCAoGMy4xMS43Si4KCGNyZXdf
|
|
||||||
a2V5EiIKIGUzZmRhMGYzMTEwZmU4MGIxODk0N2MwMTQ3MTQzMGE0SjEKB2NyZXdfaWQSJgokZGJm
|
|
||||||
YzNjMjctMmRjZS00MjIyLThiYmQtYmMxMjU3OTVlNWI1Sh4KDGNyZXdfcHJvY2VzcxIOCgxoaWVy
|
|
||||||
YXJjaGljYWxKEQoLY3Jld19tZW1vcnkSAhAAShoKFGNyZXdfbnVtYmVyX29mX3Rhc2tzEgIYAUob
|
|
||||||
ChVjcmV3X251bWJlcl9vZl9hZ2VudHMSAhgCSpIFCgtjcmV3X2FnZW50cxKCBQr/BFt7ImtleSI6
|
|
||||||
ICI4YmQyMTM5YjU5NzUxODE1MDZlNDFmZDljNDU2M2Q3NSIsICJpZCI6ICI1Y2Y0OWVjNy05NWYz
|
|
||||||
LTRkZDctODU3Mi1mODAwNDA4NjBiMjgiLCAicm9sZSI6ICJSZXNlYXJjaGVyIiwgInZlcmJvc2U/
|
|
||||||
IjogZmFsc2UsICJtYXhfaXRlciI6IDIwLCAibWF4X3JwbSI6IG51bGwsICJmdW5jdGlvbl9jYWxs
|
|
||||||
aW5nX2xsbSI6ICIiLCAibGxtIjogImdwdC00by1taW5pIiwgImRlbGVnYXRpb25fZW5hYmxlZD8i
|
|
||||||
OiBmYWxzZSwgImFsbG93X2NvZGVfZXhlY3V0aW9uPyI6IGZhbHNlLCAibWF4X3JldHJ5X2xpbWl0
|
|
||||||
IjogMiwgInRvb2xzX25hbWVzIjogW119LCB7ImtleSI6ICI5YTUwMTVlZjQ4OTVkYzYyNzhkNTQ4
|
|
||||||
MThiYTQ0NmFmNyIsICJpZCI6ICI0MTEyM2QzZC01NmEwLTRhNTgtYTljNi1mZjUwNjRmZjNmNTEi
|
|
||||||
LCAicm9sZSI6ICJTZW5pb3IgV3JpdGVyIiwgInZlcmJvc2U/IjogZmFsc2UsICJtYXhfaXRlciI6
|
|
||||||
IDIwLCAibWF4X3JwbSI6IG51bGwsICJmdW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAibGxtIjog
|
|
||||||
ImdwdC00by1taW5pIiwgImRlbGVnYXRpb25fZW5hYmxlZD8iOiBmYWxzZSwgImFsbG93X2NvZGVf
|
|
||||||
ZXhlY3V0aW9uPyI6IGZhbHNlLCAibWF4X3JldHJ5X2xpbWl0IjogMiwgInRvb2xzX25hbWVzIjog
|
|
||||||
W119XUrbAQoKY3Jld190YXNrcxLMAQrJAVt7ImtleSI6ICI1ZmE2NWMwNmE5ZTMxZjJjNjk1NDMy
|
|
||||||
NjY4YWNkNjJkZCIsICJpZCI6ICIyYWFjOTllMC0yNWVmLTQzN2MtYTJmZi1jZGFlMjg2ZWU2MzQi
|
|
||||||
LCAiYXN5bmNfZXhlY3V0aW9uPyI6IGZhbHNlLCAiaHVtYW5faW5wdXQ/IjogZmFsc2UsICJhZ2Vu
|
|
||||||
dF9yb2xlIjogIk5vbmUiLCAiYWdlbnRfa2V5IjogbnVsbCwgInRvb2xzX25hbWVzIjogW119XXoC
|
|
||||||
GAGFAQABAAAS1QkKEM6Xt0BvAHy+TI7iLC6ovN0SCEfHP30NZESSKgxDcmV3IENyZWF0ZWQwATkg
|
|
||||||
PdnDGacTGEFIPN/DGacTGEoaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjg2LjBKGgoOcHl0aG9uX3Zl
|
|
||||||
cnNpb24SCAoGMy4xMS43Si4KCGNyZXdfa2V5EiIKIGU2NDk1NzNhMjZlNTg3OTBjYWMyMWEzN2Nk
|
|
||||||
NDQ0MzdhSjEKB2NyZXdfaWQSJgokNjE3MDA3NGMtYzU5OS00ODkyLTkwYzYtMTcxYjhkM2Y1OTRh
|
|
||||||
ShwKDGNyZXdfcHJvY2VzcxIMCgpzZXF1ZW50aWFsShEKC2NyZXdfbWVtb3J5EgIQAEoaChRjcmV3
|
|
||||||
X251bWJlcl9vZl90YXNrcxICGAFKGwoVY3Jld19udW1iZXJfb2ZfYWdlbnRzEgIYAkqKBQoLY3Jl
|
|
||||||
d19hZ2VudHMS+gQK9wRbeyJrZXkiOiAiMzI4MjE3YjZjMjk1OWJkZmM0N2NhZDAwZTg0ODkwZDAi
|
|
||||||
LCAiaWQiOiAiYjNmMTczZTktNjY3NS00OTFkLTgyYjctODM4NmRkMjExMDM1IiwgInJvbGUiOiAi
|
|
||||||
Q0VPIiwgInZlcmJvc2U/IjogZmFsc2UsICJtYXhfaXRlciI6IDIwLCAibWF4X3JwbSI6IG51bGws
|
|
||||||
ICJmdW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAibGxtIjogImdwdC00by1taW5pIiwgImRlbGVn
|
|
||||||
YXRpb25fZW5hYmxlZD8iOiB0cnVlLCAiYWxsb3dfY29kZV9leGVjdXRpb24/IjogZmFsc2UsICJt
|
|
||||||
YXhfcmV0cnlfbGltaXQiOiAyLCAidG9vbHNfbmFtZXMiOiBbXX0sIHsia2V5IjogIjlhNTAxNWVm
|
|
||||||
NDg5NWRjNjI3OGQ1NDgxOGJhNDQ2YWY3IiwgImlkIjogIjQxMTIzZDNkLTU2YTAtNGE1OC1hOWM2
|
|
||||||
LWZmNTA2NGZmM2Y1MSIsICJyb2xlIjogIlNlbmlvciBXcml0ZXIiLCAidmVyYm9zZT8iOiBmYWxz
|
|
||||||
ZSwgIm1heF9pdGVyIjogMjAsICJtYXhfcnBtIjogbnVsbCwgImZ1bmN0aW9uX2NhbGxpbmdfbGxt
|
|
||||||
IjogIiIsICJsbG0iOiAiZ3B0LTRvLW1pbmkiLCAiZGVsZWdhdGlvbl9lbmFibGVkPyI6IGZhbHNl
|
|
||||||
LCAiYWxsb3dfY29kZV9leGVjdXRpb24/IjogZmFsc2UsICJtYXhfcmV0cnlfbGltaXQiOiAyLCAi
|
|
||||||
dG9vbHNfbmFtZXMiOiBbXX1dSvgBCgpjcmV3X3Rhc2tzEukBCuYBW3sia2V5IjogIjBiOWQ2NWRi
|
|
||||||
NmI3YWVkZmIzOThjNTllMmE5ZjcxZWM1IiwgImlkIjogImJiNmI1Njg3LTg5NGMtNDAyNS05M2My
|
|
||||||
LTMyYjdkZmEwZTUxMyIsICJhc3luY19leGVjdXRpb24/IjogZmFsc2UsICJodW1hbl9pbnB1dD8i
|
|
||||||
OiBmYWxzZSwgImFnZW50X3JvbGUiOiAiQ0VPIiwgImFnZW50X2tleSI6ICIzMjgyMTdiNmMyOTU5
|
|
||||||
YmRmYzQ3Y2FkMDBlODQ4OTBkMCIsICJ0b29sc19uYW1lcyI6IFtdfV16AhgBhQEAAQAAEo4CChCK
|
|
||||||
KIL9w7sqoMzG3JItjK8eEgiR4RSmJw+SMSoMVGFzayBDcmVhdGVkMAE5CCjywxmnExhByIXywxmn
|
|
||||||
ExhKLgoIY3Jld19rZXkSIgogZTY0OTU3M2EyNmU1ODc5MGNhYzIxYTM3Y2Q0NDQzN2FKMQoHY3Jl
|
|
||||||
d19pZBImCiQ2MTcwMDc0Yy1jNTk5LTQ4OTItOTBjNi0xNzFiOGQzZjU5NGFKLgoIdGFza19rZXkS
|
|
||||||
IgogMGI5ZDY1ZGI2YjdhZWRmYjM5OGM1OWUyYTlmNzFlYzVKMQoHdGFza19pZBImCiRiYjZiNTY4
|
|
||||||
Ny04OTRjLTQwMjUtOTNjMi0zMmI3ZGZhMGU1MTN6AhgBhQEAAQAAEpwBChD+/zv5udkceIEyIb7d
|
|
||||||
ne5vEgj1My75q1O7UCoKVG9vbCBVc2FnZTABOThPfMQZpxMYQcA4g8QZpxMYShoKDmNyZXdhaV92
|
|
||||||
ZXJzaW9uEggKBjAuODYuMEooCgl0b29sX25hbWUSGwoZRGVsZWdhdGUgd29yayB0byBjb3dvcmtl
|
|
||||||
ckoOCghhdHRlbXB0cxICGAF6AhgBhQEAAQAAEuAJChBIzM1Xa9IhegFDHxt6rj3eEgj9z56V1hXk
|
|
||||||
aCoMQ3JldyBDcmVhdGVkMAE5mEoMxRmnExhBoPsRxRmnExhKGgoOY3Jld2FpX3ZlcnNpb24SCAoG
|
|
||||||
MC44Ni4wShoKDnB5dGhvbl92ZXJzaW9uEggKBjMuMTEuN0ouCghjcmV3X2tleRIiCiBlNjQ5NTcz
|
|
||||||
YTI2ZTU4NzkwY2FjMjFhMzdjZDQ0NDM3YUoxCgdjcmV3X2lkEiYKJGQ4MjhhZWM2LTg2N2MtNDdh
|
|
||||||
YS04ODY4LWQwMWYwNGM0MGE0MUocCgxjcmV3X3Byb2Nlc3MSDAoKc2VxdWVudGlhbEoRCgtjcmV3
|
|
||||||
X21lbW9yeRICEABKGgoUY3Jld19udW1iZXJfb2ZfdGFza3MSAhgBShsKFWNyZXdfbnVtYmVyX29m
|
|
||||||
X2FnZW50cxICGAJKigUKC2NyZXdfYWdlbnRzEvoECvcEW3sia2V5IjogIjMyODIxN2I2YzI5NTli
|
|
||||||
ZGZjNDdjYWQwMGU4NDg5MGQwIiwgImlkIjogImIzZjE3M2U5LTY2NzUtNDkxZC04MmI3LTgzODZk
|
|
||||||
ZDIxMTAzNSIsICJyb2xlIjogIkNFTyIsICJ2ZXJib3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAy
|
|
||||||
MCwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJn
|
|
||||||
cHQtNG8tbWluaSIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogdHJ1ZSwgImFsbG93X2NvZGVfZXhl
|
|
||||||
Y3V0aW9uPyI6IGZhbHNlLCAibWF4X3JldHJ5X2xpbWl0IjogMiwgInRvb2xzX25hbWVzIjogW119
|
|
||||||
LCB7ImtleSI6ICI5YTUwMTVlZjQ4OTVkYzYyNzhkNTQ4MThiYTQ0NmFmNyIsICJpZCI6ICI0MTEy
|
|
||||||
M2QzZC01NmEwLTRhNTgtYTljNi1mZjUwNjRmZjNmNTEiLCAicm9sZSI6ICJTZW5pb3IgV3JpdGVy
|
|
||||||
IiwgInZlcmJvc2U/IjogZmFsc2UsICJtYXhfaXRlciI6IDIwLCAibWF4X3JwbSI6IG51bGwsICJm
|
|
||||||
dW5jdGlvbl9jYWxsaW5nX2xsbSI6ICIiLCAibGxtIjogImdwdC00by1taW5pIiwgImRlbGVnYXRp
|
|
||||||
b25fZW5hYmxlZD8iOiBmYWxzZSwgImFsbG93X2NvZGVfZXhlY3V0aW9uPyI6IGZhbHNlLCAibWF4
|
|
||||||
X3JldHJ5X2xpbWl0IjogMiwgInRvb2xzX25hbWVzIjogW119XUqDAgoKY3Jld190YXNrcxL0AQrx
|
|
||||||
AVt7ImtleSI6ICIwYjlkNjVkYjZiN2FlZGZiMzk4YzU5ZTJhOWY3MWVjNSIsICJpZCI6ICI5YTBj
|
|
||||||
ODZhZi0wYTE0LTQ4MzgtOTJmZC02NDhhZGM1NzJlMDMiLCAiYXN5bmNfZXhlY3V0aW9uPyI6IGZh
|
|
||||||
bHNlLCAiaHVtYW5faW5wdXQ/IjogZmFsc2UsICJhZ2VudF9yb2xlIjogIkNFTyIsICJhZ2VudF9r
|
|
||||||
ZXkiOiAiMzI4MjE3YjZjMjk1OWJkZmM0N2NhZDAwZTg0ODkwZDAiLCAidG9vbHNfbmFtZXMiOiBb
|
|
||||||
InRlc3QgdG9vbCJdfV16AhgBhQEAAQAAEo4CChDl0EBv/8sdeV8eJ45EUBpxEgj+C7UlokySqSoM
|
|
||||||
VGFzayBDcmVhdGVkMAE5oI8jxRmnExhBYO0jxRmnExhKLgoIY3Jld19rZXkSIgogZTY0OTU3M2Ey
|
|
||||||
NmU1ODc5MGNhYzIxYTM3Y2Q0NDQzN2FKMQoHY3Jld19pZBImCiRkODI4YWVjNi04NjdjLTQ3YWEt
|
|
||||||
ODg2OC1kMDFmMDRjNDBhNDFKLgoIdGFza19rZXkSIgogMGI5ZDY1ZGI2YjdhZWRmYjM5OGM1OWUy
|
|
||||||
YTlmNzFlYzVKMQoHdGFza19pZBImCiQ5YTBjODZhZi0wYTE0LTQ4MzgtOTJmZC02NDhhZGM1NzJl
|
|
||||||
MDN6AhgBhQEAAQAAEpsBChArkcRTKJCaWLUYbx8DLyvTEgikYuS5tmbKNioKVG9vbCBVc2FnZTAB
|
|
||||||
OSh+MscZpxMYQdgTOMcZpxMYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODYuMEonCgl0b29sX25h
|
|
||||||
bWUSGgoYQXNrIHF1ZXN0aW9uIHRvIGNvd29ya2VySg4KCGF0dGVtcHRzEgIYAXoCGAGFAQABAAAS
|
|
||||||
6wkKEHxFJsjiUgQromzfQHpYYMISCBkGairjk9kkKgxDcmV3IENyZWF0ZWQwATk4/rXHGacTGEGY
|
|
||||||
yrvHGacTGEoaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjg2LjBKGgoOcHl0aG9uX3ZlcnNpb24SCAoG
|
|
||||||
My4xMS43Si4KCGNyZXdfa2V5EiIKIGU2NDk1NzNhMjZlNTg3OTBjYWMyMWEzN2NkNDQ0MzdhSjEK
|
|
||||||
B2NyZXdfaWQSJgokMjY3NzEyNzItOTRlZC00NDVkLTg1MGEtYTkyYTZjOWI5YmJkShwKDGNyZXdf
|
|
||||||
cHJvY2VzcxIMCgpzZXF1ZW50aWFsShEKC2NyZXdfbWVtb3J5EgIQAEoaChRjcmV3X251bWJlcl9v
|
|
||||||
Zl90YXNrcxICGAFKGwoVY3Jld19udW1iZXJfb2ZfYWdlbnRzEgIYAkqVBQoLY3Jld19hZ2VudHMS
|
|
||||||
hQUKggVbeyJrZXkiOiAiMzI4MjE3YjZjMjk1OWJkZmM0N2NhZDAwZTg0ODkwZDAiLCAiaWQiOiAi
|
|
||||||
YjNmMTczZTktNjY3NS00OTFkLTgyYjctODM4NmRkMjExMDM1IiwgInJvbGUiOiAiQ0VPIiwgInZl
|
|
||||||
cmJvc2U/IjogZmFsc2UsICJtYXhfaXRlciI6IDIwLCAibWF4X3JwbSI6IG51bGwsICJmdW5jdGlv
|
|
||||||
bl9jYWxsaW5nX2xsbSI6ICIiLCAibGxtIjogImdwdC00by1taW5pIiwgImRlbGVnYXRpb25fZW5h
|
|
||||||
YmxlZD8iOiB0cnVlLCAiYWxsb3dfY29kZV9leGVjdXRpb24/IjogZmFsc2UsICJtYXhfcmV0cnlf
|
|
||||||
bGltaXQiOiAyLCAidG9vbHNfbmFtZXMiOiBbInRlc3QgdG9vbCJdfSwgeyJrZXkiOiAiOWE1MDE1
|
|
||||||
ZWY0ODk1ZGM2Mjc4ZDU0ODE4YmE0NDZhZjciLCAiaWQiOiAiNDExMjNkM2QtNTZhMC00YTU4LWE5
|
|
||||||
YzYtZmY1MDY0ZmYzZjUxIiwgInJvbGUiOiAiU2VuaW9yIFdyaXRlciIsICJ2ZXJib3NlPyI6IGZh
|
|
||||||
bHNlLCAibWF4X2l0ZXIiOiAyMCwgIm1heF9ycG0iOiBudWxsLCAiZnVuY3Rpb25fY2FsbGluZ19s
|
|
||||||
bG0iOiAiIiwgImxsbSI6ICJncHQtNG8tbWluaSIsICJkZWxlZ2F0aW9uX2VuYWJsZWQ/IjogZmFs
|
|
||||||
c2UsICJhbGxvd19jb2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9yZXRyeV9saW1pdCI6IDIs
|
|
||||||
ICJ0b29sc19uYW1lcyI6IFtdfV1KgwIKCmNyZXdfdGFza3MS9AEK8QFbeyJrZXkiOiAiMGI5ZDY1
|
|
||||||
ZGI2YjdhZWRmYjM5OGM1OWUyYTlmNzFlYzUiLCAiaWQiOiAiNjYzOTEwZjYtNTlkYS00NjE3LTli
|
|
||||||
ZTMtNTBmMDdhNmQ5N2U3IiwgImFzeW5jX2V4ZWN1dGlvbj8iOiBmYWxzZSwgImh1bWFuX2lucHV0
|
|
||||||
PyI6IGZhbHNlLCAiYWdlbnRfcm9sZSI6ICJDRU8iLCAiYWdlbnRfa2V5IjogIjMyODIxN2I2YzI5
|
|
||||||
NTliZGZjNDdjYWQwMGU4NDg5MGQwIiwgInRvb2xzX25hbWVzIjogWyJ0ZXN0IHRvb2wiXX1degIY
|
|
||||||
AYUBAAEAABKOAgoQ1qBlNY8Yu1muyMaMnchyJBII0vE2y9FMwz0qDFRhc2sgQ3JlYXRlZDABObDR
|
|
||||||
zscZpxMYQah5z8cZpxMYSi4KCGNyZXdfa2V5EiIKIGU2NDk1NzNhMjZlNTg3OTBjYWMyMWEzN2Nk
|
|
||||||
NDQ0MzdhSjEKB2NyZXdfaWQSJgokMjY3NzEyNzItOTRlZC00NDVkLTg1MGEtYTkyYTZjOWI5YmJk
|
|
||||||
Si4KCHRhc2tfa2V5EiIKIDBiOWQ2NWRiNmI3YWVkZmIzOThjNTllMmE5ZjcxZWM1SjEKB3Rhc2tf
|
|
||||||
aWQSJgokNjYzOTEwZjYtNTlkYS00NjE3LTliZTMtNTBmMDdhNmQ5N2U3egIYAYUBAAEAABKMAQoQ
|
|
||||||
a8ZDV3ZaBmcOZE5dJ87f1hII7iBRAQfEmdAqClRvb2wgVXNhZ2UwATmYcwjIGacTGEE4RxLIGacT
|
|
||||||
GEoaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjg2LjBKGAoJdG9vbF9uYW1lEgsKCVRlc3QgVG9vbEoO
|
|
||||||
CghhdHRlbXB0cxICGAF6AhgBhQEAAQAAEowBChBqK4036ypaH1gZ3OIOE/0HEgiF8wTQDQGRlSoK
|
|
||||||
VG9vbCBVc2FnZTABOYBiSsgZpxMYQRCYUsgZpxMYShoKDmNyZXdhaV92ZXJzaW9uEggKBjAuODYu
|
|
||||||
MEoYCgl0b29sX25hbWUSCwoJVGVzdCBUb29sSg4KCGF0dGVtcHRzEgIYAXoCGAGFAQABAAASwQcK
|
|
||||||
EIWSiNjtKgeNQ6oIv8gjJ+MSCG8YnypCXfw1KgxDcmV3IENyZWF0ZWQwATnYUW/KGacTGEEoenTK
|
|
||||||
GacTGEoaCg5jcmV3YWlfdmVyc2lvbhIICgYwLjg2LjBKGgoOcHl0aG9uX3ZlcnNpb24SCAoGMy4x
|
|
||||||
MS43Si4KCGNyZXdfa2V5EiIKIDk4MjQ2MGVlMmRkMmNmMTJhNzEzOGI3MDg1OWZlODE3SjEKB2Ny
|
|
||||||
ZXdfaWQSJgokZDNkODZjNmEtNWNmMi00MGI0LWExZGQtMzA5NTYyODdjNWE3ShwKDGNyZXdfcHJv
|
|
||||||
Y2VzcxIMCgpzZXF1ZW50aWFsShEKC2NyZXdfbWVtb3J5EgIQAEoaChRjcmV3X251bWJlcl9vZl90
|
|
||||||
YXNrcxICGAFKGwoVY3Jld19udW1iZXJfb2ZfYWdlbnRzEgIYAUrcAgoLY3Jld19hZ2VudHMSzAIK
|
|
||||||
yQJbeyJrZXkiOiAiOGJkMjEzOWI1OTc1MTgxNTA2ZTQxZmQ5YzQ1NjNkNzUiLCAiaWQiOiAiNWNm
|
|
||||||
NDllYzctOTVmMy00ZGQ3LTg1NzItZjgwMDQwODYwYjI4IiwgInJvbGUiOiAiUmVzZWFyY2hlciIs
|
|
||||||
ICJ2ZXJib3NlPyI6IGZhbHNlLCAibWF4X2l0ZXIiOiAyMCwgIm1heF9ycG0iOiBudWxsLCAiZnVu
|
|
||||||
Y3Rpb25fY2FsbGluZ19sbG0iOiAiIiwgImxsbSI6ICJncHQtNG8tbWluaSIsICJkZWxlZ2F0aW9u
|
|
||||||
X2VuYWJsZWQ/IjogZmFsc2UsICJhbGxvd19jb2RlX2V4ZWN1dGlvbj8iOiBmYWxzZSwgIm1heF9y
|
|
||||||
ZXRyeV9saW1pdCI6IDIsICJ0b29sc19uYW1lcyI6IFsidGVzdCB0b29sIl19XUqSAgoKY3Jld190
|
|
||||||
YXNrcxKDAgqAAlt7ImtleSI6ICJmODM5Yzg3YzNkNzU3Yzg4N2Y0Y2U3NGQxODY0YjAyYSIsICJp
|
|
||||||
ZCI6ICJjM2Y2NjY2MS00YWNjLTQ5OWQtYjJkNC1kZjI0Nzg1MTJhZGYiLCAiYXN5bmNfZXhlY3V0
|
|
||||||
aW9uPyI6IGZhbHNlLCAiaHVtYW5faW5wdXQ/IjogZmFsc2UsICJhZ2VudF9yb2xlIjogIlJlc2Vh
|
|
||||||
cmNoZXIiLCAiYWdlbnRfa2V5IjogIjhiZDIxMzliNTk3NTE4MTUwNmU0MWZkOWM0NTYzZDc1Iiwg
|
|
||||||
InRvb2xzX25hbWVzIjogWyJhbm90aGVyIHRlc3QgdG9vbCJdfV16AhgBhQEAAQAAEo4CChD8dNvp
|
|
||||||
UItERukk59GnvESYEghtjirHyG3B3SoMVGFzayBDcmVhdGVkMAE5MAGByhmnExhBIFeByhmnExhK
|
|
||||||
LgoIY3Jld19rZXkSIgogOTgyNDYwZWUyZGQyY2YxMmE3MTM4YjcwODU5ZmU4MTdKMQoHY3Jld19p
|
|
||||||
ZBImCiRkM2Q4NmM2YS01Y2YyLTQwYjQtYTFkZC0zMDk1NjI4N2M1YTdKLgoIdGFza19rZXkSIgog
|
|
||||||
ZjgzOWM4N2MzZDc1N2M4ODdmNGNlNzRkMTg2NGIwMmFKMQoHdGFza19pZBImCiRjM2Y2NjY2MS00
|
|
||||||
YWNjLTQ5OWQtYjJkNC1kZjI0Nzg1MTJhZGZ6AhgBhQEAAQAAEowBChDdoNfQMW/Om7LQU9gZGDrl
|
|
||||||
Egjw71DM3bnOWCoKVG9vbCBVc2FnZTABOUgPFC8apxMYQdhtKi8apxMYShoKDmNyZXdhaV92ZXJz
|
|
||||||
aW9uEggKBjAuODYuMEoYCgl0b29sX25hbWUSCwoJVGVzdCBUb29sSg4KCGF0dGVtcHRzEgIYAXoC
|
|
||||||
GAGFAQABAAA=
|
|
||||||
headers:
|
|
||||||
Accept:
|
|
||||||
- '*/*'
|
|
||||||
Accept-Encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
Connection:
|
|
||||||
- keep-alive
|
|
||||||
Content-Length:
|
|
||||||
- '14771'
|
|
||||||
Content-Type:
|
|
||||||
- application/x-protobuf
|
|
||||||
User-Agent:
|
|
||||||
- OTel-OTLP-Exporter-Python/1.27.0
|
|
||||||
method: POST
|
|
||||||
uri: https://telemetry.crewai.com:4319/v1/traces
|
|
||||||
response:
|
|
||||||
body:
|
|
||||||
string: "\n\0"
|
|
||||||
headers:
|
|
||||||
Content-Length:
|
|
||||||
- '2'
|
|
||||||
Content-Type:
|
|
||||||
- application/x-protobuf
|
|
||||||
Date:
|
|
||||||
- Mon, 23 Dec 2024 00:33:37 GMT
|
|
||||||
status:
|
|
||||||
code: 200
|
|
||||||
message: OK
|
|
||||||
- request:
|
|
||||||
body: '{"messages": [{"role": "system", "content": "You are Researcher. You''re
|
|
||||||
an expert researcher, specialized in technology, software engineering, AI and
|
|
||||||
startups. You work as a freelancer and is now working on doing research and
|
|
||||||
analysis for a new customer.\nYour personal goal is: Make the best research
|
|
||||||
and analysis on content about AI and AI agents\nYou ONLY have access to the
|
|
||||||
following tools, and should NEVER make up tools that are not listed here:\n\nTool
|
|
||||||
Name: Test Tool\nTool Arguments: {''query'': {''description'': ''Query to process'',
|
|
||||||
''type'': ''str''}}\nTool Description: A test tool that just returns the input\n\nUse
|
|
||||||
the following format:\n\nThought: you should always think about what to do\nAction:
|
|
||||||
the action to take, only one name of [Test Tool], just the name, exactly as
|
|
||||||
it''s written.\nAction Input: the input to the action, just a simple python
|
|
||||||
dictionary, enclosed in curly braces, using \" to wrap keys and values.\nObservation:
|
|
||||||
the result of the action\n\nOnce all necessary information is gathered:\n\nThought:
|
|
||||||
I now know the final answer\nFinal Answer: the final answer to the original
|
|
||||||
input question"}, {"role": "user", "content": "\nCurrent Task: Write a test
|
|
||||||
task\n\nThis is the expect criteria for your final answer: Test output\nyou
|
|
||||||
MUST return the actual complete content as the final answer, not a summary.\n\nBegin!
|
|
||||||
This is VERY important to you, use the tools available and give your best Final
|
|
||||||
Answer, your job depends on it!\n\nThought:"}, {"role": "assistant", "content":
|
|
||||||
"I need to come up with a suitable test task that meets the criteria provided.
|
|
||||||
I will focus on outlining a clear and effective test task related to AI and
|
|
||||||
AI agents.\n\nAction: Test Tool\nAction Input: {\"query\": \"Create a test task
|
|
||||||
that involves evaluating the performance of an AI agent in a given scenario,
|
|
||||||
including criteria for success, tools required, and process for assessment.\"}\nObservation:
|
|
||||||
Processed: Create a test task that involves evaluating the performance of an
|
|
||||||
AI agent in a given scenario, including criteria for success, tools required,
|
|
||||||
and process for assessment."}], "model": "gpt-4o-mini", "stop": ["\nObservation:"],
|
|
||||||
"stream": false}'
|
|
||||||
headers:
|
|
||||||
accept:
|
|
||||||
- application/json
|
|
||||||
accept-encoding:
|
|
||||||
- gzip, deflate
|
|
||||||
connection:
|
|
||||||
- keep-alive
|
|
||||||
content-length:
|
|
||||||
- '2160'
|
|
||||||
content-type:
|
|
||||||
- application/json
|
|
||||||
cookie:
|
|
||||||
- _cfuvid=2u_Xw.i716TDjD2vb2mvMyWxhA4q1MM1JvbrA8CNZpI-1734895557894-0.0.1.1-604800000;
|
|
||||||
__cf_bm=i6jvNjhsDne300GPAeEmyiJJKYqy7OPuamFG_kht3KE-1734914012-1.0.1.1-tCeVANAF521vkXpBdgYw.ov.fYUr6t5QC4LG_DugWyzu4C60Pi2CruTVniUgfCvkcu6rdHA5DwnaEZf2jFaRCQ
|
|
||||||
host:
|
|
||||||
- api.openai.com
|
|
||||||
user-agent:
|
|
||||||
- OpenAI/Python 1.52.1
|
|
||||||
x-stainless-arch:
|
|
||||||
- arm64
|
|
||||||
x-stainless-async:
|
|
||||||
- 'false'
|
|
||||||
x-stainless-lang:
|
|
||||||
- python
|
|
||||||
x-stainless-os:
|
|
||||||
- MacOS
|
|
||||||
x-stainless-package-version:
|
|
||||||
- 1.52.1
|
|
||||||
x-stainless-raw-response:
|
|
||||||
- 'true'
|
|
||||||
x-stainless-retry-count:
|
|
||||||
- '0'
|
|
||||||
x-stainless-runtime:
|
|
||||||
- CPython
|
|
||||||
x-stainless-runtime-version:
|
|
||||||
- 3.11.7
|
|
||||||
method: POST
|
|
||||||
uri: https://api.openai.com/v1/chat/completions
|
|
||||||
response:
|
|
||||||
content: "{\n \"id\": \"chatcmpl-AhQg0Cx1FED7gOwMpJUD5cQxVzQBK\",\n \"object\":
|
|
||||||
\"chat.completion\",\n \"created\": 1734914012,\n \"model\": \"gpt-4o-mini-2024-07-18\",\n
|
|
||||||
\ \"choices\": [\n {\n \"index\": 0,\n \"message\": {\n \"role\":
|
|
||||||
\"assistant\",\n \"content\": \"Thought: I need to compile a complete
|
|
||||||
and detailed test task based on the processed input, ensuring it includes all
|
|
||||||
necessary components as per the expect criteria.\\n\\nFinal Answer: \\n### Test
|
|
||||||
Task: Evaluating the Performance of an AI Agent\\n\\n**Objective:** \\nTo evaluate
|
|
||||||
the performance of an AI agent in simulating conversation with users in a customer
|
|
||||||
service context.\\n\\n**Scenario:**\\nAn AI agent is tasked with handling customer
|
|
||||||
inquiries on an e-commerce platform. The agent must respond to questions about
|
|
||||||
product availability, order tracking, returns, and other common queries.\\n\\n**Criteria
|
|
||||||
for Success:**\\n1. **Accuracy:** The AI should provide correct information
|
|
||||||
in at least 90% of the interactions.\\n2. **Response Time:** The average response
|
|
||||||
time should be under 2 seconds.\\n3. **User Satisfaction:** Aim for a user satisfaction
|
|
||||||
score of 85% or higher based on follow-up surveys after interactions.\\n4. **Fallback
|
|
||||||
Rate:** The AI should not default to a human agent more than 10% of the time.\\n\\n**Tools
|
|
||||||
Required:**\\n- Chatbot development platform (e.g., Dialogflow, Rasa)\\n- Metrics
|
|
||||||
tracking software (to measure accuracy, response times, and user satisfaction)\\n-
|
|
||||||
Survey tool (e.g., Google Forms, SurveyMonkey) for feedback collection\\n\\n**Process
|
|
||||||
for Assessment:**\\n1. **Setup:** Deploy the AI agent on a testing environment
|
|
||||||
simulating real customer inquiries.\\n2. **Data Collection:** Run the test for
|
|
||||||
a predetermined period (e.g., one week) or until a set number of interactions
|
|
||||||
(e.g., 1000).\\n3. **Measurement:**\\n - Record the interactions and analyze
|
|
||||||
the accuracy of the AI's responses.\\n - Measure the average response time
|
|
||||||
for each interaction.\\n - Collect user satisfaction scores via surveys sent
|
|
||||||
after the interaction.\\n4. **Analysis:** Compile the data to see if the AI
|
|
||||||
met the success criteria. Identify strengths and weaknesses in the responses.\\n5.
|
|
||||||
**Review:** Share findings with the development team to strategize improvements.\\n\\nThis
|
|
||||||
detailed task will help assess the AI agent\u2019s capabilities and provide
|
|
||||||
insights for further enhancements.\",\n \"refusal\": null\n },\n
|
|
||||||
\ \"logprobs\": null,\n \"finish_reason\": \"stop\"\n }\n ],\n
|
|
||||||
\ \"usage\": {\n \"prompt_tokens\": 416,\n \"completion_tokens\": 422,\n
|
|
||||||
\ \"total_tokens\": 838,\n \"prompt_tokens_details\": {\n \"cached_tokens\":
|
|
||||||
0,\n \"audio_tokens\": 0\n },\n \"completion_tokens_details\": {\n
|
|
||||||
\ \"reasoning_tokens\": 0,\n \"audio_tokens\": 0,\n \"accepted_prediction_tokens\":
|
|
||||||
0,\n \"rejected_prediction_tokens\": 0\n }\n },\n \"system_fingerprint\":
|
|
||||||
\"fp_d02d531b47\"\n}\n"
|
|
||||||
headers:
|
|
||||||
CF-Cache-Status:
|
|
||||||
- DYNAMIC
|
|
||||||
CF-RAY:
|
|
||||||
- 8f6442c2ba15a486-GRU
|
|
||||||
Connection:
|
|
||||||
- keep-alive
|
|
||||||
Content-Encoding:
|
|
||||||
- gzip
|
|
||||||
Content-Type:
|
|
||||||
- application/json
|
|
||||||
Date:
|
|
||||||
- Mon, 23 Dec 2024 00:33:39 GMT
|
|
||||||
Server:
|
|
||||||
- cloudflare
|
|
||||||
Transfer-Encoding:
|
|
||||||
- chunked
|
|
||||||
X-Content-Type-Options:
|
|
||||||
- nosniff
|
|
||||||
access-control-expose-headers:
|
|
||||||
- X-Request-ID
|
|
||||||
alt-svc:
|
|
||||||
- h3=":443"; ma=86400
|
|
||||||
openai-organization:
|
|
||||||
- crewai-iuxna1
|
|
||||||
openai-processing-ms:
|
|
||||||
- '6734'
|
|
||||||
openai-version:
|
|
||||||
- '2020-10-01'
|
|
||||||
strict-transport-security:
|
|
||||||
- max-age=31536000; includeSubDomains; preload
|
|
||||||
x-ratelimit-limit-requests:
|
|
||||||
- '30000'
|
|
||||||
x-ratelimit-limit-tokens:
|
|
||||||
- '150000000'
|
|
||||||
x-ratelimit-remaining-requests:
|
|
||||||
- '29999'
|
|
||||||
x-ratelimit-remaining-tokens:
|
|
||||||
- '149999497'
|
|
||||||
x-ratelimit-reset-requests:
|
|
||||||
- 2ms
|
|
||||||
x-ratelimit-reset-tokens:
|
|
||||||
- 0s
|
|
||||||
x-request-id:
|
|
||||||
- req_7d8df8b840e279bd64280d161d854161
|
|
||||||
http_version: HTTP/1.1
|
|
||||||
status_code: 200
|
|
||||||
version: 1
|
|
||||||
@@ -2,7 +2,6 @@ import unittest
|
|||||||
from unittest.mock import MagicMock, patch
|
from unittest.mock import MagicMock, patch
|
||||||
|
|
||||||
import requests
|
import requests
|
||||||
|
|
||||||
from crewai.cli.authentication.main import AuthenticationCommand
|
from crewai.cli.authentication.main import AuthenticationCommand
|
||||||
|
|
||||||
|
|
||||||
@@ -48,9 +47,7 @@ class TestAuthenticationCommand(unittest.TestCase):
|
|||||||
@patch("crewai.cli.authentication.main.requests.post")
|
@patch("crewai.cli.authentication.main.requests.post")
|
||||||
@patch("crewai.cli.authentication.main.validate_token")
|
@patch("crewai.cli.authentication.main.validate_token")
|
||||||
@patch("crewai.cli.authentication.main.console.print")
|
@patch("crewai.cli.authentication.main.console.print")
|
||||||
def test_poll_for_token_success(
|
def test_poll_for_token_success(self, mock_print, mock_validate_token, mock_post, mock_tool):
|
||||||
self, mock_print, mock_validate_token, mock_post, mock_tool
|
|
||||||
):
|
|
||||||
mock_response = MagicMock()
|
mock_response = MagicMock()
|
||||||
mock_response.status_code = 200
|
mock_response.status_code = 200
|
||||||
mock_response.json.return_value = {
|
mock_response.json.return_value = {
|
||||||
@@ -65,9 +62,7 @@ class TestAuthenticationCommand(unittest.TestCase):
|
|||||||
self.auth_command._poll_for_token({"device_code": "123456"})
|
self.auth_command._poll_for_token({"device_code": "123456"})
|
||||||
|
|
||||||
mock_validate_token.assert_called_once_with("TOKEN")
|
mock_validate_token.assert_called_once_with("TOKEN")
|
||||||
mock_print.assert_called_once_with(
|
mock_print.assert_called_once_with("\n[bold green]Welcome to CrewAI Enterprise![/bold green]\n")
|
||||||
"\n[bold green]Welcome to CrewAI Enterprise![/bold green]\n"
|
|
||||||
)
|
|
||||||
|
|
||||||
@patch("crewai.cli.authentication.main.requests.post")
|
@patch("crewai.cli.authentication.main.requests.post")
|
||||||
@patch("crewai.cli.authentication.main.console.print")
|
@patch("crewai.cli.authentication.main.console.print")
|
||||||
|
|||||||
@@ -3,9 +3,8 @@ import unittest
|
|||||||
from datetime import datetime, timedelta
|
from datetime import datetime, timedelta
|
||||||
from unittest.mock import MagicMock, patch
|
from unittest.mock import MagicMock, patch
|
||||||
|
|
||||||
from cryptography.fernet import Fernet
|
|
||||||
|
|
||||||
from crewai.cli.authentication.utils import TokenManager, validate_token
|
from crewai.cli.authentication.utils import TokenManager, validate_token
|
||||||
|
from cryptography.fernet import Fernet
|
||||||
|
|
||||||
|
|
||||||
class TestValidateToken(unittest.TestCase):
|
class TestValidateToken(unittest.TestCase):
|
||||||
|
|||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user