Compare commits

..

5 Commits

Author SHA1 Message Date
Greyson LaLonde
0bb6faa9d3 docs: update changelog and version for v1.14.2rc1
Some checks are pending
CodeQL Advanced / Analyze (actions) (push) Waiting to run
CodeQL Advanced / Analyze (python) (push) Waiting to run
Check Documentation Broken Links / Check broken links (push) Waiting to run
Vulnerability Scan / pip-audit (push) Waiting to run
Build uv cache / build-cache (3.10) (push) Waiting to run
Build uv cache / build-cache (3.11) (push) Waiting to run
Build uv cache / build-cache (3.12) (push) Waiting to run
Build uv cache / build-cache (3.13) (push) Waiting to run
2026-04-16 05:24:57 +08:00
Greyson LaLonde
aa28eeab6a feat: bump versions to 1.14.2rc1 2026-04-16 05:18:24 +08:00
Greyson LaLonde
29b5531f78 fix: handle cyclic JSON schemas in MCP tool resolution 2026-04-16 05:03:00 +08:00
Greyson LaLonde
74d061e994 fix: bump python-multipart to 0.0.26 to patch GHSA-mj87-hwqh-73pj
Fixes GHSA-mj87-hwqh-73pj
2026-04-16 04:25:35 +08:00
Greyson LaLonde
18d0fd6b80 fix: bump pypdf to 6.10.1 to patch GHSA-jj6c-8h6c-hppx
Fixes GHSA-jj6c-8h6c-hppx
2026-04-16 04:11:08 +08:00
26 changed files with 367 additions and 495 deletions

View File

@@ -4,6 +4,27 @@ description: "تحديثات المنتج والتحسينات وإصلاحات
icon: "clock"
mode: "wide"
---
<Update label="16 أبريل 2026">
## v1.14.2rc1
[عرض الإصدار على GitHub](https://github.com/crewAIInc/crewAI/releases/tag/1.14.2rc1)
## ما الذي تغير
### إصلاحات الأخطاء
- إصلاح معالجة مخططات JSON الدائرية في أداة MCP
- إصلاح ثغرة أمنية من خلال تحديث python-multipart إلى 0.0.26
- إصلاح ثغرة أمنية من خلال تحديث pypdf إلى 6.10.1
### الوثائق
- تحديث سجل التغييرات والإصدار لـ v1.14.2a5
## المساهمون
@greysonlalonde
</Update>
<Update label="15 أبريل 2026">
## v1.14.2a5

View File

@@ -4,6 +4,27 @@ description: "Product updates, improvements, and bug fixes for CrewAI"
icon: "clock"
mode: "wide"
---
<Update label="Apr 16, 2026">
## v1.14.2rc1
[View release on GitHub](https://github.com/crewAIInc/crewAI/releases/tag/1.14.2rc1)
## What's Changed
### Bug Fixes
- Fix handling of cyclic JSON schemas in MCP tool resolution
- Fix vulnerability by bumping python-multipart to 0.0.26
- Fix vulnerability by bumping pypdf to 6.10.1
### Documentation
- Update changelog and version for v1.14.2a5
## Contributors
@greysonlalonde
</Update>
<Update label="Apr 15, 2026">
## v1.14.2a5

View File

@@ -4,6 +4,27 @@ description: "CrewAI의 제품 업데이트, 개선 사항 및 버그 수정"
icon: "clock"
mode: "wide"
---
<Update label="2026년 4월 16일">
## v1.14.2rc1
[GitHub 릴리스 보기](https://github.com/crewAIInc/crewAI/releases/tag/1.14.2rc1)
## 변경 사항
### 버그 수정
- MCP 도구 해상도에서 순환 JSON 스키마 처리 수정
- python-multipart를 0.0.26으로 업데이트하여 취약점 수정
- pypdf를 6.10.1로 업데이트하여 취약점 수정
### 문서
- v1.14.2a5에 대한 변경 로그 및 버전 업데이트
## 기여자
@greysonlalonde
</Update>
<Update label="2026년 4월 15일">
## v1.14.2a5

View File

@@ -4,6 +4,27 @@ description: "Atualizações de produto, melhorias e correções do CrewAI"
icon: "clock"
mode: "wide"
---
<Update label="16 abr 2026">
## v1.14.2rc1
[Ver release no GitHub](https://github.com/crewAIInc/crewAI/releases/tag/1.14.2rc1)
## O que Mudou
### Correções de Bugs
- Corrigir o manuseio de esquemas JSON cíclicos na resolução da ferramenta MCP
- Corrigir vulnerabilidade atualizando python-multipart para 0.0.26
- Corrigir vulnerabilidade atualizando pypdf para 6.10.1
### Documentação
- Atualizar o changelog e a versão para v1.14.2a5
## Contribuidores
@greysonlalonde
</Update>
<Update label="15 abr 2026">
## v1.14.2a5

View File

@@ -152,4 +152,4 @@ __all__ = [
"wrap_file_source",
]
__version__ = "1.14.2a5"
__version__ = "1.14.2rc1"

View File

@@ -10,7 +10,7 @@ requires-python = ">=3.10, <3.14"
dependencies = [
"pytube~=15.0.0",
"requests>=2.33.0,<3",
"crewai==1.14.2a5",
"crewai==1.14.2rc1",
"tiktoken~=0.8.0",
"beautifulsoup4~=4.13.4",
"python-docx~=1.2.0",

View File

@@ -305,4 +305,4 @@ __all__ = [
"ZapierActionTools",
]
__version__ = "1.14.2a5"
__version__ = "1.14.2rc1"

View File

@@ -55,7 +55,7 @@ Repository = "https://github.com/crewAIInc/crewAI"
[project.optional-dependencies]
tools = [
"crewai-tools==1.14.2a5",
"crewai-tools==1.14.2rc1",
]
embeddings = [
"tiktoken~=0.8.0"

View File

@@ -46,7 +46,7 @@ def _suppress_pydantic_deprecation_warnings() -> None:
_suppress_pydantic_deprecation_warnings()
__version__ = "1.14.2a5"
__version__ = "1.14.2rc1"
_telemetry_submitted = False

View File

@@ -1091,10 +1091,8 @@ class Agent(BaseAgent):
)
)
def get_delegation_tools(
self, agents: Sequence[BaseAgent], task: Task | None = None
) -> list[BaseTool]:
agent_tools = AgentTools(agents=agents, task=task)
def get_delegation_tools(self, agents: Sequence[BaseAgent]) -> list[BaseTool]:
agent_tools = AgentTools(agents=agents)
return agent_tools.tools()
def get_platform_tools(self, apps: list[PlatformAppOrAction]) -> list[BaseTool]:

View File

@@ -274,22 +274,18 @@ class LangGraphAgentAdapter(BaseAgentAdapter):
available_tools: list[Any] = self._tool_adapter.tools()
self._graph.tools = available_tools
def get_delegation_tools(
self, agents: Sequence[BaseAgent], task: Any | None = None
) -> list[BaseTool]:
def get_delegation_tools(self, agents: Sequence[BaseAgent]) -> list[BaseTool]:
"""Implement delegation tools support for LangGraph.
Creates delegation tools that allow this agent to delegate tasks to other agents.
When a task is provided, its constraints are propagated to the delegation tools.
Args:
agents: List of agents available for delegation.
task: Optional task whose constraints should be propagated.
Returns:
List of delegation tools.
"""
agent_tools: AgentTools = AgentTools(agents=agents, task=task)
agent_tools: AgentTools = AgentTools(agents=agents)
return agent_tools.tools()
@staticmethod

View File

@@ -223,22 +223,18 @@ class OpenAIAgentAdapter(BaseAgentAdapter):
"""
return self._converter_adapter.post_process_result(result.final_output)
def get_delegation_tools(
self, agents: Sequence[BaseAgent], task: Any | None = None
) -> list[BaseTool]:
def get_delegation_tools(self, agents: Sequence[BaseAgent]) -> list[BaseTool]:
"""Implement delegation tools support.
Creates delegation tools that allow this agent to delegate tasks to other agents.
When a task is provided, its constraints are propagated to the delegation tools.
Args:
agents: List of agents available for delegation.
task: Optional task whose constraints should be propagated.
Returns:
List of delegation tools.
"""
agent_tools: AgentTools = AgentTools(agents=agents, task=task)
agent_tools: AgentTools = AgentTools(agents=agents)
return agent_tools.tools()
def configure_structured_output(self, task: Any) -> None:

View File

@@ -530,9 +530,7 @@ class BaseAgent(BaseModel, ABC, metaclass=AgentMeta):
pass
@abstractmethod
def get_delegation_tools(
self, agents: Sequence[BaseAgent], task: Any | None = None
) -> list[BaseTool]:
def get_delegation_tools(self, agents: Sequence[BaseAgent]) -> list[BaseTool]:
"""Set the task tools that init BaseAgenTools class."""
@abstractmethod

View File

@@ -5,7 +5,7 @@ description = "{{name}} using crewAI"
authors = [{ name = "Your Name", email = "you@example.com" }]
requires-python = ">=3.10,<3.14"
dependencies = [
"crewai[tools]==1.14.2a5"
"crewai[tools]==1.14.2rc1"
]
[project.scripts]

View File

@@ -5,7 +5,7 @@ description = "{{name}} using crewAI"
authors = [{ name = "Your Name", email = "you@example.com" }]
requires-python = ">=3.10,<3.14"
dependencies = [
"crewai[tools]==1.14.2a5"
"crewai[tools]==1.14.2rc1"
]
[project.scripts]

View File

@@ -5,7 +5,7 @@ description = "Power up your crews with {{folder_name}}"
readme = "README.md"
requires-python = ">=3.10,<3.14"
dependencies = [
"crewai[tools]==1.14.2a5"
"crewai[tools]==1.14.2rc1"
]
[tool.crewai]

View File

@@ -1608,10 +1608,9 @@ class Crew(FlowTrackable, BaseModel):
tools: list[BaseTool],
task_agent: BaseAgent,
agents: Sequence[BaseAgent],
task: Task | None = None,
) -> list[BaseTool]:
if hasattr(task_agent, "get_delegation_tools"):
delegation_tools = task_agent.get_delegation_tools(agents, task=task)
delegation_tools = task_agent.get_delegation_tools(agents)
# Cast delegation_tools to the expected type for _merge_tools
return self._merge_tools(tools, delegation_tools)
return tools
@@ -1694,7 +1693,7 @@ class Crew(FlowTrackable, BaseModel):
if not tools:
tools = []
tools = self._inject_delegation_tools(
tools, task.agent, agents_for_delegation, task=task
tools, task.agent, agents_for_delegation
)
return tools
@@ -1724,12 +1723,10 @@ class Crew(FlowTrackable, BaseModel):
) -> list[BaseTool]:
if self.manager_agent:
if task.agent:
tools = self._inject_delegation_tools(
tools, task.agent, [task.agent], task=task
)
tools = self._inject_delegation_tools(tools, task.agent, [task.agent])
else:
tools = self._inject_delegation_tools(
tools, self.manager_agent, self.agents, task=task
tools, self.manager_agent, self.agents
)
return tools

View File

@@ -417,9 +417,18 @@ class MCPToolResolver:
args_schema = None
if tool_def.get("inputSchema"):
args_schema = self._json_schema_to_pydantic(
tool_name, tool_def["inputSchema"]
)
try:
args_schema = self._json_schema_to_pydantic(
tool_name, tool_def["inputSchema"]
)
except Exception as e:
self._logger.log(
"warning",
f"Failed to build args schema for MCP tool "
f"'{tool_name}': {e}. Registering tool without a "
"typed schema.",
)
args_schema = None
tool_schema = {
"description": tool_def.get("description", ""),

View File

@@ -193,13 +193,6 @@ class Task(BaseModel):
description="A converter class used to export structured output",
default=None,
)
constraints: list[str] = Field(
default_factory=list,
description="Structured constraints that must be preserved during task delegation. "
"Each constraint is a string describing a specific requirement (e.g., domain scope, "
"quality specs, temporal or geographic limits). These are automatically propagated "
"to delegated tasks so worker agents are aware of all original constraints.",
)
processed_by_agents: set[str] = Field(default_factory=set)
guardrail: GuardrailType | None = Field(
default=None,
@@ -908,17 +901,10 @@ class Task(BaseModel):
tasks_slices = [description]
if self.constraints:
constraints_text = (
"\n\nTask Constraints (MUST be respected):\n"
+ "\n".join(f"- {constraint}" for constraint in self.constraints)
)
tasks_slices.append(constraints_text)
output = I18N_DEFAULT.slice("expected_output").format(
expected_output=self.expected_output
)
tasks_slices.append(output)
tasks_slices = [description, output]
if self.markdown:
markdown_instruction = """Your final answer MUST be formatted in Markdown syntax.

View File

@@ -10,34 +10,26 @@ from crewai.utilities.i18n import I18N_DEFAULT
if TYPE_CHECKING:
from crewai.agents.agent_builder.base_agent import BaseAgent
from crewai.task import Task
from crewai.tools.base_tool import BaseTool
class AgentTools:
"""Manager class for agent-related tools"""
def __init__(self, agents: Sequence[BaseAgent], task: Task | None = None) -> None:
def __init__(self, agents: Sequence[BaseAgent]) -> None:
self.agents = agents
self.task = task
def tools(self) -> list[BaseTool]:
"""Get all available agent tools.
When a task is provided, its constraints are automatically propagated
to the delegation tools so that worker agents receive them.
"""
"""Get all available agent tools"""
coworkers = ", ".join([f"{agent.role}" for agent in self.agents])
delegate_tool = DelegateWorkTool(
agents=self.agents,
original_task=self.task,
description=I18N_DEFAULT.tools("delegate_work").format(coworkers=coworkers), # type: ignore
)
ask_tool = AskQuestionTool(
agents=self.agents,
original_task=self.task,
description=I18N_DEFAULT.tools("ask_question").format(coworkers=coworkers), # type: ignore
)

View File

@@ -16,10 +16,6 @@ class BaseAgentTool(BaseTool):
"""Base class for agent-related tools"""
agents: list[BaseAgent] = Field(description="List of available agents")
original_task: Task | None = Field(
default=None,
description="The original task being delegated, used to propagate constraints",
)
def sanitize_agent_name(self, name: str) -> str:
"""
@@ -55,10 +51,6 @@ class BaseAgentTool(BaseTool):
"""
Execute delegation to an agent with case-insensitive and whitespace-tolerant matching.
When the original_task has constraints defined, they are automatically
propagated to the delegated Task object. The constraints are then
rendered by Task.prompt() so the worker agent sees them.
Args:
agent_name: Name/role of the agent to delegate to (case-insensitive)
task: The specific question or task to delegate
@@ -122,25 +114,10 @@ class BaseAgentTool(BaseTool):
selected_agent = agent[0]
try:
# Propagate constraints from the original task to the delegated task.
# Constraints are set on the Task object so that Task.prompt() renders
# them for the worker agent — no need to also inject them into `context`,
# which would cause duplication.
constraints: list[str] = []
if self.original_task and self.original_task.constraints:
constraints = list(self.original_task.constraints)
logger.info(
"Propagating %d constraint(s) from original task to delegated task for agent '%s': %s",
len(constraints),
self.sanitize_agent_name(selected_agent.role),
constraints,
)
task_with_assigned_agent = Task(
description=task,
agent=selected_agent,
expected_output=I18N_DEFAULT.slice("manager_request"),
constraints=constraints,
)
logger.debug(
f"Created task for agent '{self.sanitize_agent_name(selected_agent.role)}': {task}"

View File

@@ -19,7 +19,18 @@ from collections.abc import Callable
from copy import deepcopy
import datetime
import logging
from typing import TYPE_CHECKING, Annotated, Any, Final, Literal, TypedDict, Union, cast
from typing import (
TYPE_CHECKING,
Annotated,
Any,
Final,
ForwardRef,
Literal,
Optional,
TypedDict,
Union,
cast,
)
import uuid
import jsonref # type: ignore[import-untyped]
@@ -99,15 +110,22 @@ def resolve_refs(schema: dict[str, Any]) -> dict[str, Any]:
"""
defs = schema.get("$defs", {})
schema_copy = deepcopy(schema)
expanding: set[str] = set()
def _resolve(node: Any) -> Any:
if isinstance(node, dict):
ref = node.get("$ref")
if isinstance(ref, str) and ref.startswith("#/$defs/"):
def_name = ref.replace("#/$defs/", "")
if def_name in defs:
if def_name not in defs:
raise KeyError(f"Definition '{def_name}' not found in $defs.")
if def_name in expanding:
return {}
expanding.add(def_name)
try:
return _resolve(deepcopy(defs[def_name]))
raise KeyError(f"Definition '{def_name}' not found in $defs.")
finally:
expanding.discard(def_name)
return {k: _resolve(v) for k, v in node.items()}
if isinstance(node, list):
@@ -119,7 +137,11 @@ def resolve_refs(schema: dict[str, Any]) -> dict[str, Any]:
def add_key_in_dict_recursively(
d: dict[str, Any], key: str, value: Any, criteria: Callable[[dict[str, Any]], bool]
d: dict[str, Any],
key: str,
value: Any,
criteria: Callable[[dict[str, Any]], bool],
_seen: set[int] | None = None,
) -> dict[str, Any]:
"""Recursively adds a key/value pair to all nested dicts matching `criteria`.
@@ -128,22 +150,31 @@ def add_key_in_dict_recursively(
key: The key to add.
value: The value to add.
criteria: A function that returns True for dicts that should receive the key.
_seen: Internal set of visited ``id()``s, used to guard cyclic schemas.
Returns:
The modified dictionary.
"""
if _seen is None:
_seen = set()
if isinstance(d, dict):
if id(d) in _seen:
return d
_seen.add(id(d))
if criteria(d) and key not in d:
d[key] = value
for v in d.values():
add_key_in_dict_recursively(v, key, value, criteria)
add_key_in_dict_recursively(v, key, value, criteria, _seen)
elif isinstance(d, list):
if id(d) in _seen:
return d
_seen.add(id(d))
for i in d:
add_key_in_dict_recursively(i, key, value, criteria)
add_key_in_dict_recursively(i, key, value, criteria, _seen)
return d
def force_additional_properties_false(d: Any) -> Any:
def force_additional_properties_false(d: Any, _seen: set[int] | None = None) -> Any:
"""Force additionalProperties=false on all object-type dicts recursively.
OpenAI strict mode requires all objects to have additionalProperties=false.
@@ -154,11 +185,17 @@ def force_additional_properties_false(d: Any) -> Any:
Args:
d: The dictionary/list to modify.
_seen: Internal set of visited ``id()``s, used to guard cyclic schemas.
Returns:
The modified dictionary/list.
"""
if _seen is None:
_seen = set()
if isinstance(d, dict):
if id(d) in _seen:
return d
_seen.add(id(d))
if d.get("type") == "object":
d["additionalProperties"] = False
if "properties" not in d:
@@ -166,10 +203,13 @@ def force_additional_properties_false(d: Any) -> Any:
if "required" not in d:
d["required"] = []
for v in d.values():
force_additional_properties_false(v)
force_additional_properties_false(v, _seen)
elif isinstance(d, list):
if id(d) in _seen:
return d
_seen.add(id(d))
for i in d:
force_additional_properties_false(i)
force_additional_properties_false(i, _seen)
return d
@@ -183,7 +223,7 @@ OPENAI_SUPPORTED_FORMATS: Final[
}
def strip_unsupported_formats(d: Any) -> Any:
def strip_unsupported_formats(d: Any, _seen: set[int] | None = None) -> Any:
"""Remove format annotations that OpenAI strict mode doesn't support.
OpenAI only supports: date-time, date, time, duration.
@@ -191,11 +231,17 @@ def strip_unsupported_formats(d: Any) -> Any:
Args:
d: The dictionary/list to modify.
_seen: Internal set of visited ``id()``s, used to guard cyclic schemas.
Returns:
The modified dictionary/list.
"""
if _seen is None:
_seen = set()
if isinstance(d, dict):
if id(d) in _seen:
return d
_seen.add(id(d))
format_value = d.get("format")
if (
isinstance(format_value, str)
@@ -203,14 +249,17 @@ def strip_unsupported_formats(d: Any) -> Any:
):
del d["format"]
for v in d.values():
strip_unsupported_formats(v)
strip_unsupported_formats(v, _seen)
elif isinstance(d, list):
if id(d) in _seen:
return d
_seen.add(id(d))
for i in d:
strip_unsupported_formats(i)
strip_unsupported_formats(i, _seen)
return d
def ensure_type_in_schemas(d: Any) -> Any:
def ensure_type_in_schemas(d: Any, _seen: set[int] | None = None) -> Any:
"""Ensure all schema objects in anyOf/oneOf have a 'type' key.
OpenAI strict mode requires every schema to have a 'type' key.
@@ -218,11 +267,17 @@ def ensure_type_in_schemas(d: Any) -> Any:
Args:
d: The dictionary/list to modify.
_seen: Internal set of visited ``id()``s, used to guard cyclic schemas.
Returns:
The modified dictionary/list.
"""
if _seen is None:
_seen = set()
if isinstance(d, dict):
if id(d) in _seen:
return d
_seen.add(id(d))
for key in ("anyOf", "oneOf"):
if key in d:
schema_list = d[key]
@@ -230,12 +285,15 @@ def ensure_type_in_schemas(d: Any) -> Any:
if isinstance(schema, dict) and schema == {}:
schema_list[i] = {"type": "object"}
else:
ensure_type_in_schemas(schema)
ensure_type_in_schemas(schema, _seen)
for v in d.values():
ensure_type_in_schemas(v)
ensure_type_in_schemas(v, _seen)
elif isinstance(d, list):
if id(d) in _seen:
return d
_seen.add(id(d))
for item in d:
ensure_type_in_schemas(item)
ensure_type_in_schemas(item, _seen)
return d
@@ -318,7 +376,9 @@ def add_const_to_oneof_variants(schema: dict[str, Any]) -> dict[str, Any]:
return _process_oneof(deepcopy(schema))
def convert_oneof_to_anyof(schema: dict[str, Any]) -> dict[str, Any]:
def convert_oneof_to_anyof(
schema: dict[str, Any], _seen: set[int] | None = None
) -> dict[str, Any]:
"""Convert oneOf to anyOf for OpenAI compatibility.
OpenAI's Structured Outputs support anyOf better than oneOf.
@@ -326,26 +386,37 @@ def convert_oneof_to_anyof(schema: dict[str, Any]) -> dict[str, Any]:
Args:
schema: JSON schema dictionary.
_seen: Internal set of visited ``id()``s, used to guard cyclic schemas.
Returns:
Modified schema with anyOf instead of oneOf.
"""
if _seen is None:
_seen = set()
if isinstance(schema, dict):
if id(schema) in _seen:
return schema
_seen.add(id(schema))
if "oneOf" in schema:
schema["anyOf"] = schema.pop("oneOf")
for value in schema.values():
if isinstance(value, dict):
convert_oneof_to_anyof(value)
convert_oneof_to_anyof(value, _seen)
elif isinstance(value, list):
if id(value) in _seen:
continue
_seen.add(id(value))
for item in value:
if isinstance(item, dict):
convert_oneof_to_anyof(item)
convert_oneof_to_anyof(item, _seen)
return schema
def ensure_all_properties_required(schema: dict[str, Any]) -> dict[str, Any]:
def ensure_all_properties_required(
schema: dict[str, Any], _seen: set[int] | None = None
) -> dict[str, Any]:
"""Ensure all properties are in the required array for OpenAI strict mode.
OpenAI's strict structured outputs require all properties to be listed
@@ -354,11 +425,17 @@ def ensure_all_properties_required(schema: dict[str, Any]) -> dict[str, Any]:
Args:
schema: JSON schema dictionary.
_seen: Internal set of visited ``id()``s, used to guard cyclic schemas.
Returns:
Modified schema with all properties marked as required.
"""
if _seen is None:
_seen = set()
if isinstance(schema, dict):
if id(schema) in _seen:
return schema
_seen.add(id(schema))
if schema.get("type") == "object" and "properties" in schema:
properties = schema["properties"]
if properties:
@@ -366,16 +443,21 @@ def ensure_all_properties_required(schema: dict[str, Any]) -> dict[str, Any]:
for value in schema.values():
if isinstance(value, dict):
ensure_all_properties_required(value)
ensure_all_properties_required(value, _seen)
elif isinstance(value, list):
if id(value) in _seen:
continue
_seen.add(id(value))
for item in value:
if isinstance(item, dict):
ensure_all_properties_required(item)
ensure_all_properties_required(item, _seen)
return schema
def strip_null_from_types(schema: dict[str, Any]) -> dict[str, Any]:
def strip_null_from_types(
schema: dict[str, Any], _seen: set[int] | None = None
) -> dict[str, Any]:
"""Remove null type from anyOf/type arrays.
Pydantic generates `T | None` for optional fields, which creates schemas with
@@ -384,11 +466,17 @@ def strip_null_from_types(schema: dict[str, Any]) -> dict[str, Any]:
Args:
schema: JSON schema dictionary.
_seen: Internal set of visited ``id()``s, used to guard cyclic schemas.
Returns:
Modified schema with null types removed.
"""
if _seen is None:
_seen = set()
if isinstance(schema, dict):
if id(schema) in _seen:
return schema
_seen.add(id(schema))
if "anyOf" in schema:
any_of = schema["anyOf"]
non_null = [opt for opt in any_of if opt.get("type") != "null"]
@@ -408,11 +496,14 @@ def strip_null_from_types(schema: dict[str, Any]) -> dict[str, Any]:
for value in schema.values():
if isinstance(value, dict):
strip_null_from_types(value)
strip_null_from_types(value, _seen)
elif isinstance(value, list):
if id(value) in _seen:
continue
_seen.add(id(value))
for item in value:
if isinstance(item, dict):
strip_null_from_types(item)
strip_null_from_types(item, _seen)
return schema
@@ -451,16 +542,26 @@ _CLAUDE_STRICT_UNSUPPORTED: Final[tuple[str, ...]] = (
)
def _strip_keys_recursive(d: Any, keys: tuple[str, ...]) -> Any:
def _strip_keys_recursive(
d: Any, keys: tuple[str, ...], _seen: set[int] | None = None
) -> Any:
"""Recursively delete a fixed set of keys from a schema."""
if _seen is None:
_seen = set()
if isinstance(d, dict):
if id(d) in _seen:
return d
_seen.add(id(d))
for key in keys:
d.pop(key, None)
for v in d.values():
_strip_keys_recursive(v, keys)
_strip_keys_recursive(v, keys, _seen)
elif isinstance(d, list):
if id(d) in _seen:
return d
_seen.add(id(d))
for i in d:
_strip_keys_recursive(i, keys)
_strip_keys_recursive(i, keys, _seen)
return d
@@ -719,12 +820,70 @@ def create_model_from_schema( # type: ignore[no-any-unimported]
json_schema = force_additional_properties_false(json_schema)
effective_root = force_additional_properties_false(effective_root)
in_progress: dict[int, Any] = {}
model = _build_model_from_schema(
json_schema,
effective_root,
model_name=model_name,
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
__config__=__config__,
__base__=__base__,
__module__=__module__,
__validators__=__validators__,
__cls_kwargs__=__cls_kwargs__,
)
types_namespace: dict[str, Any] = {
entry.__name__: entry
for entry in in_progress.values()
if isinstance(entry, type) and issubclass(entry, BaseModel)
}
for entry in in_progress.values():
if (
isinstance(entry, type)
and issubclass(entry, BaseModel)
and not getattr(entry, "__pydantic_complete__", True)
):
try:
entry.model_rebuild(_types_namespace=types_namespace)
except Exception as e:
logger.debug("model_rebuild failed for %s: %s", entry.__name__, e)
return model
def _build_model_from_schema( # type: ignore[no-any-unimported]
json_schema: dict[str, Any],
effective_root: dict[str, Any],
*,
model_name: str | None,
enrich_descriptions: bool,
in_progress: dict[int, Any],
__config__: ConfigDict | None = None,
__base__: type[BaseModel] | None = None,
__module__: str = __name__,
__validators__: dict[str, AnyClassMethod] | None = None,
__cls_kwargs__: dict[str, Any] | None = None,
) -> type[BaseModel]:
"""Inner builder shared by the public entry point and recursive nested-object creation.
Preprocessing via ``jsonref.replace_refs`` and the sanitization walkers is
run once by the public entry; this helper walks the already-normalized
schema and emits Pydantic models. ``in_progress`` maps ``id(schema)`` to
the model being built for that schema, so a cyclic ``$ref`` graph
degrades to a ``ForwardRef`` back-edge instead of blowing the stack.
"""
original_id = id(json_schema)
if "allOf" in json_schema:
json_schema = _merge_all_of_schemas(json_schema["allOf"], effective_root)
if "title" not in json_schema and "title" in (root_schema or {}):
json_schema["title"] = (root_schema or {}).get("title")
effective_name = model_name or json_schema.get("title") or "DynamicModel"
schema_id = id(json_schema)
in_progress[original_id] = effective_name
if schema_id != original_id:
in_progress[schema_id] = effective_name
field_definitions = {
name: _json_schema_to_pydantic_field(
name,
@@ -732,13 +891,14 @@ def create_model_from_schema( # type: ignore[no-any-unimported]
json_schema.get("required", []),
effective_root,
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
)
for name, prop in (json_schema.get("properties", {}) or {}).items()
}
effective_config = __config__ or ConfigDict(extra="forbid")
return create_model_base(
model = create_model_base(
effective_name,
__config__=effective_config,
__base__=__base__,
@@ -747,6 +907,10 @@ def create_model_from_schema( # type: ignore[no-any-unimported]
__cls_kwargs__=__cls_kwargs__,
**field_definitions,
)
in_progress[original_id] = model
if schema_id != original_id:
in_progress[schema_id] = model
return model
def _json_schema_to_pydantic_field(
@@ -756,6 +920,7 @@ def _json_schema_to_pydantic_field(
root_schema: dict[str, Any],
*,
enrich_descriptions: bool = False,
in_progress: dict[int, Any] | None = None,
) -> Any:
"""Convert a JSON schema property to a Pydantic field definition.
@@ -774,6 +939,7 @@ def _json_schema_to_pydantic_field(
root_schema,
name_=name.title(),
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
)
is_required = name in required
@@ -833,7 +999,7 @@ def _json_schema_to_pydantic_field(
field_params["pattern"] = json_schema["pattern"]
if not is_required:
type_ = type_ | None
type_ = Optional[type_] # noqa: UP045 - ForwardRef does not support `|`
if schema_extra:
field_params["json_schema_extra"] = schema_extra
@@ -906,6 +1072,7 @@ def _json_schema_to_pydantic_type(
*,
name_: str | None = None,
enrich_descriptions: bool = False,
in_progress: dict[int, Any] | None = None,
) -> Any:
"""Convert a JSON schema to a Python/Pydantic type.
@@ -914,10 +1081,23 @@ def _json_schema_to_pydantic_type(
root_schema: The root schema for resolving $ref.
name_: Optional name for nested models.
enrich_descriptions: Propagated to nested model creation.
in_progress: Map of ``id(schema_dict)`` to the Pydantic model
currently being built for that schema, or to a placeholder name
as a plain ``str`` while the model is still being constructed.
Populated by :func:`_build_model_from_schema`. Enables cycle
detection so a self-referential ``$ref`` graph resolves to a
:class:`ForwardRef` back-edge rather than recursing forever.
Returns:
A Python type corresponding to the JSON schema.
"""
if in_progress is not None:
cached = in_progress.get(id(json_schema))
if isinstance(cached, str):
return ForwardRef(cached)
if cached is not None:
return cached
ref = json_schema.get("$ref")
if ref:
ref_schema = _resolve_ref(ref, root_schema)
@@ -926,6 +1106,7 @@ def _json_schema_to_pydantic_type(
root_schema,
name_=name_,
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
)
enum_values = json_schema.get("enum")
@@ -945,6 +1126,7 @@ def _json_schema_to_pydantic_type(
root_schema,
name_=f"{name_ or 'Union'}Option{i}",
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
)
for i, schema in enumerate(any_of_schemas)
]
@@ -958,6 +1140,15 @@ def _json_schema_to_pydantic_type(
root_schema,
name_=name_,
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
)
if in_progress is not None:
return _build_model_from_schema(
json_schema,
root_schema,
model_name=name_,
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
)
merged = _merge_all_of_schemas(all_of_schemas, root_schema)
return _json_schema_to_pydantic_type(
@@ -965,6 +1156,7 @@ def _json_schema_to_pydantic_type(
root_schema,
name_=name_,
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
)
type_ = json_schema.get("type")
@@ -985,12 +1177,21 @@ def _json_schema_to_pydantic_type(
root_schema,
name_=name_,
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
)
return list[item_type] # type: ignore[valid-type]
return list
if type_ == "object":
properties = json_schema.get("properties")
if properties:
if in_progress is not None:
return _build_model_from_schema(
json_schema,
root_schema,
model_name=name_,
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
)
json_schema_ = json_schema.copy()
if json_schema_.get("title") is None:
json_schema_["title"] = name_ or "DynamicModel"

View File

@@ -1,365 +0,0 @@
"""Tests for constraint propagation during task delegation.
These tests verify that when a Task has structured constraints defined,
they are properly propagated to delegated tasks through the DelegateWorkTool
and AskQuestionTool, ensuring worker agents receive the original requirements.
See: https://github.com/crewAIInc/crewAI/issues/5476
"""
import logging
from unittest.mock import patch
import pytest
from crewai.agent import Agent
from crewai.task import Task
from crewai.tools.agent_tools.agent_tools import AgentTools
from crewai.tools.agent_tools.base_agent_tools import BaseAgentTool
from crewai.tools.agent_tools.delegate_work_tool import DelegateWorkTool
from crewai.tools.agent_tools.ask_question_tool import AskQuestionTool
@pytest.fixture
def researcher():
return Agent(
role="researcher",
goal="Research AI topics",
backstory="Expert researcher in AI",
allow_delegation=False,
)
@pytest.fixture
def writer():
return Agent(
role="writer",
goal="Write articles about AI",
backstory="Expert technical writer",
allow_delegation=False,
)
@pytest.fixture
def task_with_constraints(researcher):
return Task(
description="Find the best open-source ML frameworks from 2024 in Europe",
expected_output="A list of ML frameworks",
agent=researcher,
constraints=[
"Only open-source frameworks",
"Must be from 2024",
"Only frameworks available in Europe",
],
)
@pytest.fixture
def task_without_constraints(researcher):
return Task(
description="Find ML frameworks",
expected_output="A list of ML frameworks",
agent=researcher,
)
class TestTaskConstraintsField:
"""Tests for the constraints field on the Task model."""
def test_task_has_constraints_field(self):
"""A Task can be created with a constraints field."""
task = Task(
description="Test task",
expected_output="Test output",
constraints=["constraint1", "constraint2"],
)
assert task.constraints == ["constraint1", "constraint2"]
def test_task_constraints_default_empty(self):
"""A Task without constraints has an empty list by default."""
task = Task(
description="Test task",
expected_output="Test output",
)
assert task.constraints == []
def test_task_prompt_includes_constraints(self):
"""Task.prompt() includes constraints when they are set."""
task = Task(
description="Find ML frameworks",
expected_output="A list of frameworks",
constraints=["Only open-source", "From 2024 only"],
)
prompt = task.prompt()
assert "Task Constraints (MUST be respected):" in prompt
assert "- Only open-source" in prompt
assert "- From 2024 only" in prompt
def test_task_prompt_excludes_constraints_when_empty(self):
"""Task.prompt() does not include constraint section when constraints are empty."""
task = Task(
description="Find ML frameworks",
expected_output="A list of frameworks",
)
prompt = task.prompt()
assert "Task Constraints" not in prompt
class TestConstraintPropagationInDelegation:
"""Tests for constraint propagation through delegation tools."""
def test_delegate_tool_receives_original_task(self, researcher, writer, task_with_constraints):
"""DelegateWorkTool is initialized with the original task reference."""
tools = AgentTools(agents=[writer], task=task_with_constraints).tools()
delegate_tool = tools[0]
assert isinstance(delegate_tool, DelegateWorkTool)
assert delegate_tool.original_task is task_with_constraints
def test_ask_tool_receives_original_task(self, researcher, writer, task_with_constraints):
"""AskQuestionTool is initialized with the original task reference."""
tools = AgentTools(agents=[writer], task=task_with_constraints).tools()
ask_tool = tools[1]
assert isinstance(ask_tool, AskQuestionTool)
assert ask_tool.original_task is task_with_constraints
def test_delegate_tool_without_task_has_none(self, writer):
"""When no task is provided, original_task is None."""
tools = AgentTools(agents=[writer]).tools()
delegate_tool = tools[0]
assert delegate_tool.original_task is None
@patch.object(Agent, "execute_task")
def test_constraints_propagated_to_delegated_task(
self, mock_execute, researcher, writer, task_with_constraints
):
"""Constraints from the original task are propagated to the delegated task."""
mock_execute.return_value = "result"
tools = AgentTools(agents=[researcher], task=task_with_constraints).tools()
delegate_tool = tools[0]
delegate_tool.run(
coworker="researcher",
task="Find ML frameworks",
context="Need a comprehensive list",
)
# Verify execute_task was called
mock_execute.assert_called_once()
delegated_task = mock_execute.call_args[0][0]
delegated_context = mock_execute.call_args[0][1]
# The delegated task should have the constraints from the original task
assert delegated_task.constraints == [
"Only open-source frameworks",
"Must be from 2024",
"Only frameworks available in Europe",
]
# Context should NOT be modified — constraints are rendered via Task.prompt()
assert delegated_context == "Need a comprehensive list"
@patch.object(Agent, "execute_task")
def test_context_not_modified_by_constraints(
self, mock_execute, researcher, writer, task_with_constraints
):
"""Context is passed through unchanged; constraints live on the Task object."""
mock_execute.return_value = "result"
tools = AgentTools(agents=[researcher], task=task_with_constraints).tools()
delegate_tool = tools[0]
delegate_tool.run(
coworker="researcher",
task="Find ML frameworks",
context="Previous context here",
)
mock_execute.assert_called_once()
delegated_task = mock_execute.call_args[0][0]
delegated_context = mock_execute.call_args[0][1]
# Context should be unchanged
assert delegated_context == "Previous context here"
# Constraints should be on the task object
assert len(delegated_task.constraints) == 3
@patch.object(Agent, "execute_task")
def test_no_constraints_no_modification(
self, mock_execute, researcher, writer, task_without_constraints
):
"""When original task has no constraints, context is not modified."""
mock_execute.return_value = "result"
tools = AgentTools(agents=[researcher], task=task_without_constraints).tools()
delegate_tool = tools[0]
delegate_tool.run(
coworker="researcher",
task="Find ML frameworks",
context="Just context",
)
mock_execute.assert_called_once()
delegated_task = mock_execute.call_args[0][0]
delegated_context = mock_execute.call_args[0][1]
assert delegated_task.constraints == []
assert delegated_context == "Just context"
@patch.object(Agent, "execute_task")
def test_ask_question_propagates_constraints(
self, mock_execute, researcher, writer, task_with_constraints
):
"""AskQuestionTool also propagates constraints to the delegated task."""
mock_execute.return_value = "answer"
tools = AgentTools(agents=[researcher], task=task_with_constraints).tools()
ask_tool = tools[1]
ask_tool.run(
coworker="researcher",
question="What are the best frameworks?",
context="Need details",
)
mock_execute.assert_called_once()
delegated_task = mock_execute.call_args[0][0]
delegated_context = mock_execute.call_args[0][1]
assert delegated_task.constraints == task_with_constraints.constraints
# Context should be unchanged — constraints live on the task
assert delegated_context == "Need details"
@patch.object(Agent, "execute_task")
def test_constraints_propagated_when_no_original_context(
self, mock_execute, researcher, writer, task_with_constraints
):
"""Even with empty context, constraints are on the task, not injected into context."""
mock_execute.return_value = "result"
tools = AgentTools(agents=[researcher], task=task_with_constraints).tools()
delegate_tool = tools[0]
delegate_tool.run(
coworker="researcher",
task="Find ML frameworks",
context="",
)
mock_execute.assert_called_once()
delegated_task = mock_execute.call_args[0][0]
delegated_context = mock_execute.call_args[0][1]
# Context should remain empty
assert delegated_context == ""
# Constraints are on the task object
assert delegated_task.constraints == task_with_constraints.constraints
@patch.object(Agent, "execute_task")
def test_delegation_without_original_task_works(
self, mock_execute, researcher, writer
):
"""Delegation still works when no original task is set (backward compatible)."""
mock_execute.return_value = "result"
tools = AgentTools(agents=[researcher]).tools()
delegate_tool = tools[0]
delegate_tool.run(
coworker="researcher",
task="Find ML frameworks",
context="Some context",
)
mock_execute.assert_called_once()
delegated_task = mock_execute.call_args[0][0]
delegated_context = mock_execute.call_args[0][1]
# Should work normally without constraints
assert delegated_task.constraints == []
assert delegated_context == "Some context"
class TestConstraintPropagationLogging:
"""Tests for logging during constraint propagation."""
@patch.object(Agent, "execute_task")
def test_constraint_propagation_logs_info(
self, mock_execute, researcher, writer, task_with_constraints, caplog
):
"""An info log is emitted when constraints are propagated."""
mock_execute.return_value = "result"
tools = AgentTools(agents=[researcher], task=task_with_constraints).tools()
delegate_tool = tools[0]
with caplog.at_level(logging.INFO, logger="crewai.tools.agent_tools.base_agent_tools"):
delegate_tool.run(
coworker="researcher",
task="Find ML frameworks",
context="Context",
)
assert any("Propagating 3 constraint(s)" in record.message for record in caplog.records)
@patch.object(Agent, "execute_task")
def test_no_log_when_no_constraints(
self, mock_execute, researcher, writer, task_without_constraints, caplog
):
"""No constraint propagation log when there are no constraints."""
mock_execute.return_value = "result"
tools = AgentTools(agents=[researcher], task=task_without_constraints).tools()
delegate_tool = tools[0]
with caplog.at_level(logging.INFO, logger="crewai.tools.agent_tools.base_agent_tools"):
delegate_tool.run(
coworker="researcher",
task="Find ML frameworks",
context="Context",
)
assert not any("Propagating" in record.message for record in caplog.records)
class TestAgentToolsTaskPassThrough:
"""Tests that AgentTools passes the task to the underlying tools."""
def test_agent_tools_with_task(self, researcher, task_with_constraints):
"""AgentTools passes the task to both delegate and ask tools."""
agent_tools = AgentTools(agents=[researcher], task=task_with_constraints)
tools = agent_tools.tools()
assert len(tools) == 2
for tool in tools:
assert isinstance(tool, BaseAgentTool)
assert tool.original_task is task_with_constraints
def test_agent_tools_without_task(self, researcher):
"""AgentTools without a task sets original_task to None on tools."""
agent_tools = AgentTools(agents=[researcher])
tools = agent_tools.tools()
assert len(tools) == 2
for tool in tools:
assert isinstance(tool, BaseAgentTool)
assert tool.original_task is None
def test_agent_get_delegation_tools_passes_task(self, researcher, task_with_constraints):
"""Agent.get_delegation_tools passes the task through to AgentTools."""
tools = researcher.get_delegation_tools(agents=[researcher], task=task_with_constraints)
assert len(tools) == 2
for tool in tools:
assert isinstance(tool, BaseAgentTool)
assert tool.original_task is task_with_constraints
def test_agent_get_delegation_tools_without_task(self, researcher):
"""Agent.get_delegation_tools without task still works (backward compatible)."""
tools = researcher.get_delegation_tools(agents=[researcher])
assert len(tools) == 2
for tool in tools:
assert isinstance(tool, BaseAgentTool)
assert tool.original_task is None

View File

@@ -1,3 +1,3 @@
"""CrewAI development tools."""
__version__ = "1.14.2a5"
__version__ = "1.14.2rc1"

View File

@@ -162,7 +162,7 @@ info = "Commits must follow Conventional Commits 1.0.0."
[tool.uv]
exclude-newer = "3 days"
exclude-newer = "1 day"
# composio-core pins rich<14 but textual requires rich>=14.
# onnxruntime 1.24+ dropped Python 3.10 wheels; cap it so qdrant[fastembed] resolves on 3.10.
@@ -170,8 +170,9 @@ exclude-newer = "3 days"
# langchain-core <1.2.28 has GHSA-926x-3r5x-gfhw (incomplete f-string validation).
# transformers 4.57.6 has CVE-2026-1839; force 5.4+ (docling 2.84 allows huggingface-hub>=1).
# cryptography 46.0.6 has CVE-2026-39892; force 46.0.7+.
# pypdf <6.10.0 has CVE-2026-40260; force 6.10.0+.
# pypdf <6.10.1 has CVE-2026-40260 and GHSA-jj6c-8h6c-hppx; force 6.10.1+.
# uv <0.11.6 has GHSA-pjjw-68hj-v9mw; force 0.11.6+.
# python-multipart <0.0.26 has GHSA-mj87-hwqh-73pj; force 0.0.26+.
override-dependencies = [
"rich>=13.7.1",
"onnxruntime<1.24; python_version < '3.11'",
@@ -180,8 +181,9 @@ override-dependencies = [
"urllib3>=2.6.3",
"transformers>=5.4.0; python_version >= '3.10'",
"cryptography>=46.0.7",
"pypdf>=6.10.0,<7",
"pypdf>=6.10.1,<7",
"uv>=0.11.6,<1",
"python-multipart>=0.0.26,<1",
]
[tool.uv.workspace]

19
uv.lock generated
View File

@@ -13,8 +13,8 @@ resolution-markers = [
]
[options]
exclude-newer = "2026-04-10T18:30:59.748668Z"
exclude-newer-span = "P3D"
exclude-newer = "2026-04-14T20:20:18.36862Z"
exclude-newer-span = "P1D"
[manifest]
members = [
@@ -28,7 +28,8 @@ overrides = [
{ name = "langchain-core", specifier = ">=1.2.28,<2" },
{ name = "onnxruntime", marker = "python_full_version < '3.11'", specifier = "<1.24" },
{ name = "pillow", specifier = ">=12.1.1" },
{ name = "pypdf", specifier = ">=6.10.0,<7" },
{ name = "pypdf", specifier = ">=6.10.1,<7" },
{ name = "python-multipart", specifier = ">=0.0.26,<1" },
{ name = "rich", specifier = ">=13.7.1" },
{ name = "transformers", marker = "python_full_version >= '3.10'", specifier = ">=5.4.0" },
{ name = "urllib3", specifier = ">=2.6.3" },
@@ -6727,14 +6728,14 @@ wheels = [
[[package]]
name = "pypdf"
version = "6.10.0"
version = "6.10.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "typing-extensions", marker = "python_full_version < '3.11'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/b8/9f/ca96abf18683ca12602065e4ed2bec9050b672c87d317f1079abc7b6d993/pypdf-6.10.0.tar.gz", hash = "sha256:4c5a48ba258c37024ec2505f7e8fd858525f5502784a2e1c8d415604af29f6ef", size = 5314833, upload-time = "2026-04-10T09:34:57.102Z" }
sdist = { url = "https://files.pythonhosted.org/packages/66/79/f2730c42ec7891a75a2fcea2eb4f356872bcbc671b711418060424796612/pypdf-6.10.1.tar.gz", hash = "sha256:62e6ca7f65aaa28b3d192addb44f97296e4be1748f57ed0f4efb2d4915841880", size = 5315704, upload-time = "2026-04-14T12:55:20.996Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/55/f2/7ebe366f633f30a6ad105f650f44f24f98cb1335c4157d21ae47138b3482/pypdf-6.10.0-py3-none-any.whl", hash = "sha256:90005e959e1596c6e6c84c8b0ad383285b3e17011751cedd17f2ce8fcdfc86de", size = 334459, upload-time = "2026-04-10T09:34:54.966Z" },
{ url = "https://files.pythonhosted.org/packages/f0/04/e3aa7f1f14dbc53429cae34666261eb935d99bd61d24756ab94d7e0309da/pypdf-6.10.1-py3-none-any.whl", hash = "sha256:6331940d3bfe75b7e6601d35db7adabab5fc1d716efaeb384e3c0c3957d033de", size = 335606, upload-time = "2026-04-14T12:55:18.941Z" },
]
[[package]]
@@ -6988,11 +6989,11 @@ wheels = [
[[package]]
name = "python-multipart"
version = "0.0.24"
version = "0.0.26"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/8a/45/e23b5dc14ddb9918ae4a625379506b17b6f8fc56ca1d82db62462f59aea6/python_multipart-0.0.24.tar.gz", hash = "sha256:9574c97e1c026e00bc30340ef7c7d76739512ab4dfd428fec8c330fa6a5cc3c8", size = 37695, upload-time = "2026-04-05T20:49:13.829Z" }
sdist = { url = "https://files.pythonhosted.org/packages/88/71/b145a380824a960ebd60e1014256dbb7d2253f2316ff2d73dfd8928ec2c3/python_multipart-0.0.26.tar.gz", hash = "sha256:08fadc45918cd615e26846437f50c5d6d23304da32c341f289a617127b081f17", size = 43501, upload-time = "2026-04-10T14:09:59.473Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/a3/73/89930efabd4da63cea44a3f438aeb753d600123570e6d6264e763617a9ce/python_multipart-0.0.24-py3-none-any.whl", hash = "sha256:9b110a98db707df01a53c194f0af075e736a770dc5058089650d70b4a182f950", size = 24420, upload-time = "2026-04-05T20:49:12.555Z" },
{ url = "https://files.pythonhosted.org/packages/9a/22/f1925cdda983ab66fc8ec6ec8014b959262747e58bdca26a4e3d1da29d56/python_multipart-0.0.26-py3-none-any.whl", hash = "sha256:c0b169f8c4484c13b0dcf2ef0ec3a4adb255c4b7d18d8e420477d2b1dd03f185", size = 28847, upload-time = "2026-04-10T14:09:58.131Z" },
]
[[package]]