Compare commits

...

8 Commits

Author SHA1 Message Date
dependabot[bot]
07eef79c19 chore(deps): bump langsmith
Bumps the security-updates group with 1 update in the / directory: [langsmith](https://github.com/langchain-ai/langsmith-sdk).


Updates `langsmith` from 0.7.30 to 0.7.31
- [Release notes](https://github.com/langchain-ai/langsmith-sdk/releases)
- [Commits](https://github.com/langchain-ai/langsmith-sdk/compare/v0.7.30...v0.7.31)

---
updated-dependencies:
- dependency-name: langsmith
  dependency-version: 0.7.31
  dependency-type: indirect
  dependency-group: security-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-16 02:49:23 +00:00
Greyson LaLonde
0bb6faa9d3 docs: update changelog and version for v1.14.2rc1
Some checks are pending
CodeQL Advanced / Analyze (actions) (push) Waiting to run
CodeQL Advanced / Analyze (python) (push) Waiting to run
Check Documentation Broken Links / Check broken links (push) Waiting to run
Vulnerability Scan / pip-audit (push) Waiting to run
Build uv cache / build-cache (3.10) (push) Waiting to run
Build uv cache / build-cache (3.11) (push) Waiting to run
Build uv cache / build-cache (3.12) (push) Waiting to run
Build uv cache / build-cache (3.13) (push) Waiting to run
2026-04-16 05:24:57 +08:00
Greyson LaLonde
aa28eeab6a feat: bump versions to 1.14.2rc1 2026-04-16 05:18:24 +08:00
Greyson LaLonde
29b5531f78 fix: handle cyclic JSON schemas in MCP tool resolution 2026-04-16 05:03:00 +08:00
Greyson LaLonde
74d061e994 fix: bump python-multipart to 0.0.26 to patch GHSA-mj87-hwqh-73pj
Fixes GHSA-mj87-hwqh-73pj
2026-04-16 04:25:35 +08:00
Greyson LaLonde
18d0fd6b80 fix: bump pypdf to 6.10.1 to patch GHSA-jj6c-8h6c-hppx
Fixes GHSA-jj6c-8h6c-hppx
2026-04-16 04:11:08 +08:00
Greyson LaLonde
1c90d574ab docs: update changelog and version for v1.14.2a5
Some checks failed
CodeQL Advanced / Analyze (actions) (push) Has been cancelled
CodeQL Advanced / Analyze (python) (push) Has been cancelled
Check Documentation Broken Links / Check broken links (push) Has been cancelled
Vulnerability Scan / pip-audit (push) Has been cancelled
2026-04-15 22:45:15 +08:00
Greyson LaLonde
3a7c550512 feat: bump versions to 1.14.2a5 2026-04-15 22:40:48 +08:00
17 changed files with 420 additions and 59 deletions

View File

@@ -4,6 +4,43 @@ description: "تحديثات المنتج والتحسينات وإصلاحات
icon: "clock"
mode: "wide"
---
<Update label="16 أبريل 2026">
## v1.14.2rc1
[عرض الإصدار على GitHub](https://github.com/crewAIInc/crewAI/releases/tag/1.14.2rc1)
## ما الذي تغير
### إصلاحات الأخطاء
- إصلاح معالجة مخططات JSON الدائرية في أداة MCP
- إصلاح ثغرة أمنية من خلال تحديث python-multipart إلى 0.0.26
- إصلاح ثغرة أمنية من خلال تحديث pypdf إلى 6.10.1
### الوثائق
- تحديث سجل التغييرات والإصدار لـ v1.14.2a5
## المساهمون
@greysonlalonde
</Update>
<Update label="15 أبريل 2026">
## v1.14.2a5
[عرض الإصدار على GitHub](https://github.com/crewAIInc/crewAI/releases/tag/1.14.2a5)
## ما الذي تغير
### الوثائق
- تحديث سجل التغييرات والإصدار لـ v1.14.2a4
## المساهمون
@greysonlalonde
</Update>
<Update label="15 أبريل 2026">
## v1.14.2a4

View File

@@ -4,6 +4,43 @@ description: "Product updates, improvements, and bug fixes for CrewAI"
icon: "clock"
mode: "wide"
---
<Update label="Apr 16, 2026">
## v1.14.2rc1
[View release on GitHub](https://github.com/crewAIInc/crewAI/releases/tag/1.14.2rc1)
## What's Changed
### Bug Fixes
- Fix handling of cyclic JSON schemas in MCP tool resolution
- Fix vulnerability by bumping python-multipart to 0.0.26
- Fix vulnerability by bumping pypdf to 6.10.1
### Documentation
- Update changelog and version for v1.14.2a5
## Contributors
@greysonlalonde
</Update>
<Update label="Apr 15, 2026">
## v1.14.2a5
[View release on GitHub](https://github.com/crewAIInc/crewAI/releases/tag/1.14.2a5)
## What's Changed
### Documentation
- Update changelog and version for v1.14.2a4
## Contributors
@greysonlalonde
</Update>
<Update label="Apr 15, 2026">
## v1.14.2a4

View File

@@ -4,6 +4,43 @@ description: "CrewAI의 제품 업데이트, 개선 사항 및 버그 수정"
icon: "clock"
mode: "wide"
---
<Update label="2026년 4월 16일">
## v1.14.2rc1
[GitHub 릴리스 보기](https://github.com/crewAIInc/crewAI/releases/tag/1.14.2rc1)
## 변경 사항
### 버그 수정
- MCP 도구 해상도에서 순환 JSON 스키마 처리 수정
- python-multipart를 0.0.26으로 업데이트하여 취약점 수정
- pypdf를 6.10.1로 업데이트하여 취약점 수정
### 문서
- v1.14.2a5에 대한 변경 로그 및 버전 업데이트
## 기여자
@greysonlalonde
</Update>
<Update label="2026년 4월 15일">
## v1.14.2a5
[GitHub 릴리스 보기](https://github.com/crewAIInc/crewAI/releases/tag/1.14.2a5)
## 변경 사항
### 문서
- v1.14.2a4의 변경 로그 및 버전 업데이트
## 기여자
@greysonlalonde
</Update>
<Update label="2026년 4월 15일">
## v1.14.2a4

View File

@@ -4,6 +4,43 @@ description: "Atualizações de produto, melhorias e correções do CrewAI"
icon: "clock"
mode: "wide"
---
<Update label="16 abr 2026">
## v1.14.2rc1
[Ver release no GitHub](https://github.com/crewAIInc/crewAI/releases/tag/1.14.2rc1)
## O que Mudou
### Correções de Bugs
- Corrigir o manuseio de esquemas JSON cíclicos na resolução da ferramenta MCP
- Corrigir vulnerabilidade atualizando python-multipart para 0.0.26
- Corrigir vulnerabilidade atualizando pypdf para 6.10.1
### Documentação
- Atualizar o changelog e a versão para v1.14.2a5
## Contribuidores
@greysonlalonde
</Update>
<Update label="15 abr 2026">
## v1.14.2a5
[Ver release no GitHub](https://github.com/crewAIInc/crewAI/releases/tag/1.14.2a5)
## O que Mudou
### Documentação
- Atualizar changelog e versão para v1.14.2a4
## Contribuidores
@greysonlalonde
</Update>
<Update label="15 abr 2026">
## v1.14.2a4

View File

@@ -152,4 +152,4 @@ __all__ = [
"wrap_file_source",
]
__version__ = "1.14.2a4"
__version__ = "1.14.2rc1"

View File

@@ -10,7 +10,7 @@ requires-python = ">=3.10, <3.14"
dependencies = [
"pytube~=15.0.0",
"requests>=2.33.0,<3",
"crewai==1.14.2a4",
"crewai==1.14.2rc1",
"tiktoken~=0.8.0",
"beautifulsoup4~=4.13.4",
"python-docx~=1.2.0",

View File

@@ -305,4 +305,4 @@ __all__ = [
"ZapierActionTools",
]
__version__ = "1.14.2a4"
__version__ = "1.14.2rc1"

View File

@@ -55,7 +55,7 @@ Repository = "https://github.com/crewAIInc/crewAI"
[project.optional-dependencies]
tools = [
"crewai-tools==1.14.2a4",
"crewai-tools==1.14.2rc1",
]
embeddings = [
"tiktoken~=0.8.0"

View File

@@ -46,7 +46,7 @@ def _suppress_pydantic_deprecation_warnings() -> None:
_suppress_pydantic_deprecation_warnings()
__version__ = "1.14.2a4"
__version__ = "1.14.2rc1"
_telemetry_submitted = False

View File

@@ -5,7 +5,7 @@ description = "{{name}} using crewAI"
authors = [{ name = "Your Name", email = "you@example.com" }]
requires-python = ">=3.10,<3.14"
dependencies = [
"crewai[tools]==1.14.2a4"
"crewai[tools]==1.14.2rc1"
]
[project.scripts]

View File

@@ -5,7 +5,7 @@ description = "{{name}} using crewAI"
authors = [{ name = "Your Name", email = "you@example.com" }]
requires-python = ">=3.10,<3.14"
dependencies = [
"crewai[tools]==1.14.2a4"
"crewai[tools]==1.14.2rc1"
]
[project.scripts]

View File

@@ -5,7 +5,7 @@ description = "Power up your crews with {{folder_name}}"
readme = "README.md"
requires-python = ">=3.10,<3.14"
dependencies = [
"crewai[tools]==1.14.2a4"
"crewai[tools]==1.14.2rc1"
]
[tool.crewai]

View File

@@ -417,9 +417,18 @@ class MCPToolResolver:
args_schema = None
if tool_def.get("inputSchema"):
args_schema = self._json_schema_to_pydantic(
tool_name, tool_def["inputSchema"]
)
try:
args_schema = self._json_schema_to_pydantic(
tool_name, tool_def["inputSchema"]
)
except Exception as e:
self._logger.log(
"warning",
f"Failed to build args schema for MCP tool "
f"'{tool_name}': {e}. Registering tool without a "
"typed schema.",
)
args_schema = None
tool_schema = {
"description": tool_def.get("description", ""),

View File

@@ -19,7 +19,18 @@ from collections.abc import Callable
from copy import deepcopy
import datetime
import logging
from typing import TYPE_CHECKING, Annotated, Any, Final, Literal, TypedDict, Union, cast
from typing import (
TYPE_CHECKING,
Annotated,
Any,
Final,
ForwardRef,
Literal,
Optional,
TypedDict,
Union,
cast,
)
import uuid
import jsonref # type: ignore[import-untyped]
@@ -99,15 +110,22 @@ def resolve_refs(schema: dict[str, Any]) -> dict[str, Any]:
"""
defs = schema.get("$defs", {})
schema_copy = deepcopy(schema)
expanding: set[str] = set()
def _resolve(node: Any) -> Any:
if isinstance(node, dict):
ref = node.get("$ref")
if isinstance(ref, str) and ref.startswith("#/$defs/"):
def_name = ref.replace("#/$defs/", "")
if def_name in defs:
if def_name not in defs:
raise KeyError(f"Definition '{def_name}' not found in $defs.")
if def_name in expanding:
return {}
expanding.add(def_name)
try:
return _resolve(deepcopy(defs[def_name]))
raise KeyError(f"Definition '{def_name}' not found in $defs.")
finally:
expanding.discard(def_name)
return {k: _resolve(v) for k, v in node.items()}
if isinstance(node, list):
@@ -119,7 +137,11 @@ def resolve_refs(schema: dict[str, Any]) -> dict[str, Any]:
def add_key_in_dict_recursively(
d: dict[str, Any], key: str, value: Any, criteria: Callable[[dict[str, Any]], bool]
d: dict[str, Any],
key: str,
value: Any,
criteria: Callable[[dict[str, Any]], bool],
_seen: set[int] | None = None,
) -> dict[str, Any]:
"""Recursively adds a key/value pair to all nested dicts matching `criteria`.
@@ -128,22 +150,31 @@ def add_key_in_dict_recursively(
key: The key to add.
value: The value to add.
criteria: A function that returns True for dicts that should receive the key.
_seen: Internal set of visited ``id()``s, used to guard cyclic schemas.
Returns:
The modified dictionary.
"""
if _seen is None:
_seen = set()
if isinstance(d, dict):
if id(d) in _seen:
return d
_seen.add(id(d))
if criteria(d) and key not in d:
d[key] = value
for v in d.values():
add_key_in_dict_recursively(v, key, value, criteria)
add_key_in_dict_recursively(v, key, value, criteria, _seen)
elif isinstance(d, list):
if id(d) in _seen:
return d
_seen.add(id(d))
for i in d:
add_key_in_dict_recursively(i, key, value, criteria)
add_key_in_dict_recursively(i, key, value, criteria, _seen)
return d
def force_additional_properties_false(d: Any) -> Any:
def force_additional_properties_false(d: Any, _seen: set[int] | None = None) -> Any:
"""Force additionalProperties=false on all object-type dicts recursively.
OpenAI strict mode requires all objects to have additionalProperties=false.
@@ -154,11 +185,17 @@ def force_additional_properties_false(d: Any) -> Any:
Args:
d: The dictionary/list to modify.
_seen: Internal set of visited ``id()``s, used to guard cyclic schemas.
Returns:
The modified dictionary/list.
"""
if _seen is None:
_seen = set()
if isinstance(d, dict):
if id(d) in _seen:
return d
_seen.add(id(d))
if d.get("type") == "object":
d["additionalProperties"] = False
if "properties" not in d:
@@ -166,10 +203,13 @@ def force_additional_properties_false(d: Any) -> Any:
if "required" not in d:
d["required"] = []
for v in d.values():
force_additional_properties_false(v)
force_additional_properties_false(v, _seen)
elif isinstance(d, list):
if id(d) in _seen:
return d
_seen.add(id(d))
for i in d:
force_additional_properties_false(i)
force_additional_properties_false(i, _seen)
return d
@@ -183,7 +223,7 @@ OPENAI_SUPPORTED_FORMATS: Final[
}
def strip_unsupported_formats(d: Any) -> Any:
def strip_unsupported_formats(d: Any, _seen: set[int] | None = None) -> Any:
"""Remove format annotations that OpenAI strict mode doesn't support.
OpenAI only supports: date-time, date, time, duration.
@@ -191,11 +231,17 @@ def strip_unsupported_formats(d: Any) -> Any:
Args:
d: The dictionary/list to modify.
_seen: Internal set of visited ``id()``s, used to guard cyclic schemas.
Returns:
The modified dictionary/list.
"""
if _seen is None:
_seen = set()
if isinstance(d, dict):
if id(d) in _seen:
return d
_seen.add(id(d))
format_value = d.get("format")
if (
isinstance(format_value, str)
@@ -203,14 +249,17 @@ def strip_unsupported_formats(d: Any) -> Any:
):
del d["format"]
for v in d.values():
strip_unsupported_formats(v)
strip_unsupported_formats(v, _seen)
elif isinstance(d, list):
if id(d) in _seen:
return d
_seen.add(id(d))
for i in d:
strip_unsupported_formats(i)
strip_unsupported_formats(i, _seen)
return d
def ensure_type_in_schemas(d: Any) -> Any:
def ensure_type_in_schemas(d: Any, _seen: set[int] | None = None) -> Any:
"""Ensure all schema objects in anyOf/oneOf have a 'type' key.
OpenAI strict mode requires every schema to have a 'type' key.
@@ -218,11 +267,17 @@ def ensure_type_in_schemas(d: Any) -> Any:
Args:
d: The dictionary/list to modify.
_seen: Internal set of visited ``id()``s, used to guard cyclic schemas.
Returns:
The modified dictionary/list.
"""
if _seen is None:
_seen = set()
if isinstance(d, dict):
if id(d) in _seen:
return d
_seen.add(id(d))
for key in ("anyOf", "oneOf"):
if key in d:
schema_list = d[key]
@@ -230,12 +285,15 @@ def ensure_type_in_schemas(d: Any) -> Any:
if isinstance(schema, dict) and schema == {}:
schema_list[i] = {"type": "object"}
else:
ensure_type_in_schemas(schema)
ensure_type_in_schemas(schema, _seen)
for v in d.values():
ensure_type_in_schemas(v)
ensure_type_in_schemas(v, _seen)
elif isinstance(d, list):
if id(d) in _seen:
return d
_seen.add(id(d))
for item in d:
ensure_type_in_schemas(item)
ensure_type_in_schemas(item, _seen)
return d
@@ -318,7 +376,9 @@ def add_const_to_oneof_variants(schema: dict[str, Any]) -> dict[str, Any]:
return _process_oneof(deepcopy(schema))
def convert_oneof_to_anyof(schema: dict[str, Any]) -> dict[str, Any]:
def convert_oneof_to_anyof(
schema: dict[str, Any], _seen: set[int] | None = None
) -> dict[str, Any]:
"""Convert oneOf to anyOf for OpenAI compatibility.
OpenAI's Structured Outputs support anyOf better than oneOf.
@@ -326,26 +386,37 @@ def convert_oneof_to_anyof(schema: dict[str, Any]) -> dict[str, Any]:
Args:
schema: JSON schema dictionary.
_seen: Internal set of visited ``id()``s, used to guard cyclic schemas.
Returns:
Modified schema with anyOf instead of oneOf.
"""
if _seen is None:
_seen = set()
if isinstance(schema, dict):
if id(schema) in _seen:
return schema
_seen.add(id(schema))
if "oneOf" in schema:
schema["anyOf"] = schema.pop("oneOf")
for value in schema.values():
if isinstance(value, dict):
convert_oneof_to_anyof(value)
convert_oneof_to_anyof(value, _seen)
elif isinstance(value, list):
if id(value) in _seen:
continue
_seen.add(id(value))
for item in value:
if isinstance(item, dict):
convert_oneof_to_anyof(item)
convert_oneof_to_anyof(item, _seen)
return schema
def ensure_all_properties_required(schema: dict[str, Any]) -> dict[str, Any]:
def ensure_all_properties_required(
schema: dict[str, Any], _seen: set[int] | None = None
) -> dict[str, Any]:
"""Ensure all properties are in the required array for OpenAI strict mode.
OpenAI's strict structured outputs require all properties to be listed
@@ -354,11 +425,17 @@ def ensure_all_properties_required(schema: dict[str, Any]) -> dict[str, Any]:
Args:
schema: JSON schema dictionary.
_seen: Internal set of visited ``id()``s, used to guard cyclic schemas.
Returns:
Modified schema with all properties marked as required.
"""
if _seen is None:
_seen = set()
if isinstance(schema, dict):
if id(schema) in _seen:
return schema
_seen.add(id(schema))
if schema.get("type") == "object" and "properties" in schema:
properties = schema["properties"]
if properties:
@@ -366,16 +443,21 @@ def ensure_all_properties_required(schema: dict[str, Any]) -> dict[str, Any]:
for value in schema.values():
if isinstance(value, dict):
ensure_all_properties_required(value)
ensure_all_properties_required(value, _seen)
elif isinstance(value, list):
if id(value) in _seen:
continue
_seen.add(id(value))
for item in value:
if isinstance(item, dict):
ensure_all_properties_required(item)
ensure_all_properties_required(item, _seen)
return schema
def strip_null_from_types(schema: dict[str, Any]) -> dict[str, Any]:
def strip_null_from_types(
schema: dict[str, Any], _seen: set[int] | None = None
) -> dict[str, Any]:
"""Remove null type from anyOf/type arrays.
Pydantic generates `T | None` for optional fields, which creates schemas with
@@ -384,11 +466,17 @@ def strip_null_from_types(schema: dict[str, Any]) -> dict[str, Any]:
Args:
schema: JSON schema dictionary.
_seen: Internal set of visited ``id()``s, used to guard cyclic schemas.
Returns:
Modified schema with null types removed.
"""
if _seen is None:
_seen = set()
if isinstance(schema, dict):
if id(schema) in _seen:
return schema
_seen.add(id(schema))
if "anyOf" in schema:
any_of = schema["anyOf"]
non_null = [opt for opt in any_of if opt.get("type") != "null"]
@@ -408,11 +496,14 @@ def strip_null_from_types(schema: dict[str, Any]) -> dict[str, Any]:
for value in schema.values():
if isinstance(value, dict):
strip_null_from_types(value)
strip_null_from_types(value, _seen)
elif isinstance(value, list):
if id(value) in _seen:
continue
_seen.add(id(value))
for item in value:
if isinstance(item, dict):
strip_null_from_types(item)
strip_null_from_types(item, _seen)
return schema
@@ -451,16 +542,26 @@ _CLAUDE_STRICT_UNSUPPORTED: Final[tuple[str, ...]] = (
)
def _strip_keys_recursive(d: Any, keys: tuple[str, ...]) -> Any:
def _strip_keys_recursive(
d: Any, keys: tuple[str, ...], _seen: set[int] | None = None
) -> Any:
"""Recursively delete a fixed set of keys from a schema."""
if _seen is None:
_seen = set()
if isinstance(d, dict):
if id(d) in _seen:
return d
_seen.add(id(d))
for key in keys:
d.pop(key, None)
for v in d.values():
_strip_keys_recursive(v, keys)
_strip_keys_recursive(v, keys, _seen)
elif isinstance(d, list):
if id(d) in _seen:
return d
_seen.add(id(d))
for i in d:
_strip_keys_recursive(i, keys)
_strip_keys_recursive(i, keys, _seen)
return d
@@ -719,12 +820,70 @@ def create_model_from_schema( # type: ignore[no-any-unimported]
json_schema = force_additional_properties_false(json_schema)
effective_root = force_additional_properties_false(effective_root)
in_progress: dict[int, Any] = {}
model = _build_model_from_schema(
json_schema,
effective_root,
model_name=model_name,
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
__config__=__config__,
__base__=__base__,
__module__=__module__,
__validators__=__validators__,
__cls_kwargs__=__cls_kwargs__,
)
types_namespace: dict[str, Any] = {
entry.__name__: entry
for entry in in_progress.values()
if isinstance(entry, type) and issubclass(entry, BaseModel)
}
for entry in in_progress.values():
if (
isinstance(entry, type)
and issubclass(entry, BaseModel)
and not getattr(entry, "__pydantic_complete__", True)
):
try:
entry.model_rebuild(_types_namespace=types_namespace)
except Exception as e:
logger.debug("model_rebuild failed for %s: %s", entry.__name__, e)
return model
def _build_model_from_schema( # type: ignore[no-any-unimported]
json_schema: dict[str, Any],
effective_root: dict[str, Any],
*,
model_name: str | None,
enrich_descriptions: bool,
in_progress: dict[int, Any],
__config__: ConfigDict | None = None,
__base__: type[BaseModel] | None = None,
__module__: str = __name__,
__validators__: dict[str, AnyClassMethod] | None = None,
__cls_kwargs__: dict[str, Any] | None = None,
) -> type[BaseModel]:
"""Inner builder shared by the public entry point and recursive nested-object creation.
Preprocessing via ``jsonref.replace_refs`` and the sanitization walkers is
run once by the public entry; this helper walks the already-normalized
schema and emits Pydantic models. ``in_progress`` maps ``id(schema)`` to
the model being built for that schema, so a cyclic ``$ref`` graph
degrades to a ``ForwardRef`` back-edge instead of blowing the stack.
"""
original_id = id(json_schema)
if "allOf" in json_schema:
json_schema = _merge_all_of_schemas(json_schema["allOf"], effective_root)
if "title" not in json_schema and "title" in (root_schema or {}):
json_schema["title"] = (root_schema or {}).get("title")
effective_name = model_name or json_schema.get("title") or "DynamicModel"
schema_id = id(json_schema)
in_progress[original_id] = effective_name
if schema_id != original_id:
in_progress[schema_id] = effective_name
field_definitions = {
name: _json_schema_to_pydantic_field(
name,
@@ -732,13 +891,14 @@ def create_model_from_schema( # type: ignore[no-any-unimported]
json_schema.get("required", []),
effective_root,
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
)
for name, prop in (json_schema.get("properties", {}) or {}).items()
}
effective_config = __config__ or ConfigDict(extra="forbid")
return create_model_base(
model = create_model_base(
effective_name,
__config__=effective_config,
__base__=__base__,
@@ -747,6 +907,10 @@ def create_model_from_schema( # type: ignore[no-any-unimported]
__cls_kwargs__=__cls_kwargs__,
**field_definitions,
)
in_progress[original_id] = model
if schema_id != original_id:
in_progress[schema_id] = model
return model
def _json_schema_to_pydantic_field(
@@ -756,6 +920,7 @@ def _json_schema_to_pydantic_field(
root_schema: dict[str, Any],
*,
enrich_descriptions: bool = False,
in_progress: dict[int, Any] | None = None,
) -> Any:
"""Convert a JSON schema property to a Pydantic field definition.
@@ -774,6 +939,7 @@ def _json_schema_to_pydantic_field(
root_schema,
name_=name.title(),
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
)
is_required = name in required
@@ -833,7 +999,7 @@ def _json_schema_to_pydantic_field(
field_params["pattern"] = json_schema["pattern"]
if not is_required:
type_ = type_ | None
type_ = Optional[type_] # noqa: UP045 - ForwardRef does not support `|`
if schema_extra:
field_params["json_schema_extra"] = schema_extra
@@ -906,6 +1072,7 @@ def _json_schema_to_pydantic_type(
*,
name_: str | None = None,
enrich_descriptions: bool = False,
in_progress: dict[int, Any] | None = None,
) -> Any:
"""Convert a JSON schema to a Python/Pydantic type.
@@ -914,10 +1081,23 @@ def _json_schema_to_pydantic_type(
root_schema: The root schema for resolving $ref.
name_: Optional name for nested models.
enrich_descriptions: Propagated to nested model creation.
in_progress: Map of ``id(schema_dict)`` to the Pydantic model
currently being built for that schema, or to a placeholder name
as a plain ``str`` while the model is still being constructed.
Populated by :func:`_build_model_from_schema`. Enables cycle
detection so a self-referential ``$ref`` graph resolves to a
:class:`ForwardRef` back-edge rather than recursing forever.
Returns:
A Python type corresponding to the JSON schema.
"""
if in_progress is not None:
cached = in_progress.get(id(json_schema))
if isinstance(cached, str):
return ForwardRef(cached)
if cached is not None:
return cached
ref = json_schema.get("$ref")
if ref:
ref_schema = _resolve_ref(ref, root_schema)
@@ -926,6 +1106,7 @@ def _json_schema_to_pydantic_type(
root_schema,
name_=name_,
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
)
enum_values = json_schema.get("enum")
@@ -945,6 +1126,7 @@ def _json_schema_to_pydantic_type(
root_schema,
name_=f"{name_ or 'Union'}Option{i}",
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
)
for i, schema in enumerate(any_of_schemas)
]
@@ -958,6 +1140,15 @@ def _json_schema_to_pydantic_type(
root_schema,
name_=name_,
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
)
if in_progress is not None:
return _build_model_from_schema(
json_schema,
root_schema,
model_name=name_,
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
)
merged = _merge_all_of_schemas(all_of_schemas, root_schema)
return _json_schema_to_pydantic_type(
@@ -965,6 +1156,7 @@ def _json_schema_to_pydantic_type(
root_schema,
name_=name_,
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
)
type_ = json_schema.get("type")
@@ -985,12 +1177,21 @@ def _json_schema_to_pydantic_type(
root_schema,
name_=name_,
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
)
return list[item_type] # type: ignore[valid-type]
return list
if type_ == "object":
properties = json_schema.get("properties")
if properties:
if in_progress is not None:
return _build_model_from_schema(
json_schema,
root_schema,
model_name=name_,
enrich_descriptions=enrich_descriptions,
in_progress=in_progress,
)
json_schema_ = json_schema.copy()
if json_schema_.get("title") is None:
json_schema_["title"] = name_ or "DynamicModel"

View File

@@ -1,3 +1,3 @@
"""CrewAI development tools."""
__version__ = "1.14.2a4"
__version__ = "1.14.2rc1"

View File

@@ -162,7 +162,7 @@ info = "Commits must follow Conventional Commits 1.0.0."
[tool.uv]
exclude-newer = "3 days"
exclude-newer = "1 day"
# composio-core pins rich<14 but textual requires rich>=14.
# onnxruntime 1.24+ dropped Python 3.10 wheels; cap it so qdrant[fastembed] resolves on 3.10.
@@ -170,8 +170,9 @@ exclude-newer = "3 days"
# langchain-core <1.2.28 has GHSA-926x-3r5x-gfhw (incomplete f-string validation).
# transformers 4.57.6 has CVE-2026-1839; force 5.4+ (docling 2.84 allows huggingface-hub>=1).
# cryptography 46.0.6 has CVE-2026-39892; force 46.0.7+.
# pypdf <6.10.0 has CVE-2026-40260; force 6.10.0+.
# pypdf <6.10.1 has CVE-2026-40260 and GHSA-jj6c-8h6c-hppx; force 6.10.1+.
# uv <0.11.6 has GHSA-pjjw-68hj-v9mw; force 0.11.6+.
# python-multipart <0.0.26 has GHSA-mj87-hwqh-73pj; force 0.0.26+.
override-dependencies = [
"rich>=13.7.1",
"onnxruntime<1.24; python_version < '3.11'",
@@ -180,8 +181,9 @@ override-dependencies = [
"urllib3>=2.6.3",
"transformers>=5.4.0; python_version >= '3.10'",
"cryptography>=46.0.7",
"pypdf>=6.10.0,<7",
"pypdf>=6.10.1,<7",
"uv>=0.11.6,<1",
"python-multipart>=0.0.26,<1",
]
[tool.uv.workspace]

25
uv.lock generated
View File

@@ -13,8 +13,8 @@ resolution-markers = [
]
[options]
exclude-newer = "2026-04-10T18:30:59.748668Z"
exclude-newer-span = "P3D"
exclude-newer = "2026-04-15T02:49:11.767508106Z"
exclude-newer-span = "P1D"
[manifest]
members = [
@@ -28,7 +28,8 @@ overrides = [
{ name = "langchain-core", specifier = ">=1.2.28,<2" },
{ name = "onnxruntime", marker = "python_full_version < '3.11'", specifier = "<1.24" },
{ name = "pillow", specifier = ">=12.1.1" },
{ name = "pypdf", specifier = ">=6.10.0,<7" },
{ name = "pypdf", specifier = ">=6.10.1,<7" },
{ name = "python-multipart", specifier = ">=0.0.26,<1" },
{ name = "rich", specifier = ">=13.7.1" },
{ name = "transformers", marker = "python_full_version >= '3.10'", specifier = ">=5.4.0" },
{ name = "urllib3", specifier = ">=2.6.3" },
@@ -3595,7 +3596,7 @@ sdist = { url = "https://files.pythonhosted.org/packages/0e/72/a3add0e4eec4eb9e2
[[package]]
name = "langsmith"
version = "0.7.30"
version = "0.7.31"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "httpx" },
@@ -3608,9 +3609,9 @@ dependencies = [
{ name = "xxhash" },
{ name = "zstandard" },
]
sdist = { url = "https://files.pythonhosted.org/packages/46/e7/d27d952ce9824d684a3bb500a06541a2d55734bc4d849cdfcca2dfd4d93a/langsmith-0.7.30.tar.gz", hash = "sha256:d9df7ba5e42f818b63bda78776c8f2fc853388be3ae77b117e5d183a149321a2", size = 1106040, upload-time = "2026-04-09T21:12:01.892Z" }
sdist = { url = "https://files.pythonhosted.org/packages/e6/11/696019490992db5c87774dc20515529ef42a01e1d770fb754ed6d9b12fb0/langsmith-0.7.31.tar.gz", hash = "sha256:331ee4f7c26bb5be4022b9859b7d7b122cbf8c9d01d9f530114c1914b0349ffb", size = 1178480, upload-time = "2026-04-14T17:55:41.242Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/37/19/96250cf58070c5563446651b03bb76c2eb5afbf08e754840ab639532d8c6/langsmith-0.7.30-py3-none-any.whl", hash = "sha256:43dd9f8d290e4d406606d6cc0bd62f5d1050963f05fe0ab6ffe50acf41f2f55a", size = 372682, upload-time = "2026-04-09T21:12:00.481Z" },
{ url = "https://files.pythonhosted.org/packages/1d/a1/a013cf458c301cda86a213dd153ce0a01c93f1ab5833f951e6a44c9763ce/langsmith-0.7.31-py3-none-any.whl", hash = "sha256:0291d49203f6e80dda011af1afda61eb0595a4d697adb684590a8805e1d61fb6", size = 373276, upload-time = "2026-04-14T17:55:39.677Z" },
]
[[package]]
@@ -6727,14 +6728,14 @@ wheels = [
[[package]]
name = "pypdf"
version = "6.10.0"
version = "6.10.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "typing-extensions", marker = "python_full_version < '3.11'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/b8/9f/ca96abf18683ca12602065e4ed2bec9050b672c87d317f1079abc7b6d993/pypdf-6.10.0.tar.gz", hash = "sha256:4c5a48ba258c37024ec2505f7e8fd858525f5502784a2e1c8d415604af29f6ef", size = 5314833, upload-time = "2026-04-10T09:34:57.102Z" }
sdist = { url = "https://files.pythonhosted.org/packages/66/79/f2730c42ec7891a75a2fcea2eb4f356872bcbc671b711418060424796612/pypdf-6.10.1.tar.gz", hash = "sha256:62e6ca7f65aaa28b3d192addb44f97296e4be1748f57ed0f4efb2d4915841880", size = 5315704, upload-time = "2026-04-14T12:55:20.996Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/55/f2/7ebe366f633f30a6ad105f650f44f24f98cb1335c4157d21ae47138b3482/pypdf-6.10.0-py3-none-any.whl", hash = "sha256:90005e959e1596c6e6c84c8b0ad383285b3e17011751cedd17f2ce8fcdfc86de", size = 334459, upload-time = "2026-04-10T09:34:54.966Z" },
{ url = "https://files.pythonhosted.org/packages/f0/04/e3aa7f1f14dbc53429cae34666261eb935d99bd61d24756ab94d7e0309da/pypdf-6.10.1-py3-none-any.whl", hash = "sha256:6331940d3bfe75b7e6601d35db7adabab5fc1d716efaeb384e3c0c3957d033de", size = 335606, upload-time = "2026-04-14T12:55:18.941Z" },
]
[[package]]
@@ -6988,11 +6989,11 @@ wheels = [
[[package]]
name = "python-multipart"
version = "0.0.24"
version = "0.0.26"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/8a/45/e23b5dc14ddb9918ae4a625379506b17b6f8fc56ca1d82db62462f59aea6/python_multipart-0.0.24.tar.gz", hash = "sha256:9574c97e1c026e00bc30340ef7c7d76739512ab4dfd428fec8c330fa6a5cc3c8", size = 37695, upload-time = "2026-04-05T20:49:13.829Z" }
sdist = { url = "https://files.pythonhosted.org/packages/88/71/b145a380824a960ebd60e1014256dbb7d2253f2316ff2d73dfd8928ec2c3/python_multipart-0.0.26.tar.gz", hash = "sha256:08fadc45918cd615e26846437f50c5d6d23304da32c341f289a617127b081f17", size = 43501, upload-time = "2026-04-10T14:09:59.473Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/a3/73/89930efabd4da63cea44a3f438aeb753d600123570e6d6264e763617a9ce/python_multipart-0.0.24-py3-none-any.whl", hash = "sha256:9b110a98db707df01a53c194f0af075e736a770dc5058089650d70b4a182f950", size = 24420, upload-time = "2026-04-05T20:49:12.555Z" },
{ url = "https://files.pythonhosted.org/packages/9a/22/f1925cdda983ab66fc8ec6ec8014b959262747e58bdca26a4e3d1da29d56/python_multipart-0.0.26-py3-none-any.whl", hash = "sha256:c0b169f8c4484c13b0dcf2ef0ec3a4adb255c4b7d18d8e420477d2b1dd03f185", size = 28847, upload-time = "2026-04-10T14:09:58.131Z" },
]
[[package]]