Default Branch

main
Some checks are pending
Build uv cache / build-cache (3.10) (push) Waiting to run
Build uv cache / build-cache (3.11) (push) Waiting to run
Build uv cache / build-cache (3.12) (push) Waiting to run
Build uv cache / build-cache (3.13) (push) Waiting to run
CodeQL Advanced / Analyze (actions) (push) Waiting to run
CodeQL Advanced / Analyze (python) (push) Waiting to run
Check Documentation Broken Links / Check broken links (push) Waiting to run
Vulnerability Scan / pip-audit (push) Waiting to run

d165bcb65f · fix(deps): move textual to crewai-cli and add certifi · Updated 2026-05-06 20:40:08 +00:00

Branches

2df8973658 · fix: use the internal token usage tracker when using litellm · Updated 2026-01-06 15:06:25 +00:00

496
1

83b07b9d23 · fix: prevent LLM observation hallucination by properly attributing tool results · Updated 2026-01-06 06:57:16 +00:00

496
1

0976c42c6b · fix: track token usage in litellm non-streaming and async calls · Updated 2026-01-03 19:28:08 +00:00

498
1

e022caae6c · Fix W293 lint errors in usage_metrics.py docstrings · Updated 2026-01-03 17:37:00 +00:00

498
11

4f13153483 · fix: handle None value for a2a_task_ids_by_endpoint in config · Updated 2026-01-01 08:17:38 +00:00

498
3

4f0b6f6427 · feat(a2a): add async execution support for A2A delegation · Updated 2025-12-30 07:54:43 +00:00

500
1

1c59ff8b96 · Fix output file writes outdated task result after guardrail execution · Updated 2025-12-29 17:37:07 +00:00

500
1

9c5b68f1c5 · feat: Add OpenAI Responses API integration · Updated 2025-12-27 06:19:28 +00:00

501
1

06c4974cb3 · fix: exclude empty stop parameter from LLM completion params · Updated 2025-12-24 07:12:18 +00:00

502
1

9460e5e182 · fix: use huggingface_hub InferenceClient for HuggingFace embeddings · Updated 2025-12-22 22:44:13 +00:00

503
1

b8eb7dd294 · fix import · Updated 2025-12-22 06:38:24 +00:00

503
3

029eedfddb · fix: preserve task outputs when mixing sync and async tasks (#4137) · Updated 2025-12-20 15:09:38 +00:00

503
1

47c698dc86 · fix: implement cooperative cancellation for timeout handling · Updated 2025-12-20 15:01:11 +00:00

503
1

db4cb93770 · fix: inject MCP tools in standalone agent execution (fixes #4133) · Updated 2025-12-20 13:00:47 +00:00

503
1

52862b4f0f · fix: handle ValidationError in guardrail retry loop (fixes #4126) · Updated 2025-12-19 02:23:57 +00:00

506
1

86523dc306 · fix: support 'embedder' key as alias for 'embedding_model' in RagTool config · Updated 2025-12-18 15:45:05 +00:00

507
1

2fd5c46b42 · Emit FlowFailedEvent and finalise batch via added event listener · Updated 2025-12-18 14:04:46 +00:00

507
4

be9dfbc9de · Add todo on TaskStartedEvent model · Updated 2025-12-18 13:58:54 +00:00

507
3

c97c5d27de · Prevent crew inside flow to finish trace batch prematurely on failure · Updated 2025-12-18 12:22:37 +00:00

507
1

be3e7998ae · fix: merge system messages into user messages for Ollama models · Updated 2025-12-18 11:39:51 +00:00

507
1