Commit Graph

86 Commits

Author SHA1 Message Date
Lorenze Jay
b1fdcdfa6e chore: update dependencies and version in project files (#3212)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
- Updated `crewai-tools` dependency from `0.55.0` to `0.58.0` in `pyproject.toml` and `uv.lock`.
- Added new packages `anthropic`, `browserbase`, `playwright`, `pyee`, and `stagehand` with their respective versions in `uv.lock`.
- Bumped the version of the CrewAI library from `0.148.0` to `0.150.0` in `__init__.py`.
- Updated dependency versions in CLI templates for crew, flow, and tool projects to reflect the new CrewAI version.
2025-07-23 11:03:50 -07:00
Lucas Gomide
2fd99503ed build: upgrade LiteLLM to 1.74.3 (#3199) 2025-07-21 09:58:47 -04:00
Vini Brasil
9737333ffd Use file lock around Chroma client initialization (#3181)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
Mark stale issues and pull requests / stale (push) Has been cancelled
This commit fixes a bug with concurrent processess and Chroma where
`table collections already exists` (and similar) were raised.

https://cookbook.chromadb.dev/core/system_constraints/
2025-07-17 11:50:45 -03:00
Lorenze Jay
2490e8cd46 Update CrewAI version to 0.148.0 in project templates and dependencies (#3172)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
* Update CrewAI version to 0.148.0 in project templates and dependencies

* Update crewai-tools dependency to version 0.55.0 in pyproject.toml and uv.lock for improved functionality and performance.
2025-07-16 12:36:43 -07:00
Lorenze Jay
7b0f3aabd9 chore: update crewAI and dependencies to version 0.141.0 and 0.51.0 (#3128)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
- Bump crewAI version to 0.141.0 in __init__.py for alignment with updated dependencies.
- Update `crewai-tools` dependency version to 0.51.0 in pyproject.toml and related template files.
- Add new testing dependencies: pytest-split and pytest-xdist for improved test execution.
- Ensure compatibility with the latest package versions in uv.lock and template files.
2025-07-09 10:37:06 -07:00
Lorenze Jay
748c25451c Lorenze/new version 0.140.0 (#3106)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
Mark stale issues and pull requests / stale (push) Has been cancelled
* fix: clean up whitespace and update dependencies

* Removed unnecessary whitespace in multiple files for consistency.
* Updated `crewai-tools` dependency version to `0.49.0` in `pyproject.toml` and related template files.
* Bumped CrewAI version to `0.140.0` in `__init__.py` for alignment with updated dependencies.

* chore: update pyproject.toml to exclude documentation from build targets

* Added exclusions for the `docs` directory in both wheel and sdist build targets to streamline the build process and reduce unnecessary file inclusion.

* chore: update uv.lock for dependency resolution and Python version compatibility

* Incremented revision to 2.
* Updated resolution markers to include support for Python 3.13 and adjusted platform checks for better compatibility.
* Added new wheel URLs for zstandard version 0.23.0 to ensure availability across various platforms.

* chore: pin json-repair dependency version in pyproject.toml and uv.lock

* Updated json-repair dependency from a range to a specific version (0.25.2) for consistency and to avoid potential compatibility issues.
* Adjusted related entries in uv.lock to reflect the pinned version, ensuring alignment across project files.

* chore: pin agentops dependency version in pyproject.toml and uv.lock

* Updated agentops dependency from a range to a specific version (0.3.18) for consistency and to avoid potential compatibility issues.
* Adjusted related entries in uv.lock to reflect the pinned version, ensuring alignment across project files.

* test: enhance cache call assertions in crew tests

* Improved the test for cache hitting between agents by filtering mock calls to ensure they include the expected 'tool' and 'input' keywords.
* Added assertions to verify the number of cache calls and their expected arguments, enhancing the reliability of the test.
* Cleaned up whitespace and improved readability in various test cases for better maintainability.
2025-07-02 15:22:18 -07:00
Heitor Carvalho
a77dcdd419 feat: add multiple provider support (#3089)
* Remove `crewai signup` command, update docs

* Add `Settings.clear()` and clear settings before each login

* Add pyjwt

* Remove print statement from ToolCommand.login()

* Remove auth0 dependency

* Update docs
2025-07-02 16:44:47 -04:00
Michael Juliano
576b8ff836 Updated LiteLLM dependency. (#3047)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
Mark stale issues and pull requests / stale (push) Has been cancelled
* Updated LiteLLM dependency.

This moves to the latest stable release. Critically, this includes a fix
from https://github.com/BerriAI/litellm/pull/11563 which is required to
use grok-3-mini with crewAI.

* Ran `uv sync` as requested.
2025-06-27 09:54:12 -04:00
Lorenze Jay
0f861338ef chore: update version to 0.134.0 across project files (#3067)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
Mark stale issues and pull requests / stale (push) Has been cancelled
2025-06-25 16:06:43 -07:00
Lucas Gomide
4d1aabf620 feat: enhance CrewBase MCP tools support to allow selecting multiple tools per agent (#3065)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
* feat: enhance CrewBase MCP tools support to allow selecting multiple tools per agent

* docs: clarify how to access MCP tools

* build: upgrade crewai-tools
2025-06-25 14:59:55 -04:00
Daniel Barreto
4364585ebc Remove mkdocs from project dependencies (#3036)
CrewAI has been using https://mintlify.com/
to serve its docs
2025-06-19 11:21:08 -04:00
Lorenze Jay
99133104dd Update version to 0.130.0 and dependencies in pyproject.toml and uv.lock (#3002)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
Mark stale issues and pull requests / stale (push) Has been cancelled
- Bump CrewAI version from 0.126.0 to 0.130.0 in pyproject.toml and uv.lock.
- Update optional dependency 'crewai-tools' version from 0.46.0 to 0.47.1.
- Adjust dependency specifications in CLI templates to reflect the new version.
2025-06-11 17:01:11 -07:00
Lucas Gomide
e6ac1311e7 build: upgrade LiteLLM to support latest Openai version (#2963)
Co-authored-by: Tony Kipkemboi <iamtonykipkemboi@gmail.com>
2025-06-09 08:55:12 -04:00
Lorenze Jay
ba740c6157 Update version to 0.126.0 and dependencies in pyproject.toml and lock files (#2961)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
Mark stale issues and pull requests / stale (push) Has been cancelled
2025-06-04 17:49:07 -07:00
Lucas Gomide
66b7628972 Support Python 3.13 (#2844)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
Mark stale issues and pull requests / stale (push) Has been cancelled
* ci: support python 3.13 on CI

* docs: update docs about support python version

* build: adds requires python <3.14

* build: explicit tokenizers dependency

Added explicit tokenizers dependency: Added tokenizers>=0.20.3 to ensure a version compatible with Python 3.13 is used.

* build: drop fastembed is not longer used

* build: attempt to build PyTorch on Python 3.13

* feat: upgrade fastavro, pyarrow and lancedb

* build: ensure tiktoken greather than 0.8.0 due Python 3.13 compatibility
2025-06-02 18:12:24 -04:00
Lorenze Jay
bcc694348e chore: Bump version to 0.121.1 in project files and update dependencies (#2912)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
Mark stale issues and pull requests / stale (push) Has been cancelled
2025-05-27 10:46:20 -07:00
Lorenze Jay
e59627adf2 Update version to 0.121.0 across project files and dependencies (#2879)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
Mark stale issues and pull requests / stale (push) Has been cancelled
2025-05-21 18:17:19 -07:00
Lorenze Jay
c566747d4a patch version 0.120.1
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
Mark stale issues and pull requests / stale (push) Has been cancelled
2025-05-14 17:34:07 -07:00
Lorenze Jay
3a114463f9 Update version to 0.120.0 and dependencies in pyproject.toml and uv.lock files (#2835) 2025-05-14 16:48:21 -07:00
Lorenze Jay
516d45deaa chore: bump version to 0.119.0 and update dependencies (#2778)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
Mark stale issues and pull requests / stale (push) Has been cancelled
This commit updates the project version to 0.119.0 and modifies the required version of the `crewai-tools` dependency to 0.44.0 across various configuration files. Additionally, the version number is reflected in the `__init__.py` file and the CLI templates for crew, flow, and tool projects.
2025-05-07 17:29:41 -07:00
Lucas Gomide
dabf02a90d build(LiteLLM): upgrade LiteLLM version (#2757)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
Mark stale issues and pull requests / stale (push) Has been cancelled
2025-05-05 17:07:29 -04:00
Lucas Gomide
2902201bfa pytest improvements to handle flaky test (#2726)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
Mark stale issues and pull requests / stale (push) Has been cancelled
* build(dev): add pytest-randomly dependency

By randomizing the test execution order, this helps identify tests
that unintentionally depend on shared state or specific execution
order, which can lead to flaky or unreliable test behavior.

* build(dev): add pytest-timeout

This will prevent a test from running indefinitely

* test: block external requests in CI and set default 10s timeout per test

* test: adding missing cassettes

We notice that those cassettes are missing after enabling block-network on CI

* test: increase tests timeout on CI

* test: fix flaky test ValueError: Circular reference detected (id repeated)

* fix: prevent crash when event handler raises exception

Previously, if a registered event handler raised an exception during execution,
it could crash the entire application or interrupt the event dispatch process.
This change wraps handler execution in a try/except block within the `emit` method,
ensuring that exceptions are caught and logged without affecting other handlers or flow.

This improves the resilience of the event bus, especially when handling third-party
or temporary listeners.
2025-05-01 15:48:29 -04:00
Lorenze Jay
378dcc79bb prepping new version (#2733)
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
Mark stale issues and pull requests / stale (push) Has been cancelled
2025-04-30 14:57:54 -04:00
Lucas Gomide
1c2976c4d1 build: downgrade litellm to 1.167.1 (#2711)
The version 1.167.2 is not compatible with Windows
2025-04-30 09:23:14 -04:00
João Moura
4f6054d439 new version
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
2025-04-28 07:39:38 -07:00
Dev Khant
a86a1213c7 Fix Mem0 OSS (#2604)
* Fix Mem0 OSS

* add test

* fix lint and tests

* fix

* add tests

* drop test

* changed to class comparision

* fixed test cases

* Update src/crewai/memory/storage/mem0_storage.py

* Update src/crewai/memory/storage/mem0_storage.py

* fix

* fix lock file

---------

Co-authored-by: Vidit-Ostwal <viditostwal@gmail.com>
2025-04-28 10:37:31 -04:00
Lucas Gomide
566935fb94 upgrade liteLLM to latest version (#2684)
* build(litellm): upgrade LiteLLM to latest version

* fix: update filtered logs from LiteLLM

* Fix for a missing backtick

---------

Co-authored-by: Mike Plachta <mike@crewai.com>
Co-authored-by: Lorenze Jay <63378463+lorenzejay@users.noreply.github.com>
2025-04-28 09:46:40 -04:00
Lucas Gomide
3a66746a99 build: upgrade crewai-tools (#2705)
* build: upgrade crewai-tools

* build: prepare new version
2025-04-28 06:38:56 -07:00
João Moura
337a6d5719 preparing new version
Some checks failed
Notify Downstream / notify-downstream (push) Has been cancelled
Mark stale issues and pull requests / stale (push) Has been cancelled
2025-04-27 23:56:22 -07:00
Lucas Gomide
ced3c8f0e0 Unblock LLM(stream=True) to work with tools (#2582)
* feat: unblock LLM(stream=True) to work with tools

* feat: replace pytest-vcr by pytest-recording

1. pytest-vcr does not support httpx - which LiteLLM uses for streaming responses.
2. pytest-vcr is no longer maintained, last commit 6 years ago :fist::skin-tone-4:
3. pytest-recording supports modern request libraries (including httpx) and actively maintained

* refactor: remove @skip_streaming_in_ci

Since we have fixed streaming response issue we can remove this @skip_streaming_in_ci

---------

Co-authored-by: Lorenze Jay <63378463+lorenzejay@users.noreply.github.com>
2025-04-17 11:58:52 -04:00
João Moura
98ccbeb4bd new version 2025-04-09 18:13:41 -07:00
Vini Brasil
97d4439872 Bump crewai-tools to v0.40.1 (#2554) 2025-04-09 11:24:43 -04:00
lucasgomide
6e209d5d77 chore(deps): pin crewai-tools to compatible version ~=0.38.0
fixes [issue](https://github.com/crewAIInc/crewAI/issues/2390)
2025-03-27 16:36:08 -03:00
Devin AI
12a815e5db Fix #2351: Sanitize collection names to meet ChromaDB requirements
Co-Authored-By: Joe Moura <joao@crewai.com>
2025-03-26 12:02:17 -03:00
devin-ai-integration[bot]
807c13e144 Add support for custom LLM implementations (#2277)
* Add support for custom LLM implementations

Co-Authored-By: Joe Moura <joao@crewai.com>

* Fix import sorting and type annotations

Co-Authored-By: Joe Moura <joao@crewai.com>

* Fix linting issues with import sorting

Co-Authored-By: Joe Moura <joao@crewai.com>

* Fix type errors in crew.py by updating tool-related methods to return List[BaseTool]

Co-Authored-By: Joe Moura <joao@crewai.com>

* Enhance custom LLM implementation with better error handling, documentation, and test coverage

Co-Authored-By: Joe Moura <joao@crewai.com>

* Refactor LLM module by extracting BaseLLM to a separate file

This commit moves the BaseLLM abstract base class from llm.py to a new file llms/base_llm.py to improve code organization. The changes include:

- Creating a new file src/crewai/llms/base_llm.py
- Moving the BaseLLM class to the new file
- Updating imports in __init__.py and llm.py to reflect the new location
- Updating test cases to use the new import path

The refactoring maintains the existing functionality while improving the project's module structure.

* Add AISuite LLM support and update dependencies

- Integrate AISuite as a new third-party LLM option
- Update pyproject.toml and uv.lock to include aisuite package
- Modify BaseLLM to support more flexible initialization
- Remove unnecessary LLM imports across multiple files
- Implement AISuiteLLM with basic chat completion functionality

* Update AISuiteLLM and LLM utility type handling

- Modify AISuiteLLM to support more flexible input types for messages
- Update type hints in AISuiteLLM to allow string or list of message dictionaries
- Enhance LLM utility function to support broader LLM type annotations
- Remove default `self.stop` attribute from BaseLLM initialization

* Update LLM imports and type hints across multiple files

- Modify imports in crew_chat.py to use LLM instead of BaseLLM
- Update type hints in llm_utils.py to use LLM type
- Add optional `stop` parameter to BaseLLM initialization
- Refactor type handling for LLM creation and usage

* Improve stop words handling in CrewAgentExecutor

- Add support for handling existing stop words in LLM configuration
- Ensure stop words are correctly merged and deduplicated
- Update type hints to support both LLM and BaseLLM types

* Remove abstract method set_callbacks from BaseLLM class

* Enhance CustomLLM and JWTAuthLLM initialization with model parameter

- Update CustomLLM to accept a model parameter during initialization
- Modify test cases to include the new model argument
- Ensure JWTAuthLLM and TimeoutHandlingLLM also utilize the model parameter in their constructors
- Update type hints in create_llm function to support both LLM and BaseLLM types

* Enhance create_llm function to support BaseLLM type

- Update the create_llm function to accept both LLM and BaseLLM instances
- Ensure compatibility with existing LLM handling logic

* Update type hint for initialize_chat_llm to support BaseLLM

- Modify the return type of initialize_chat_llm function to allow for both LLM and BaseLLM instances
- Ensure compatibility with recent changes in create_llm function

* Refactor AISuiteLLM to include tools parameter in completion methods

- Update the _prepare_completion_params method to accept an optional tools parameter
- Modify the chat completion method to utilize the new tools parameter for enhanced functionality
- Clean up print statements for better code clarity

* Remove unused tool_calls handling in AISuiteLLM chat completion method for cleaner code.

* Refactor Crew class and LLM hierarchy for improved type handling and code clarity

- Update Crew class methods to enhance readability with consistent formatting and type hints.
- Change LLM class to inherit from BaseLLM for better structure.
- Remove unnecessary type checks and streamline tool handling in CrewAgentExecutor.
- Adjust BaseLLM to provide default implementations for stop words and context window size methods.
- Clean up AISuiteLLM by removing unused methods related to stop words and context window size.

* Remove unused `stream` method from `BaseLLM` class to enhance code clarity and maintainability.

---------

Co-authored-by: Devin AI <158243242+devin-ai-integration[bot]@users.noreply.github.com>
Co-authored-by: Joe Moura <joao@crewai.com>
Co-authored-by: Lorenze Jay <lorenzejaytech@gmail.com>
Co-authored-by: João Moura <joaomdmoura@gmail.com>
Co-authored-by: Brandon Hancock (bhancock_ai) <109994880+bhancockio@users.noreply.github.com>
2025-03-25 12:39:08 -04:00
ayulockin
d2a9a4a4e4 Revert "remove uv.lock"
This reverts commit e62e9c7401.
2025-03-25 19:05:58 +05:30
ayulockin
e62e9c7401 remove uv.lock 2025-03-25 19:04:51 +05:30
ayulockin
2c550dc175 add weave docs 2025-03-25 15:46:41 +05:30
João Moura
e723e5ca3f preparign new version 2025-03-17 09:13:21 -07:00
João Moura
435bfca186 preparing new version 2025-03-09 04:24:05 -07:00
João Moura
d3b398ed52 preparring new version 2025-02-12 18:16:48 -05:00
João Moura
abee94d056 fix version 2025-02-05 21:19:28 -08:00
Brandon Hancock
748383d74c update litellm to support o3-mini and deepseek. Update docs. 2025-02-04 10:58:34 -05:00
Brandon Hancock
cba8c9faec update litellm 2025-01-28 12:23:06 -05:00
João Moura
c310044bec preparing new version 2025-01-28 10:29:53 -03:00
Brandon Hancock (bhancock_ai)
dea6ed7ef0 fix issue pointed out by mike (#1986)
* fix issue pointed out by mike

* clean up

* Drop logger

* drop unused imports
2025-01-27 17:35:17 -05:00
João Moura
cc018bf128 updating tools version 2025-01-19 00:36:19 -08:00
João Moura
4a44245de9 preparing new verison 2025-01-18 10:18:56 -08:00
devin-ai-integration[bot]
294f2cc3a9 Add @persist decorator with FlowPersistence interface (#1892)
* Add @persist decorator with SQLite persistence

- Add FlowPersistence abstract base class
- Implement SQLiteFlowPersistence backend
- Add @persist decorator for flow state persistence
- Add tests for flow persistence functionality

Co-Authored-By: Joe Moura <joao@crewai.com>

* Fix remaining merge conflicts in uv.lock

- Remove stray merge conflict markers
- Keep main's comprehensive platform-specific resolution markers
- Preserve all required dependencies for persistence functionality

Co-Authored-By: Joe Moura <joao@crewai.com>

* Fix final CUDA dependency conflicts in uv.lock

- Resolve NVIDIA CUDA solver dependency conflicts
- Use main's comprehensive platform checks
- Ensure all merge conflict markers are removed
- Preserve persistence-related dependencies

Co-Authored-By: Joe Moura <joao@crewai.com>

* Fix nvidia-cusparse-cu12 dependency conflicts in uv.lock

- Resolve NVIDIA CUSPARSE dependency conflicts
- Use main's comprehensive platform checks
- Complete systematic check of entire uv.lock file
- Ensure all merge conflict markers are removed

Co-Authored-By: Joe Moura <joao@crewai.com>

* Fix triton filelock dependency conflicts in uv.lock

- Resolve triton package filelock dependency conflict
- Use main's comprehensive platform checks
- Complete final systematic check of entire uv.lock file
- Ensure TOML file structure is valid

Co-Authored-By: Joe Moura <joao@crewai.com>

* Fix merge conflict in crew_test.py

- Remove duplicate assertion in test_multimodal_agent_live_image_analysis
- Clean up conflict markers
- Preserve test functionality

Co-Authored-By: Joe Moura <joao@crewai.com>

* Clean up trailing merge conflict marker in crew_test.py

- Remove remaining conflict marker at end of file
- Preserve test functionality
- Complete conflict resolution

Co-Authored-By: Joe Moura <joao@crewai.com>

* Improve type safety in persistence implementation and resolve merge conflicts

Co-Authored-By: Joe Moura <joao@crewai.com>

* fix: Add explicit type casting in _create_initial_state method

Co-Authored-By: Joe Moura <joao@crewai.com>

* fix: Improve type safety in flow state handling with proper validation

Co-Authored-By: Joe Moura <joao@crewai.com>

* fix: Improve type system with proper TypeVar scoping and validation

Co-Authored-By: Joe Moura <joao@crewai.com>

* fix: Improve state restoration logic and add comprehensive tests

Co-Authored-By: Joe Moura <joao@crewai.com>

* fix: Initialize FlowState instances without passing id to constructor

Co-Authored-By: Joe Moura <joao@crewai.com>

* feat: Add class-level flow persistence decorator with SQLite default

- Add class-level @persist decorator support
- Set SQLiteFlowPersistence as default backend
- Use db_storage_path for consistent database location
- Improve async method handling and type safety
- Add comprehensive docstrings and examples

Co-Authored-By: Joe Moura <joao@crewai.com>

* fix: Sort imports in decorators.py to fix lint error

Co-Authored-By: Joe Moura <joao@crewai.com>

* style: Organize imports according to PEP 8 standard

Co-Authored-By: Joe Moura <joao@crewai.com>

* style: Format typing imports with line breaks for better readability

Co-Authored-By: Joe Moura <joao@crewai.com>

* style: Simplify import organization to fix lint error

Co-Authored-By: Joe Moura <joao@crewai.com>

* style: Fix import sorting using Ruff auto-fix

Co-Authored-By: Joe Moura <joao@crewai.com>

---------

Co-authored-by: Devin AI <158243242+devin-ai-integration[bot]@users.noreply.github.com>
Co-authored-by: Joe Moura <joao@crewai.com>
Co-authored-by: João Moura <joaomdmoura@gmail.com>
2025-01-16 10:23:46 -03:00
Brandon Hancock (bhancock_ai)
8ceeec7d36 drop litellm version to prevent windows issue (#1878)
* drop litellm version to prevent windows issue

* Fix failing tests

* Trying to fix tests

* clean up

* Trying to fix tests

* Drop token calc handler changes

* fix failing test

* Fix failing test

---------

Co-authored-by: João Moura <joaomdmoura@gmail.com>
2025-01-14 13:06:47 -05:00