diff --git a/docs/en/guides/migration/upgrading-crewai.mdx b/docs/en/guides/migration/upgrading-crewai.mdx
index 4f9c63f99..a845be8a0 100644
--- a/docs/en/guides/migration/upgrading-crewai.mdx
+++ b/docs/en/guides/migration/upgrading-crewai.mdx
@@ -1,9 +1,22 @@
---
-title: "Upgrading CrewAI in Your Project"
-description: "How to upgrade the crewai version inside your project's virtual environment — and why crewai install alone won't do it."
+title: "Upgrading & Migrating CrewAI"
+description: "How to upgrade CrewAI in your project, migrate around breaking changes, and move standalone Crews onto Flows."
icon: "arrow-up-circle"
---
+## Overview
+
+CrewAI moves quickly. New releases regularly tighten import paths, change defaults on `Agent`, `Crew`, and `Task`, and introduce new orchestration primitives like `Flow` and checkpointing. This guide collects the practical steps needed to:
+
+- Upgrade the global `crewai` CLI and your project's pinned dependency
+- Adapt to breaking changes in imports and parameters
+- Migrate a standalone `Crew` to a typed `Flow`
+- Avoid the gotchas that show up the first time you re-run an upgraded project
+
+If you're starting fresh, see [Installation](/en/installation). If you're coming from another framework, see [Migrating from LangGraph](/en/guides/migration/migrating-from-langgraph).
+
+---
+
## The Two Things You Might Want to Upgrade
CrewAI lives in two places on your machine, and they upgrade independently:
@@ -15,13 +28,14 @@ CrewAI lives in two places on your machine, and they upgrade independently:
These can — and often do — get out of sync. Running `crewai --version` tells you the CLI version. Running `uv pip show crewai` inside your project tells you the venv version. If they differ, that's normal; what matters for your running code is the venv version.
-## Why `crewai install` Doesn't Upgrade
+## Why `crewai install` Alone Doesn't Upgrade
`crewai install` is a thin wrapper around `uv sync`. It installs exactly what the current `uv.lock` file says — it does **not** bump any version constraints.
If your `pyproject.toml` says `crewai>=1.11.1` and the lock file resolved to `1.11.1`, running `crewai install` will keep you on `1.11.1` forever, even if `1.14.4` is available.
To actually upgrade, you need to:
+
1. Update the version constraint in `pyproject.toml`
2. Re-solve the lock file
3. Sync the venv
@@ -48,6 +62,22 @@ Replace `[tools]` with whatever extras your project uses (e.g. `[tools,anthropic
`uv add` updates both `pyproject.toml` **and** `uv.lock` atomically. If you edit `pyproject.toml` manually, you still need to run `uv lock --upgrade-package crewai` to re-solve the lock file before `crewai install` will pick up the new version.
+## Upgrading the Global CLI
+
+The global CLI is separate from your project. Upgrade it with:
+
+```bash
+uv tool install crewai --upgrade
+```
+
+If your shell warns about `PATH` after the upgrade, refresh it:
+
+```bash
+uv tool update-shell
+```
+
+This does **not** touch your project's venv — you still need `uv add` + `crewai install` inside the project.
+
## Verify Both Are in Sync
```bash
@@ -60,22 +90,270 @@ uv pip show crewai | grep Version
They don't need to match — but your project venv version is what matters for runtime behavior.
-## Common Gotchas
+
+ CrewAI requires `Python >=3.10, <3.14`. If `uv` was installed against an older interpreter, recreate the project venv with a supported Python before running `crewai install`.
+
-**The constraint is a floor, not a pin.** `crewai>=1.11.1` means "any version at or above 1.11.1." uv will pick the highest compatible version when re-locking — but only if you explicitly re-lock with `uv add` or `uv lock --upgrade-package crewai`.
+---
-**You have a `uv.lock` file.** If you commit `uv.lock`, your teammates get the exact same versions. After bumping with `uv add`, commit the updated `uv.lock` too.
+## Breaking Changes & Migration Notes
-**Extras must be consistent.** If you run `uv add "crewai>=1.14.4"` without extras, uv may drop `[tools]` from the resolved set. Always include the extras you need: `uv add "crewai[tools]>=1.14.4"`.
+Most upgrades only require small adjustments. The areas below are the ones that break silently or with confusing tracebacks.
-**Dependency conflicts.** If uv can't resolve the new version against your other dependencies, it will tell you which package conflicts. Pin or upgrade the conflicting package explicitly.
+### Import paths: tools and `BaseTool`
-## Upgrading the Global CLI
+The canonical import location for tools is `crewai.tools`. Older paths still surface in tutorials but should be updated.
-The global CLI is separate from your project. Upgrade it with:
+```python
+# Before
+from crewai_tools import BaseTool
+from crewai.agents.tools import tool
-```bash
-uv tool install crewai --upgrade
+# After
+from crewai.tools import BaseTool, tool
```
-This does **not** touch your project's venv.
+The `@tool` decorator and `BaseTool` subclass both live in `crewai.tools`. `AgentFinish` and other internal-agent symbols are no longer part of the public surface — if you were importing them, switch to event listeners or `Task` callbacks instead.
+
+### `Agent` parameter changes
+
+```python
+from crewai import Agent
+
+agent = Agent(
+ role="Researcher",
+ goal="Find authoritative sources on {topic}",
+ backstory="You are a careful, source-driven researcher.",
+ llm="gpt-4o-mini", # string model name OR an LLM object
+ verbose=True, # bool, not an int level
+ max_iter=15, # default has changed across versions — set explicitly
+ allow_delegation=False,
+)
+```
+
+- `llm` accepts either a string model name (resolved via the configured provider) or an `LLM` object for fine-grained control.
+- `verbose` is a plain `bool`. Passing an integer no longer toggles log levels.
+- `max_iter` defaults have shifted between releases. If your agent silently stops looping after the first tool call, set `max_iter` explicitly.
+
+### `Crew` parameters
+
+```python
+from crewai import Crew, Process
+
+crew = Crew(
+ agents=[...],
+ tasks=[...],
+ process=Process.sequential, # or Process.hierarchical
+ memory=True,
+ cache=True,
+ embedder={"provider": "openai", "config": {"model": "text-embedding-3-small"}},
+)
+```
+
+- `process=Process.hierarchical` requires either `manager_llm=` or `manager_agent=`. Without one, kickoff raises at validation time.
+- `memory=True` with a non-default embedding provider needs an `embedder` dict — see [Memory & embedder config](#memory-embedder-config) below.
+
+### `Task` structured output
+
+Use `output_pydantic`, `output_json`, or `output_file` to coerce a task's result into a typed shape:
+
+```python
+from pydantic import BaseModel
+from crewai import Task
+
+class Article(BaseModel):
+ title: str
+ body: str
+
+write = Task(
+ description="Write an article about {topic}",
+ expected_output="A short article with a title and body",
+ agent=writer,
+ output_pydantic=Article, # the class, NOT an instance
+ output_file="output/article.md",
+)
+```
+
+`output_pydantic` takes the **class** itself. Passing `Article(title="", body="")` is a common mistake and fails with a confusing validation error.
+
+### Memory & embedder config
+
+If `memory=True` and you're not using the default OpenAI embeddings, you must pass an `embedder`:
+
+```python
+crew = Crew(
+ agents=[...],
+ tasks=[...],
+ memory=True,
+ embedder={
+ "provider": "ollama",
+ "config": {"model": "nomic-embed-text"},
+ },
+)
+```
+
+Set the relevant provider credentials (`OPENAI_API_KEY`, `OLLAMA_HOST`, etc.) in your `.env` file. Memory storage paths are project-local by default — delete the project's memory directory if you change embedders, since dimensions don't mix.
+
+---
+
+## Migrating a Crew to a Flow
+
+`Crew` is the right primitive when you have a single team of agents executing one workflow. Once you need branching, multiple crews, or persistent state across runs, reach for `Flow`.
+
+### When to use Flows vs standalone Crews
+
+| Situation | Use |
+| --- | --- |
+| Single team, single linear/hierarchical workflow | `Crew` |
+| Conditional branches, retries, routing on results | `Flow` |
+| Multiple specialized crews chained together | `Flow` |
+| State that must persist between steps or runs | `Flow` (with checkpointing) |
+| You want typed, IDE-friendly state | `Flow[MyState]` with a Pydantic model |
+
+If you only need one of: branching, multi-crew, or persistent state — start with a `Flow`. The boilerplate is small and you won't have to rewrite later.
+
+### Step-by-step migration
+
+**Before — standalone crew:**
+
+```python
+from crewai import Crew
+
+crew = Crew(agents=[researcher, writer], tasks=[research_task, write_task])
+result = crew.kickoff(inputs={"topic": "vector databases"})
+print(result)
+```
+
+**After — crew inside a typed Flow:**
+
+```python
+from crewai.flow.flow import Flow, start, listen
+from pydantic import BaseModel
+
+class MyState(BaseModel):
+ input_data: str = ""
+ result: str = ""
+
+class MyFlow(Flow[MyState]):
+ @start()
+ def run_crew(self):
+ result = MyCrew().crew().kickoff(inputs={"topic": self.state.input_data})
+ self.state.result = str(result)
+ return self.state.result
+
+flow = MyFlow()
+flow.kickoff(inputs={"input_data": "vector databases"})
+```
+
+What changed:
+
+1. The crew is constructed inside a method, not at module load.
+2. Inputs flow through `self.state` instead of being threaded as kwargs.
+3. The entry point is marked with `@start()`. Subsequent steps use `@listen(run_crew)` to chain.
+
+### Structured state setup
+
+Prefer typed state (`Flow[MyState]`) over the untyped dict variant. You get autocompletion, validation at the boundary, and serializable state for checkpointing:
+
+```python
+from pydantic import BaseModel, Field
+
+class ResearchState(BaseModel):
+ topic: str = ""
+ sources: list[str] = Field(default_factory=list)
+ draft: str = ""
+ final: str = ""
+```
+
+Untyped state (`Flow()` with no generic) still works, but you lose static checks and checkpointing fidelity.
+
+### Multi-crew Flow pattern
+
+Chaining two crews — research, then writing — is the canonical reason to adopt Flows:
+
+```python
+from crewai.flow.flow import Flow, start, listen, router
+from pydantic import BaseModel
+
+class PipelineState(BaseModel):
+ topic: str = ""
+ research: str = ""
+ article: str = ""
+
+class ContentPipeline(Flow[PipelineState]):
+ @start()
+ def research(self):
+ out = ResearchCrew().crew().kickoff(inputs={"topic": self.state.topic})
+ self.state.research = str(out)
+ return self.state.research
+
+ @router(research)
+ def gate(self):
+ return "write" if len(self.state.research) > 200 else "abort"
+
+ @listen("write")
+ def write(self):
+ out = WritingCrew().crew().kickoff(
+ inputs={"topic": self.state.topic, "notes": self.state.research}
+ )
+ self.state.article = str(out)
+ return self.state.article
+
+ @listen("abort")
+ def bail(self):
+ self.state.article = "Insufficient research."
+ return self.state.article
+
+ContentPipeline().kickoff(inputs={"topic": "vector databases"})
+```
+
+`@start()`, `@listen()`, and `@router()` are the three decorators you'll use 95% of the time. See [Flows](/en/concepts/flows) for the full reference.
+
+---
+
+## Common Gotchas
+
+1. **Running `crewai install` and expecting an upgrade.** `crewai install` syncs against the existing `uv.lock`. To bump versions, run `uv add "crewai[tools]>=X.Y.Z"` first, then `crewai install`.
+2. **The constraint is a floor, not a pin.** `crewai>=1.11.1` means "any version at or above 1.11.1." `uv` only re-resolves when you explicitly run `uv add` or `uv lock --upgrade-package crewai`.
+3. **Extras dropped during re-lock.** If you run `uv add "crewai>=1.14.4"` without extras, `uv` may drop `[tools]` from the resolved set. Always include the extras you need: `uv add "crewai[tools]>=1.14.4"`.
+4. **Forgetting to commit `uv.lock`.** After bumping with `uv add`, commit the updated `uv.lock` so teammates get the same versions.
+5. **`pip install` instead of `uv tool install`.** Mixing pip-installed and uv-installed `crewai` leads to two binaries on `PATH` and confusing version skew. Pick one — the supported one is `uv`.
+6. **Passing a Pydantic instance to `output_pydantic`.** It expects the class. `output_pydantic=Article`, not `output_pydantic=Article(...)`.
+7. **Hierarchical process with no manager.** `process=Process.hierarchical` requires `manager_llm=` or `manager_agent=`.
+8. **Memory enabled with the wrong embedder.** Switching embedders without clearing the on-disk memory directory causes dimension mismatches. Delete the project's memory store after changing providers.
+9. **Dict state when you wanted typed state.** `Flow()` with no generic gives you a dict. For type checking and clean checkpointing, use `Flow[MyState]` with a `BaseModel`.
+10. **Stale tool imports.** `from crewai_tools import BaseTool` works in some versions but is not the canonical path. Standardize on `from crewai.tools import BaseTool, tool`.
+11. **Python version drift.** CrewAI requires `>=3.10, <3.14`. `uv` will happily build a venv against 3.14+ if it's the default; pin the Python version in `pyproject.toml`.
+12. **`verbose=2` and similar integer flags.** `verbose` is a `bool`. Use event listeners for finer-grained logging.
+13. **Calling `crew.kickoff()` from inside a Flow without wrapping in `inputs={}`.** Flows pass state, not kwargs. The crew still expects `inputs={...}`.
+
+---
+
+## Checkpointing
+
+Checkpointing is a newer addition that persists agent, crew, and flow state between runs. It lets long-running workflows resume after a crash, a manual stop, or a deploy.
+
+```python
+crew = Crew(
+ agents=[...],
+ tasks=[...],
+ checkpoint=True,
+)
+```
+
+The same flag is supported on `Flow` and `Agent`. State is written to the project's local store and replayed on the next `kickoff()` with the same identifier.
+
+
+ Checkpointing is in early release. APIs around resume semantics, storage backends, and identifiers may still shift between minor versions — pin your `crewai` version if you depend on it in production.
+
+
+See [Checkpointing](/en/concepts/checkpointing) for the full feature reference.
+
+---
+
+## Getting Help
+
+- **Changelog** — every breaking change is noted in the [release notes](/en/changelog).
+- **GitHub Issues** — open one at [github.com/crewAIInc/crewAI/issues](https://github.com/crewAIInc/crewAI/issues) with a minimal repro and your `crewai --version` output.
+- **Discord** — the CrewAI community Discord is the fastest path to debugging help: [community.crewai.com](https://community.crewai.com).
+- **Migration guides** — if you're moving from another framework, start at [Migrating from LangGraph](/en/guides/migration/migrating-from-langgraph).
diff --git a/docs/upgrade-crewai.mdx b/docs/upgrade-crewai.mdx
deleted file mode 100644
index e1f0eb2ea..000000000
--- a/docs/upgrade-crewai.mdx
+++ /dev/null
@@ -1,329 +0,0 @@
----
-title: "Upgrading & Migrating CrewAI"
-description: How to upgrade CrewAI, migrate around breaking changes, and move standalone Crews onto Flows.
-icon: arrow-up-right-dots
----
-
-## Overview
-
-CrewAI moves quickly. New releases regularly tighten import paths, change defaults on `Agent`, `Crew`, and `Task`, and introduce new orchestration primitives like `Flow` and checkpointing. This guide collects the practical steps needed to:
-
-- Upgrade the `crewai` CLI and your project dependencies
-- Adapt to breaking changes in imports and parameters
-- Migrate a standalone `Crew` to a typed `Flow`
-- Avoid the gotchas that show up the first time you re-run an upgraded project
-
-If you're starting fresh, see [Installation](/en/installation). If you're coming from another framework, see the [migration guides](/en/guides/migration/migrating-from-langgraph).
-
----
-
-## Upgrading CrewAI
-
-CrewAI is distributed as a `uv` tool. `pip install -U crewai` works in a pinch, but the supported upgrade path is `uv`.
-
-### 1. Check your current version
-
-```bash
-uv tool list
-```
-
-You should see a line like:
-
-```
-crewai v0.102.0
-- crewai
-```
-
-### 2. Upgrade the CLI
-
-```bash
-uv tool install crewai --upgrade
-```
-
-If your shell warns about `PATH` after the upgrade, refresh it:
-
-```bash
-uv tool update-shell
-```
-
-### 3. Verify the upgrade
-
-```bash
-uv tool list
-crewai --version
-```
-
-### 4. Update project dependencies
-
-The CLI upgrade only updates the global tool. Each project pins its own `crewai` dependency in `pyproject.toml`. Inside the project directory, run:
-
-```bash
-crewai install
-```
-
-This re-syncs your project's lockfile and virtual environment against the new version. Run your tests or `crewai run` afterwards to surface any breakage early.
-
-
- CrewAI requires `Python >=3.10, <3.14`. If `uv` was installed against an older interpreter, recreate the project venv with a supported Python before running `crewai install`.
-
-
----
-
-## Breaking Changes & Migration Notes
-
-Most upgrades only require small adjustments. The areas below are the ones that break silently or with confusing tracebacks.
-
-### Import paths: tools and `BaseTool`
-
-The canonical import location for tools is `crewai.tools`. Older paths still surface in tutorials but should be updated.
-
-```python
-# Before
-from crewai_tools import BaseTool
-from crewai.agents.tools import tool
-
-# After
-from crewai.tools import BaseTool, tool
-```
-
-The `@tool` decorator and `BaseTool` subclass both live in `crewai.tools`. `AgentFinish` and other internal-agent symbols are no longer part of the public surface — if you were importing them, switch to event listeners or `Task` callbacks instead.
-
-### `Agent` parameter changes
-
-```python
-from crewai import Agent
-
-agent = Agent(
- role="Researcher",
- goal="Find authoritative sources on {topic}",
- backstory="You are a careful, source-driven researcher.",
- llm="gpt-4o-mini", # string model name OR an LLM object
- verbose=True, # bool, not an int level
- max_iter=15, # default has changed across versions — set explicitly
- allow_delegation=False,
-)
-```
-
-- `llm` accepts either a string model name (resolved via the configured provider) or an `LLM` object for fine-grained control.
-- `verbose` is a plain `bool`. Passing an integer no longer toggles log levels.
-- `max_iter` defaults have shifted between releases. If your agent silently stops looping after the first tool call, set `max_iter` explicitly.
-
-### `Crew` parameters
-
-```python
-from crewai import Crew, Process
-
-crew = Crew(
- agents=[...],
- tasks=[...],
- process=Process.sequential, # or Process.hierarchical
- memory=True,
- cache=True,
- embedder={"provider": "openai", "config": {"model": "text-embedding-3-small"}},
-)
-```
-
-- `process=Process.hierarchical` requires either `manager_llm=` or `manager_agent=`. Without one, kickoff raises at validation time.
-- `memory=True` with a non-default embedding provider needs an `embedder` dict — see [Memory & embedder config](#memory-embedder-config) below.
-
-### `Task` structured output
-
-Use `output_pydantic`, `output_json`, or `output_file` to coerce a task's result into a typed shape:
-
-```python
-from pydantic import BaseModel
-from crewai import Task
-
-class Article(BaseModel):
- title: str
- body: str
-
-write = Task(
- description="Write an article about {topic}",
- expected_output="A short article with a title and body",
- agent=writer,
- output_pydantic=Article, # the class, NOT an instance
- output_file="output/article.md",
-)
-```
-
-`output_pydantic` takes the **class** itself. Passing `Article(title="", body="")` is a common mistake and fails with a confusing validation error.
-
-### Memory & embedder config
-
-If `memory=True` and you're not using the default OpenAI embeddings, you must pass an `embedder`:
-
-```python
-crew = Crew(
- agents=[...],
- tasks=[...],
- memory=True,
- embedder={
- "provider": "ollama",
- "config": {"model": "nomic-embed-text"},
- },
-)
-```
-
-Set the relevant provider credentials (`OPENAI_API_KEY`, `OLLAMA_HOST`, etc.) in your `.env` file. Memory storage paths are project-local by default — delete the project's memory directory if you change embedders, since dimensions don't mix.
-
----
-
-## Migrating a Crew to a Flow
-
-`Crew` is the right primitive when you have a single team of agents executing one workflow. Once you need branching, multiple crews, or persistent state across runs, reach for `Flow`.
-
-### When to use Flows vs standalone Crews
-
-| Situation | Use |
-| --- | --- |
-| Single team, single linear/hierarchical workflow | `Crew` |
-| Conditional branches, retries, routing on results | `Flow` |
-| Multiple specialized crews chained together | `Flow` |
-| State that must persist between steps or runs | `Flow` (with checkpointing) |
-| You want typed, IDE-friendly state | `Flow[MyState]` with a Pydantic model |
-
-If you only need one of: branching, multi-crew, or persistent state — start with a `Flow`. The boilerplate is small and you won't have to rewrite later.
-
-### Step-by-step migration
-
-**Before — standalone crew:**
-
-```python
-from crewai import Crew
-
-crew = Crew(agents=[researcher, writer], tasks=[research_task, write_task])
-result = crew.kickoff(inputs={"topic": "vector databases"})
-print(result)
-```
-
-**After — crew inside a typed Flow:**
-
-```python
-from crewai.flow.flow import Flow, start, listen
-from pydantic import BaseModel
-
-class MyState(BaseModel):
- input_data: str = ""
- result: str = ""
-
-class MyFlow(Flow[MyState]):
- @start()
- def run_crew(self):
- result = MyCrew().crew().kickoff(inputs={"topic": self.state.input_data})
- self.state.result = str(result)
- return self.state.result
-
-flow = MyFlow()
-flow.kickoff(inputs={"input_data": "vector databases"})
-```
-
-What changed:
-
-1. The crew is constructed inside a method, not at module load.
-2. Inputs flow through `self.state` instead of being threaded as kwargs.
-3. The entry point is marked with `@start()`. Subsequent steps use `@listen(run_crew)` to chain.
-
-### Structured state setup
-
-Prefer typed state (`Flow[MyState]`) over the untyped dict variant. You get autocompletion, validation at the boundary, and serializable state for checkpointing:
-
-```python
-from pydantic import BaseModel, Field
-
-class ResearchState(BaseModel):
- topic: str = ""
- sources: list[str] = Field(default_factory=list)
- draft: str = ""
- final: str = ""
-```
-
-Untyped state (`Flow()` with no generic) still works, but you lose static checks and checkpointing fidelity.
-
-### Multi-crew Flow pattern
-
-Chaining two crews — research, then writing — is the canonical reason to adopt Flows:
-
-```python
-from crewai.flow.flow import Flow, start, listen, router
-from pydantic import BaseModel
-
-class PipelineState(BaseModel):
- topic: str = ""
- research: str = ""
- article: str = ""
-
-class ContentPipeline(Flow[PipelineState]):
- @start()
- def research(self):
- out = ResearchCrew().crew().kickoff(inputs={"topic": self.state.topic})
- self.state.research = str(out)
- return self.state.research
-
- @router(research)
- def gate(self):
- return "write" if len(self.state.research) > 200 else "abort"
-
- @listen("write")
- def write(self):
- out = WritingCrew().crew().kickoff(
- inputs={"topic": self.state.topic, "notes": self.state.research}
- )
- self.state.article = str(out)
- return self.state.article
-
- @listen("abort")
- def bail(self):
- self.state.article = "Insufficient research."
- return self.state.article
-
-ContentPipeline().kickoff(inputs={"topic": "vector databases"})
-```
-
-`@start()`, `@listen()`, and `@router()` are the three decorators you'll use 95% of the time. See [Flows](/en/concepts/flows) for the full reference.
-
----
-
-## Common Gotchas
-
-1. **`pip install` instead of `uv tool install`.** The CLI is a `uv` tool. Mixing pip-installed and uv-installed `crewai` leads to two binaries on `PATH` and confusing version skew. Pick one — the supported one is `uv`.
-2. **Forgetting `crewai install` after upgrading.** The CLI upgrade is global; your project venv still pins the old version until you re-sync.
-3. **Passing a Pydantic instance to `output_pydantic`.** It expects the class. `output_pydantic=Article`, not `output_pydantic=Article(...)`.
-4. **Hierarchical process with no manager.** `process=Process.hierarchical` requires `manager_llm=` or `manager_agent=`.
-5. **Memory enabled with the wrong embedder.** Switching embedders without clearing the on-disk memory directory causes dimension mismatches. Delete the project's memory store after changing providers.
-6. **Dict state when you wanted typed state.** `Flow()` with no generic gives you a dict. For type checking and clean checkpointing, use `Flow[MyState]` with a `BaseModel`.
-7. **Stale tool imports.** `from crewai_tools import BaseTool` works in some versions but is not the canonical path. Standardize on `from crewai.tools import BaseTool, tool`.
-8. **Python version drift.** CrewAI requires `>=3.10, <3.14`. `uv` will happily build a venv against 3.14+ if it's the default; pin the Python version in `pyproject.toml`.
-9. **`verbose=2` and similar integer flags.** `verbose` is a `bool`. Use event listeners for finer-grained logging.
-10. **Calling `crew.kickoff()` from inside a Flow without wrapping in `inputs={}`.** Flows pass state, not kwargs. The crew still expects `inputs={...}`.
-
----
-
-## Checkpointing
-
-Checkpointing is a newer addition that persists agent, crew, and flow state between runs. It lets long-running workflows resume after a crash, a manual stop, or a deploy.
-
-```python
-crew = Crew(
- agents=[...],
- tasks=[...],
- checkpoint=True,
-)
-```
-
-The same flag is supported on `Flow` and `Agent`. State is written to the project's local store and replayed on the next `kickoff()` with the same identifier.
-
-
- Checkpointing is in early release. APIs around resume semantics, storage backends, and identifiers may still shift between minor versions — pin your `crewai` version if you depend on it in production.
-
-
-See [Checkpointing](/en/concepts/checkpointing) for the full feature reference.
-
----
-
-## Getting Help
-
-- **Changelog** — every breaking change is noted in the [release notes](/en/changelog).
-- **GitHub Issues** — open one at [github.com/crewAIInc/crewAI/issues](https://github.com/crewAIInc/crewAI/issues) with a minimal repro and your `crewai --version` output.
-- **Discord** — the CrewAI community Discord is the fastest path to debugging help: [community.crewai.com](https://community.crewai.com).
-- **Migration guides** — if you're moving from another framework, start at [Migrating from LangGraph](/en/guides/migration/migrating-from-langgraph).