Files
crewAI/docs/memory.md
Devin AI 636dac6efb fix: Update embedding configuration and fix type errors
- Add configurable embedding providers (OpenAI, Ollama)
- Fix type hints in base_tool and structured_tool
- Add proper json property implementations
- Update documentation for memory configuration
- Add environment variables for embedding configuration
- Fix type errors in task and crew output classes

Co-Authored-By: Joe Moura <joao@crewai.com>
2025-02-09 19:16:51 +00:00

1.5 KiB

Memory in CrewAI

CrewAI provides a robust memory system that allows agents to retain and recall information from previous interactions.

Configuring Embedding Providers

CrewAI supports multiple embedding providers for memory functionality:

Environment Variables

Configure the embedding provider using these environment variables:

  • CREWAI_EMBEDDING_PROVIDER: Provider name (default: "openai")
  • CREWAI_EMBEDDING_MODEL: Model name (default: "text-embedding-3-small")
  • CREWAI_OLLAMA_URL: URL for Ollama API (when using Ollama provider)

Example Configuration

# Using OpenAI (default)
os.environ["OPENAI_API_KEY"] = "your-api-key"

# Using Ollama
os.environ["CREWAI_EMBEDDING_PROVIDER"] = "ollama"
os.environ["CREWAI_EMBEDDING_MODEL"] = "llama2"  # or any other model supported by your Ollama instance
os.environ["CREWAI_OLLAMA_URL"] = "http://localhost:11434/api/embeddings"  # optional, this is the default

Memory Usage

When an agent has memory enabled, it can access and store information from previous interactions:

agent = Agent(
    role="Researcher",
    goal="Research AI topics",
    backstory="You're an AI researcher",
    memory=True  # Enable memory for this agent
)

The memory system uses embeddings to store and retrieve relevant information, allowing agents to maintain context across multiple interactions and tasks.