--- title: "LLMs: Reference" description: "Provider-agnostic LLM configuration reference for CrewAI projects." icon: "book" mode: "wide" --- ## Common parameters - `model` - `temperature` - `max_tokens` - `timeout` - `max_retries` - `response_format` ## Contract guidance - Set low temperature for extraction/classification. - Use structured outputs for downstream automation. - Set explicit timeout and retry policy for production. ## Canonical source Primary API details live in [/en/concepts/llms](/en/concepts/llms).