mirror of
https://github.com/crewAIInc/crewAI.git
synced 2026-05-05 01:02:37 +00:00
Some checks failed
CodeQL Advanced / Analyze (actions) (push) Has been cancelled
CodeQL Advanced / Analyze (python) (push) Has been cancelled
Check Documentation Broken Links / Check broken links (push) Has been cancelled
Vulnerability Scan / pip-audit (push) Has been cancelled
Nightly Canary Release / Check for new commits (push) Has been cancelled
Nightly Canary Release / Build nightly packages (push) Has been cancelled
Nightly Canary Release / Publish nightly to PyPI (push) Has been cancelled
Mark stale issues and pull requests / stale (push) Has been cancelled
* feat(azure): forward credential_scopes to Azure AI Inference client Adds a credential_scopes field to the native Azure AI Inference provider and a matching AZURE_CREDENTIAL_SCOPES env var (comma-separated). The value is forwarded to ChatCompletionsClient / AsyncChatCompletionsClient when set, letting keyless / Entra-based callers target a specific Azure AD audience (e.g. https://cognitiveservices.azure.com/.default) without subclassing the provider. Matches the upstream azure.ai.inference SDK kwarg of the same name. Lazy build re-reads the env var so an LLM constructed at module import (before deployment env vars are set) still picks up scopes — same pattern as the existing AZURE_API_KEY / AZURE_ENDPOINT lazy reads. to_config_dict round-trips the field. * refactor(azure): tighten credential_scopes env handling Address review feedback: - Move os.getenv into the helper so AZURE_CREDENTIAL_SCOPES appears once - Match the surrounding api_key/endpoint `or` style in the validator - Drop the list() defensive copy in to_config_dict — every other field in that method (and the base class's `stop`) is assigned by reference