Files
crewAI/lib
Greyson LaLonde 60e8a4a364 fix: recompute is_azure_openai_endpoint after lazy endpoint resolve
`_prepare_completion_params` uses `is_azure_openai_endpoint` to decide
whether to include the `model` parameter in requests — Azure OpenAI
endpoints embed the deployment name in the URL and reject a `model`
field. When the endpoint was resolved lazily from env vars, the flag
stayed at its pre-resolve `False` value, causing every lazily-inited
Azure OpenAI request to include `model` and fail.

Factor the classification into `_is_azure_openai_endpoint` and call
it from both `_normalize_azure_fields` and `_make_client_kwargs`.
Extend the lazy-build regression test to assert the flag flips to
`True` once the endpoint is resolved.
2026-04-12 04:59:18 +08:00
..
2026-04-10 21:51:51 +08:00