Files
crewAI/lib/crewai/tests/llms/bedrock/test_bedrock.py
Devin AI e4b7ccf717 fix: add drop_params support to BedrockCompletion for unsupported parameters
This fixes GitHub issue #4046 where certain Bedrock models don't support
the stopSequences field. The fix adds support for drop_params and
additional_drop_params parameters in BedrockCompletion._get_inference_config()
to allow users to drop unsupported parameters from the inference config.

Example usage:
llm = LLM(
    model="bedrock/openai.gpt-oss-safeguard-120b",
    drop_params=True,
    additional_drop_params=["stopSequences"]
)

This follows the same pattern as the Azure provider implementation.

Co-Authored-By: João <joao@crewai.com>
2025-12-08 15:33:26 +00:00

31 KiB