This fixes GitHub issue #4046 where certain Bedrock models don't support
the stopSequences field. The fix adds support for drop_params and
additional_drop_params parameters in BedrockCompletion._get_inference_config()
to allow users to drop unsupported parameters from the inference config.
Example usage:
llm = LLM(
model="bedrock/openai.gpt-oss-safeguard-120b",
drop_params=True,
additional_drop_params=["stopSequences"]
)
This follows the same pattern as the Azure provider implementation.
Co-Authored-By: João <joao@crewai.com>