From e27a15023c819f69bb7f8d1ebf7e238781ac59d1 Mon Sep 17 00:00:00 2001 From: Bobby Lindsey Date: Wed, 22 Jan 2025 12:55:24 -0700 Subject: [PATCH] Add SageMaker as a LLM provider (#1947) * Add SageMaker as a LLM provider * Removed unnecessary constants; updated docs to align with bootstrap naming convention --------- Co-authored-by: Brandon Hancock (bhancock_ai) <109994880+bhancockio@users.noreply.github.com> --- docs/concepts/llms.mdx | 18 ++++++++++++++++++ 1 file changed, 18 insertions(+) diff --git a/docs/concepts/llms.mdx b/docs/concepts/llms.mdx index 851e93085..261a1fdd8 100644 --- a/docs/concepts/llms.mdx +++ b/docs/concepts/llms.mdx @@ -243,6 +243,9 @@ There are three ways to configure LLMs in CrewAI. Choose the method that best fi # llm: bedrock/amazon.titan-text-express-v1 # llm: bedrock/meta.llama2-70b-chat-v1 + # Amazon SageMaker Models - Enterprise-grade + # llm: sagemaker/ + # Mistral Models - Open source alternative # llm: mistral/mistral-large-latest # llm: mistral/mistral-medium-latest @@ -506,6 +509,21 @@ Learn how to get the most out of your LLM configuration: ) ``` + + + ```python Code + AWS_ACCESS_KEY_ID= + AWS_SECRET_ACCESS_KEY= + AWS_DEFAULT_REGION= + ``` + + Example usage: + ```python Code + llm = LLM( + model="sagemaker/" + ) + ``` + ```python Code