mirror of
https://github.com/crewAIInc/crewAI.git
synced 2026-01-10 00:28:31 +00:00
Fix indentation in llm-connections.mdx code block (#1573)
Co-authored-by: Brandon Hancock (bhancock_ai) <109994880+bhancockio@users.noreply.github.com>
This commit is contained in:
@@ -125,10 +125,10 @@ You can connect to OpenAI-compatible LLMs using either environment variables or
|
|||||||
</Tab>
|
</Tab>
|
||||||
<Tab title="Using LLM Class Attributes">
|
<Tab title="Using LLM Class Attributes">
|
||||||
<CodeGroup>
|
<CodeGroup>
|
||||||
```python Code
|
```python Code
|
||||||
llm = LLM(
|
llm = LLM(
|
||||||
model="custom-model-name",
|
model="custom-model-name",
|
||||||
api_key="your-api-key",
|
api_key="your-api-key",
|
||||||
base_url="https://api.your-provider.com/v1"
|
base_url="https://api.your-provider.com/v1"
|
||||||
)
|
)
|
||||||
agent = Agent(llm=llm, ...)
|
agent = Agent(llm=llm, ...)
|
||||||
@@ -179,4 +179,4 @@ This is particularly useful when working with OpenAI-compatible APIs or when you
|
|||||||
|
|
||||||
## Conclusion
|
## Conclusion
|
||||||
|
|
||||||
By leveraging LiteLLM, CrewAI offers seamless integration with a vast array of LLMs. This flexibility allows you to choose the most suitable model for your specific needs, whether you prioritize performance, cost-efficiency, or local deployment. Remember to consult the [LiteLLM documentation](https://docs.litellm.ai/docs/) for the most up-to-date information on supported models and configuration options.
|
By leveraging LiteLLM, CrewAI offers seamless integration with a vast array of LLMs. This flexibility allows you to choose the most suitable model for your specific needs, whether you prioritize performance, cost-efficiency, or local deployment. Remember to consult the [LiteLLM documentation](https://docs.litellm.ai/docs/) for the most up-to-date information on supported models and configuration options.
|
||||||
|
|||||||
Reference in New Issue
Block a user