From b09796cd3ff4d5ec3c1870ca6c516b7981d6778a Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Mr=2E=20=C3=85nand?= <73425223+Astrodevil@users.noreply.github.com> Date: Thu, 26 Jun 2025 20:40:19 +0530 Subject: [PATCH] Adding Nebius to docs (#3070) * Adding Nebius to docs Submitting this PR on behalf of Nebius AI Studio to add Nebius models to the CrewAI documentation. I tested with the latest CrewAI + Nebius setup to ensure compatibility. cc @tonykipkemboi * updated LiteLLM page --------- Co-authored-by: Lucas Gomide --- docs/en/concepts/llms.mdx | 22 ++++++++++++++++++++++ docs/en/learn/llm-connections.mdx | 1 + 2 files changed, 23 insertions(+) diff --git a/docs/en/concepts/llms.mdx b/docs/en/concepts/llms.mdx index 80b69ce11..332d48576 100644 --- a/docs/en/concepts/llms.mdx +++ b/docs/en/concepts/llms.mdx @@ -684,6 +684,28 @@ In this section, you'll find detailed examples that help you select, configure, - openrouter/deepseek/deepseek-chat + + + Set the following environment variables in your `.env` file: + ```toml Code + NEBIUS_API_KEY= + ``` + + Example usage in your CrewAI project: + ```python Code + llm = LLM( + model="nebius/Qwen/Qwen3-30B-A3B" + ) + ``` + + + Nebius AI Studio features: + - Large collection of open source models + - Higher rate limits + - Competitive pricing + - Good balance of speed and quality + + ## Streaming Responses diff --git a/docs/en/learn/llm-connections.mdx b/docs/en/learn/llm-connections.mdx index fcc264f09..c70577887 100644 --- a/docs/en/learn/llm-connections.mdx +++ b/docs/en/learn/llm-connections.mdx @@ -34,6 +34,7 @@ LiteLLM supports a wide range of providers, including but not limited to: - DeepInfra - Groq - SambaNova +- Nebius AI Studio - [NVIDIA NIMs](https://docs.api.nvidia.com/nim/reference/models-1) - And many more!