feat: enhance InternalInstructor to support multiple LLM providers (#3767)

* feat: enhance InternalInstructor to support multiple LLM providers

- Updated InternalInstructor to conditionally create an instructor client based on the LLM provider.
- Introduced a new method _create_instructor_client to handle client creation using the modern from_provider pattern.
- Added functionality to extract the provider from the LLM model name.
- Implemented tests for InternalInstructor with various LLM providers including OpenAI, Anthropic, Gemini, and Azure, ensuring robust integration and error handling.

This update improves flexibility and extensibility for different LLM integrations.

* fix test
This commit is contained in:
Lorenze Jay
2025-10-21 15:24:59 -07:00
committed by GitHub
parent 5944a39629
commit a850813f2b
3 changed files with 442 additions and 53 deletions

View File

@@ -902,7 +902,8 @@ def test_agent_step_callback():
@pytest.mark.vcr(filter_headers=["authorization"])
def test_agent_function_calling_llm():
llm = "gpt-4o"
from crewai.llm import LLM
llm = LLM(model="gpt-4o", is_litellm=True)
@tool
def learn_about_ai() -> str: