Bug 1: InternalInstructor.to_pydantic() now forwards api_base/base_url/api_key
to litellm so that remote Ollama servers are reachable during structured output
parsing.
Bug 2: LLM.supports_function_calling() now falls back to querying the remote
Ollama /api/show endpoint when litellm returns False for Ollama models with a
non-localhost base URL.
Fixes#4694
Co-Authored-By: João <joao@crewai.com>