mirror of
https://github.com/crewAIInc/crewAI.git
synced 2026-04-07 03:28:29 +00:00
Some checks failed
Check Documentation Broken Links / Check broken links (push) Has been cancelled
CodeQL Advanced / Analyze (actions) (push) Has been cancelled
CodeQL Advanced / Analyze (python) (push) Has been cancelled
Nightly Canary Release / Build nightly packages (push) Has been cancelled
Nightly Canary Release / Check for new commits (push) Has been cancelled
Nightly Canary Release / Publish nightly to PyPI (push) Has been cancelled
Mark stale issues and pull requests / stale (push) Has been cancelled
When a flow with @human_feedback(llm=create_llm()) pauses for HITL and later resumes: 1. The LLM object was being serialized to just a model string via _serialize_llm_for_context() (e.g. 'gemini/gemini-3.1-flash-lite-preview') 2. On resume, resume_async() was creating LLM(model=string) with NO credentials, project, location, safety_settings, or client_params 3. OpenAI worked by accident (OPENAI_API_KEY from env), but Gemini with service accounts broke This fix: - Stashes the live LLM object on the wrapper as _hf_llm attribute - On resume, looks up the method and retrieves the live LLM if available - Falls back to the serialized string for backward compatibility - Preserves _hf_llm through FlowMethod wrapper decorators Co-authored-by: Joao Moura <joao@crewai.com> Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>