Files
crewAI/lib
Joao Moura 3a7e499a31 fix: enhance LLM response handling and serialization
* Updated the Flow class to improve error handling when both structured and simple prompting fail, ensuring the first outcome is returned as a fallback.
* Introduced a new function, _serialize_llm_for_context, to properly serialize LLM objects with provider prefixes for better context management.
* Added tests to validate the new serialization logic and ensure correct behavior when LLM calls fail.

This update enhances the robustness of LLM interactions and improves the overall flow of handling outcomes.
2026-03-16 13:37:20 -07:00
..
2026-03-15 23:37:20 -04:00