Fixes issues with result as answer not properly exiting LLM loop (#1689)

* v1 of fix implemented. Need to confirm with tokens.

* remove print statements
This commit is contained in:
Brandon Hancock (bhancock_ai)
2024-12-02 13:38:17 -05:00
committed by GitHub
parent 4bc23affe0
commit 3285c1b196
4 changed files with 44 additions and 15 deletions

View File

@@ -73,6 +73,7 @@ class BaseTool(BaseModel, ABC):
description=self.description,
args_schema=self.args_schema,
func=self._run,
result_as_answer=self.result_as_answer,
)
@classmethod