From be2def3fc8e2cb6a02bafaa9f97721fce346dfe9 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Jo=C3=A3o=20Moura?= Date: Tue, 19 Mar 2024 08:30:59 -0300 Subject: [PATCH] Adding HuggingFace docs --- docs/how-to/LLM-Connections.md | 33 +++++++++++++++++++++++++++++++++ 1 file changed, 33 insertions(+) diff --git a/docs/how-to/LLM-Connections.md b/docs/how-to/LLM-Connections.md index fad9f42b5..aa6cab77d 100644 --- a/docs/how-to/LLM-Connections.md +++ b/docs/how-to/LLM-Connections.md @@ -50,6 +50,39 @@ OPENAI_MODEL_NAME='openhermes' # Adjust based on available model OPENAI_API_KEY='' ``` +## HuggingFace Integration +There are a couple different ways you can use HuggingFace to host your LLM. + +### Your own HuggingFace endpoint +```python +from langchain_community.llms import HuggingFaceEndpoint + +llm = HuggingFaceEndpoint( + endpoint_url="", + huggingfacehub_api_token="", + task="text-generation", + max_new_tokens=512 +) + +agent = Agent( + role="HuggingFace Agent", + goal="Generate text using HuggingFace", + backstory="A diligent explorer of GitHub docs.", + llm=llm +) +``` + +### From HuggingFaceHub endpoint +```python +from langchain_community.llms import HuggingFaceHub + +llm = HuggingFaceHub( + repo_id="HuggingFaceH4/zephyr-7b-beta", + huggingfacehub_api_token="", + task="text-generation", +) +``` + ## OpenAI Compatible API Endpoints Switch between APIs and models seamlessly using environment variables, supporting platforms like FastChat, LM Studio, and Mistral AI.