How can I avoid this message, when I use custom model from HuggingFace?
(base) andrey@m2 current % poetry run app
Starting semantic-router...
The repository for nomic-ai/nomic-embed-text-v1.5 contains custom code which must be executed to correctly load the model. You can inspect the repository content at https://hf.co/nomic-ai/nomic-embed-text-v1.5.
You can avoid this prompt in future by passing the argument `trust_remote_code=True`.
Do you wish to run the custom code? [y/N] y
The repository for nomic-ai/nomic-embed-text-v1.5 contains custom code which must be executed to correctly load the model. You can inspect the repository content at https://hf.co/nomic-ai/nomic-embed-text-v1.5.
You can avoid this prompt in future by passing the argument `trust_remote_code=True`.
Do you wish to run the custom code? [y/N] y
<All keys matched successfully>
2024-06-05 18:16:17 INFO semantic_router.utils.logger local
name='politics' function_call=None similarity_score=None
app.py
import os
import tiktoken
from semantic_router import Route, RouteLayer
from semantic_router.encoders import OpenAIEncoder, HuggingFaceEncoder
from dotenv import load_dotenv
from semantic_router.llms.ollama import OllamaLLM
def main():
load_dotenv() # take environment variables from .env.
# @see https://github.com/openai/tiktoken/blob/main/tiktoken/model.py#L20
tiktoken.model.MODEL_TO_ENCODING["nomic_embed_text"] = "cl100k_base"
politics = Route(
name="politics",
utterances=[
"isn't politics the best thing ever",
"why don't you tell me about your political opinions",
"don't you just love the president",
"don't you just hate the president",
"they're going to destroy this country!",
"they will save the country!",
],
)
chitchat = Route(
name="chitchat",
utterances=[
"how's the weather today?",
"how are things going?",
"lovely weather today",
"the weather is horrendous",
"let's go to the chippy",
],
)
routes = [politics, chitchat]
print("Starting semantic-router...")
# Local with Ollama embeddings
# encoder=OpenAIEncoder(name="nomic_embed_text")
# rl = RouteLayer(encoder=encoder, routes=routes)
# Local
encoder = HuggingFaceEncoder(name="nomic-ai/nomic-embed-text-v1.5", trust_remote_code=True)
rl = RouteLayer(encoder=encoder, routes=routes)
# llm = OllamaLLM(
# llm_name="nomic_embed_text" # openhermes
# )
#rl = RouteLayer(encoder=encoder, routes=routes, llm=llm)
result = rl("don't you love politics?")
print(result)
How can I avoid this message, when I use custom model from HuggingFace?
app.py
pyproject.toml
Running command: