Hugo-Persson / texify-wep-api

A simple wrapper around texify that exposes web api
MIT License
3 stars 2 forks source link

Please add an offline version to use #1

Open math-feng opened 7 months ago

math-feng commented 7 months ago

Because there will be network restrictions in China

Hugo-Persson commented 7 months ago

You can use Texify and Pix2tex lovally. Follow the documentation and point Obsidian to 127.0.0.1.

Let me know if you encounter any issues. I am running Texify locally and works great.

Hugo-Persson commented 7 months ago

Oh I saw your other issue, maybe use VPN?

math-feng commented 7 months ago

It does run well locally, but it's too much of a PC hog!

math-feng commented 7 months ago

https://huggingface.co/docs/transformers/installation#offline-mode

This is a tutorial, but I don't know how to program

math-feng commented 7 months ago
import os
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

# 确保 Transformers 以离线模式运行
os.environ["TRANSFORMERS_OFFLINE"] = "1"

def get_processor():
    global processor
    if processor is None:
        # 指定分词器的本地路径
        processor = AutoTokenizer.from_pretrained("/local/path/to/processor", local_files_only=True)
    return processor

def get_model():
    global model
    if model is None:
        # 指定模型的本地路径
        model = AutoModelForSeq2SeqLM.from_pretrained("/local/path/to/model", local_files_only=True)
    return model

# Flask 应用的其余部分保持不变

I inquired about ChatGPT and this is the advice it gave, is this ok please?

Hugo-Persson commented 7 months ago

I will look into it when I have some time, if you want you could create a PR @math-feng