[On 2023-10-18] Instruction on the use of Fastapi App in OpenAlex\fastapi
directory\
Go to the folder cd OpenAlex\fastapi
\
Create a vitual environment and install the dependencies and activate the environment:
python -m venv venv
Scripts\activate
pip install -r requirements.txt
After correctly installed and activate the environment, run the fastapi server:
uvicorn main:app --reload
Now you can explore the api at http://127.0.0.1:8000/docs
. Currently, the app supports
In order for the graph construction function to work properly, you can create a neo4j instance on neo4j web, and put the configurations such as username, password, and uri in a .env
file. The .env
file should be in the same directory as main.py
. You can check the format of the .env
file in .env.example
as well. After running the neo4j instance, the application should be working properly.
This is a flask app that uses the LLM model to extract chemicals from a PDF File input.
Clone the repository: git clone https://github.com/wesharetechnology/flask-llm-pdf-analyzer.git
The project runs on conda environment for running large language models. To install the same conda environment, run conda env create -f llm_env.yml
Activate the conda environment: conda activate llm_env
Tutorial Source: How to build a web application using Flask and deploy it to the cloud
Scripts\activate
Set-ExecutionPolicy -ExecutionPolicy Unrestricted -Scope CurrentUser
pip install -r pip_requirements.txt
python
in terminalmain.py
in LLM_MODEL_PATH
. Change the path to the correct path of the model. or change to THUDM\chatglm-6b
to use the online model on huggingfacecd flask-llm-pdf-analyzer
python main.py
ChatGLM-6B license: here\ Cite the article
@article{zeng2022glm,
title={Glm-130b: An open bilingual pre-trained model},
author={Zeng, Aohan and Liu, Xiao and Du, Zhengxiao and Wang, Zihan and Lai, Hanyu and Ding, Ming and Yang, Zhuoyi and Xu, Yifan and Zheng, Wendi and Xia, Xiao and others},
journal={arXiv preprint arXiv:2210.02414},
year={2022}
}
Contributor: Wang Yumeng