Closed gibbs-shih closed 3 months ago
chain info 資訊分別放入 .env
.env.sample 範例
在 main.ts 中載入 .env 中關於 chain info 的資訊, 組合成 chainInfo, 並 export 出去 預設是 iSunCoin chain
修改原本使用 CHAIN_INFO 的地方及來源
crawler.ts
parser.ts
-block.ts
receipt.ts
parse_report_name_address.ts
crawl_report.ts
get_raw_data.ts
parsers.ts
拿掉原來的constants / chain_info.ts 因放在 main.ts 中會讀取不到, 故從 main.ts 移出放入 lib / chain_info 並調整.env 設置確保能讀取到 將所有路徑 import 自 lib / chain_info
This guide will walk you through the steps to set up the BAIFA web crawler on a remote server.
Here are the main steps to set up the compilation environment and run the crawler:
Set up the environment
Clone the repository and configure files
Install dependencies and set up the database
Run the crawler using PM2
ssh [user_name]@[IP_address]
/workspace
directory:
sudo mkdir /workspace
/workspace
to ${user}/workspace
to the current user:
sudo chown -R ${user} /workspace
/workspace
directory:
cd /workspace
git clone https://github.com/CAFECA-IO/BAIFA-web-crawling.git
BAIFA-web-crawling
directory:
cd BAIFA-web-crawling
ls -al
.env.sample
file to .env
:
cp .env.sample .env
.env
file:
vi .env
i
to enter edit mode
DATABASE_URL="postgresql://[name]:[password]@[host]:5432/[database_name]"
CHAIN_ID=[chain number, type: number]
CHAIN_NAME="[chain name, type: string]"
SYMBOL="[chain symbol, type: string]"
DECIMAL=[chain decimal, type: number]
RPC="[chain rpc url, type: string]"
Esc
to exit edit mode:wq
and press Enter
to save and exitnpm install
npx prisma db push --schema=./prisma/schema.prisma
npm install pm2 -g
pm2 start npm --name BAIFA-CRAWLER -- start
pm2 ls
pm2 kill
pm2 log 0
Ctrl + C
to exit log observationtake 3 hr
調整 .env.sample 調整 README.md 將chain info 設定至於 .env 重新部署