xenova / transformers.js

State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
https://huggingface.co/docs/transformers.js
Apache License 2.0
9.87k stars 582 forks source link

fetch faild when run demo by node #709

Open xiaobaichiliangpi opened 2 months ago

xiaobaichiliangpi commented 2 months ago

System Info

transformerjs version: 2.16.1 system: mac os node version: v18.20.1

Environment/Platform

Description

import { pipeline, env } from '@xenova/transformers';

env.allowLocalModels = false;

process.env.HTTP_PROXY = 'http://your.proxy.server:port';

// Create a feature extraction pipeline const extractor = await pipeline('feature-extraction', 'nomic-ai/nomic-embed-text-v1', { quantized: false, // Comment out this line to use the quantized version });

// Compute sentence embeddings const texts = ['search_query: What is TSNE?', 'search_query: Who is Laurens van der Maaten?']; const embeddings = await extractor(texts, { pooling: 'mean', normalize: true }); console.log(embeddings);

It return errors when run 'node index.js', seems to can not fetch the model because of internet limit, how can i set internet proxy?

node:internal/deps/undici/undici:12618 Error.captureStackTrace(err, this); ^

TypeError: fetch failed at node:internal/deps/undici/undici:12618:11 at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async getModelFile (file:///Users/wangwen/Desktop/program/ai/vue-translator/node_modules/@xenova/transformers/src/utils/hub.js:471:24) at async getModelJSON (file:///Users/wangwen/Desktop/program/ai/vue-translator/node_modules/@xenova/transformers/src/utils/hub.js:575:18) at async Promise.all (index 1) at async loadTokenizer (file:///Users/wangwen/Desktop/program/ai/vue-translator/node_modules/@xenova/transformers/src/tokenizers.js:61:18) at async AutoTokenizer.from_pretrained (file:///Users/wangwen/Desktop/program/ai/vue-translator/node_modules/@xenova/transformers/src/tokenizers.js:4398:50) at async Promise.all (index 0) at async loadItems (file:///Users/wangwen/Desktop/program/ai/vue-translator/node_modules/@xenova/transformers/src/pipelines.js:3206:5) at async pipeline (file:///Users/wangwen/Desktop/program/ai/vue-translator/node_modules/@xenova/transformers/src/pipelines.js:3146:21) { cause: Error: read ECONNRESET at TLSWrap.onStreamRead (node:internal/stream_base_commons:217:20) { errno: -54, code: 'ECONNRESET', syscall: 'read' } }

Node.js v18.20.1

Reproduction

do like above

JP-HoneyBadger commented 2 months ago

following, simular issue

honorsuper commented 2 months ago

following, simular issue

AngeloCarnevale commented 2 months ago

following, simular issue

BInyLU commented 1 week ago

following, simular issue