huggingface / transformers.js

State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
https://huggingface.co/docs/transformers.js
Apache License 2.0
11.83k stars 745 forks source link

Hard-coded absolute, non-existing cache path in V3 Common-js version #997

Closed jens-ghc closed 11 hours ago

jens-ghc commented 1 week ago

System Info

Transformers.js 3.0.1 running in node 18 using CommonJS

Environment/Platform

Description

When using transformers.js in a common-js based node application, I'm running into an error message when loading a model. It cannot be written to a local cache.

The reason for this is that somehow the file transformers.cjs has the default path to the cache directory hard-coded to an absolute directory path in DEFAULT_CACHE_DIR, it's set to this path: /home/runner/work/transformers.js/transformers.js/.cache/.

Reproduction

import { pipeline } from '@huggingface/transformers';
const pipe = await pipeline('zero-shot-classification', 'MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7', { dtype: 'q8'});

Getting this error message:

  errno: -2,
  code: 'ENOENT',
  syscall: 'mkdir',
  path: '/home/runner'
}

The same error does not happen when using the dynamic import instead.

const pipeline = (await import('@huggingface/transformers')).pipeline;

That's because in that case transformers.mjs is used instead of transformers.cjs and the former does not have the cache path hard-coded

xenova commented 3 days ago

Hi there! Thanks for the report. This seems to be an issue with our webpack setup for building the project in GitHub actions (hence the name). Will look into this.