withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level
https://node-llama-cpp.withcat.ai
MIT License
830 stars 80 forks source link

feat: React Native support #287

Open raymon-io opened 1 month ago

raymon-io commented 1 month ago

Feature Description

Is it possible for node-llama-cpp to support React Native projects? I know that this project depends on many node standard libraries, but is it possible to make changes and run in RN runtime? Especially now that it supports arm64 Mac and Windows machines, is it possible to run in arm64 non-node environments? Even just importing node-llama-cpp necessary functions produces error.

On Android it gives error: The package at "node_modules/node-llama-cpp/dist/evaluator/LlamaModel/LlamaModel.js" attempted to import the Node standard library module "process". It failed because the native React runtime does not include the Node standard library.

On React Native web (using expo) it gives the following error The resource from “http://localhost:8081/node_modules/expo/AppEntry.bundle?platform=web&dev=true&hot=false&lazy=true&transform.engine=hermes&transform.routerRoot=app” was blocked due to MIME type (“application/json”) mismatch which comes from a Githubissues.

  • Githubissues is a development platform for aggregating issues.