microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.5k stars 2.9k forks source link

[Web] custom wasm model location path for inference #22504

Open sca1235 opened 4 days ago

sca1235 commented 4 days ago

Describe the issue

we are using onnxruntime-web and want to know if there is a configuration to customize the nn inference location. the method now is getting from the root of the app build but we want to do a custom path.

anyone have idea on how to change so the wasm can be in own custom location?

Trying to find way to set custom wasm paths.

Unfortunately, direct configuration for custom WASM file paths via ONNX Runtime Web's API is not supported. Instead, the solution revolves around ensuring the files are served from the correct path that ONNX Runtime expects.

To reproduce

we are building the wasm but when we call the innference it only using the root location. we placed the wasm elsewhere.

import cv from "@techstark/opencv-js";
import { Tensor, InferenceSession } from "onnxruntime-web";
import CryptoJS from 'crypto-js';

window.liveIdLibraries = {
    Tensor: Tensor,
    InferenceSession: InferenceSession,
    CryptoJS: CryptoJS,
    cv: cv
};

And then our webpack, we do this:

new CopyPlugin({
    patterns: [
        {
            from: path.resolve(path.join(__dirname, 'node_modules', 'onnxruntime-web', 'dist', '*.wasm')),
            to: "../html/assets/js/id-models/[name][ext]"
        },
    ],
}),

Urgency

No response

Platform

Windows

OS Version

web

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1

ONNX Runtime API

JavaScript

Architecture

X64

Execution Provider

Default CPU

Execution Provider Library Version

No response

gyagp commented 3 days ago

Are you looking for ort.env.wasm.wasmPaths?