issues
search
visheratin
/
web-ai
Run modern deep learning models in the browser.
https://web-ai-demo.vercel.app
MIT License
809
stars
42
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Inference of Text-To-Speech model
#21
anishhguptaa
opened
2 months ago
0
Failed to resolve '@jimp/types'
#20
crapthings
opened
1 year ago
1
Custom Model
#19
edwinjosechittilappilly
opened
1 year ago
1
vite.js complaining that web-ai is missing a dot export in package.json
#18
davidtbo
opened
1 year ago
3
model output undefined with Bert models
#16
rawsh
opened
1 year ago
1
Split utils/prepare to make TextModels only usage at least works without Node.js polyfills
#15
leaysgur
closed
1 year ago
6
SyntaxError: Unexpected token 'export'
#14
rifkyniyas
closed
1 year ago
4
Create a tensor when token_type_ids is expected
#13
GarciaLnk
closed
1 year ago
3
token_type_ids input missing on BERT models
#12
GarciaLnk
closed
1 year ago
0
Allow changing the URL of wasmPaths and tokenizers .wasm files
#11
GarciaLnk
closed
1 year ago
3
Create table to showcase processing time required for different models
#10
shivaylamba
opened
1 year ago
1
Add resource consumption guardrails
#9
visheratin
closed
1 year ago
2
Add generation options
#8
visheratin
opened
1 year ago
0
ORT environment variables are not accessible through this library
#7
GarciaLnk
closed
1 year ago
4
Add {type: 'module'} as a second argument to the Worker constructor
#6
GarciaLnk
closed
1 year ago
2
Error using ONNX T5 model exported from Optimum
#5
GarciaLnk
closed
1 year ago
5
Onnx Conversion scripts
#4
volkancirik
opened
1 year ago
12
Does this support webGPU?
#3
altryne
opened
1 year ago
1
A demo site
#2
volkancirik
closed
1 year ago
2
Typo fixes
#1
maccman
closed
1 year ago
1