huggingface / transformers.js

State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
https://huggingface.co/docs/transformers.js
Apache License 2.0
12.08k stars 765 forks source link

How can i use this Model? #539

Open wfk007 opened 9 months ago

wfk007 commented 9 months ago

Question

How can i use this Model? https://huggingface.co/shibing624/macbert4csc-base-chinese

xenova commented 9 months ago

Sure, here's example usage (adapted from the model card, and indeed the logits match the python version):

import { BertTokenizer, BertForMaskedLM } from "@xenova/transformers";

const tokenizer = await BertTokenizer.from_pretrained("Xenova/macbert4csc-base-chinese");
const model = await BertForMaskedLM.from_pretrained("Xenova/macbert4csc-base-chinese", {
    quantized: false, // Comment out this line to use the unquantized version
});

const texts = ["今天新情很好", "你找到你最喜欢的工作,我也很高心。"];
const model_inputs = tokenizer(texts, { padding: true });

const outputs = await model(model_inputs);
console.log(outputs);
// MaskedLMOutput {
//   logits: Tensor {
//     dims: [ 2, 19, 21128 ],
//     type: 'float32',
//     data: Float32Array(802864) [
//         -6.916086673736572,  -6.508951187133789,  -6.746380805969238,
//         ...
//     ],
//     size: 802864
//   }
// }

(note I'm using a custom-converted version, since the original model does not have a tokenizer.json file)

You can then convert the rest of the example code (the get_errors function).