xenova / transformers.js

State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
https://huggingface.co/docs/transformers.js
Apache License 2.0
9.71k stars 571 forks source link

Implement numerically stable log_softmax() #812

Open taha-yassine opened 1 week ago

taha-yassine commented 1 week ago

A numerically stable implementation of log_softmax(). It's similar to the current implementation of softmax() and is what's used in PyTorch and SciPy too.

log_softmax([1000,1])

// Current implementation
// > Array [ 0, -Infinity ]

// Proposed implementation
// > Array [ 0, -999 ]
xenova commented 4 days ago

Thanks so much! 🤗 Can you add a unit test for this to tests/maths.test.js? One normal case and one case where the unstable version will break? 🔥

taha-yassine commented 3 days ago

Done :)

HuggingFaceDocBuilderDev commented 2 days ago

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

xenova commented 2 days ago

(ignore failing tests; related to something else)