jemisjoky / TorchMPS

PyTorch toolbox for matrix product state models
MIT License
138 stars 31 forks source link

Embeddings #20

Closed jemisjoky closed 3 years ago

jemisjoky commented 3 years ago

Wrote a class for wrapping a fixed (i.e. functional) embedding map, which allows probabilistic MPS models to work with either continuous or discrete input data. This is currently integrated into the Prob*MPS classes, and potential use cases for this functionality include (a) probabilistic modeling over continuous input data, and (b) language modeling using word embeddings.