jxmorris12 / vec2text

utilities for decoding deep representations (like sentence embeddings) back to text
Other
673 stars 75 forks source link

Multi-GPU Support #56

Open shrijayan opened 1 month ago

shrijayan commented 1 month ago

First, I want to thank the team for the fantastic work on the vec2text library. It has been incredibly useful for my projects.

I have a feature request regarding the sequence_beam_width parameter in the invert_strings method. I noticed that setting sequence_beam_width=4 significantly increases memory consumption. While this is understandable given the increased computational requirements, it can be challenging to manage on a single GPU, especially for longer sequence.

jxmorris12 commented 1 month ago

This is a great idea! I did all the evaluations on a single GPU so I hadn't thought about this. Is there an easy way to parallelize beam search-type inference over multiple GPUs?