issues
search
bigscience-workshop
/
petals
🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
https://petals.dev
MIT License
9.25k
stars
525
forks
source link
running inference session with position getter/setter
#594
Closed
justheuristic
closed
4 months ago