Real-time inference for Stable Diffusion - 0.88s latency. Covers AITemplate, nvFuser, TensorRT, FlashAttention. Join our Discord communty: https://discord.com/invite/TgHXuSJEk6
553
stars
35
forks
source link
a bug in line 113 #53
Open
kx-kexi opened 1 year ago