Real-time inference for Stable Diffusion - 0.88s latency. Covers AITemplate, nvFuser, TensorRT, FlashAttention. Join our Discord communty: https://discord.com/invite/TgHXuSJEk6
553
stars
35
forks
source link
Fix typo #11
Closed
Toan-Do closed 2 years ago