Open microprediction opened 1 month ago
indeed it went OOM in my laptop, switching to a larger server, a simple
import timesfm
tfm = timesfm.TimesFm(
context_len=128,
horizon_len=64,
input_patch_len=32,
output_patch_len=128,
num_layers=20,
model_dims=1280,
backend="gpu",
)
tfm.load_from_checkpoint(repo_id="google/timesfm-1.0-200m")
Seems to take around 18G of vram
Can you provide approximate memory requirement guidance for using the model?