Closed keefeleen closed 1 month ago
Our code (following the sample)
model = timesfm.TimesFm( context_len=128, horizon_len=5, input_patch_len=32, output_patch_len=128, num_layers=20, model_dims=1280, backend=backend, ) model.load_from_checkpoint(repo_id="google/timesfm-1.0-200m")
then our process gives the GPU memory usage around 12237MiB
that is a lot
I have 64gb of ram and 64 of swap and it oom XD sooooo
Can you try setting the environment variable: XLA_PYTHON_CLIENT_PREALLOCATE=false ?
Our code (following the sample)
then our process gives the GPU memory usage around 12237MiB