gitbooo / CrossViVit

This repository contains code for the paper "Improving day-ahead Solar Irradiance Time Series Forecasting by Leveraging Spatio-Temporal Context"
https://arxiv.org/abs/2306.01112
MIT License
61 stars 4 forks source link

Share parameters #2

Closed deep404 closed 8 months ago

deep404 commented 9 months ago

Hi guys, Thank you for sharing your amazing work! For training I am using the dataset posted here https://app.activeloop.ai/crossvivit/SunLake, which can be accessed with deeplake. Furthermore, I am using some of the parameters which you have shared in your paper, however, I got some errors that mat1 could not be multiplied with mat2, because of different matrix size. In order to solve this little problem, I have changed ctx_channels from 8 to 23, ts_channels from 16 to 8 in cross_vivit_bis.yaml. However, I am not feeling that it is the correct solution to the problem. Could you please share all your parameters from cross_vivit_bis.yaml, txcontext_datamodule.yaml and cross_vivit.yaml?

yy0127-ai commented 9 months ago

Hello friend, I would like to ask how to download a dataset and run the program? I encountered this error ,deeplake.util.exceptions.DatasetHandlerError: A Deep Lake dataset does not exist at the given path (/network/scratch/g/ghait.boukachab/o/EUMETSAT/tsf_eumetsat_bis-64_64). Check the path provided or in case you want to create a new dataset, use deeplake.empty(). Perhaps this is a very simple question, but it has troubled me for a long time. Looking forward to your answer, thank you very much.

deep404 commented 9 months ago

I have added the parameter reset = True -> self.deeplake_ds = deeplake.load(self.data_dir, reset=True). It has solved the problem for me. I used the command python main.py to start training, but I am not sure, it is correct

yy0127-ai commented 9 months ago

Thanks for your answer. Sorry to ask such a simple question. Mainly, I don't know how to download the dataset from Deep Lake.

deep404 commented 9 months ago

It is not necessary to download the dataset from deeplake. Just go to configs/paths/default.yaml and change data_dir param. Put there the link to deeplake dataset: "data_dir: hub://crossvivit/SunLake"

yy0127-ai commented 9 months ago

Thank you very much!

jaggbow commented 9 months ago

Thank you @deep404 for helping @yy0127-ai with the issue. We're going to update the configs accordingly. Please note that the parameters used in the papers will be the ones assigned in configs/experiment.

deep404 commented 9 months ago

My please to help other. Great, I'm looking forward to it

jaggbow commented 9 months ago

Hey,

I want to add that for any model you run, you have to specify experiment=model_you_want to the command so that it knows which parameters to run, this way you won't have any problems. Can you try that and tell me if the problem persists?

liujian123223 commented 3 weeks ago

My please to help other. Great, I'm looking forward to it

Hello friend, I would like to ask how to download a dataset and run the program? I encountered this error ,deeplake.util.exceptions.DatasetHandlerError: A Deep Lake dataset does not exist at the given path (/network/scratch/g/ghait.boukachab/o/EUMETSAT/tsf_eumetsat_bis-64_64). Check the path provided or in case you want to create a new dataset, use deeplake.empty(). Perhaps this is a very simple question, but it has troubled me for a long time. Looking forward to your answer, thank you very much.

Hello, do you know how to run crosslive on Linux? I only have one computer and haven't used Slurm before. I don't know how to run it. I would greatly appreciate it if you could provide assistance

liujian123223 commented 3 weeks ago

It is not necessary to download the dataset from deeplake. Just go to configs/paths/default.yaml and change data_dir param. Put there the link to deeplake dataset: "data_dir: hub://crossvivit/SunLake"

Hi, I am importing sunlake dataset by using deeplake but the process is taking a long time (more than 2 hours) which is causing me to get stuck here without outputting the results. May I ask if you encounter this problem?

Is it possible to download the Sunlake dataset and use it locally? Looking forward to your reply!

import deeplake

ds = deeplake.load('hub://crossvivit/SunLake')
liujian123223 commented 3 weeks ago

Thank you very much!

"Excuse me, have you successfully imported the SunLake dataset from Deeplake? I'm having a lot of trouble because it's very slow when I use ds = deeplake.load('hub://crossvivit/SunLake'). Can I discuss this issue with you?"