Open kimia-cvengineer opened 11 months ago
Thank you so much @doc-doc. That's way much better and helpful.
Regrading the data, did you use the [Raw Videos from Charades(scaled to 480p) mp4] with the AG dump tool to extract the frames or just directly extracted features on [RGB frames at 24fps (76 GB)]? I am asking this question in a sense that, they have different fps (AG dump tools extract some samples given the annotation file) based on each videos original fps and I may not capture the frames you extracted the features out of them.
I appreaciate your assistance.
Hi, we use ffmpeg and decode each video ( or QA related segment for STAR) at 3pfs.
@doc-doc Thank you so much for your explanation. Did you use the original scale or the downscaled one (480p)?
It shoud be the original scale.
Hello, I am also interested in doing some experiments in STAR dataset. Could you share the feature for STAR again? The link above just expired. Thank you!
Thank you so much :)
Hello,
I want to conduct some experiments on STAR dataset and noticed that there are some parts in the code that you tried to load its data. I was wondering if I could have access to the files needed to load STAR dataset and extract its features, e.g.:
if self.dset == 'star': self.vid_clips = load_file(osp.dirname(csv_path)+f'/clips_{self.mode}.json')
It would be great if you could also share how you sampled data into some clips based on the id (referring to the json file). Did you follow the same clip-wise sampling exists in the preprocess_features for every qid? ` def get_video_feat_star(self, video_name, qid, width=320, height=240):
...
`
I sincerely appreciate your help.