Closed YiwuZhong closed 2 months ago
Hi, thanks for your suggestion! I will try to find another way to upload it, which might take some time. Could you please try to download it using gdown, a command-line tool for downloading files from Google Drive?
Hi, thanks for your suggestion! I will try to find another way to upload it, which might take some time. Could you please try to download it using gdown, a command-line tool for downloading files from Google Drive?
Google Drive can be a good option. Could you provide a google drive link?
The links I provided in the readme are Google Drive links. You may have a check.
Do you mean the links here? They seem to be OneDrive link which was hard to download via cmd lines.
Sorry, I remembered incorrectly. Initially, I wanted to use Google Drive, but a Google Drive account only provides 15GB of free storage space. However, HKUST has purchased more space for each student's OneDrive, so I ultimately chose OneDrive. Could you try using the rclone command-line tool to download the datasets from OneDrive?
Thanks for your reply. I'll have a try.
Besides, when you cropped regions from multi-view images, why compute the inverse matrix of Rtilt? The visualized boxes seem to be a bit off. The vertices of these boxes are computed via the functions above.
Regarding the transformation, considering that the R matrix in ScanNet is a homogeneous transformation matrix (including both rotation and translation), using the inverse matrix can transform the point cloud from one coordinate system to another. I'm not sure why your visualization has a slight shift. Maybe you need to recheck other details, such as the data, matrix, and visualization.
Do you mean the links here? They seem to be OneDrive link which was hard to download via cmd lines.
Hi, is downloading the dataset using rclone+OneDrive going smoothly? If not, I plan to upload the datasets to Hugging Face or Baidu Netdisk. Would either of these be good for you?
Hugging Face would be better for the community. Thanks!
Also, I tried to visualize the detected boxes on images via your defined function and data loader. I used the point cloud data and multi-view data from ScanNet official repo, and always disable augmentation. But the results seem to be off on the rotation angles. Should the 'gt_box_angles' be always 0, or did I refer to the wrong functions?
Thanks for your valuable suggestion! I will upload the data to Hugging Face, which may take some days. I'll let you know once it's finished. Please stay tuned. Regarding the data in ScanNet, if you extract the GT boxes following https://github.com/facebookresearch/votenet/blob/main/scannet/load_scannet_data.py#L117, the GT boxes will be non-oriented, i.e., the gt_box_angles are 0 because they are generated in an axis-aligned way. While in our setting, we generate oriented GT boxes by calculating angles following the way in https://github.com/lyhdet/OV-3DET/blob/main/Data_Maker/Test_GT_Maker/make_scannet_20cls_multi_thread.py#L205
Do you mean the links here? They seem to be OneDrive link which was hard to download via cmd lines.
Hi, is downloading the dataset using rclone+OneDrive going smoothly? If not, I plan to upload the datasets to Hugging Face or Baidu Netdisk. Would either of these be good for you?
Hi, may I ask if you managed to download the datasets successfully? Due to network restrictions in Mainland, I encountered issues while uploading the large-scale datasets to Hugging Face. I will continue to try it.
Hi, I still encountered issues while uploading the large-scale datasets due to network restrictions in Mainland. You may download them from the OneDrive links I provided. If you have no further or other questions, I'll close the issue.
Do you mean the links here? They seem to be OneDrive link which was hard to download via cmd lines.
Hi Yiwu @YiwuZhong , finally, I found a simple way to download OneDrive files via cmd lines. You can download the OV datasets by running 'bash data_download.sh', which is updated in the readme. Please have a check if you still need it :)
Thanks for your contribution for this nice work! I was setting up the repo but fail to download the large data files (~300GB) from Sharepoint. Do you mind uploading it to somewhere else so that the files can be downloaded via command lines. Thanks!