Closed PeggyPeppa closed 1 year ago
Seems that the shape of the annotations does not match the expected shape, mayby you can print out the frame.get_annotations_traffic_elements() and (split, segment_id, timestamp) when the error occur. The expected shape of element['points'] should be [2, 2]. This error maybe caused by the imcomplete downloaded data or some io error.
Hi~ When training this baseline, I met such a problem. Do you have any suggestions or views on this problem?
have you solved this? I have the same problem with you.
maybe you can provide us some info (https://github.com/OpenDriveLab/OpenLane-V2/issues/6#issuecomment-1451806700) so that we can have a look in the released data, as currently we don't have resources to try regenerating this problem.
frame.get_annotations_traffic_elements() : [{'id': 'tl_00', 'category': 1, 'attribute': 0, 'points': array([[ 289.95984, 980.3213 ], [ 300.4016 , 1002.00806]], dtype=float32)}, {'id': 'tl_01', 'category': 1, 'attribute': 0, 'points': array([[ 315.06024, 983.7349 ], [ 329.11646, 1001.80725]], dtype=float32)}, {'id': 'tl_02', 'category': 1, 'attribute': 0, 'points': array([[380.3213, 914.6586], [400.2008, 935.9438]], dtype=float32)}, {'id': 'tl_03', 'category': 1, 'attribute': 0, 'points': array([[410.24097, 911.4458 ], [420.68274, 938.9558 ]], dtype=float32)}, {'id': 'tl_04', 'category': 1, 'attribute': 0, 'points': array([[482.53012, 909.63855], [504.21686, 937.9518 ]], dtype=float32)}, {'id': 'tl_05', 'category': 1, 'attribute': 0, 'points': array([[287.1486 , 937.3494 ], [303.01205, 962.0482 ]], dtype=float32)}, {'id': 'rs_20', 'category': 0, 'attribute': 5, 'points': array([[507.39102, 893.107 ], [531.9066 , 921.40076]], dtype=float32)}, {'id': 'rs_21', 'category': 0, 'attribute': 4, 'points': array([[606.1846 , 890.8385 ], [629.57196, 920.62256]], dtype=float32)}]
Is the size of train folder the same with you? we got 225913544 when conducting ‘‘du -s’’ under train folder.
Hi, I just downloaded the data. Here is my md5sum result: Then I preprocessed data with data/OpenLane-V2/preprocess.py and train the baseline, and met the same problem.
Is the md5 of OpenLane-V2_subset_A_info.tar d7b3553e8f06b7d4febb9e1ac133ec94
?
Is the md5 of OpenLane-V2_subset_A_info.tar
d7b3553e8f06b7d4febb9e1ac133ec94
? Yes
same md5 and meet the same problem: d7b3553e8f06b7d4febb9e1ac133ec94
Is the size of train folder the same with you? we got 225913544 when conducting ‘‘du -s’’ under train folder. Hi, I haven't fixed this problem. I checked my md5, it is the same as author provided. But after unzipping, I only got 121915044 ... I wonder is it something went wrong when I was unzipping the data.
Please wait on, we are checking this problem.
获取 Outlook for iOShttps://aka.ms/o0ukef
发件人: Peggy @.> 发送时间: Friday, March 10, 2023 2:24:33 PM 收件人: OpenDriveLab/OpenLane-V2 @.> 抄送: 李阳 @.>; Comment @.> 主题: Re: [OpenDriveLab/OpenLane-V2] Baseline problem (Issue #6)
Is the size of train folder the same with you? we got 225913544 when conducting ‘‘du -s’’ under train folder. Hi, I haven't fixed this problem. I checked my md5, it is the same as author provided. But after unzipping, I only got 121915044 files... I wonder is it something went wrong when I was unzipping the data.
― Reply to this email directly, view it on GitHubhttps://github.com/OpenDriveLab/OpenLane-V2/issues/6#issuecomment-1463333918, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AUA46BTQLUSLSUNBCUZCZOTW3LCKDANCNFSM6AAAAAAVNJ66ZQ. You are receiving this because you commented.Message ID: @.***>
Hi, the format error in data cause the problem and we have fix that in latest data.
Thanks, also I want to check the completion of my data after unzipping. I use such command" find -name "*.json" | wc -l " to get the files in my own data. The results are as follows: Train: json 54576 jpg 157339 Test: json 4816 jpg33712 Val: json 4806 jpg 33642 Are these numbers correct?
Hi: The numbers in the test and val folders are correct. But the train folder should be "json 22477 and jpg 157339".
Hi~ When training this baseline, I met such a problem. Do you have any suggestions or views on this problem?