Closed KatharinaSchmidt closed 4 years ago
Were you able to fix it?
Yes I could fix it. There was a mistake with rmin, rmax, cmin and cmax, because I take them from the annotations with another coordinate system. And I also had some problems with my masks.
Hello,
I adapted the linemod_dataset from dataset.py to use a custom dataset. I modified the path to rgb image, depth image and masks, I also modified the informations about bounding box corners. If I try to run training on custom dataset I get a very long error message:
Click to expand error message
``` + set -e + export PYTHONUNBUFFERED=True + PYTHONUNBUFFERED=True + export CUDA_VISIBLE_DEVICES=0 + CUDA_VISIBLE_DEVICES=0 + python3 ./tools/train.py --dataset custom --dataset_root ./datasets/custom/custom /home/katharina/Schreibtisch/DenseFusion-Pytorch-1.0/lib/transformations.py:1912: UserWarning: failed to import module _transformations warnings.warn('failed to import module %s' % name) Object 1 buffer loaded Object 1 buffer loaded >>>>>>>>----------Dataset loaded!---------<<<<<<<< length of the training set: 11 length of the testing set: 1 number of sample points on mesh: 500 symmetry object list: [1] /home/katharina/Schreibtisch/DenseFusion-Pytorch-1.0/venv/lib/python3.6/site-packages/torch/nn/_reduction.py:49: UserWarning: size_average and reduce args will be deprecated, please use reduction='mean' instead. warnings.warn(warning.format(ret)) 2020-08-11 20:25:28,246 : Train time 00h 00m 00s, Training started r: 38 c: 23 r: 26 c: 70 r: 46 c: 18 r: 56 c: 80 r: 91 c: 77 r: 33 c: 31 r: 82 c: 112 r: 51 c: 95 r: 33 c: 19 r: 65 c: 101 data: [tensor([[[-0.0034, 0.0178, -0.0085], [-0.0061, 0.0136, -0.0175], [-0.0060, 0.0136, -0.0175], ..., [-0.0010, 0.0197, -0.0115], [-0.0004, 0.0197, -0.0115], [ 0.0005, 0.0197, -0.0115]]]), tensor([[[ 15, 30, 34, 41, 54, 58, 67, 70, 73, 74, 83, 123, 126, 127, 135, 140, 145, 161, 162, 168, 170, 181, 205, 207, 243, 260, 275, 299, 306, 310, 312, 314, 321, 322, 330, 331, 333, 350, 358, 360, 365, 370, 401, 410, 423, 436, 439, 459, 463, 466, 488, 489, 490, 495, 497, 548, 552, 557, 560, 577, 584, 586, 591, 620, 626, 636, 638, 641, 674, 679, 680, 688, 713, 723, 733, 734, 756, 772, 775, 779, 816, 818, 830, 835, 841, 863, 871, 872, 879, 883, 895, 907, 909, 935, 940, 941, 951, 953, 969, 981, 987, 991, 1002, 1006, 1008, 1022, 1032, 1040, 1058, 1103, 1109, 1124, 1125, 1141, 1144, 1145, 1169, 1172, 1178, 1187, 1194, 1202, 1216, 1219, 1240, 1241, 1256, 1277, 1297, 1300, 1320, 1334, 1339, 1340, 1353, 1359, 1365, 1367, 1370, 1381, 1408, 1414, 1450, 1453, 1461, 1467, 1469, 1473, 1501, 1521, 1529, 1535, 1542, 1546, 1551, 1580, 1590, 1604, 1607, 1609, 1645, 1647, 1676, 1679, 1683, 1688, 1693, 1696, 1712, 1718, 1723, 1727, 1734, 1744, 1802, 1821, 1825, 1832, 1838, 1839, 1853, 1870, 1885, 1894, 1895, 1901, 1911, 1923, 1926, 1930, 1943, 1950, 1954, 1990, 1999, 2008, 2019, 2021, 2032, 2041, 2042, 2044, 2060, 2066, 2070, 2077, 2092, 2131, 2175, 2186, 2188, 2189, 2192, 2193, 2198, 2205, 2208, 2209, 2220, 2243, 2247, 2250, 2251, 2257, 2260, 2277, 2282, 2287, 2289, 2301, 2313, 2335, 2342, 2352, 2361, 2369, 2382, 2406, 2412, 2422, 2424, 2425, 2429, 2437, 2444, 2460, 2482, 2488, 2512, 2522, 2530, 2556, 2566, 2568, 2585, 2599, 2624, 2650, 2652, 2657, 2661, 2666, 2669, 2673, 2681, 2694, 2707, 2709, 2717, 2722, 2724, 2737, 2749, 2763, 2774, 2778, 2785, 2813, 2821, 2830, 2832, 2833, 2834, 2840, 2845, 2858, 2864, 2879, 2885, 2891, 2896, 2898, 2905, 2910, 2926, 2937, 2939, 2945, 2953, 2963, 2965, 2979, 2980, 2989, 3003, 3005, 3010, 3034, 3042, 3064, 3083, 3085, 3091, 3097, 3108, 3114, 3128, 3139, 3178, 3202, 3211, 3222, 3232, 3245, 3250, 3280, 3285, 3298, 3299, 3314, 3315, 3327, 3342, 3346, 3366, 3377, 3390, 3393, 3396, 3414, 3416, 3418, 3422, 3428, 3435, 3448, 3461, 3472, 3473, 3475, 3480, 3485, 3516, 3522, 3542, 3547, 3564, 3568, 3572, 3578, 3589, 3591, 3623, 3633, 3654, 3661, 3662, 3670, 3678, 3696, 3701, 3702, 3728, 3744, 3747, 3748, 3751, 3757, 3767, 3799, 3811, 3826, 3827, 3839, 3847, 3866, 3874, 3875, 3879, 3883, 3886, 3888, 3891, 3917, 3927, 3933, 3941, 3943, 3945, 3956, 3957, 3960, 3962, 3963, 3973, 3987, 3997, 3999, 4027, 4028, 4040, 4051, 4062, 4065, 4069, 4073, 4074, 4075, 4076, 4083, 4094, 4112, 4115, 4123, 4131, 4133, 4134, 4143, 4144, 4146, 4148, 4150, 4165, 4173, 4181, 4200, 4204, 4216, 4224, 4232, 4251, 4263, 4265, 4269, 4273, 4274, 4287, 4288, 4290, 4295, 4302, 4323, 4324, 4325, 4336, 4338, 4348, 4349, 4368, 4373, 4389, 4400, 4424, 4437, 4446, 4460, 4483, 4488, 4489, 4508, 4535, 4564, 4572, 4581, 4588, 4626, 4632, 4649, 4662, 4670, 4675, 4678, 4687, 4714, 4727, 4729, 4748, 4758, 4759, 4767, 4771, 4786, 4789, 4806, 4809, 4812, 4816, 4818, 4827, 4841]]]), tensor([[[[1111.4192, 1111.4192, 1111.4192, ..., 770.8079, 779.5415, 792.6419], [1111.4192, 1111.4192, 1111.4192, ..., 766.4410, 770.8079, 788.2751], [1111.4192, 1111.4192, 1111.4192, ..., 757.7074, 770.8079, 788.2751], ..., [1037.1833, 984.7817, 1010.9825, ..., 679.1048, 714.0393, 683.4716], [1006.6157, 954.2140, 1006.6157, ..., 687.8384, 683.4716, 740.2402], [ 962.9476, 962.9476, 997.8821, ..., 648.5371, 670.3712, 657.2708]], [[1136.3572, 1136.3572, 1136.3572, ..., 904.2143, 904.2143, 917.6071], [1136.3572, 1136.3572, 1136.3572, ..., 895.2857, 904.2143, 913.1429], [1136.3572, 1136.3572, 1136.3572, ..., 895.2857, 904.2143, 908.6786], ..., [1060.4642, 1006.8929, 1033.6786, ..., 689.9286, 730.1072, 721.1786], [1029.2142, 984.5714, 1038.1428, ..., 716.7143, 707.7857, 756.8929], [ 984.5714, 984.5714, 1020.2857, ..., 663.1429, 685.4643, 685.4643]], [[1100.4177, 1131.5289, 1131.5289, ..., 851.5289, 851.5289, 869.3066], [1087.0845, 1104.8622, 1131.5289, ..., 842.6400, 851.5289, 864.8622], [1100.4177, 1100.4177, 1131.5289, ..., 842.6400, 851.5289, 860.4178], ..., [1055.9734, 1011.5289, 1029.3066, ..., 691.5289, 731.5289, 775.9733], [1024.8622, 980.4178, 1033.7511, ..., 713.7511, 704.8622, 758.1956], [ 980.4178, 980.4178, 1015.9733, ..., 660.4178, 682.6400, 682.6400]]]]), tensor([[[0.1352, 0.2796, 0.3264], [0.1368, 0.2782, 0.3298], [0.1375, 0.2770, 0.3303], ..., [0.1981, 0.2454, 0.3972], [0.1983, 0.2459, 0.3959], [0.1988, 0.2488, 0.3973]]]), tensor([[[-2.6151e-02, -9.5855e-02, 0.0000e+00], [-2.6128e-02, -9.1871e-02, 0.0000e+00], [-2.5783e-02, -9.0763e-02, 9.6608e-04], ..., [ 6.8498e-04, 1.1093e-03, 3.2252e-03], [ 1.3913e-03, -7.9399e-05, 3.1484e-03], [ 1.4707e-03, 2.9526e-04, -0.0000e+00]]]), tensor([[0]])] /home/katharina/Schreibtisch/DenseFusion-Pytorch-1.0/venv/lib/python3.6/site-packages/torch/nn/functional.py:2351: UserWarning: nn.functional.upsample is deprecated. Use nn.functional.interpolate instead. warnings.warn("nn.functional.upsample is deprecated. Use nn.functional.interpolate instead.") /home/katharina/Schreibtisch/DenseFusion-Pytorch-1.0/venv/lib/python3.6/site-packages/torch/nn/functional.py:2423: UserWarning: Default upsampling behavior when mode=bilinear is changed to align_corners=False since 0.4.0. Please specify align_corners=True if the old behavior is desired. See the documentation of nn.Upsample for details. "See the documentation of nn.Upsample for details.".format(mode)) /home/katharina/Schreibtisch/DenseFusion-Pytorch-1.0/venv/lib/python3.6/site-packages/torch/nn/modules/upsampling.py:129: UserWarning: nn.Upsample is deprecated. Use nn.functional.interpolate instead. warnings.warn("nn.{} is deprecated. Use nn.functional.interpolate instead.".format(self.name)) /home/katharina/Schreibtisch/DenseFusion-Pytorch-1.0/venv/lib/python3.6/site-packages/torch/nn/modules/container.py:92: UserWarning: Implicit dimension choice for log_softmax has been deprecated. Change the call to include dim=X as an argument. input = module(input) r: 43 c: 57 data: [tensor([[[-0.0356, 0.0040, 0.0706], [-0.0354, 0.0040, 0.0706], [-0.0351, 0.0040, 0.0706], ..., [-0.0319, 0.0134, 0.0536], [-0.0310, 0.0134, 0.0536], [-0.0294, 0.0134, 0.0536]]]), tensor([[[ 49, 50, 51, 52, 53, 54, 58, 59, 60, 64, 67, 68, 120, 122, 123, 125, 127, 131, 133, 134, 135, 136, 138, 139, 189, 192, 193, 194, 197, 198, 200, 203, 204, 205, 207, 208, 259, 263, 264, 265, 266, 268, 271, 274, 276, 277, 278, 330, 331, 332, 334, 335, 338, 339, 341, 342, 343, 348, 349, 399, 400, 401, 403, 405, 407, 409, 412, 413, 414, 416, 418, 470, 471, 473, 474, 476, 477, 478, 479, 480, 481, 482, 484, 485, 486, 488, 489, 539, 540, 543, 545, 546, 547, 548, 549, 550, 552, 553, 557, 559, 610, 611, 612, 613, 614, 616, 617, 619, 620, 622, 623, 624, 625, 628, 629, 680, 682, 694, 695, 696, 765, 768, 809, 836, 837, 839, 876, 877, 879, 907, 908, 941, 944, 946, 949, 950, 978, 979, 1003, 1005, 1006, 1010, 1011, 1013, 1014, 1015, 1017, 1021, 1068, 1070, 1071, 1072, 1075, 1076, 1077, 1078, 1080, 1081, 1082, 1083, 1084, 1086, 1091, 1132, 1133, 1134, 1136, 1137, 1142, 1147, 1148, 1149, 1150, 1154, 1156, 1157, 1158, 1161, 1162, 1181, 1182, 1184, 1185, 1186, 1188, 1198, 1199, 1201, 1202, 1203, 1205, 1207, 1208, 1209, 1210, 1212, 1216, 1218, 1219, 1221, 1224, 1228, 1231, 1241, 1246, 1248, 1254, 1257, 1259, 1264, 1265, 1269, 1271, 1272, 1276, 1278, 1279, 1280, 1281, 1283, 1284, 1285, 1288, 1290, 1292, 1293, 1297, 1302, 1303, 1304, 1305, 1307, 1308, 1311, 1316, 1317, 1319, 1321, 1322, 1327, 1330, 1331, 1333, 1334, 1337, 1341, 1342, 1346, 1347, 1348, 1349, 1350, 1351, 1352, 1354, 1355, 1357, 1358, 1360, 1361, 1362, 1363, 1364, 1365, 1366, 1367, 1369, 1370, 1371, 1372, 1375, 1377, 1378, 1379, 1380, 1381, 1382, 1387, 1393, 1395, 1396, 1397, 1398, 1401, 1405, 1407, 1409, 1410, 1415, 1416, 1417, 1423, 1424, 1425, 1427, 1429, 1430, 1433, 1435, 1437, 1439, 1450, 1454, 1455, 1456, 1457, 1458, 1463, 1465, 1466, 1470, 1471, 1475, 1476, 1477, 1478, 1479, 1480, 1482, 1483, 1485, 1486, 1487, 1488, 1490, 1491, 1492, 1494, 1495, 1497, 1501, 1502, 1503, 1506, 1510, 1511, 1512, 1516, 1517, 1519, 1520, 1521, 1523, 1527, 1528, 1531, 1533, 1535, 1536, 1538, 1539, 1540, 1541, 1542, 1543, 1545, 1546, 1547, 1548, 1550, 1552, 1553, 1554, 1555, 1556, 1557, 1558, 1559, 1560, 1563, 1566, 1567, 1568, 1569, 1571, 1573, 1577, 1579, 1580, 1589, 1592, 1593, 1594, 1603, 1607, 1608, 1609, 1610, 1612, 1613, 1614, 1615, 1617, 1619, 1621, 1622, 1623, 1624, 1626, 1627, 1628, 1631, 1633, 1635, 1636, 1637, 1638, 1641, 1644, 1648, 1649, 1650, 1651, 1652, 1655, 1656, 1658, 1659, 1661, 1662, 1664, 1665, 1667, 1669, 1670, 1671, 1680, 1681, 1684, 1685, 1687, 1688, 1689, 1692, 1693, 1694, 1696, 1698, 1699, 1702, 1704, 1708, 1709, 1710, 1711, 1713, 1717, 1718, 1719, 1722, 1723, 1724, 1725, 1726, 1727, 1728, 1732, 1733, 1734, 1735, 1736, 1737, 1739, 1740, 1741, 1742, 1747, 1748, 1749, 1750, 1753, 1754, 1756, 1757, 1758, 1760, 1763, 1764, 1767, 1769, 1772, 1774, 1776, 1777, 1780, 1781, 1782, 1783, 1784, 1785, 1788, 1793, 1799, 1801, 1802, 1804, 1805, 1806, 1810, 1818]]]), tensor([[[[128.8865, 128.8865, 133.2533, ..., 700.9388, 700.9388, 709.6725], [133.2533, 133.2533, 128.8865, ..., 700.9388, 696.5720, 709.6725], [133.2533, 128.8865, 128.8865, ..., 692.2053, 714.0393, 731.5065], ..., [399.6288, 377.7947, 382.1616, ..., 312.2926, 377.7947, 369.0611], [355.9607, 369.0611, 382.1616, ..., 565.5677, 556.8340, 683.4716], [377.7947, 386.5284, 382.1616, ..., 740.2402, 722.7729, 714.0393]], [[319.3929, 319.3929, 323.8571, ..., 712.2500, 703.3214, 721.1786], [323.8571, 323.8571, 319.3929, ..., 712.2500, 707.7857, 721.1786], [323.8571, 319.3929, 319.3929, ..., 703.3214, 725.6429, 734.5714], ..., [386.3571, 364.0357, 368.5000, ..., 319.3929, 381.8929, 368.5000], [337.2500, 350.6429, 368.5000, ..., 564.9285, 564.9285, 685.4643], [364.0357, 372.9643, 364.0357, ..., 747.9643, 725.6429, 725.6429]], [[638.1956, 638.1956, 638.1956, ..., 691.5289, 687.0844, 700.4178], [638.1956, 638.1956, 638.1956, ..., 691.5289, 687.0844, 695.9733], [638.1956, 638.1956, 638.1956, ..., 678.1956, 709.3066, 722.6400], ..., [362.6400, 335.9734, 340.4178, ..., 304.8622, 362.6400, 358.1956], [313.7511, 322.6400, 340.4178, ..., 549.3067, 549.3067, 669.3066], [340.4178, 344.8622, 331.5289, ..., 731.5289, 713.7511, 704.8622]]]]), tensor([[[-0.2142, -0.1577, 0.5831], [-0.2116, -0.1547, 0.5835], [-0.2164, -0.1583, 0.5850], ..., [-0.1383, -0.0989, 0.6110], [-0.1402, -0.0959, 0.6150], [-0.1413, -0.0979, 0.6154]]]), tensor([[[-0.0260, -0.0957, -0.0009], [-0.0259, -0.0917, -0.0010], [-0.0254, -0.0974, 0.0015], ..., [ 0.0013, 0.0004, -0.0027], [ 0.0015, 0.0020, 0.0024], [ 0.0023, -0.0002, 0.0026]]]), tensor([[0]])] 2020-08-11 20:25:28,634 : Train time 00h 00m 00s Epoch 1 Batch 1 Frame 2 Avg_dis:0.46966761350631714 data: [tensor([[0]]), tensor([[0]]), tensor([[0]]), tensor([[0]]), tensor([[0]]), tensor([[0]])] Traceback (most recent call last): File "./tools/train.py", line 258, inTo understand my mistake I printed "r" and "c" (height and width of bounding box) and the input of
data
fromtrain.py
. "r" and "c" should always be positive, as they are. So I assume there is no mistake with the bounding boxes. But do you know whydata
seems to be empty after the first training step? Or do you have any advice, where I can search for a reason for this error?