ekosman / AnomalyDetectionCVPR2018-Pytorch

Pytorch version of - https://github.com/WaqasSultani/AnomalyDetectionCVPR2018
176 stars 55 forks source link

problem while trying to extracting normal and abnormal segments scores for all videos #10

Closed cervantes-loves-ai closed 4 years ago

cervantes-loves-ai commented 4 years ago

in anomaly_detector_model.py (line 39, 40)

normal_segments_scores = y_pred[normal_vids_indices]  # (batch/2, 32, 1)
anomal_segments_scores = y_pred[anomal_vids_indices]  # (batch/2, 32, 1)

i tried to extract scores for all the videos but i only get 30 videos in one epoch !! did i miss something.

cervantes-loves-ai commented 4 years ago

if i try to use batch size 8 i got like python TrainingAnomalyDetector_public.py --features_path --annotation_path --annotation_path_test --batch-size=8 --end-epoch=12019-12-17 11:57:33: VideoIter:: iterator initialized (phase: '', num: 1520) 2019-12-17 11:57:36: Iter 0: start with learning rate: 1.00000e-02 (next lr step: 6250) 2019-12-17 11:57:36: Start epoch 0: tensor([[0.5311, 0.5380, 0.5634, 0.5599, 0.5779, 0.5328, 0.5542, 0.5652, 0.5318, 0.5555, 0.5917, 0.5675, 0.5708, 0.5391, 0.5405, 0.5363, 0.5205, 0.4904, 0.5487, 0.5702, 0.5374, 0.5341, 0.5072, 0.5592, 0.5401, 0.5429, 0.5860, 0.5318, 0.5549, 0.5203, 0.5074, 0.5711], [0.5622, 0.5459, 0.5468, 0.5919, 0.5930, 0.5810, 0.5559, 0.5588, 0.5165, 0.5826, 0.5504, 0.5684, 0.5245, 0.5757, 0.5368, 0.5455, 0.5557, 0.5508, 0.5583, 0.5442, 0.5797, 0.5123, 0.5472, 0.5264, 0.5482, 0.5437, 0.5557, 0.5726, 0.5451, 0.5811, 0.5797, 0.5678], [0.5317, 0.5369, 0.5646, 0.5151, 0.5670, 0.5556, 0.5754, 0.5530, 0.5293, 0.5765, 0.5446, 0.5616, 0.5562, 0.5723, 0.5396, 0.5510, 0.5495, 0.5446, 0.5438, 0.5556, 0.5227, 0.5693, 0.5889, 0.5601, 0.5460, 0.5693, 0.5439, 0.5617, 0.5722, 0.5457, 0.5823, 0.5472], [0.5660, 0.5387, 0.5757, 0.5460, 0.5367, 0.6196, 0.5704, 0.5348, 0.5753, 0.5137, 0.5201, 0.6055, 0.5398, 0.5163, 0.5511, 0.5721, 0.5519, 0.6138, 0.6015, 0.5565, 0.5474, 0.5399, 0.5569, 0.5871, 0.5193, 0.5316, 0.5568, 0.5475, 0.5517, 0.5959, 0.5884, 0.6221]], device='cuda:0', grad_fn=) torch.Size([4, 32]) 2019-12-17 11:57:38: Epoch [0] Batch [0] Speed 4.5 (+90) sample/sec losss-ce = 1.04408
tensor([[0.5403, 0.5738, 0.5711, 0.5411, 0.5531, 0.5804, 0.5158, 0.5358, 0.4741, 0.5658, 0.5357, 0.5285, 0.5562, 0.5748, 0.5140, 0.5331, 0.5890, 0.5419, 0.5196, 0.5143, 0.5357, 0.5626, 0.5480, 0.5487, 0.5335, 0.5411, 0.5276, 0.5242, 0.5758, 0.5261, 0.5008, 0.5445], [0.5369, 0.5875, 0.5624, 0.5521, 0.5705, 0.5778, 0.5753, 0.5645, 0.5616, 0.5271, 0.5681, 0.5539, 0.5583, 0.6073, 0.6067, 0.5597, 0.6240, 0.5744, 0.5122, 0.6189, 0.5673, 0.5715, 0.5981, 0.5565, 0.5341, 0.5496, 0.5659, 0.5236, 0.5591, 0.5622, 0.5636, 0.5652], [0.5801, 0.5808, 0.5711, 0.5127, 0.5587, 0.5505, 0.5298, 0.5556, 0.5641, 0.5767, 0.5738, 0.6137, 0.5625, 0.5524, 0.5464, 0.5356, 0.5443, 0.5720, 0.5623, 0.6105, 0.5755, 0.5729, 0.5607, 0.5506, 0.5894, 0.5823, 0.5444, 0.5596, 0.5669, 0.5379, 0.5415, 0.5665], [0.5667, 0.5405, 0.5905, 0.5236, 0.5069, 0.5757, 0.5522, 0.5651, 0.6035, 0.5125, 0.5687, 0.5437, 0.5498, 0.5421, 0.5346, 0.5395, 0.5300, 0.4947, 0.5577, 0.5446, 0.5002, 0.5885, 0.5691, 0.5376, 0.5513, 0.5541, 0.5731, 0.5444, 0.6021, 0.5599, 0.5478, 0.5110]], device='cuda:0', grad_fn=) torch.Size([4, 32]) tensor([[0.5871, 0.5458, 0.5259, 0.5368, 0.5356, 0.5596, 0.5418, 0.5834, 0.5736, 0.5823, 0.5347, 0.5634, 0.5982, 0.5700, 0.5364, 0.5706, 0.5517, 0.5565, 0.5755, 0.5397, 0.5626, 0.5850, 0.5105, 0.5721, 0.5416, 0.5872, 0.5521, 0.6038, 0.5462, 0.5527, 0.5765, 0.5370], [0.5353, 0.5305, 0.5548, 0.5648, 0.5525, 0.5906, 0.5320, 0.5583, 0.5837, 0.5511, 0.5493, 0.5500, 0.5930, 0.5886, 0.5647, 0.5681, 0.5627, 0.5517, 0.5543, 0.5370, 0.5323, 0.4942, 0.5689, 0.5718, 0.5574, 0.5700, 0.5574, 0.5939, 0.5747, 0.5292, 0.5907, 0.5651], [0.5398, 0.5498, 0.5408, 0.5582, 0.5923, 0.5495, 0.5547, 0.5942, 0.5493, 0.5653, 0.5634, 0.5976, 0.5867, 0.5480, 0.4949, 0.5483, 0.5702, 0.6212, 0.6001, 0.5754, 0.5781, 0.5379, 0.5443, 0.5292, 0.5814, 0.5399, 0.5716, 0.6037, 0.5571, 0.5393, 0.5137, 0.5457], [0.5669, 0.5328, 0.5735, 0.5556, 0.5345, 0.6143, 0.6108, 0.5356, 0.5464, 0.5251, 0.5138, 0.5487, 0.5752, 0.5273, 0.5992, 0.5410, 0.5674, 0.5417, 0.5191, 0.5541, 0.5616, 0.5329, 0.5823, 0.5662, 0.5665, 0.5422, 0.5775, 0.5543, 0.5371, 0.5809, 0.5716, 0.5689]], device='cuda:0', grad_fn=) torch.Size([4, 32]) tensor([[0.5629, 0.5912, 0.5434, 0.5489, 0.5505, 0.5155, 0.5475, 0.5325, 0.5355, 0.5351, 0.5380, 0.5220, 0.5820, 0.5604, 0.5610, 0.5481, 0.5467, 0.5716, 0.5378, 0.5310, 0.5350, 0.5128, 0.5484, 0.5990, 0.5487, 0.5493, 0.5600, 0.5680, 0.5867, 0.5943, 0.5319, 0.5466], [0.5810, 0.5346, 0.5290, 0.5339, 0.5451, 0.5554, 0.5591, 0.5132, 0.5507, 0.6059, 0.5318, 0.5775, 0.5476, 0.5332, 0.5746, 0.5020, 0.5592, 0.5761, 0.5617, 0.5617, 0.5222, 0.5768, 0.5287, 0.6023, 0.5388, 0.5627, 0.5399, 0.5928, 0.5817, 0.5622, 0.5429, 0.5406], [0.5433, 0.5670, 0.5816, 0.5443, 0.5936, 0.5688, 0.5521, 0.5351, 0.5583, 0.5282, 0.5399, 0.5682, 0.5262, 0.5514, 0.5443, 0.5522, 0.5460, 0.5204, 0.5611, 0.5330, 0.5005, 0.5478, 0.5811, 0.5230, 0.5194, 0.5238, 0.5681, 0.5767, 0.5498, 0.5927, 0.5345, 0.5777], [0.5369, 0.5573, 0.5141, 0.4819, 0.5444, 0.5477, 0.5536, 0.5353, 0.5543, 0.6073, 0.5481, 0.5452, 0.5965, 0.5390, 0.5648, 0.5580, 0.5066, 0.5725, 0.5594, 0.5365, 0.5390, 0.5471, 0.5276, 0.5639, 0.5615, 0.5388, 0.5596, 0.5389, 0.5384, 0.5375, 0.5200, 0.5319]], device='cuda:0', grad_fn=) torch.Size([4, 32]) tensor([[0.5543, 0.5977, 0.5280, 0.5650, 0.5090, 0.5494, 0.5659, 0.5222, 0.5384, 0.5535, 0.5778, 0.5554, 0.5516, 0.5392, 0.5652, 0.5064, 0.5650, 0.5676, 0.5687, 0.5573, 0.5579, 0.5839, 0.5390, 0.5576, 0.5497, 0.5384, 0.5600, 0.5383, 0.5963, 0.5198, 0.5142, 0.5598], [0.5702, 0.5523, 0.5473, 0.5327, 0.5774, 0.5383, 0.5568, 0.5537, 0.5289, 0.5585, 0.5316, 0.5088, 0.5927, 0.5990, 0.5454, 0.5525, 0.6077, 0.5490, 0.5840, 0.5737, 0.5720, 0.5266, 0.5784, 0.5256, 0.5509, 0.5296, 0.5886, 0.5241, 0.5403, 0.5137, 0.6220, 0.5603], [0.5495, 0.5591, 0.5418, 0.5654, 0.5722, 0.6144, 0.5282, 0.5995, 0.5889, 0.5628, 0.5234, 0.5625, 0.5845, 0.5292, 0.5306, 0.5360, 0.5564, 0.5311, 0.5429, 0.5462, 0.5188, 0.5943, 0.5784, 0.5568, 0.5129, 0.5375, 0.5525, 0.5723, 0.5633, 0.5354, 0.5358, 0.4954], [0.5469, 0.5733, 0.5778, 0.5573, 0.5594, 0.5757, 0.5605, 0.5861, 0.5004, 0.5667, 0.5534, 0.5752, 0.5805, 0.5920, 0.5506, 0.4744, 0.5517, 0.5733, 0.5793, 0.5427, 0.5720, 0.5343, 0.5563, 0.5610, 0.5777, 0.5680, 0.5423, 0.5585, 0.5913, 0.5830, 0.5437, 0.5865]], device='cuda:0', grad_fn=) torch.Size([4, 32]) tensor([[0.5709, 0.5607, 0.5209, 0.5625, 0.5669, 0.5560, 0.5558, 0.5161, 0.5583, 0.5627, 0.5237, 0.5743, 0.5519, 0.5470, 0.5456, 0.6178, 0.6081, 0.5385, 0.5307, 0.5518, 0.5132, 0.5454, 0.5343, 0.5689, 0.5359, 0.5380, 0.5539, 0.5059, 0.5169, 0.5426, 0.6164, 0.5682], [0.5510, 0.5191, 0.5607, 0.5728, 0.5511, 0.5556, 0.5570, 0.5699, 0.5443, 0.5539, 0.5516, 0.5513, 0.5540, 0.5684, 0.5453, 0.5632, 0.5687, 0.5655, 0.6327, 0.5867, 0.5957, 0.5691, 0.5233, 0.5420, 0.5577, 0.5561, 0.5637, 0.5361, 0.5391, 0.5171, 0.6003, 0.5699], [0.5626, 0.5590, 0.5714, 0.5173, 0.5517, 0.5763, 0.5184, 0.5448, 0.5388, 0.5708, 0.5742, 0.5658, 0.5442, 0.5753, 0.5740, 0.5306, 0.5554, 0.5861, 0.5173, 0.5394, 0.5684, 0.5449, 0.5326, 0.5531, 0.5481, 0.5531, 0.5558, 0.5232, 0.5672, 0.5618, 0.5444, 0.5390], [0.5297, 0.5493, 0.5554, 0.5756, 0.5349, 0.4939, 0.5415, 0.5486, 0.5592, 0.5727, 0.5332, 0.5987, 0.4504, 0.5191, 0.5171, 0.5593, 0.5038, 0.5299, 0.5403, 0.5475, 0.5755, 0.5680, 0.5411, 0.5612, 0.5213, 0.5392, 0.5462, 0.5786, 0.5133, 0.5479, 0.5484, 0.5390]], device='cuda:0', grad_fn=) torch.Size([4, 32]) 2019-12-17 11:57:38: Epoch [0] Batch [5] Speed 196.5 (+ 1) sample/sec losss-ce = 1.04408
tensor([[0.5380, 0.5223, 0.5696, 0.5127, 0.5138, 0.5859, 0.5579, 0.5505, 0.5405, 0.5520, 0.5850, 0.5140, 0.5372, 0.5359, 0.5690, 0.6357, 0.5334, 0.5672, 0.5533, 0.4855, 0.5736, 0.5629, 0.5723, 0.5571, 0.5042, 0.5447, 0.5520, 0.5709, 0.5793, 0.5906, 0.5129, 0.5501], [0.5710, 0.5632, 0.5569, 0.5447, 0.5841, 0.5519, 0.5380, 0.5767, 0.5475, 0.5451, 0.5703, 0.5775, 0.5238, 0.5876, 0.5767, 0.5843, 0.5281, 0.5342, 0.5533, 0.5694, 0.5418, 0.5220, 0.5529, 0.5836, 0.5647, 0.5663, 0.5837, 0.5572, 0.5419, 0.5917, 0.5838, 0.5881], [0.5022, 0.5633, 0.5260, 0.6010, 0.5567, 0.5389, 0.5348, 0.5291, 0.5159, 0.5830, 0.5950, 0.5312, 0.5692, 0.5499, 0.5879, 0.5458, 0.5558, 0.5296, 0.5436, 0.5371, 0.6242, 0.5865, 0.5784, 0.6114, 0.5721, 0.5200, 0.5162, 0.5338, 0.5446, 0.5350, 0.5795, 0.5558], [0.5424, 0.5422, 0.5255, 0.5542, 0.5400, 0.5058, 0.5750, 0.5369, 0.6085, 0.5785, 0.5595, 0.5042, 0.5587, 0.6052, 0.5702, 0.5822, 0.5648, 0.5676, 0.5463, 0.5408, 0.5302, 0.5551, 0.5864, 0.5656, 0.5555, 0.5496, 0.5593, 0.5532, 0.5566, 0.5367, 0.5518, 0.5607]], device='cuda:0', grad_fn=) torch.Size([4, 32]) tensor([[0.5431, 0.5264, 0.5844, 0.5425, 0.4987, 0.5217, 0.5418, 0.5396, 0.5550, 0.5496, 0.5687, 0.6006, 0.5906, 0.5725, 0.5118, 0.6205, 0.5641, 0.5566, 0.5603, 0.5524, 0.5281, 0.5345, 0.5741, 0.5712, 0.5472, 0.5243, 0.5660, 0.5432, 0.5418, 0.5759, 0.5423, 0.5647], [0.5627, 0.5403, 0.5245, 0.5439, 0.5493, 0.6107, 0.5312, 0.5228, 0.5314, 0.5022, 0.5528, 0.5664, 0.6039, 0.5282, 0.5224, 0.5690, 0.5833, 0.5293, 0.5987, 0.4865, 0.5472, 0.5253, 0.5268, 0.5242, 0.5389, 0.5218, 0.5320, 0.5574, 0.5115, 0.5255, 0.5375, 0.5672]], device='cuda:0', grad_fn=) torch.Size([2, 32]) 2019-12-17 11:57:39: Epoch [0] time cost: 2.34 sec (0.00 h) 2019-12-17 11:57:39: Checkpoint (model & optimizer) saved to: ./exps/model_ep-0001.pth 2019-12-17 11:57:39: Optimization done!

but when try to increase batch size 60 than got like --batch-size=60 --end-epoch=1 2019-12-17 11:59:50: VideoIter:: iterator initialized (phase: '', num: 1520) 2019-12-17 11:59:53: Iter 0: start with learning rate: 1.00000e-02 (next lr step: 833) 2019-12-17 11:59:53: Start epoch 0: tensor([[0.5579, 0.5534, 0.4983, 0.5302, 0.4974, 0.5229, 0.5163, 0.5733, 0.5836, 0.5394, 0.5361, 0.5540, 0.5597, 0.5575, 0.5297, 0.5361, 0.5714, 0.5375, 0.5700, 0.5591, 0.5414, 0.5504, 0.5000, 0.5606, 0.5291, 0.5477, 0.5556, 0.5711, 0.5407, 0.5523, 0.6087, 0.5562], [0.5540, 0.5383, 0.5450, 0.5395, 0.5724, 0.5571, 0.5842, 0.5589, 0.5698, 0.5367, 0.5373, 0.5667, 0.5457, 0.5309, 0.5489, 0.5380, 0.5807, 0.5340, 0.5497, 0.5499, 0.5232, 0.4737, 0.5627, 0.5239, 0.5239, 0.5483, 0.5281, 0.5914, 0.5478, 0.5316, 0.5407, 0.5238], [0.5411, 0.5314, 0.5412, 0.5575, 0.5636, 0.5813, 0.6030, 0.5993, 0.5462, 0.5800, 0.5270, 0.5826, 0.5685, 0.5521, 0.5391, 0.6000, 0.5767, 0.5641, 0.5898, 0.5867, 0.5970, 0.5129, 0.5831, 0.5436, 0.5786, 0.5677, 0.5539, 0.5779, 0.5327, 0.5583, 0.5461, 0.5916], [0.5603, 0.5355, 0.5386, 0.5146, 0.5461, 0.6292, 0.5525, 0.5418, 0.5587, 0.5406, 0.5311, 0.5179, 0.5143, 0.5199, 0.5347, 0.5121, 0.5439, 0.5311, 0.5333, 0.5740, 0.5257, 0.5219, 0.5893, 0.5319, 0.4967, 0.5747, 0.5379, 0.5379, 0.5765, 0.5589, 0.5488, 0.5962], [0.5869, 0.5272, 0.5370, 0.5674, 0.5344, 0.5419, 0.5720, 0.5782, 0.5311, 0.6088, 0.5823, 0.6045, 0.5862, 0.5388, 0.5325, 0.5289, 0.5829, 0.5774, 0.5335, 0.5342, 0.5384, 0.5611, 0.5265, 0.5630, 0.5636, 0.5346, 0.5521, 0.5781, 0.5075, 0.5470, 0.5136, 0.5403], [0.5742, 0.5528, 0.5438, 0.5791, 0.5440, 0.5945, 0.5504, 0.5388, 0.5790, 0.5276, 0.5785, 0.5657, 0.5186, 0.5436, 0.5436, 0.5445, 0.5606, 0.5236, 0.5922, 0.5879, 0.5492, 0.5873, 0.5945, 0.5837, 0.5277, 0.5379, 0.5423, 0.5647, 0.5112, 0.5600, 0.5757, 0.5923], [0.5496, 0.6082, 0.5720, 0.5450, 0.5643, 0.5045, 0.5230, 0.5666, 0.5309, 0.5442, 0.5188, 0.5627, 0.5338, 0.5448, 0.5287, 0.5011, 0.5415, 0.5325, 0.5367, 0.5485, 0.5499, 0.5351, 0.5205, 0.5149, 0.5424, 0.5211, 0.5548, 0.5153, 0.5668, 0.5312, 0.5466, 0.5367], [0.5387, 0.5347, 0.5579, 0.5629, 0.6114, 0.5318, 0.5197, 0.5514, 0.5744, 0.5641, 0.5689, 0.5496, 0.5509, 0.6066, 0.5862, 0.5403, 0.5437, 0.5708, 0.5420, 0.5459, 0.5427, 0.5851, 0.5241, 0.5405, 0.5765, 0.5189, 0.5321, 0.5470, 0.5465, 0.5347, 0.5373, 0.5055], [0.5642, 0.5181, 0.5747, 0.5665, 0.5532, 0.5580, 0.5461, 0.5382, 0.5507, 0.5647, 0.5621, 0.5365, 0.5437, 0.5328, 0.5356, 0.5330, 0.5079, 0.5693, 0.5918, 0.5178, 0.5496, 0.5232, 0.5752, 0.5591, 0.6093, 0.5566, 0.5745, 0.5189, 0.5456, 0.5376, 0.5419, 0.5622], [0.5845, 0.5489, 0.5274, 0.5633, 0.5611, 0.5755, 0.5520, 0.5451, 0.5525, 0.5498, 0.4870, 0.5777, 0.5407, 0.5768, 0.5661, 0.5960, 0.5928, 0.5739, 0.5283, 0.5397, 0.5102, 0.5184, 0.5269, 0.5428, 0.5673, 0.5705, 0.5445, 0.5266, 0.5406, 0.5506, 0.5786, 0.5388], [0.6058, 0.5265, 0.5960, 0.5756, 0.5390, 0.5522, 0.5468, 0.5545, 0.5468, 0.5622, 0.5479, 0.5768, 0.5627, 0.5771, 0.5499, 0.5168, 0.5466, 0.5509, 0.5604, 0.5493, 0.5762, 0.5150, 0.5603, 0.5488, 0.5732, 0.5454, 0.6026, 0.6339, 0.5575, 0.6029, 0.5266, 0.5740], [0.5990, 0.5836, 0.5211, 0.6165, 0.5792, 0.5486, 0.5557, 0.5408, 0.5319, 0.5677, 0.5578, 0.5588, 0.5838, 0.5158, 0.5783, 0.5171, 0.5949, 0.5991, 0.5800, 0.5535, 0.5146, 0.5416, 0.5962, 0.6102, 0.5407, 0.6372, 0.5737, 0.5678, 0.5282, 0.5579, 0.5720, 0.5570], [0.5496, 0.5871, 0.5554, 0.5317, 0.5325, 0.5307, 0.5401, 0.5711, 0.5367, 0.5356, 0.5889, 0.5737, 0.5348, 0.5364, 0.5540, 0.5112, 0.5505, 0.5353, 0.5283, 0.5064, 0.5672, 0.6009, 0.5617, 0.5589, 0.5196, 0.5506, 0.5801, 0.5544, 0.5323, 0.5663, 0.5357, 0.5821], [0.5343, 0.5473, 0.5673, 0.5472, 0.5159, 0.5774, 0.4765, 0.5742, 0.5662, 0.5360, 0.5511, 0.5356, 0.5434, 0.4840, 0.5444, 0.5191, 0.5440, 0.5131, 0.5558, 0.5469, 0.5141, 0.5432, 0.5339, 0.5082, 0.5778, 0.5275, 0.5432, 0.5159, 0.5286, 0.5147, 0.5413, 0.5229], [0.5529, 0.5378, 0.5554, 0.5522, 0.5806, 0.5583, 0.5535, 0.5123, 0.5915, 0.6025, 0.5455, 0.5811, 0.5690, 0.5211, 0.6005, 0.5181, 0.5477, 0.5107, 0.5462, 0.5837, 0.6167, 0.5486, 0.5626, 0.5526, 0.5300, 0.5737, 0.5329, 0.5584, 0.5771, 0.5572, 0.5990, 0.5016], [0.5127, 0.5472, 0.5383, 0.5940, 0.5871, 0.5439, 0.5043, 0.5823, 0.5677, 0.5514, 0.5610, 0.5939, 0.5432, 0.5430, 0.5828, 0.5553, 0.5569, 0.5607, 0.5429, 0.5570, 0.5525, 0.5189, 0.5657, 0.5270, 0.5822, 0.5662, 0.5618, 0.5765, 0.4971, 0.6064, 0.5569, 0.5795], [0.5763, 0.5643, 0.5316, 0.5464, 0.5247, 0.5422, 0.5453, 0.5652, 0.5642, 0.5154, 0.5456, 0.5103, 0.5576, 0.5298, 0.5139, 0.5386, 0.5451, 0.5144, 0.5398, 0.5370, 0.5816, 0.5063, 0.6092, 0.6188, 0.5422, 0.5760, 0.5540, 0.5790, 0.5830, 0.5161, 0.5480, 0.5233], [0.5574, 0.5694, 0.5547, 0.5938, 0.5511, 0.5217, 0.5690, 0.5398, 0.5867, 0.5404, 0.5796, 0.5726, 0.5665, 0.5603, 0.5668, 0.5539, 0.5888, 0.5304, 0.5969, 0.6084, 0.5459, 0.5816, 0.5709, 0.5693, 0.5894, 0.5639, 0.5725, 0.5252, 0.6033, 0.5219, 0.5311, 0.5509], [0.5571, 0.5170, 0.5984, 0.5754, 0.5244, 0.5631, 0.6275, 0.6005, 0.5574, 0.5717, 0.5485, 0.5703, 0.6023, 0.5841, 0.5987, 0.5920, 0.5611, 0.5606, 0.6070, 0.6221, 0.5597, 0.5940, 0.5562, 0.5927, 0.5605, 0.5561, 0.5650, 0.5589, 0.6071, 0.5625, 0.5992, 0.5805], [0.5440, 0.5503, 0.5589, 0.5404, 0.5212, 0.5665, 0.5466, 0.5646, 0.5571, 0.5595, 0.5738, 0.5240, 0.5159, 0.5110, 0.5094, 0.5928, 0.5422, 0.5723, 0.5643, 0.5589, 0.5361, 0.5811, 0.5512, 0.5643, 0.5538, 0.5696, 0.5185, 0.5780, 0.5542, 0.5351, 0.5578, 0.5855], [0.5586, 0.5118, 0.5498, 0.5481, 0.5962, 0.5237, 0.5352, 0.5741, 0.5463, 0.5325, 0.5503, 0.5697, 0.5348, 0.5374, 0.5539, 0.5750, 0.5496, 0.5282, 0.5573, 0.5457, 0.5071, 0.5798, 0.5636, 0.5194, 0.5367, 0.5617, 0.5416, 0.5446, 0.5505, 0.5837, 0.6109, 0.5679], [0.5468, 0.5427, 0.5589, 0.5409, 0.5386, 0.5225, 0.6193, 0.5250, 0.5950, 0.5650, 0.5409, 0.5646, 0.5241, 0.5365, 0.5414, 0.5276, 0.5413, 0.5827, 0.5143, 0.5507, 0.5359, 0.5759, 0.5588, 0.5330, 0.5456, 0.5220, 0.5959, 0.5668, 0.5753, 0.5503, 0.5180, 0.5034], [0.5772, 0.5628, 0.5487, 0.5313, 0.5562, 0.5438, 0.5343, 0.5849, 0.5840, 0.5511, 0.5615, 0.5352, 0.5331, 0.5455, 0.5886, 0.5069, 0.5614, 0.5325, 0.5592, 0.5212, 0.5756, 0.5523, 0.5495, 0.5461, 0.5462, 0.5703, 0.5586, 0.5601, 0.5229, 0.5321, 0.5614, 0.5694], [0.6040, 0.5747, 0.5518, 0.5649, 0.5307, 0.5475, 0.6095, 0.5884, 0.5235, 0.5203, 0.5568, 0.5428, 0.5723, 0.5322, 0.5529, 0.6178, 0.5231, 0.5375, 0.5850, 0.5088, 0.5372, 0.5571, 0.5727, 0.5456, 0.5919, 0.5373, 0.5722, 0.5186, 0.5785, 0.5410, 0.5533, 0.5726], [0.5369, 0.5790, 0.5842, 0.5201, 0.5227, 0.5896, 0.5441, 0.4922, 0.5533, 0.6052, 0.5609, 0.5839, 0.5736, 0.5649, 0.5252, 0.5762, 0.5541, 0.5528, 0.5693, 0.5457, 0.5338, 0.5035, 0.5418, 0.5556, 0.5281, 0.5366, 0.5458, 0.5358, 0.5664, 0.5317, 0.5922, 0.5690], [0.5724, 0.5571, 0.5412, 0.5102, 0.5320, 0.5372, 0.5628, 0.5732, 0.5506, 0.5700, 0.5330, 0.5219, 0.5117, 0.5601, 0.5634, 0.5753, 0.5670, 0.5571, 0.5694, 0.5671, 0.5088, 0.5288, 0.5020, 0.5233, 0.5374, 0.5625, 0.5500, 0.5362, 0.5282, 0.5777, 0.5370, 0.5325], [0.5557, 0.5020, 0.5330, 0.5686, 0.5563, 0.5727, 0.5296, 0.5375, 0.5531, 0.5328, 0.5841, 0.5966, 0.5653, 0.5700, 0.5213, 0.5675, 0.5618, 0.5686, 0.5336, 0.5955, 0.6067, 0.5999, 0.5222, 0.5625, 0.5389, 0.5211, 0.5422, 0.5322, 0.5222, 0.5209, 0.5907, 0.5256], [0.5287, 0.5720, 0.5892, 0.5466, 0.5580, 0.5463, 0.5542, 0.5458, 0.5510, 0.5483, 0.6098, 0.5501, 0.5613, 0.5953, 0.5640, 0.5523, 0.5218, 0.5167, 0.5621, 0.5668, 0.6155, 0.5182, 0.5707, 0.5731, 0.5254, 0.5491, 0.5627, 0.5257, 0.4973, 0.5440, 0.5371, 0.5721], [0.5594, 0.5654, 0.4961, 0.5681, 0.5408, 0.5445, 0.5324, 0.5716, 0.5607, 0.5899, 0.5357, 0.5961, 0.5489, 0.5310, 0.5320, 0.6013, 0.5095, 0.5346, 0.5369, 0.5750, 0.5074, 0.5133, 0.5485, 0.5780, 0.5510, 0.5352, 0.5688, 0.6013, 0.5481, 0.5437, 0.5712, 0.5657], [0.5935, 0.5446, 0.5304, 0.5172, 0.5247, 0.5627, 0.5451, 0.5335, 0.6012, 0.5305, 0.5805, 0.5388, 0.5472, 0.5644, 0.5509, 0.5315, 0.5270, 0.5553, 0.5537, 0.5335, 0.5052, 0.5470, 0.5410, 0.5486, 0.5224, 0.6009, 0.5245, 0.5316, 0.5479, 0.4847, 0.5665, 0.5473]], device='cuda:0', grad_fn=) torch.Size([30, 32]) 2019-12-17 11:59:56: Epoch [0] Batch [0] Speed 20.9 (+293) sample/sec losss-ce = 1.03999
2019-12-17 11:59:56: Epoch [0] time cost: 3.17 sec (0.00 h) 2019-12-17 11:59:56: Checkpoint (model & optimizer) saved to: ./exps/model_ep-0001.pth 2019-12-17 11:59:56: Optimization done!

i can't get the informations more than 30 videos @ekosman

ekosman commented 4 years ago

Don't change the batch size, it's fixed according to the original paper. I'll remove this parameter soon.

cervantes-loves-ai commented 4 years ago

thank you, and will wait for it and regarding to "normal_segments_scores" tensor, how to get segments score of all 800 videos?