deep-diver / Soccer-Ball-Detection-YOLOv2

YOLOv2 trained against custom dataset
116 stars 43 forks source link

AssertionError: expect 202335260 bytes, found 203934260 #3

Closed husnejahan closed 4 years ago

husnejahan commented 5 years ago

Parsing cfg/yolo_custom.cfg Loading bin/yolo.weights ...

AssertionError Traceback (most recent call last)

in () ----> 1 tfnet = TFNet(options) ~/anaconda3/lib/python3.6/site-packages/darkflow/net/build.py in __init__(self, FLAGS, darknet) 56 57 if darknet is None: ---> 58 darknet = Darknet(FLAGS) 59 self.ntrain = len(darknet.layers) 60 ~/anaconda3/lib/python3.6/site-packages/darkflow/dark/darknet.py in __init__(self, FLAGS) 25 self.meta, self.layers = des_parsed 26 ---> 27 self.load_weights() 28 29 def get_weight_src(self, FLAGS): ~/anaconda3/lib/python3.6/site-packages/darkflow/dark/darknet.py in load_weights(self) 80 81 args = [self.src_bin, self.src_layers] ---> 82 wgts_loader = loader.create_loader(*args) 83 for layer in self.layers: layer.load(wgts_loader) 84 ~/anaconda3/lib/python3.6/site-packages/darkflow/utils/loader.py in create_loader(path, cfg) 103 load_type = checkpoint_loader 104 --> 105 return load_type(path, cfg) 106 107 class weights_walker(object): ~/anaconda3/lib/python3.6/site-packages/darkflow/utils/loader.py in __init__(self, *args) 17 self.src_key = list() 18 self.vals = list() ---> 19 self.load(*args) 20 21 def __call__(self, key): ~/anaconda3/lib/python3.6/site-packages/darkflow/utils/loader.py in load(self, path, src_layers) 75 assert walker.offset == walker.size, \ 76 'expect {} bytes, found {}'.format( ---> 77 walker.offset, walker.size) 78 print('Successfully identified {} bytes'.format( 79 walker.offset)) AssertionError: expect 202335260 bytes, found 203934260
husnejahan commented 5 years ago

I modify the line self.offset = 16 in the ./darkflow/utils/loader.py file and replace with self.offset = 20. But can not solve the issue.

giri06 commented 5 years ago

cfg/yolo_custom.cfg parsing ./annotations/ Parsing for ['ball', 'goal post'] [====================>]100% scene21261.xml Statistics: ball: 177 goal post: 54 Dataset size: 191 Dataset of 191 instance(s) Training statistics: Learning rate : 1e-05 Batch size : 8 Epoch number : 100 Backup every : 2000 step 1 - loss 208.58480834960938 - moving ave loss 208.58480834960938 step 2 - loss 206.68606567382812 - moving ave loss 208.39493408203126 step 3 - loss 205.71168518066406 - moving ave loss 208.12660919189454 step 4 - loss 203.2139892578125 - moving ave loss 207.63534719848633 step 5 - loss 202.1798095703125 - moving ave loss 207.08979343566895 step 6 - loss 201.39364624023438 - moving ave loss 206.5201787161255 step 7 - loss 200.036865234375 - moving ave loss 205.87184736795047 step 8 - loss 199.026123046875 - moving ave loss 205.1872749358429 step 9 - loss 198.7299041748047 - moving ave loss 204.5415378597391 step 10 - loss 197.5237579345703 - moving ave loss 203.8397598672222 step 11 - loss 197.13595581054688 - moving ave loss 203.16937946155468 step 12 - loss 196.96920776367188 - moving ave loss 202.5493622917664 step 13 - loss 195.44493103027344 - moving ave loss 201.8389191656171 step 14 - loss 194.11166381835938 - moving ave loss 201.06619363089135 step 15 - loss 194.1480712890625 - moving ave loss 200.3743813967085 step 16 - loss 193.16526794433594 - moving ave loss 199.65347005147123 step 17 - loss 191.91339111328125 - moving ave loss 198.87946215765223 step 18 - loss 191.560791015625 - moving ave loss 198.1475950434495 step 19 - loss 190.933349609375 - moving ave loss 197.42617050004208 step 20 - loss 189.12884521484375 - moving ave loss 196.59643797152225 step 21 - loss 189.468994140625 - moving ave loss 195.88369358843252 step 22 - loss 187.623291015625 - moving ave loss 195.05765333115178 step 23 - loss 186.94000244140625 - moving ave loss 194.24588824217724 Finish 1 epoch(es) step 24 - loss 187.31460571289062 - moving ave loss 193.55275998924859 step 25 - loss 186.47097778320312 - moving ave loss 192.84458176864402 step 26 - loss 185.1648406982422 - moving ave loss 192.07660766160384 step 27 - loss 183.7412109375 - moving ave loss 191.24306798919346 step 28 - loss 183.29989624023438 - moving ave loss 190.44875081429754 step 29 - loss 182.87384033203125 - moving ave loss 189.6912597660709 step 30 - loss 182.9154815673828 - moving ave loss 189.01368194620213 step 31 - loss 181.43214416503906 - moving ave loss 188.25552816808582 step 32 - loss 178.78842163085938 - moving ave loss 187.3088175143632 step 33 - loss 180.00962829589844 - moving ave loss 186.5788985925167 step 34 - loss 179.9169464111328 - moving ave loss 185.9127033743783 step 35 - loss 178.39773559570312 - moving ave loss 185.1612065965108 step 36 - loss 177.58987426757812 - moving ave loss 184.40407336361753 step 37 - loss 177.4478759765625 - moving ave loss 183.70845362491204 step 38 - loss 176.67718505859375 - moving ave loss 183.0053267682802 step 39 - loss 174.88424682617188 - moving ave loss 182.19321877406938 step 40 - loss 175.68580627441406 - moving ave loss 181.54247752410384 step 41 - loss 173.4027099609375 - moving ave loss 180.7285007677872 step 42 - loss 172.8448028564453 - moving ave loss 179.94013097665302 step 43 - loss 171.84286499023438 - moving ave loss 179.13040437801118 step 44 - loss 171.2805938720703 - moving ave loss 178.3454233274171 step 45 - loss 170.41908264160156 - moving ave loss 177.55278925883556 step 46 - loss 169.46205139160156 - moving ave loss 176.74371547211217 Finish 2 epoch(es) step 47 - loss 168.29440307617188 - moving ave loss 175.89878423251812 step 48 - loss 166.92617797851562 - moving ave loss 175.0015236071179 step 49 - loss 167.864013671875 - moving ave loss 174.2877726135936 step 50 - loss 166.11180114746094 - moving ave loss 173.47017546698035 step 51 - loss 166.15428161621094 - moving ave loss 172.7385860819034 step 52 - loss 164.53982543945312 - moving ave loss 171.91871001765838 step 53 - loss 163.93490600585938 - moving ave loss 171.1203296164785 step 54 - loss 163.80392456054688 - moving ave loss 170.38868911088534 step 55 - loss 162.86129760742188 - moving ave loss 169.635949960539 step 56 - loss 161.6680908203125 - moving ave loss 168.83916404651634 step 57 - loss 160.37171936035156 - moving ave loss 167.99241957789985 step 58 - loss 161.3682861328125 - moving ave loss 167.33000623339112 step 59 - loss 159.05319213867188 - moving ave loss 166.5023248239192 step 60 - loss 158.26014709472656 - moving ave loss 165.67810705099993 step 61 - loss 158.1600341796875 - moving ave loss 164.92629976386868 step 62 - loss 158.1514129638672 - moving ave loss 164.24881108386853 step 63 - loss 156.42967224121094 - moving ave loss 163.46689719960278 step 64 - loss 154.62887573242188 - moving ave loss 162.58309505288472 step 65 - loss 155.60385131835938 - moving ave loss 161.88517067943218 step 66 - loss 155.21969604492188 - moving ave loss 161.21862321598115 step 67 - loss 153.7068328857422 - moving ave loss 160.46744418295725 step 68 - loss 152.2437286376953 - moving ave loss 159.64507262843105 step 69 - loss 152.10321044921875 - moving ave loss 158.8908864105098 Finish 3 epoch(es) step 70 - loss 150.66229248046875 - moving ave loss 158.0680270175057 step 71 - loss 149.8753204345703 - moving ave loss 157.24875635921217 step 72 - loss 149.26736450195312 - moving ave loss 156.45061717348625 step 73 - loss 147.99295043945312 - moving ave loss 155.60485050008293 step 74 - loss 147.46588134765625 - moving ave loss 154.79095358484028 step 75 - loss 147.01220703125 - moving ave loss 154.01307892948125 step 76 - loss 144.76470947265625 - moving ave loss 153.08824198379875 step 77 - loss 144.36280822753906 - moving ave loss 152.21569860817277 step 78 - loss 144.1804962158203 - moving ave loss 151.41217836893753 step 79 - loss 143.24322509765625 - moving ave loss 150.5952830418094 step 80 - loss 142.84527587890625 - moving ave loss 149.82028232551912 step 81 - loss 140.5435791015625 - moving ave loss 148.89261200312347 step 82 - loss 139.39959716796875 - moving ave loss 147.943310519608 step 83 - loss 139.63778686523438 - moving ave loss 147.11275815417062 step 84 - loss 138.93038940429688 - moving ave loss 146.29452127918324 step 85 - loss 137.70849609375 - moving ave loss 145.43591876063994 step 86 - loss 136.4426727294922 - moving ave loss 144.53659415752514 step 87 - loss 136.5662841796875 - moving ave loss 143.7395631597414 step 88 - loss 135.49234008789062 - moving ave loss 142.91484085255632 step 89 - loss 134.2874755859375 - moving ave loss 142.05210432589445 step 90 - loss 133.349853515625 - moving ave loss 141.1818792448675 step 91 - loss 133.47549438476562 - moving ave loss 140.41124075885733 step 92 - loss 132.45663452148438 - moving ave loss 139.61578013512005 Finish 4 epoch(es) step 93 - loss 131.29763793945312 - moving ave loss 138.78396591555335 step 94 - loss 130.80963134765625 - moving ave loss 137.98653245876363 step 95 - loss 128.58993530273438 - moving ave loss 137.0468727431607 step 96 - loss 129.54754638671875 - moving ave loss 136.2969401075165 step 97 - loss 128.74661254882812 - moving ave loss 135.54190735164767 step 98 - loss 126.58621215820312 - moving ave loss 134.6463378323032 step 99 - loss 124.79586791992188 - moving ave loss 133.66129084106507 step 100 - loss 124.49826049804688 - moving ave loss 132.74498780676325 step 101 - loss 122.90637969970703 - moving ave loss 131.7611269960576 step 102 - loss 121.33383178710938 - moving ave loss 130.7183974751628 step 103 - loss 122.42326354980469 - moving ave loss 129.888884082627 step 104 - loss 123.48786163330078 - moving ave loss 129.24878183769437 step 105 - loss 120.17093658447266 - moving ave loss 128.3409973123722 step 106 - loss 116.74472045898438 - moving ave loss 127.18136962703342 step 107 - loss 117.33317565917969 - moving ave loss 126.19655023024805 step 108 - loss 116.01788330078125 - moving ave loss 125.17868353730137 step 109 - loss 117.36740112304688 - moving ave loss 124.39755529587592 step 110 - loss 115.64006042480469 - moving ave loss 123.5218058087688 step 111 - loss 115.30579376220703 - moving ave loss 122.70020460411263 step 112 - loss 112.94532775878906 - moving ave loss 121.72471691958027 step 113 - loss 114.22193145751953 - moving ave loss 120.9744383733742 step 114 - loss 111.14302825927734 - moving ave loss 119.99129736196451 step 115 - loss 111.2153091430664 - moving ave loss 119.11369854007471 Finish 5 epoch(es) step 116 - loss 112.01972961425781 - moving ave loss 118.40430164749301 step 117 - loss 108.77558135986328 - moving ave loss 117.44142961873006 step 118 - loss 110.06942749023438 - moving ave loss 116.70422940588048 step 119 - loss 106.92150115966797 - moving ave loss 115.72595658125924 step 120 - loss 106.91034698486328 - moving ave loss 114.84439562161965 step 121 - loss 106.18382263183594 - moving ave loss 113.97833832264128 step 122 - loss 108.79846954345703 - moving ave loss 113.46035144472286 step 123 - loss 102.95854187011719 - moving ave loss 112.41017048726229 step 124 - loss 102.70999145507812 - moving ave loss 111.44015258404389 step 125 - loss 104.42990112304688 - moving ave loss 110.73912743794419 step 126 - loss 103.6244125366211 - moving ave loss 110.02765594781188 step 127 - loss 106.71337890625 - moving ave loss 109.6962282436557 step 128 - loss 102.13422393798828 - moving ave loss 108.94002781308896 step 129 - loss 99.4581069946289 - moving ave loss 107.99183573124296 step 130 - loss 101.72157287597656 - moving ave loss 107.36480944571632 step 131 - loss 99.28535461425781 - moving ave loss 106.55686396257047 step 132 - loss 97.21107482910156 - moving ave loss 105.62228504922359 step 133 - loss 99.51994323730469 - moving ave loss 105.01205086803171 step 134 - loss 99.22979736328125 - moving ave loss 104.43382551755667 step 135 - loss 96.61666870117188 - moving ave loss 103.6521098359182 step 136 - loss 95.906494140625 - moving ave loss 102.87754826638887 step 137 - loss 94.78858947753906 - moving ave loss 102.06865238750389 step 138 - loss 95.60554504394531 - moving ave loss 101.42234165314804 Finish 6 epoch(es) step 139 - loss 92.17178344726562 - moving ave loss 100.4972858325598 step 140 - loss 91.33824157714844 - moving ave loss 99.58138140701865 step 141 - loss 89.31449890136719 - moving ave loss 98.55469315645351 step 142 - loss 90.36151123046875 - moving ave loss 97.73537496385504 step 143 - loss 89.38053894042969 - moving ave loss 96.8998913615125 step 144 - loss 90.20330810546875 - moving ave loss 96.23023303590813 step 145 - loss 90.02406311035156 - moving ave loss 95.60961604335247 step 146 - loss 92.49192810058594 - moving ave loss 95.29784724907582 step 147 - loss 90.3628921508789 - moving ave loss 94.80435173925612 step 148 - loss 86.81578063964844 - moving ave loss 94.00549462929536 step 149 - loss 87.04601287841797 - moving ave loss 93.30954645420762 step 150 - loss 86.69454956054688 - moving ave loss 92.64804676484155 step 151 - loss 85.05718994140625 - moving ave loss 91.88896108249803 step 152 - loss 83.60057067871094 - moving ave loss 91.06012204211932 step 153 - loss 90.35209655761719 - moving ave loss 90.9893194936691 step 154 - loss 84.08908081054688 - moving ave loss 90.29929562535688 step 155 - loss 89.6490478515625 - moving ave loss 90.23427084797744 step 156 - loss 81.04840087890625 - moving ave loss 89.31568385107032 step 157 - loss 79.776123046875 - moving ave loss 88.36172777065079 step 158 - loss 79.84941101074219 - moving ave loss 87.51049609465993 step 159 - loss 83.24927520751953 - moving ave loss 87.08437400594589 step 160 - loss 80.50261688232422 - moving ave loss 86.42619829358372 step 161 - loss 81.61810302734375 - moving ave loss 85.94538876695974 Finish 7 epoch(es) step 162 - loss 79.56330871582031 - moving ave loss 85.30718076184579 step 163 - loss 76.50083923339844 - moving ave loss 84.42654660900106 step 164 - loss 79.57357788085938 - moving ave loss 83.9412497361869 step 165 - loss 79.59651947021484 - moving ave loss 83.5067767095897 step 166 - loss 75.34329986572266 - moving ave loss 82.69042902520299 step 167 - loss 76.04550170898438 - moving ave loss 82.02593629358113 step 168 - loss 78.19114685058594 - moving ave loss 81.64245734928161 step 169 - loss 74.9796142578125 - moving ave loss 80.97617304013471 step 170 - loss 77.21961975097656 - moving ave loss 80.6005177112189 step 171 - loss 78.17405700683594 - moving ave loss 80.3578716407806 step 172 - loss 72.09965515136719 - moving ave loss 79.53204999183926 step 173 - loss 75.09050750732422 - moving ave loss 79.08789574338776 step 174 - loss 75.42056274414062 - moving ave loss 78.72116244346306 step 175 - loss 73.62187194824219 - moving ave loss 78.21123339394097 step 176 - loss 73.80097961425781 - moving ave loss 77.77020801597266 step 177 - loss 75.75357055664062 - moving ave loss 77.56854427003945 step 178 - loss 72.53475952148438 - moving ave loss 77.06516579518394 step 179 - loss 72.5950927734375 - moving ave loss 76.6181584930093 step 180 - loss 70.90780639648438 - moving ave loss 76.04712328335683 step 181 - loss 70.38195037841797 - moving ave loss 75.48060599286295 step 182 - loss 70.91569519042969 - moving ave loss 75.02411491261962 step 183 - loss 71.9124984741211 - moving ave loss 74.71295326876977 step 184 - loss 70.58320617675781 - moving ave loss 74.29997855956857 Finish 8 epoch(es) step 185 - loss 68.53063201904297 - moving ave loss 73.72304390551601 step 186 - loss 68.19671630859375 - moving ave loss 73.17041114582378 step 187 - loss 66.86723327636719 - moving ave loss 72.54009335887811 step 188 - loss 65.95557403564453 - moving ave loss 71.88164142655475 step 189 - loss 66.90568542480469 - moving ave loss 71.38404582637975 step 190 - loss 68.72626495361328 - moving ave loss 71.1182677391031 step 191 - loss 63.67438507080078 - moving ave loss 70.37387947227288 step 192 - loss 65.57852172851562 - moving ave loss 69.89434369789716 step 193 - loss 63.219093322753906 - moving ave loss 69.22681866038283 step 194 - loss 65.15702819824219 - moving ave loss 68.81983961416877 step 195 - loss 65.59504699707031 - moving ave loss 68.49736035245893 step 196 - loss 64.05462646484375 - moving ave loss 68.05308696369741 step 197 - loss 63.43571472167969 - moving ave loss 67.59134973949564 step 198 - loss 62.62007141113281 - moving ave loss 67.09422190665936 step 199 - loss 62.74987030029297 - moving ave loss 66.65978674602272 step 200 - loss 61.9943962097168 - moving ave loss 66.19324769239212 step 201 - loss 63.25391387939453 - moving ave loss 65.89931431109235 step 202 - loss 64.1312255859375 - moving ave loss 65.72250543857687 step 203 - loss 63.59746551513672 - moving ave loss 65.51000144623286 step 204 - loss 61.93889236450195 - moving ave loss 65.15289053805976 step 205 - loss 60.61602020263672 - moving ave loss 64.69920350451747 step 206 - loss 63.50352478027344 - moving ave loss 64.57963563209307 step 207 - loss 59.44737243652344 - moving ave loss 64.06640931253611 Finish 9 epoch(es) step 208 - loss 60.91448974609375 - moving ave loss 63.751217355891875 step 209 - loss 58.316383361816406 - moving ave loss 63.207733956484326 step 210 - loss 59.103580474853516 - moving ave loss 62.79731860832125 step 211 - loss 58.547264099121094 - moving ave loss 62.37231315740124 step 212 - loss 58.63814926147461 - moving ave loss 61.998896767808574 step 213 - loss 57.81769561767578 - moving ave loss 61.580776652795294 step 214 - loss 58.93470001220703 - moving ave loss 61.31616898873647 step 215 - loss 59.09355545043945 - moving ave loss 61.09390763490677 step 216 - loss 57.06298828125 - moving ave loss 60.6908156995411 step 217 - loss 59.583168029785156 - moving ave loss 60.580050932565506 step 218 - loss 56.281829833984375 - moving ave loss 60.1502288227074 step 219 - loss 54.77497482299805 - moving ave loss 59.61270342273646 step 220 - loss 58.0753288269043 - moving ave loss 59.45896596315325 step 221 - loss 54.55260467529297 - moving ave loss 58.968329834367225 step 222 - loss 55.517189025878906 - moving ave loss 58.623215753518394 step 223 - loss 55.21542739868164 - moving ave loss 58.282436918034726 step 224 - loss 53.373748779296875 - moving ave loss 57.791568104160945 step 225 - loss 53.56874084472656 - moving ave loss 57.36928537821751 step 226 - loss 51.48128128051758 - moving ave loss 56.780484968447524 step 227 - loss 52.206851959228516 - moving ave loss 56.323121667525626 step 228 - loss 52.93808364868164 - moving ave loss 55.984617865641226 step 229 - loss 52.67249298095703 - moving ave loss 55.65340537717281 step 230 - loss 51.370304107666016 - moving ave loss 55.225095250222125 Finish 10 epoch(es) step 231 - loss 50.33940887451172 - moving ave loss 54.73652661265109 step 232 - loss 51.70677185058594 - moving ave loss 54.43355113644458 step 233 - loss 51.74354934692383 - moving ave loss 54.1645509574925 step 234 - loss 51.094329833984375 - moving ave loss 53.85752884514169 step 235 - loss 48.77113342285156 - moving ave loss 53.348889302912674 step 236 - loss 50.949256896972656 - moving ave loss 53.10892606231867 step 237 - loss 49.72297668457031 - moving ave loss 52.77033112454384 step 238 - loss 50.44377899169922 - moving ave loss 52.53767591125938 step 239 - loss 49.981178283691406 - moving ave loss 52.28202614850258 step 240 - loss 50.736392974853516 - moving ave loss 52.12746283113768 step 241 - loss 49.77989959716797 - moving ave loss 51.89270650774071 step 242 - loss 49.01692199707031 - moving ave loss 51.60512805667368 step 243 - loss 49.460044860839844 - moving ave loss 51.3906197370903 step 244 - loss 47.50181579589844 - moving ave loss 51.00173934297111 step 245 - loss 46.564430236816406 - moving ave loss 50.55800843235564 step 246 - loss 46.724395751953125 - moving ave loss 50.17464716431539 step 247 - loss 47.44655990600586 - moving ave loss 49.90183843848443 step 248 - loss 47.082298278808594 - moving ave loss 49.61988442251685 step 249 - loss 45.824951171875 - moving ave loss 49.240391097452665 step 250 - loss 45.82374572753906 - moving ave loss 48.898726560461306

FileNotFoundError Traceback (most recent call last)

in ----> 1 tfnet.train() ~\Anaconda3\envs\project\lib\site-packages\darkflow\net\flow.py in train(self) 70 ckpt = (i+1) % (self.FLAGS.save // self.FLAGS.batch) 71 args = [step_now, profile] ---> 72 if not ckpt: _save_ckpt(self, *args) 73 74 if ckpt: _save_ckpt(self, *args) ~\Anaconda3\envs\project\lib\site-packages\darkflow\net\flow.py in _save_ckpt(self, step, loss_profile) 21 profile = file.format(model, step, '.profile') 22 profile = os.path.join(self.FLAGS.backup, profile) ---> 23 with open(profile, 'wb') as profile_ckpt: 24 pickle.dump(loss_profile, profile_ckpt) 25 FileNotFoundError: [Errno 2] No such file or directory: './ckpt/yolo_custom-250.profile'
deep-diver commented 5 years ago

looks like the weight file is broken, can you download and try one more shot?

turmezzz commented 5 years ago

hi, have same problem. Just downloaded weights from link that u gave (working with your instructions https://towardsdatascience.com/yolov2-to-detect-your-own-objects-soccer-ball-using-darkflow-a4f98d5ce5bf), but nothing changed. What different can be wrong?

arnandprabaswara commented 5 years ago

looks like the weight file is broken, can you download and try one more shot?

i have try to download and try to train again, but stiill error.

jeditelo commented 5 years ago

I modify the line self.offset = 16 in the ./darkflow/utils/loader.py file and replace with self.offset = 20. But can not solve the issue.

i have the same problem.. did you find some solutions?

pdhruv93 commented 5 years ago

Yes I also have same problem.
@deep-diver I am using yolov2-tiny-voc_10000.weights and yolov2-tiny-voc.cfg in combination. Both of them I have downloaded from official darkflow site.

The deafult yolov2-tiny-voc.cfg file has filters=35 But for my custom dataset classes=1 so filters should be 30. Making it 30 straightaway gives error: AssertionError: expect 63082060 bytes, found 63102560

Here is my repo link: https://github.com/pdhruv93/RoadCrackDetector

B-tum commented 5 years ago

Good day! I had the same problem

Мanaged to solve it by adjusting the values in the file ./darkflow/utils/loader.py necessary

Ign0reLee commented 5 years ago

Hi, I also encountered the same error while studying this code.

And after searching a little bit, I was finally able to solve it.

first, problem is that the number of parameters in the weight file is different from the number of parameters in the cfg file.

See this link for information about why changing the parameters in the weight file

Link: https://github.com/thtrieu/darkflow/issues/107

Perhaps this code would have created a cfg file based on an older version of the weight file.

Therefore, we need to receive the previous version of the weight file.

Link: https://pjreddie.com/darknet/yolo/

Change the load section of the main ipynb file by getting the file "yolov2.weight" from here.

It can be solved with that.

제목 없음

chenminhua commented 4 years ago

Hi, where did you get the previous version of the weight file?

I tried "https://pjreddie.com/media/files/yolov2.weights" but it still not working.

Ign0reLee commented 4 years ago

Hi, I think I wrote the same weight file with you.

Please check if the error message also shows the same error message.

If you get the same error message, you should try the cfg file with the file provided in the link.

After try , If still raise same error Please Show me your options code and error message

chenminhua commented 4 years ago

Hi, thanks for helping. I used the same cfg file in this repo. Nothing changed. my network is

options = {"model": "cfg/yolo_custom.cfg", 
           "load": "bin/yolov2.weights",
           "batch": 8,
           "epoch": 100,
           "gpu": 1.0,
           "train": True,
           "annotation": "./annotations/",
           "dataset": "./images/"}
tfnet = TFNet(options)

Here is the error msg.

Parsing cfg/yolo_custom.cfg
Loading bin/yolov2.weights ...
/home/chenmh/workspace/darkflow/darkflow/dark/darknet.py:54: UserWarning: ./cfg/yolov2.cfg not found, use cfg/yolo_custom.cfg instead
  cfg_path, FLAGS.model))
---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-3-200f77f1939f> in <module>
----> 1 tfnet = TFNet(options)

~/workspace/darkflow/darkflow/net/build.py in __init__(self, FLAGS, darknet)
     56 
     57                 if darknet is None:
---> 58                         darknet = Darknet(FLAGS)
     59                         self.ntrain = len(darknet.layers)
     60 

~/workspace/darkflow/darkflow/dark/darknet.py in __init__(self, FLAGS)
     25                 self.meta, self.layers = des_parsed
     26 
---> 27         self.load_weights()
     28 
     29     def get_weight_src(self, FLAGS):

~/workspace/darkflow/darkflow/dark/darknet.py in load_weights(self)
     80 
     81         args = [self.src_bin, self.src_layers]
---> 82         wgts_loader = loader.create_loader(*args)
     83         for layer in self.layers: layer.load(wgts_loader)
     84 

~/workspace/darkflow/darkflow/utils/loader.py in create_loader(path, cfg)
    103         load_type = checkpoint_loader
    104 
--> 105     return load_type(path, cfg)
    106 
    107 class weights_walker(object):

~/workspace/darkflow/darkflow/utils/loader.py in __init__(self, *args)
     17         self.src_key = list()
     18         self.vals = list()
---> 19         self.load(*args)
     20 
     21     def __call__(self, key):

~/workspace/darkflow/darkflow/utils/loader.py in load(self, path, src_layers)
     75             assert walker.offset == walker.size, \
     76             'expect {} bytes, found {}'.format(
---> 77                 walker.offset, walker.size)
     78             print('Successfully identified {} bytes'.format(
     79                 walker.offset))

AssertionError: expect 202335260 bytes, found 203934260

Then, I try the cfg file "https://github.com/pjreddie/darknet/blob/master/cfg/yolov2.cfg" and change the region part. In this time, I can build the network, but training is still not working.

Would you like to share your cfg file with me?

Ign0reLee commented 4 years ago

yolo_custom.zip

I used this cfg file.

If it does not work with this cfg file, we recommend customizing it in the existing yolo v2 file.

You do not have to change a lot.

Perhaps it only changed the number of classes and labels.

Please note that this cfg has 2 classes, so you can change it by referring to the original link.

chenminhua commented 4 years ago

It works! Appreciate your help ^_^

Vineet2408 commented 4 years ago

Good day! I had the same problem

Мanaged to solve it by adjusting the values in the file ./darkflow/utils/loader.py necessary

can you tell me, what did you adjust

Zrufy commented 4 years ago

https://stackoverflow.com/questions/55224586/assertionerror-expect-202335260-bytes-found-203934260-soccer-ball-detection-us/58269646#58269646

Tanmay-Kulkarni101 commented 4 years ago

In darkflow/utils/loader.py

class weights_walker(object):
    """incremental reader of float32 binary files"""
    def __init__(self, path):
        self.eof = False # end of file
        self.path = path  # current pos
        if path is None: 
            self.eof = True
            return
        else: 
            self.size = os.path.getsize(path)# save the path
            major, minor, revision, seen = np.memmap(path,
                shape = (), mode = 'r', offset = 0,
                dtype = '({})i4,'.format(4))
            self.transpose = major > 1000 or minor > 1000
            self.offset = 16 + 203934260 - 202335260

Make the adjustment according to the error that you have faced.

Zrufy commented 4 years ago

if you found the answer useful, help me on stackoverflow. So that those who have the same problem will immediately find the solution.

Ademord commented 4 years ago

Thanks everyone for your help solving this ! The solution for me was to

  1. Get the right cfg file
  2. Get the right weights
  3. Modify the loader.py file self.offset = 16 + 203934260 - 202335260
  4. Add the checkpoint folder
     mkdir ckpt 
     touch ckpt/checkpoint

However... Were you able to get to the ~0.5 mov avg loss the author claims? I will restart my training, but I am done with the 100 epochs and the avg loss is at 100 :(

DishaAnDS commented 3 years ago

In darkflow/utils/loader.py

class weights_walker(object):
    """incremental reader of float32 binary files"""
    def __init__(self, path):
        self.eof = False # end of file
        self.path = path  # current pos
        if path is None: 
            self.eof = True
            return
        else: 
            self.size = os.path.getsize(path)# save the path
            major, minor, revision, seen = np.memmap(path,
                shape = (), mode = 'r', offset = 0,
                dtype = '({})i4,'.format(4))
            self.transpose = major > 1000 or minor > 1000
            self.offset = 16 + 203934260 - 202335260

Make the adjustment according to the error that you have faced.

Thank you, this really solves my issue.

rohandhi commented 3 years ago

Open ./darkflow/darkflow/utils/loader.py and in line 75 remove:

 assert walker.offset == walker.size, \
        'expect {} bytes, found {}'.format(
            walker.offset, walker.size)

It worked for me. Finally, it will look like:


if walker.path is not None:
            print('Successfully identified {} bytes'.format(
                walker.offset))