IDEALLab / bezier-gan

Bézier Generative Adversarial Networks
MIT License
37 stars 21 forks source link

training issue! #12

Open zakaria-hassoun opened 7 months ago

zakaria-hassoun commented 7 months ago

Hi everybody I trained the Bazier GAN using the training code given in the folder !python train_gan.py train 2 2 --model_id=0, but i got the error :

94: [D] real 0.049025 fake 0.055628 q -0.489287 [G] fake 2.868984 reg 0.162203 q -0.756375 95: [D] real 0.024284 fake 0.066706 q -0.767501 [G] fake 3.008167 reg 0.151189 q -0.750525 96: [D] real 0.025183 fake 0.063513 q -0.674761 [G] fake 2.918955 reg 0.140957 q -0.816136 97: [D] real 0.018686 fake 0.058341 q -0.535204 [G] fake 2.976438 reg 0.146039 q -0.792278 98: [D] real 0.020485 fake 0.052454 q -0.686408 [G] fake 3.203141 reg 0.146167 q -0.763868 99: [D] real 0.017695 fake 0.049382 q -0.872483 [G] fake 3.673816 reg 0.161016 q -0.910567 100: [D] real 0.019736 fake 0.024547 q -0.772567 [G] fake 3.770491 reg 0.210877 q -0.896318 101: [D] real 0.012227 fake 0.029819 q -0.819332 [G] fake 3.622375 reg 0.190337 q -0.887288 102: [D] real 0.009362 fake 0.037180 q -0.904366 [G] fake 4.002365 reg 0.185869 q -0.870554 103: [D] real 0.008421 fake 0.020478 q -0.865159 [G] fake 3.838341 reg 0.200372 q -1.013012 104: [D] real 0.012289 fake 0.027121 q -0.583854 [G] fake 3.490432 reg 0.173785 q -0.682441 105: [D] real 0.005464 fake 0.048383 q -0.558417 [G] fake 3.059311 reg 0.168226 q -0.707295 106: [D] real 0.005408 fake 0.076037 q -0.722151 [G] fake 3.225547 reg 0.167123 q -0.791089 107: [D] real 0.011245 fake 0.060011 q -0.830260 [G] fake 3.372714 reg 0.174958 q -0.893936 108: [D] real 0.015743 fake 0.057669 q -0.944482 [G] fake 3.649921 reg 0.164587 q -0.892548 109: [D] real 0.014506 fake 0.105701 q -0.893725 [G] fake 3.169295 reg 0.171590 q -0.089188 110: [D] real 0.033460 fake 0.069197 q -0.338123 [G] fake 1.794614 reg 0.180591 q -0.445808 111: [D] real 0.010237 fake 0.407048 q -0.053652 [G] fake 2.409947 reg 0.172413 q -0.533668 112: [D] real 0.056873 fake 0.090222 q -0.605167 [G] fake 2.825137 reg 0.166038 q -0.661242 113: [D] real 0.093108 fake 0.084386 q -0.662412 [G] fake 2.858156 reg 0.171858 q -0.755297 114: [D] real 0.048567 fake 0.104229 q -0.696583 [G] fake 2.956159 reg 0.170676 q -0.740559 115: [D] real 0.059919 fake 0.103678 q -0.686840 [G] fake 3.038925 reg 0.172483 q -0.919335 116: [D] real 0.111117 fake 0.168958 q -0.866141 [G] fake 3.139798 reg 0.160500 q -0.837689 117: [D] real 0.205896 fake 0.437863 q -0.841303 [G] fake 3.713123 reg 0.164670 q -0.679638 118: [D] real 1.123409 fake 2.513395 q -0.587899 [G] fake 3.511013 reg 0.155572 q -0.392821 119: [D] real 1.536775 fake 0.325721 q -0.831571 [G] fake 1.454310 reg 0.161923 q -0.596536 120: [D] real 0.330671 fake 1.434564 q -0.527187 [G] fake 2.096961 reg 0.213576 q -0.880335 121: [D] real 0.845298 fake 0.588884 q -0.683085 [G] fake 1.919043 reg 0.190389 q -0.848813 122: [D] real 0.806444 fake 0.831669 q -0.787398 [G] fake 1.612193 reg 0.181597 q -0.764545 123: [D] real 0.578596 fake 0.556828 q -0.806279 [G] fake 1.795060 reg 0.198984 q -0.849524 124: [D] real 0.574830 fake 0.963833 q -0.842046 [G] fake 1.856215 reg 0.209261 q -0.840430 125: [D] real 0.884844 fake 0.570552 q -0.831394 [G] fake 1.344687 reg 0.186219 q -0.702547 126: [D] real 0.667982 fake 0.939998 q -0.696801 [G] fake 1.727671 reg 0.178839 q -0.645964 127: [D] real 0.811976 fake 0.451530 q -0.751050 [G] fake 1.484829 reg 0.183385 q -0.648140 128: [D] real 0.615507 fake 0.782151 q -0.840182 [G] fake 1.229969 reg 0.185949 q -0.884246 129: [D] real 0.432424 fake 0.582835 q -0.737997 [G] fake 1.672830 reg 0.175358 q -0.778267 130: [D] real 0.756622 fake 0.648572 q -0.832122 [G] fake 1.258403 reg 0.213678 q -0.839033 131: [D] real 0.504551 fake 0.924520 q -0.820396 [G] fake 1.489545 reg 0.220720 q -0.500786 132: [D] real 0.806953 fake 0.559774 q -0.673045 [G] fake 1.377262 reg 0.195133 q -0.956005 133: [D] real 0.604057 fake 0.795723 q -0.824598 [G] fake 1.424316 reg 0.190225 q -0.723750 134: [D] real 0.730149 fake 0.497561 q -0.753350 [G] fake 1.127791 reg 0.195168 q -0.661237 135: [D] real 0.594268 fake 0.745395 q -0.852637 [G] fake 1.306198 reg 0.221789 q -0.753839 136: [D] real 0.527917 fake 0.668734 q -0.914637 [G] fake 1.596844 reg 0.192573 q -0.834908 137: [D] real 0.645270 fake 0.610233 q -0.846997 [G] fake 1.318011 reg 0.180237 q -0.594126 138: [D] real 0.618420 fake 0.617515 q -0.748594 [G] fake 1.482435 reg 0.185728 q -0.880302 139: [D] real 0.640740 fake 0.700095 q -0.765105 [G] fake 1.252635 reg 0.175224 q -0.753500 140: [D] real 0.564426 fake 0.747072 q -0.849074 [G] fake 1.383031 reg 0.172862 q -0.459535 141: [D] real 0.827494 fake 0.443311 q -0.738901 [G] fake 1.064335 reg 0.174561 q -0.758910 142: [D] real 0.487161 fake 1.234263 q -0.789341 [G] fake 1.536461 reg 0.207746 q -0.549804 143: [D] real 0.668465 fake 0.405609 q -0.832105 [G] fake 1.271661 reg 0.202024 q -0.787934 144: [D] real 0.604810 fake 0.817013 q -0.801061 [G] fake 1.247777 reg 0.180549 q -0.869429 145: [D] real 0.622981 fake 0.656119 q -0.806624 [G] fake 1.230857 reg 0.175309 q -0.872301 146: [D] real 0.736194 fake 0.655951 q -0.735813 [G] fake 1.050899 reg 0.175174 q -0.851473 147: [D] real 0.596882 fake 0.815118 q -0.820723 [G] fake 1.350834 reg 0.203711 q -0.702248 148: [D] real 0.692439 fake 0.667475 q -0.819480 [G] fake 1.335103 reg 0.182471 q -0.810421 149: [D] real 0.712486 fake 0.662717 q -0.705899 [G] fake 1.149072 reg 0.159546 q -0.972545 150: [D] real 0.576752 fake 0.664418 q -0.788494 [G] fake 1.237292 reg 0.163760 q -0.867523 151: [D] real 0.581326 fake 0.640885 q -0.765226 [G] fake 1.308925 reg 0.175772 q -0.681130 152: [D] real 0.760331 fake 0.500161 q -1.013284 [G] fake 1.056633 reg 0.181632 q -0.868946 153: [D] real 0.558329 fake 0.727618 q -0.932644 [G] fake 1.292400 reg 0.169674 q -0.982578 154: [D] real 0.589010 fake 0.777418 q -0.843361 [G] fake 1.272008 reg 0.166010 q -0.863834 155: [D] real 0.643989 fake 0.810005 q -0.837203 [G] fake 1.331134 reg 0.184250 q -0.844173 156: [D] real 0.749743 fake 0.570826 q -0.957411 [G] fake 1.183065 reg 0.179777 q -0.743625 157: [D] real 0.590272 fake 0.586231 q -0.822308 [G] fake 1.134881 reg 0.181980 q -0.929652 158: [D] real 0.466234 fake 0.722427 q -0.692577 [G] fake 1.497250 reg 0.164048 q -0.814350 159: [D] real 0.802311 fake 0.680343 q -0.759023 [G] fake 1.226673 reg 0.158884 q -0.733456 160: [D] real 0.540587 fake 0.648866 q -0.949648 [G] fake 1.151498 reg 0.166885 q -0.761982 161: [D] real 0.570282 fake 0.706784 q -0.941831 [G] fake 1.184997 reg 0.192602 q -0.951665 162: [D] real 0.580857 fake 0.717304 q -0.766593 [G] fake 1.326836 reg 0.174802 q -0.920302 163: [D] real 0.651678 fake 0.594557 q -0.906404 [G] fake 1.103606 reg 0.182712 q -0.796652 164: [D] real 0.618101 fake 0.715031 q -0.922442 [G] fake 1.198589 reg 0.197391 q -0.876396 165: [D] real 0.586636 fake 0.799828 q -0.916981 [G] fake 1.101994 reg 0.190851 q -0.899326 166: [D] real 0.620455 fake 0.654185 q -0.942161 [G] fake 1.242548 reg 0.197050 q -0.885755 167: [D] real 0.548295 fake 0.641066 q -0.894651 [G] fake 1.268167 reg 0.175528 q -0.592985 168: [D] real 0.577349 fake 0.740060 q -0.807416 [G] fake 1.203739 reg 0.172994 q -0.762825 169: [D] real 0.524740 fake 0.578754 q -0.987321 [G] fake 1.280082 reg 0.169113 q -0.927283 170: [D] real 0.588097 fake 0.557567 q -1.051758 [G] fake 1.397066 reg 0.162628 q -0.914452 171: [D] real 0.537267 fake 0.502048 q -1.028583 [G] fake 1.618242 reg 0.154638 q -0.845395 172: [D] real 0.557873 fake 0.485305 q -0.831734 [G] fake 1.420617 reg 0.158682 q -0.079166 173: [D] real 0.297535 fake 0.456154 q -0.342472 [G] fake 1.058169 reg 0.186995 q -0.931712 174: [D] real 0.260578 fake 0.627228 q -0.581013 [G] fake 1.133204 reg 0.196000 q -0.778997 175: [D] real 0.286603 fake 0.596393 q -0.793501 [G] fake 1.768633 reg 0.186810 q -0.772054 176: [D] real 0.448309 fake 0.405023 q -0.794934 [G] fake 1.503221 reg 0.181935 q -0.899701 177: [D] real 0.361072 fake 0.511827 q -1.018397 [G] fake 1.707558 reg 0.177686 q -0.911727 178: [D] real 0.409645 fake 0.471448 q -0.736474 [G] fake 1.796332 reg 0.159705 q -0.347220 179: [D] real 0.732972 fake 0.089192 q -0.501272 [G] fake 2.394927 reg 0.153624 q -0.820980 180: [D] real 0.263928 fake 0.189646 q -0.788025 [G] fake 2.190970 reg 0.154208 q -0.774173 181: [D] real 0.070066 fake 0.232347 q -0.715063 [G] fake 2.152536 reg 0.187857 q -0.862223 182: [D] real 0.081567 fake 0.399424 q -0.858320 [G] fake 2.129161 reg 0.163103 q -0.898540 183: [D] real 0.107097 fake 0.860393 q -0.212941 [G] fake 1.753463 reg 0.194370 q -0.654063 184: [D] real 0.325115 fake 0.688954 q -0.658340 [G] fake 1.533870 reg 0.210164 q -0.583336 185: [D] real 0.950626 fake 0.380645 q -0.672334 [G] fake 1.819451 reg 0.197325 q -0.811858 186: [D] real 0.559435 fake 0.456979 q -0.710562 [G] fake 1.578009 reg 0.201581 q -0.812699 187: [D] real 0.389277 fake 0.601579 q -0.868285 [G] fake 1.766707 reg 0.189469 q -0.919317 188: [D] real 0.457596 fake 0.668966 q -0.712653 [G] fake 1.813855 reg 0.172944 q -0.930159 189: [D] real 1.122157 fake 0.457293 q -0.966136 [G] fake 1.669212 reg 0.167639 q -0.955080 190: [D] real 0.621869 fake 0.663305 q -0.645516 [G] fake 1.493407 reg 0.169039 q -0.672077 191: [D] real 0.263624 fake 1.049936 q -0.423778 [G] fake 1.480487 reg 0.169157 q -0.553272 192: [D] real 0.623585 fake 0.354170 q -0.607599 [G] fake 1.700614 reg 0.189422 q -0.543530 193: [D] real 0.850270 fake 0.499374 q -0.708906 [G] fake 1.342539 reg 0.185945 q -0.711809 194: [D] real 0.366806 fake 0.483162 q -0.787004 [G] fake 1.439575 reg 0.190259 q -0.900386 195: [D] real 0.332763 fake 0.629679 q -0.840518 [G] fake 1.823097 reg 0.189459 q -0.777243 196: [D] real 0.586912 fake 0.468720 q -1.000087 [G] fake 1.632961 reg 0.183253 q -0.981954 197: [D] real 0.617223 fake 0.412802 q -0.875779 [G] fake 1.565403 reg 0.176076 q -1.046356 198: [D] real 0.408841 fake 0.676116 q -1.017243 [G] fake 1.802686 reg 0.174198 q -1.058955 199: [D] real 0.494034 fake 0.585102 q -0.767353 [G] fake 1.877573 reg 0.174197 q -0.802177 200: [D] real 0.486504 fake 0.422234 q -0.774088 [G] fake 1.586223 reg 0.178410 q -0.867320 201: [D] real 0.569218 fake 0.495879 q -0.910300 [G] fake 1.453305 reg 0.174519 q -0.931885 202: [D] real 0.425385 fake 0.693199 q -0.993027 [G] fake 1.629206 reg 0.163880 q -1.147344 203: [D] real 0.584926 fake 0.698773 q -1.052680 [G] fake 2.092289 reg 0.170324 q -1.014533 204: [D] real 0.847685 fake 0.425146 q -1.158492 [G] fake 1.667479 reg 0.165669 q -1.155453 205: [D] real 0.541921 fake 0.606119 q -1.278375 [G] fake 1.464315 reg 0.175812 q -1.076050 206: [D] real 0.378487 fake 0.754230 q -1.191002 [G] fake 1.897093 reg 0.180908 q -1.018805 207: [D] real 0.786705 fake 0.401779 q -1.138337 [G] fake 1.390654 reg 0.183071 q -1.215994 208: [D] real 0.488997 fake 0.856426 q -1.164147 [G] fake 1.560180 reg 0.180549 q -0.352701 209: [D] real 0.414485 fake 0.143282 q 0.116483 [G] fake 2.308135 reg 0.153489 q 0.894769 210: [D] real 0.791623 fake 0.082891 q 0.121810 [G] fake 2.045639 reg 0.131675 q -0.611036 211: [D] real 0.159225 fake 0.196745 q -0.517456 [G] fake 1.918807 reg 0.143353 q -0.536624 212: [D] real 0.070445 fake 0.199352 q -0.582943 [G] fake 2.240221 reg 0.153021 q -0.622566 213: [D] real 0.043131 fake 0.174563 q -0.661611 [G] fake 2.246523 reg 0.141507 q -0.711075 214: [D] real 0.043824 fake 0.188989 q -0.827823 [G] fake 2.160186 reg 0.140674 q -0.726190 215: [D] real 0.044929 fake 0.375078 q -0.844923 [G] fake 2.041591 reg 0.153948 q -0.778921 216: [D] real 0.092407 fake 0.996320 q -0.568906 [G] fake 1.723639 reg 0.159930 q -0.786749 217: [D] real 0.303783 fake 0.423281 q -0.687701 [G] fake 1.577882 reg 0.176508 q -0.789670 218: [D] real 0.605584 fake 0.399700 q -0.682859 [G] fake 1.418545 reg 0.174511 q -0.812666 219: [D] real 0.425186 fake 0.664644 q -0.878901 [G] fake 1.286871 reg 0.173372 q -0.821737 220: [D] real 0.318283 fake 0.473053 q -0.946168 [G] fake 1.435802 reg 0.163573 q -0.928213 221: [D] real 0.357351 fake 0.456713 q -0.860409 [G] fake 1.409808 reg 0.162237 q -0.960886 222: [D] real 0.321783 fake 0.462360 q -0.932166 [G] fake 1.578986 reg 0.176128 q -1.077763 223: [D] real 0.425614 fake 0.504160 q -1.074022 [G] fake 1.232175 reg 0.178551 q -1.169410 224: [D] real 0.315275 fake 0.657427 q -1.146146 [G] fake 1.688043 reg 0.178284 q -0.996123 225: [D] real 0.348319 fake 0.407061 q -0.769110 [G] fake 1.682521 reg 0.177204 q -0.705174 226: [D] real 0.486660 fake 0.427022 q -0.462274 [G] fake 1.620193 reg 0.179665 q -0.494154 227: [D] real 0.483713 fake 0.487291 q -0.687896 [G] fake 1.570116 reg 0.179423 q -0.840340 228: [D] real 0.311039 fake 0.606739 q -0.911812 [G] fake 1.818091 reg 0.174518 q -1.002786 229: [D] real 0.453980 fake 0.313651 q -1.000875 [G] fake 1.639843 reg 0.165214 q -1.102536 230: [D] real 0.394061 fake 0.463557 q -0.950259 [G] fake 1.549970 reg 0.157326 q -1.018562 231: [D] real 0.469057 fake 0.312427 q -1.027573 [G] fake 1.801611 reg 0.168607 q -1.062141 232: [D] real 0.271077 fake 0.613803 q -0.627922 [G] fake 1.640943 reg 0.198766 q -0.560992 233: [D] real 0.206435 fake 0.359742 q -0.569240 [G] fake 1.886556 reg 0.191011 q -0.589881 234: [D] real 0.515490 fake 0.266945 q -0.739391 [G] fake 1.952120 reg 0.202333 q -0.817620 235: [D] real 0.397208 fake 0.679691 q -0.862016 [G] fake 1.979137 reg 0.192351 q -0.800645 236: [D] real 0.400889 fake 0.286162 q -0.927120 [G] fake 2.111108 reg 0.180551 q -0.941602 237: [D] real 0.373123 fake 0.224365 q -1.036293 [G] fake 2.352549 reg 0.175140 q -0.998028 238: [D] real 0.193978 fake 0.478606 q -0.974143 [G] fake 2.006686 reg 0.168198 q -1.079218 239: [D] real 0.224958 fake 0.080906 q -1.141151 [G] fake 3.147329 reg 0.171498 q -0.824057 240: [D] real 0.318862 fake 0.206866 q -1.034546 [G] fake 2.097729 reg 0.162782 q -0.964602 241: [D] real 0.101638 fake 0.753400 q -0.942146 [G] fake 2.206700 reg 0.161936 q -1.033158 242: [D] real 0.369955 fake 0.010845 q 0.159056 [G] fake 1.517223 reg 0.178974 q -0.345402 243: [D] real 0.223678 fake 0.586859 q -0.324083 [G] fake 1.110781 reg 0.188524 q -0.318636 244: [D] real 0.214494 fake 0.787482 q -0.374255 [G] fake 1.504704 reg 0.184277 q -0.341333 245: [D] real 0.340791 fake 0.468451 q -0.375497 [G] fake 1.555001 reg 0.177670 q -0.271757 246: [D] real 0.425240 fake 0.599963 q -0.477677 [G] fake 1.331756 reg 0.175542 q -0.471476 247: [D] real 0.392904 fake 0.659003 q -0.521527 [G] fake 1.304916 reg 0.167718 q -0.470568 248: [D] real 0.347176 fake 0.564037 q -0.705766 [G] fake 1.426940 reg 0.169626 q -0.676109 249: [D] real 0.278482 fake 0.464156 q -0.634570 [G] fake 1.517711 reg 0.167615 q -0.748319 250: [D] real 0.398629 fake 0.797213 q -0.839209 [G] fake 1.834175 reg 0.170347 q -0.635464 251: [D] real 0.289832 fake 0.307061 q -0.741741 [G] fake 1.572076 reg 0.171206 q -0.892280 252: [D] real 0.310638 fake 0.565109 q -0.892271 [G] fake 1.904063 reg 0.195529 q -0.812942 253: [D] real 0.239937 fake 0.317925 q -0.866001 [G] fake 2.092096 reg 0.202708 q -0.971743 254: [D] real 0.345387 fake 0.207933 q -0.873197 [G] fake 1.742245 reg 0.202094 q -0.806660 255: [D] real 0.134807 fake 0.322319 q -0.839402 [G] fake 1.854336 reg 0.208734 q -0.801090 256: [D] real 0.126718 fake 0.381376 q -0.782909 [G] fake 2.367402 reg 0.195367 q -0.634487 257: [D] real 0.146076 fake 0.125274 q -0.421200 [G] fake 2.204415 reg 0.195664 q -0.736207 258: [D] real 0.294505 fake 0.212146 q -0.856953 [G] fake 2.146280 reg 0.193623 q -0.847464 259: [D] real 0.203709 fake 0.259220 q -1.013733 [G] fake 1.981403 reg 0.187881 q -0.988741 260: [D] real 0.133654 fake 0.292103 q -1.028832 [G] fake 2.306002 reg 0.178669 q -1.004125 261: [D] real 0.192651 fake 0.171296 q -1.021533 [G] fake 2.185297 reg 0.175234 q -1.153042 262: [D] real 0.246044 fake 0.281807 q -1.043701 [G] fake 2.364691 reg 0.176147 q -1.060042 263: [D] real 0.212628 fake 0.199133 q -1.020985 [G] fake 2.525411 reg 0.164617 q -0.782530 264: [D] real 0.181502 fake 0.143028 q -1.076159 [G] fake 2.026202 reg 0.173901 q -1.040835 265: [D] real 0.087186 fake 0.278096 q -1.001667 [G] fake 2.347130 reg 0.183379 q -0.878565 266: [D] real 0.062752 fake 0.144715 q -0.699100 [G] fake 2.536963 reg 0.179629 q -1.082230 267: [D] real 0.063224 fake 0.118571 q -1.084972 [G] fake 2.533639 reg 0.174383 q -1.023313 268: [D] real 0.099444 fake 0.190151 q 0.966843 [G] fake 3.534700 reg 0.161111 q -0.462003 269: [D] real 0.825868 fake 0.038404 q -0.321420 [G] fake 3.081764 reg 0.154154 q -0.380820 270: [D] real 0.681468 fake 0.089789 q -0.458831 [G] fake 2.273474 reg 0.158503 q -0.475376 271: [D] real 0.133811 fake 0.218112 q -0.533081 [G] fake 1.983759 reg 0.194510 q -0.531651 272: [D] real 0.046722 fake 0.253118 q -0.615377 [G] fake 1.880164 reg 0.186068 q -0.649264 273: [D] real 0.126493 fake 0.232773 q -0.707682 [G] fake 2.024463 reg 0.175919 q -0.629446 274: [D] real 0.103825 fake 0.196673 q -0.780402 [G] fake 2.160493 reg 0.177258 q -0.801369 275: [D] real 0.103461 fake 0.308521 q -0.752701 [G] fake 2.151097 reg 0.179207 q -0.829202 276: [D] real 0.087885 fake 0.187790 q -0.818133 [G] fake 2.163178 reg 0.181682 q -0.834857 277: [D] real 0.092997 fake 0.126002 q -0.833975 [G] fake 2.252396 reg 0.171985 q -0.931971 278: [D] real 0.308613 fake 0.191540 q -0.944410 [G] fake 2.117795 reg 0.164304 q -0.881252 279: [D] real 0.060434 fake 0.291883 q -0.869025 [G] fake 2.539036 reg 0.161033 q -0.925627 280: [D] real 0.149675 fake 0.192502 q -1.035655 [G] fake 2.460853 reg 0.153830 q -1.024976 281: [D] real 0.147579 fake 0.093068 q -0.985188 [G] fake 2.615279 reg 0.159125 q -0.991050 282: [D] real 0.112500 fake 0.082689 q -1.091702 [G] fake 2.562362 reg 0.214169 q -1.004805 283: [D] real 0.050915 fake 0.170305 q -0.843591 [G] fake 2.363958 reg 0.178207 q -0.995034 284: [D] real 0.083959 fake 0.175833 q -0.992732 [G] fake 2.572192 reg 0.173572 q -1.001937 285: [D] real 0.086917 fake 0.097319 q -1.019619 [G] fake 2.651865 reg 0.173577 q -1.216684 286: [D] real 0.141198 fake 0.097037 q -1.069943 [G] fake 2.812839 reg 0.182626 q -1.068581 287: [D] real 0.107197 fake 0.145190 q -1.117733 [G] fake 2.936222 reg 0.173632 q -1.135377 288: [D] real 0.086051 fake 0.084722 q -0.870849 [G] fake 2.259280 reg 0.170713 q -0.537360 289: [D] real 0.051106 fake 0.186608 q -0.477937 [G] fake 2.010910 reg 0.158162 q -0.464347 290: [D] real 0.036617 fake 0.218884 q -0.367234 [G] fake 2.428784 reg 0.153843 q -0.608844 291: [D] real 0.066434 fake 0.193711 q -0.608816 [G] fake 2.559536 reg 0.145482 q -0.662918 292: [D] real 0.215430 fake 0.122064 q -0.683410 [G] fake 2.635773 reg 0.141307 q -0.817531 293: [D] real 0.152967 fake 0.122322 q -0.801589 [G] fake 2.625087 reg 0.168251 q -0.969384 294: [D] real 0.099567 fake 0.072160 q -0.965136 [G] fake 2.711774 reg 0.189003 q -0.992799 295: [D] real 0.023679 fake 0.098124 q -1.084831 [G] fake 2.957693 reg 0.157237 q -0.936738 296: [D] real 0.117722 fake 0.116767 q -0.851114 [G] fake 2.926289 reg 0.158436 q -0.919528 297: [D] real 0.053261 fake 0.107395 q -1.086125 [G] fake 3.089556 reg 0.177973 q -1.101499 298: [D] real 0.075970 fake 0.113352 q -1.073815 [G] fake 3.091954 reg 0.150376 q -1.227454 299: [D] real 0.217530 fake 0.067094 q -1.219321 [G] fake 3.012822 reg 0.142925 q -1.223692 300: [D] real 0.089902 fake 0.143699 q -1.024353 [G] fake 2.939714 reg 0.137138 q -1.044200 301: [D] real 0.080641 fake 0.039776 q -1.064815 [G] fake 3.684123 reg 0.144174 q -0.927823 302: [D] real 0.130781 fake 0.045528 q -0.876836 [G] fake 3.586332 reg 0.210136 q -0.974331 303: [D] real 0.013757 fake 0.105725 q -0.541525 [G] fake 2.095718 reg 0.227920 q -0.400779 304: [D] real 0.023522 fake 0.576958 q -0.625946 [G] fake 2.142954 reg 0.197713 q -0.583818 305: [D] real 0.030862 fake 0.107620 q -0.641995 [G] fake 2.856812 reg 0.229249 q -0.775509 306: [D] real 0.111661 fake 0.136809 q -0.895345 [G] fake 2.913697 reg 0.200767 q -1.044481 307: [D] real 0.087761 fake 0.070995 q -1.060725 [G] fake 3.123698 reg 0.185733 q -1.022653 308: [D] real 0.137756 fake 0.110327 q -1.179686 [G] fake 2.882759 reg 0.176159 q -1.017315 309: [D] real 0.061605 fake 0.044018 q -0.837976 [G] fake 3.777215 reg 0.166215 q -1.013071 310: [D] real 0.383435 fake 0.053173 q -1.036077 [G] fake 3.103069 reg 0.150691 q -1.057071 311: [D] real 0.023669 fake 0.315093 q -1.179120 [G] fake 3.089949 reg 0.138209 q -1.033486 312: [D] real 0.138739 fake 0.240810 q -1.030052 [G] fake 2.780006 reg 0.152410 q -1.021564 313: [D] real 0.065952 fake 0.181225 q -1.169255 [G] fake 2.819703 reg 0.170162 q -1.130559 314: [D] real 0.024403 fake 0.269374 q -1.282463 [G] fake 3.324264 reg 0.167690 q -1.352163 315: [D] real 0.139724 fake 0.115466 q -1.299188 [G] fake 3.101998 reg 0.190497 q -1.296658 316: [D] real 0.209359 fake 0.256132 q -1.002118 [G] fake 3.334124 reg 0.179316 q -1.275090 317: [D] real 0.125720 fake 0.111348 q -1.216476 [G] fake 3.843625 reg 0.158850 q -1.090191 318: [D] real 0.197678 fake 0.036700 q -1.016079 [G] fake 3.655090 reg 0.150398 q -0.612607 319: [D] real 0.392861 fake 0.033264 q -0.422554 [G] fake 3.786092 reg 0.140328 q -0.317425 320: [D] real 0.154928 fake 0.123132 q -0.550228 [G] fake 2.497214 reg 0.119170 q -0.833248 321: [D] real 0.028177 fake 0.378895 q -0.790179 [G] fake 2.978299 reg 0.234085 q -0.833492 322: [D] real 0.055716 fake 0.168703 q -0.850831 [G] fake 3.116048 reg 0.251846 q -0.940177 323: [D] real 0.080919 fake 0.092472 q -1.034467 [G] fake 3.174194 reg 0.243985 q -0.913453 324: [D] real 0.081796 fake 0.393688 q -1.105762 [G] fake 3.072115 reg 0.247658 q -1.074341 325: [D] real 0.221867 fake 0.198715 q -0.999677 [G] fake 2.832264 reg 0.234086 q -0.975025 326: [D] real 0.180916 fake 0.176479 q -0.925116 [G] fake 2.389131 reg 0.233465 q -1.042779 327: [D] real 0.135401 fake 0.282270 q -1.217407 [G] fake 2.178298 reg 0.213068 q -1.193969 328: [D] real 0.194610 fake 0.330668 q -1.227558 [G] fake 2.943058 reg 0.195311 q -1.317648 329: [D] real 0.178314 fake 0.071974 q -1.326029 [G] fake 2.994093 reg 0.181942 q -1.405137 330: [D] real 0.414897 fake 0.156119 q -0.339622 [G] fake 2.925642 reg 0.162523 q -0.821756 331: [D] real 0.382163 fake 0.296758 q -0.798776 [G] fake 2.588255 reg 0.150542 q -0.736927 332: [D] real 0.270227 fake 0.195583 q -0.843588 [G] fake 2.527786 reg 0.140902 q -0.985321 333: [D] real 0.168847 fake 0.337773 q -0.557850 [G] fake 2.144130 reg 0.148041 q -0.841740 334: [D] real 0.167300 fake 0.465040 q -0.771233 [G] fake 2.228754 reg 0.265121 q -0.799065 335: [D] real 0.253754 fake 0.170228 q -0.836332 [G] fake 2.594417 reg 0.248895 q -0.910905 336: [D] real 0.715745 fake 0.497938 q -0.771872 [G] fake 1.702283 reg 0.256686 q -0.950629 337: [D] real 0.328732 fake 0.512931 q -1.057091 [G] fake 2.210587 reg 0.220855 q -1.057714 338: [D] real 0.399503 fake 0.390766 q -1.163758 [G] fake 2.519905 reg 0.209392 q -1.019120 339: [D] real 0.591581 fake 0.308046 q -1.183204 [G] fake 2.138544 reg 0.212054 q -1.271688 340: [D] real 0.486181 fake 0.432398 q -1.164853 [G] fake 2.556405 reg 0.183576 q -1.186393 341: [D] real 0.401881 fake 0.291942 q -1.304147 [G] fake 2.060388 reg 0.168250 q -1.296842 342: [D] real 0.342082 fake 0.751324 q -1.166432 [G] fake 2.267294 reg 0.161012 q -1.341924 343: [D] real 0.371574 fake 0.132355 q -1.144847 [G] fake 2.069512 reg 0.140197 q -1.402473 344: [D] real 0.296351 fake 0.414782 q -1.383975 [G] fake 2.008165 reg 0.177607 q -1.440557 345: [D] real 0.413384 fake 0.952205 q -1.477475 [G] fake 2.147883 reg 0.280099 q -1.497453 346: [D] real 0.594875 fake 0.157235 q -1.194580 [G] fake 2.917478 reg 0.260287 q -1.290447 347: [D] real 0.448288 fake 0.523652 q -1.022790 [G] fake 2.801682 reg 0.282147 q 0.685875 348: [D] real 0.534449 fake 0.010846 q 0.016447 [G] fake 3.934923 reg 0.245867 q -0.829493 349: [D] real 0.332893 fake 0.052046 q -0.692602 [G] fake 3.330492 reg 0.224279 q -0.793634 350: [D] real 0.086327 fake 0.058890 q -0.853905 [G] fake 3.243462 reg 0.215561 q -0.713706 351: [D] real 0.026016 fake 0.078413 q -0.780960 [G] fake 3.683698 reg 0.197356 q -0.684255 352: [D] real 0.059520 fake 0.076422 q -0.819890 [G] fake 3.127319 reg 0.183906 q -0.855334 353: [D] real 0.061255 fake 0.107013 q -1.000795 [G] fake 2.820206 reg 0.175799 q -0.958818 354: [D] real 0.029103 fake 0.354423 q -0.153643 [G] fake 4.424933 reg 0.159757 q 0.240573 355: [D] real 0.133920 fake 0.120074 q -0.464674 [G] fake 2.772639 reg 0.148935 q -0.793918 356: [D] real 0.289426 fake 0.220973 q -0.897726 [G] fake 1.861169 reg 0.133088 q -0.991996 357: [D] real 0.180729 fake 0.477408 q -0.553611 [G] fake 2.432757 reg 0.129442 q -0.839572 358: [D] real 0.229586 fake 0.280009 q -0.759089 [G] fake 2.484715 reg 0.173956 q -0.812817 359: [D] real 0.336505 fake 0.245982 q -0.821324 [G] fake 1.805825 reg 0.178880 q -0.904256 360: [D] real 0.201381 fake 0.720133 q -0.893484 [G] fake 2.173992 reg 0.155438 q -0.980856 361: [D] real 0.714304 fake 0.162949 q -0.863672 [G] fake 2.551556 reg 0.140975 q -0.925036 362: [D] real 0.398964 fake 0.795975 q -1.048812 [G] fake 1.849993 reg 0.122535 q -1.119644 363: [D] real 0.261116 fake 0.539026 q -1.138386 [G] fake 1.764922 reg 0.104821 q -1.209711 364: [D] real 0.597093 fake 0.505747 q -1.284490 [G] fake 1.902191 reg 0.153200 q -1.231070 365: [D] real 0.672827 fake 0.279180 q -1.173139 [G] fake 1.626063 reg 0.239776 q -1.198770 366: [D] real 0.249628 fake 0.139457 q -0.898051 [G] fake 2.634499 reg 0.258608 q -1.156303 367: [D] real 0.075426 fake 0.971186 q -1.189556 [G] fake 2.298076 reg 0.258717 q -0.993745 368: [D] real 0.619853 fake 0.045814 q -0.938709 [G] fake 3.321057 reg 0.258419 q -1.091452 369: [D] real 0.472836 fake 0.116896 q -1.192061 [G] fake 2.403603 reg 0.238973 q -1.241496 370: [D] real 0.093853 fake 1.595058 q -1.427991 [G] fake 1.887851 reg 0.226595 q -1.312108 371: [D] real 0.444327 fake 0.150839 q -1.370587 [G] fake 2.448933 reg 0.213397 q -1.348715 372: [D] real 0.802857 fake 0.157231 q -1.343786 [G] fake 2.190679 reg 0.202310 q -1.424851 373: [D] real 0.145676 fake 0.785335 q -0.885067 [G] fake 2.785416 reg 0.175397 q 2.065556 374: [D] real 0.786934 fake 0.005205 q -0.230166 [G] fake 4.353951 reg 0.137799 q -0.724919 375: [D] real 0.927096 fake 0.009631 q -0.624087 [G] fake 4.586877 reg 0.113271 q -0.674914 376: [D] real 0.114994 fake 0.015471 q -0.633952 [G] fake 4.565423 reg 0.092753 q -0.756335 377: [D] real 0.035736 fake 0.016477 q -0.654836 [G] fake 4.380668 reg 0.150741 q -0.717682 378: [D] real 0.018004 fake 0.024417 q -0.710562 [G] fake 4.245042 reg 0.211197 q -0.800738 379: [D] real 0.017089 fake 0.023600 q -0.848531 [G] fake 4.243143 reg 0.210198 q -0.737534 380: [D] real 0.014301 fake 0.035433 q -0.775039 [G] fake 3.798802 reg 0.224612 q -0.917759 381: [D] real 0.010710 fake 0.080688 q -0.962821 [G] fake 3.519438 reg 0.210715 q -0.983035 382: [D] real 0.016281 fake 0.091602 q -0.959413 [G] fake 3.378960 reg 0.206556 q -0.931016 383: [D] real 0.020615 fake 0.255766 q -1.154396 [G] fake 2.840881 reg 0.193026 q -1.055007 384: [D] real 0.014246 fake 0.328092 q -0.748481 [G] fake 2.612421 reg 0.190732 q -1.030892 385: [D] real 0.065405 fake 0.129014 q -0.681027 [G] fake 2.645152 reg 0.183298 q -0.856203 386: [D] real 0.189267 fake 0.213146 q -0.922048 [G] fake 2.789577 reg 0.171806 q -0.976072 387: [D] real 0.316999 fake 0.316476 q -0.992554 [G] fake 2.681291 reg 0.152171 q -1.035280 388: [D] real 0.332790 fake 0.545636 q -0.913344 [G] fake 2.228653 reg 0.134444 q -1.108719 389: [D] real 0.339215 fake 0.541798 q -0.923375 [G] fake 2.190560 reg 0.118279 q -1.096282 390: [D] real 0.795353 fake 0.351092 q -0.957522 [G] fake 1.879734 reg 0.208688 q -1.157627 391: [D] real 0.617020 fake 1.052753 q -0.983546 [G] fake 1.663153 reg 0.211523 q -1.052422 392: [D] real 0.809115 fake 0.476287 q -0.972308 [G] fake 1.913801 reg 0.213276 q -0.969983 393: [D] real 0.310634 fake 0.321338 q -0.715835 [G] fake 1.681684 reg 0.222830 q -0.986628 394: [D] real 0.336016 fake 0.717720 q -0.847817 [G] fake 2.015157 reg 0.206272 q -0.886643 395: [D] real 0.377918 fake 0.518829 q -0.854795 [G] fake 1.901855 reg 0.197410 q -0.969891 396: [D] real 0.481678 fake 0.372873 q -0.962732 [G] fake 1.421130 reg 0.185092 q -1.057864 397: [D] real 0.384689 fake 0.675461 q -1.061890 [G] fake 1.695248 reg 0.168601 q -0.973859 398: [D] real 0.397546 fake 0.342189 q -1.142460 [G] fake 1.778374 reg 0.153973 q -1.198023 399: [D] real 0.391074 fake 0.710838 q -1.102513 [G] fake 1.490185 reg 0.137147 q -1.233578 400: [D] real 0.370625 fake 0.531774 q -1.143144 [G] fake 2.063780 reg 0.121366 q -1.120438 401: [D] real 0.649231 fake 0.239625 q -1.044713 [G] fake 1.552323 reg 0.106030 q -1.141974 402: [D] real 0.337647 fake 0.580283 q -1.089722 [G] fake 1.573005 reg 0.155809 q -1.222031 403: [D] real 0.279467 fake 0.540631 q -1.331298 [G] fake 1.774459 reg 0.227794 q -1.440455 404: [D] real 0.558466 fake 0.590651 q -1.372303 [G] fake 2.028747 reg 0.227345 q -1.391881 405: [D] real 0.655369 fake 0.578424 q -1.474030 [G] fake 1.699810 reg 0.242219 q -1.079441 406: [D] real 0.596629 fake 0.767054 q -1.125476 [G] fake 1.815565 reg 0.218381 q -0.998480 407: [D] real 0.426607 fake 0.641299 q -0.982479 [G] fake 2.050037 reg 0.210277 q -0.987574 408: [D] real 0.675567 fake 0.402312 q -0.964594 [G] fake 1.768010 reg 0.195666 q -1.183692 409: [D] real 0.521303 fake 0.540671 q -1.109700 [G] fake 1.550412 reg 0.173292 q -1.304809 410: [D] real 0.400568 fake 0.492276 q -1.262418 [G] fake 1.421175 reg 0.157941 q -1.436225 411: [D] real 0.386527 fake 0.481183 q -1.323238 [G] fake 1.509165 reg 0.137360 q -1.477507 412: [D] real 0.545413 fake 0.674720 q -1.346152 [G] fake 1.734227 reg 0.117121 q -1.404580 413: [D] real 0.752010 fake 0.750794 q -1.322700 [G] fake 1.648709 reg 0.099929 q -1.119342 414: [D] real 0.749021 fake 0.631194 q -1.453865 [G] fake 1.791239 reg 0.210895 q -1.508321 415: [D] real 0.672374 fake 0.349962 q -1.229887 [G] fake 1.801314 reg 0.204337 q -0.991480 416: [D] real 0.459560 fake 0.642567 q -1.002367 [G] fake 1.806863 reg 0.226479 q -1.342126 417: [D] real 0.517731 fake 0.445993 q -1.335984 [G] fake 1.604069 reg 0.245346 q -1.378793 418: [D] real 0.449239 fake 0.539421 q -1.362687 [G] fake 1.250012 reg 0.225423 q -1.380262 419: [D] real 0.467366 fake 0.610346 q -1.418554 [G] fake 1.425873 reg 0.216040 q -1.609614 420: [D] real 0.584381 fake 0.739430 q -1.243760 [G] fake 1.778856 reg 0.194991 q -0.759429 421: [D] real 0.686479 fake 0.486648 q -1.308187 [G] fake 1.909177 reg 0.169440 q -1.487011 422: [D] real 0.694140 fake 0.553595 q -1.486097 [G] fake 1.483725 reg 0.149889 q -1.466514 423: [D] real 0.555272 fake 0.529127 q -1.456444 [G] fake 1.374125 reg 0.130824 q -1.575219 424: [D] real 0.309444 fake 0.464759 q -1.508069 [G] fake 1.733922 reg 0.111103 q -1.429905 425: [D] real 0.372169 fake 0.665501 q -1.388209 [G] fake 2.136502 reg 0.143159 q -0.305924 426: [D] real 0.841549 fake 0.030695 q -0.485900 [G] fake 3.154605 reg 0.228497 q -1.074697 427: [D] real 0.577676 fake 0.060933 q -1.053382 [G] fake 2.865932 reg 0.231541 q -1.107838 428: [D] real 0.101495 fake 0.127293 q -1.010209 [G] fake 2.523548 reg 0.221672 q -1.077350 429: [D] real 0.053267 fake 0.468731 q -1.171559 [G] fake 2.167472 reg 0.224412 q -1.344616 430: [D] real 0.061854 fake 0.940322 q -0.859592 [G] fake 2.019461 reg 0.216242 q -1.041106 431: [D] real 0.261079 fake 0.117306 q -0.902248 [G] fake 2.713635 reg 0.211516 q -1.119087 432: [D] real 0.559906 fake 0.199107 q -1.088247 [G] fake 1.931207 reg 0.194939 q -1.123041 433: [D] real 0.198679 fake 0.780264 q -1.241093 [G] fake 1.614694 reg 0.170649 q -1.282053 434: [D] real 0.155219 fake 0.503043 q -1.248094 [G] fake 2.105291 reg 0.145290 q -1.330676 435: [D] real 0.513490 fake 0.474479 q -1.405019 [G] fake 1.757776 reg 0.122879 q -1.361636 436: [D] real 0.376866 fake 0.491382 q -1.418623 [G] fake 1.546366 reg 0.102088 q -1.560200 437: [D] real 0.546402 fake 0.452626 q -1.397195 [G] fake 1.801389 reg 0.195543 q -1.336879 438: [D] real 0.454353 fake 0.230549 q -1.386031 [G] fake 1.758412 reg 0.207535 q -1.479000 439: [D] real 0.150279 fake 0.795486 q -1.286400 [G] fake 2.320328 reg 0.218961 q -0.671317 440: [D] real 0.549875 fake 0.152246 q -1.553409 [G] fake 2.514274 reg 0.220151 q -1.498377 WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:36: The name tf.AUTO_REUSE is deprecated. Please use tf.compat.v1.AUTO_REUSE instead.

WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:188: The name tf.placeholder is deprecated. Please use tf.compat.v1.placeholder instead.

WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:131: The name tf.variable_scope is deprecated. Please use tf.compat.v1.variable_scope instead.

WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:133: conv2d (from tensorflow.python.layers.convolutional) is deprecated and will be removed in a future version. Instructions for updating: Use tf.keras.layers.Conv2D instead. WARNING:tensorflow:From C:\Users\USER\anaconda3\envs\zakaria\lib\site-packages\tensorflow_core\python\layers\convolutional.py:424: Layer.apply (from tensorflow.python.keras.engine.base_layer) is deprecated and will be removed in a future version. Instructions for updating: Please use layer.call method instead. WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:134: batch_normalization (from tensorflow.python.layers.normalization) is deprecated and will be removed in a future version. Instructions for updating: Use keras.layers.BatchNormalization instead. In particular, tf.control_dependencies(tf.GraphKeys.UPDATE_OPS) should not be used (consult the tf.keras.layers.batch_normalization documentation). WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:136: dropout (from tensorflow.python.layers.core) is deprecated and will be removed in a future version. Instructions for updating: Use keras.layers.dropout instead. WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:163: flatten (from tensorflow.python.layers.core) is deprecated and will be removed in a future version. Instructions for updating: Use keras.layers.flatten instead. WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:164: dense (from tensorflow.python.layers.core) is deprecated and will be removed in a future version. Instructions for updating: Use keras.layers.Dense instead. WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:59: conv2d_transpose (from tensorflow.python.layers.convolutional) is deprecated and will be removed in a future version. Instructions for updating: Use tf.keras.layers.Conv2DTranspose instead. WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:111: The name tf.log is deprecated. Please use tf.math.log instead.

WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:112: The name tf.lgamma is deprecated. Please use tf.math.lgamma instead.

WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:120: div (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Deprecated in favor of operator or tf.math.divide. WARNING:tensorflow:From C:\Users\USER\anaconda3\envs\zakaria\lib\site-packages\tensorflow_core\python\ops\nn_impl.py:183: where (from tensorflow.python.ops.array_ops) is deprecated and will be removed in a future version. Instructions for updating: Use tf.where in 2.0, which has the same broadcast rule as np.where WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:227: The name tf.train.AdamOptimizer is deprecated. Please use tf.compat.v1.train.AdamOptimizer instead.

WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:231: The name tf.get_collection is deprecated. Please use tf.compat.v1.get_collection instead.

WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:231: The name tf.GraphKeys is deprecated. Please use tf.compat.v1.GraphKeys instead.

WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:252: The name tf.global_variables_initializer is deprecated. Please use tf.compat.v1.global_variables_initializer instead.

WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:255: The name tf.summary.scalar is deprecated. Please use tf.compat.v1.summary.scalar instead.

WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:261: The name tf.summary.merge_all is deprecated. Please use tf.compat.v1.summary.merge_all instead.

WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:264: The name tf.train.Saver is deprecated. Please use tf.compat.v1.train.Saver instead.

WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:267: The name tf.Session is deprecated. Please use tf.compat.v1.Session instead.

2024-01-03 22:11:32.802831: I tensorflow/core/platform/cpu_feature_guard.cc:145] This TensorFlow binary is optimized with Intel(R) MKL-DNN to use the following CPU instructions in performance critical operations: AVX AVX2 To enable them in non-MKL-DNN operations, rebuild TensorFlow with the appropriate compiler flags. 2024-01-03 22:11:32.809138: I tensorflow/core/common_runtime/process_util.cc:115] Creating new thread pool with default inter op setting: 12. Tune using inter_op_parallelism_threads for best performance. WARNING:tensorflow:From C:\Users\USER\OneDrive - Ministere de l'education nationale, de l'enseignement supérieur, de la formation professionnelle et de la recherche scientifique\Bureau\Nouveau dossier\beziergan\gan.py:272: The name tf.summary.FileWriter is deprecated. Please use tf.compat.v1.summary.FileWriter instead.

OMP: Info #212: KMP_AFFINITY: decoding x2APIC ids. OMP: Info #210: KMP_AFFINITY: Affinity capable, using global cpuid leaf 11 info OMP: Info #154: KMP_AFFINITY: Initial OS proc set respected: 0-11 OMP: Info #156: KMP_AFFINITY: 12 available OS procs OMP: Info #157: KMP_AFFINITY: Uniform topology OMP: Info #179: KMP_AFFINITY: 1 packages x 6 cores/pkg x 2 threads/core (6 total cores) OMP: Info #214: KMP_AFFINITY: OS proc to physical thread map: OMP: Info #171: KMP_AFFINITY: OS proc 0 maps to package 0 core 0 thread 0 OMP: Info #171: KMP_AFFINITY: OS proc 1 maps to package 0 core 0 thread 1 OMP: Info #171: KMP_AFFINITY: OS proc 2 maps to package 0 core 1 thread 0 OMP: Info #171: KMP_AFFINITY: OS proc 3 maps to package 0 core 1 thread 1 OMP: Info #171: KMP_AFFINITY: OS proc 4 maps to package 0 core 2 thread 0 OMP: Info #171: KMP_AFFINITY: OS proc 5 maps to package 0 core 2 thread 1 OMP: Info #171: KMP_AFFINITY: OS proc 6 maps to package 0 core 4 thread 0 OMP: Info #171: KMP_AFFINITY: OS proc 7 maps to package 0 core 4 thread 1 OMP: Info #171: KMP_AFFINITY: OS proc 8 maps to package 0 core 5 thread 0 OMP: Info #171: KMP_AFFINITY: OS proc 9 maps to package 0 core 5 thread 1 OMP: Info #171: KMP_AFFINITY: OS proc 10 maps to package 0 core 6 thread 0 OMP: Info #171: KMP_AFFINITY: OS proc 11 maps to package 0 core 6 thread 1 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 5876 thread 0 bound to OS proc set 0 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 16304 thread 4 bound to OS proc set 8 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 18112 thread 2 bound to OS proc set 4 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 6692 thread 3 bound to OS proc set 6 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 11092 thread 1 bound to OS proc set 2 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 19440 thread 5 bound to OS proc set 10 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 5948 thread 6 bound to OS proc set 1 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 2300 thread 7 bound to OS proc set 3 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 12228 thread 8 bound to OS proc set 5 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 7984 thread 9 bound to OS proc set 7 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 11756 thread 10 bound to OS proc set 9 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 15012 thread 11 bound to OS proc set 11 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 5128 thread 12 bound to OS proc set 0 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 13004 thread 13 bound to OS proc set 2 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 11796 thread 14 bound to OS proc set 4 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 10112 thread 15 bound to OS proc set 6 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 14728 thread 16 bound to OS proc set 8 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 15988 thread 17 bound to OS proc set 10 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 9048 thread 18 bound to OS proc set 1 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 15224 thread 23 bound to OS proc set 11 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 16528 thread 19 bound to OS proc set 3 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 16940 thread 21 bound to OS proc set 7 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 7404 thread 20 bound to OS proc set 5 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 11932 thread 22 bound to OS proc set 9 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 13140 thread 24 bound to OS proc set 0 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 3052 thread 25 bound to OS proc set 2 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 15584 thread 26 bound to OS proc set 4 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 14748 thread 27 bound to OS proc set 6 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 9380 thread 29 bound to OS proc set 10 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 12220 thread 28 bound to OS proc set 8 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 9612 thread 30 bound to OS proc set 1 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 13428 thread 31 bound to OS proc set 3 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 4288 thread 32 bound to OS proc set 5 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 15996 thread 33 bound to OS proc set 7 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 2044 thread 34 bound to OS proc set 9 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 12812 thread 35 bound to OS proc set 11 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 12356 thread 36 bound to OS proc set 0 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 16092 thread 37 bound to OS proc set 2 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 17064 thread 38 bound to OS proc set 4 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 11360 thread 39 bound to OS proc set 6 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 16220 thread 40 bound to OS proc set 8 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 14804 thread 41 bound to OS proc set 10 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 7572 thread 42 bound to OS proc set 1 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 13488 thread 43 bound to OS proc set 3 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 2564 thread 44 bound to OS proc set 5 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 10856 thread 45 bound to OS proc set 7 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 19276 thread 46 bound to OS proc set 9 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 2924 thread 47 bound to OS proc set 11 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 16964 thread 48 bound to OS proc set 0 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 15912 thread 49 bound to OS proc set 2 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 10128 thread 50 bound to OS proc set 4 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 7712 thread 51 bound to OS proc set 6 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 9924 thread 52 bound to OS proc set 8 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 2012 thread 53 bound to OS proc set 10 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 812 thread 54 bound to OS proc set 1 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 19288 thread 55 bound to OS proc set 3 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 3280 thread 56 bound to OS proc set 5 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 4224 thread 57 bound to OS proc set 7 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 9016 thread 58 bound to OS proc set 9 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 4236 thread 59 bound to OS proc set 11 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 16276 thread 60 bound to OS proc set 0 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 5332 thread 61 bound to OS proc set 2 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 16108 thread 62 bound to OS proc set 4 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 8608 thread 63 bound to OS proc set 6 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 15624 thread 64 bound to OS proc set 8 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 8988 thread 65 bound to OS proc set 10 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 15456 thread 66 bound to OS proc set 1 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 2720 thread 67 bound to OS proc set 3 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 6228 thread 68 bound to OS proc set 5 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 17236 thread 69 bound to OS proc set 7 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 4452 thread 70 bound to OS proc set 9 OMP: Info #250: KMP_AFFINITY: pid 18472 tid 3904 thread 71 bound to OS proc set 11 2024-01-04 00:03:17.673033: W tensorflow/core/framework/op_kernel.cc:1651] OP_REQUIRES failed at save_restore_v2_ops.cc:109 : Not found: Failed to create a NewWriteableFile: ./trained_gan/2_2/0/model.data-00000-of-00001.tempstate7771815659253205507 : Le chemin d’accès spécifié est introuvable. ; No such process.

how can i solve it please and what is the next step to do to find the results?

wchen459 commented 6 months ago

The program cannot create a writeable file. Please check your permission to create files.

zakaria-hassoun commented 6 months ago

Even after granting all permissions, the problem persists. Do you have any further suggestions, please?

wchen459 commented 6 months ago

The error message said that the path is not found. Could you manually create that directory and see if the error still persists?

zakaria-hassoun commented 5 months ago

Thank you for you precious help! But now, I've been encountering a MemoryError while working with the optimization implementation in the code you provided. The error occurs with the following traceback:

Exception in thread Thread-11: Traceback (most recent call last): File "C:\Users\USER\anaconda3\envs\zakaria\lib\threading.py", line 926, in _bootstrap_inner self.run() File "C:\Users\USER\anaconda3\envs\zakaria\lib\threading.py", line 870, in run self._target(*self._args, **self._kwargs) File "C:\Users\USER\anaconda3\envs\zakaria\lib\site-packages\IPython\utils_process_win32.py", line 100, in stdout_read for line in read_no_interrupt(p.stdout).splitlines(): File "C:\Users\USER\anaconda3\envs\zakaria\lib\site-packages\IPython\utils_process_common.py", line 37, in read_no_interrupt return p.read() MemoryError

I would greatly appreciate any advice or suggestions you may have for resolving this issue.

wchen459 commented 5 months ago

This might be a hardware issue that simply indicates insufficient memory. Check whether it is running on a CPU or a GPU, and then check if the memory, or GPU memory if running on a GPU, is sufficient.