MattRosenLab / AUTOMAP

GNU General Public License v3.0
55 stars 17 forks source link

How many epoches should each network be trained for? #4

Closed superresolution closed 2 years ago

superresolution commented 2 years ago

Hi,

I am training the AUTOMAP with the sample data now. I found that after 200 epoches of training, the results were still not as good as the original FFT converted data. So I am wondering what is the best number of epoches should I train for the sample data? It would be good if you could provide me some detailed suggestions about how to train the network.

Here is one of the test data after training. The left is test_real (predicted), the right is test_img (predicted). Do you think I should train more epoches?

image

Here is the loss function values of 200 epoches:


Epoch 0, Loss: 0.001818065531551838, ValLoss: 0.0014511736808344722 Epoch 1, Loss: 0.0007942133815959096, ValLoss: 0.0006231632432900369 Epoch 2, Loss: 0.0007066625985316932, ValLoss: 0.00056696921819821 Epoch 3, Loss: 0.0006540745962411165, ValLoss: 0.0005262654740363359 Epoch 4, Loss: 0.0006044905167073011, ValLoss: 0.000490388018079102 Epoch 5, Loss: 0.0005670794053003192, ValLoss: 0.0004573477490339428 Epoch 6, Loss: 0.0005273337010294199, ValLoss: 0.0004268423072062433 Epoch 7, Loss: 0.0004948762943968177, ValLoss: 0.0003992934653069824 Epoch 8, Loss: 0.00046704066335223615, ValLoss: 0.0003744650457520038 Epoch 9, Loss: 0.00044045771937817335, ValLoss: 0.00035213076625950634 Epoch 10, Loss: 0.0004158079391345382, ValLoss: 0.00033372483449056745 Epoch 11, Loss: 0.0003962920745834708, ValLoss: 0.000316171208396554 Epoch 12, Loss: 0.0003755392099265009, ValLoss: 0.0003021014272235334 Epoch 13, Loss: 0.00035998993553221226, ValLoss: 0.0002880832180380821 Epoch 14, Loss: 0.0003454314137343317, ValLoss: 0.000277158833341673 Epoch 15, Loss: 0.0003345168079249561, ValLoss: 0.0002675647265277803 Epoch 16, Loss: 0.0003207316913176328, ValLoss: 0.00025875313440337777 Epoch 17, Loss: 0.0003113432030659169, ValLoss: 0.0002497326349839568 Epoch 18, Loss: 0.0003029639774467796, ValLoss: 0.00024375253997277468 Epoch 19, Loss: 0.0002923588326666504, ValLoss: 0.00023745720682200044 Epoch 20, Loss: 0.000285791204078123, ValLoss: 0.00022963630908634514 Epoch 21, Loss: 0.0002775583998300135, ValLoss: 0.00022426243231166154 Epoch 22, Loss: 0.0002706784871406853, ValLoss: 0.0002196639688918367 Epoch 23, Loss: 0.0002640376624185592, ValLoss: 0.000214996631257236 Epoch 24, Loss: 0.00025948972324840724, ValLoss: 0.00021032812946941704 Epoch 25, Loss: 0.0002549767668824643, ValLoss: 0.0002059525577351451 Epoch 26, Loss: 0.00024848629254847765, ValLoss: 0.00020235440752003342 Epoch 27, Loss: 0.00024289476277772337, ValLoss: 0.0001982448302442208 Epoch 28, Loss: 0.00023839290952309966, ValLoss: 0.00019418032024987042 Epoch 29, Loss: 0.00023414791212417185, ValLoss: 0.00019088423869106919 Epoch 30, Loss: 0.00023099774261936545, ValLoss: 0.00018756708595901728 Epoch 31, Loss: 0.00022674580395687371, ValLoss: 0.00018515123520046473 Epoch 32, Loss: 0.00022328173508867621, ValLoss: 0.0001821798359742388 Epoch 33, Loss: 0.00021917946287430823, ValLoss: 0.00017934449715539813 Epoch 34, Loss: 0.00021650336566381156, ValLoss: 0.00017626609769649804 Epoch 35, Loss: 0.00021100122830830514, ValLoss: 0.000174134605913423 Epoch 36, Loss: 0.00020855992625001818, ValLoss: 0.00017197859415318817 Epoch 37, Loss: 0.00020676346321124583, ValLoss: 0.0001691854267846793 Epoch 38, Loss: 0.0002030799660133198, ValLoss: 0.00016700538981240243 Epoch 39, Loss: 0.0002005492424359545, ValLoss: 0.00016416564176324755 Epoch 40, Loss: 0.00019814611005131155, ValLoss: 0.00016256155504379421 Epoch 41, Loss: 0.00019474698638077825, ValLoss: 0.00015991319378372282 Epoch 42, Loss: 0.0001935036270879209, ValLoss: 0.0001587242732057348 Epoch 43, Loss: 0.00019041869381908327, ValLoss: 0.00015689057181589305 Epoch 44, Loss: 0.00018732836178969592, ValLoss: 0.00015467061894014478 Epoch 45, Loss: 0.00018558897136244923, ValLoss: 0.00015315142809413373 Epoch 46, Loss: 0.00018393354548607022, ValLoss: 0.0001512866874691099 Epoch 47, Loss: 0.0001814405550248921, ValLoss: 0.0001497800840297714 Epoch 48, Loss: 0.0001796913129510358, ValLoss: 0.00014762063801754266 Epoch 49, Loss: 0.00017729074170347303, ValLoss: 0.0001464946399210021 Epoch 50, Loss: 0.0001761972380336374, ValLoss: 0.00014423044922295958 Epoch 51, Loss: 0.0001742267340887338, ValLoss: 0.000143263881909661 Epoch 52, Loss: 0.00017161200230475515, ValLoss: 0.00014147408364806324 Epoch 53, Loss: 0.0001699174608802423, ValLoss: 0.00014015343913342804 Epoch 54, Loss: 0.00016769973444752395, ValLoss: 0.00013844929344486445 Epoch 55, Loss: 0.00016653981583658606, ValLoss: 0.00013798376312479377 Epoch 56, Loss: 0.00016434340795967728, ValLoss: 0.00013622385449707508 Epoch 57, Loss: 0.00016258899995591491, ValLoss: 0.00013519843923859298 Epoch 58, Loss: 0.0001605571451364085, ValLoss: 0.00013368930376600474 Epoch 59, Loss: 0.00016010172839742154, ValLoss: 0.00013238107203505933 Epoch 60, Loss: 0.00015834473015274853, ValLoss: 0.00013111584121361375 Epoch 61, Loss: 0.0001566501014167443, ValLoss: 0.00012990583491045982 Epoch 62, Loss: 0.00015505394549109042, ValLoss: 0.00012902167509309947 Epoch 63, Loss: 0.00015318863734137267, ValLoss: 0.00012798671377822757 Epoch 64, Loss: 0.00015171203995123506, ValLoss: 0.00012643956870306283 Epoch 65, Loss: 0.00015100585005711764, ValLoss: 0.00012506914208643138 Epoch 66, Loss: 0.0001486591499997303, ValLoss: 0.00012379357940517366 Epoch 67, Loss: 0.00014735291188117117, ValLoss: 0.00012299284571781754 Epoch 68, Loss: 0.00014615840336773545, ValLoss: 0.00012132155825383961 Epoch 69, Loss: 0.0001450248237233609, ValLoss: 0.00012085985508747399 Epoch 70, Loss: 0.00014381662185769528, ValLoss: 0.00011974234075751156 Epoch 71, Loss: 0.00014258526789490134, ValLoss: 0.00011902034748345613 Epoch 72, Loss: 0.00014162249863147736, ValLoss: 0.00011759660264942795 Epoch 73, Loss: 0.00013948307605460286, ValLoss: 0.00011663532495731488 Epoch 74, Loss: 0.00013829657109454274, ValLoss: 0.00011576826364034787 Epoch 75, Loss: 0.00013684049190487713, ValLoss: 0.00011490806355141103 Epoch 76, Loss: 0.00013576405763160437, ValLoss: 0.0001138898660428822 Epoch 77, Loss: 0.0001339329464826733, ValLoss: 0.00011295831791358069 Epoch 78, Loss: 0.00013348847278393805, ValLoss: 0.00011154610547237098 Epoch 79, Loss: 0.00013139711518306285, ValLoss: 0.00011117180110886693 Epoch 80, Loss: 0.00013132749882061034, ValLoss: 0.00011007770808646455 Epoch 81, Loss: 0.00012964176130481064, ValLoss: 0.00010864507203223184 Epoch 82, Loss: 0.00012866318866144866, ValLoss: 0.00010831371764652431 Epoch 83, Loss: 0.00012718130892608315, ValLoss: 0.00010733326780609787 Epoch 84, Loss: 0.00012586201773956418, ValLoss: 0.00010666351590771228 Epoch 85, Loss: 0.00012525951024144888, ValLoss: 0.00010600133100524545 Epoch 86, Loss: 0.0001242460130015388, ValLoss: 0.00010528654092922807 Epoch 87, Loss: 0.00012272957246750593, ValLoss: 0.00010414209100417793 Epoch 88, Loss: 0.00012215366587042809, ValLoss: 0.0001039269263856113 Epoch 89, Loss: 0.00012134561984566972, ValLoss: 0.00010236727393930778 Epoch 90, Loss: 0.00011975372763117775, ValLoss: 0.00010162792750634253 Epoch 91, Loss: 0.00011942184210056439, ValLoss: 0.0001013100118143484 Epoch 92, Loss: 0.00011762550275307149, ValLoss: 0.00010062673391075805 Epoch 93, Loss: 0.00011756252933992073, ValLoss: 9.963889169739559e-05 Epoch 94, Loss: 0.00011582951992750168, ValLoss: 9.896473784465343e-05 Epoch 95, Loss: 0.00011509107571328059, ValLoss: 9.83579593594186e-05 Epoch 96, Loss: 0.00011435808846727014, ValLoss: 9.793078788788989e-05 Epoch 97, Loss: 0.00011381338845239952, ValLoss: 9.726460120873526e-05 Epoch 98, Loss: 0.00011263025953667238, ValLoss: 9.617573232389987e-05 Epoch 99, Loss: 0.0001113981197704561, ValLoss: 9.567739471094683e-05 Epoch 100, Loss: 0.00011066416482208297, ValLoss: 9.509914525551721e-05 Epoch 101, Loss: 0.00010995173215633258, ValLoss: 9.465649054618552e-05 Epoch 102, Loss: 0.0001091085359803401, ValLoss: 9.397033863933757e-05 Epoch 103, Loss: 0.00010830321843968704, ValLoss: 9.286426939070225e-05 Epoch 104, Loss: 0.00010754285176517442, ValLoss: 9.255079203285277e-05 Epoch 105, Loss: 0.00010733550152508542, ValLoss: 9.192111610900611e-05 Epoch 106, Loss: 0.0001062702140188776, ValLoss: 9.127301018452272e-05 Epoch 107, Loss: 0.00010540661605773494, ValLoss: 9.087169746635482e-05 Epoch 108, Loss: 0.00010416420263936743, ValLoss: 9.03348991414532e-05 Epoch 109, Loss: 0.00010299142741132528, ValLoss: 8.944542787503451e-05 Epoch 110, Loss: 0.00010280736023560166, ValLoss: 8.941981650423259e-05 Epoch 111, Loss: 0.00010213429777650163, ValLoss: 8.826781413517892e-05 Epoch 112, Loss: 0.00010154563642572612, ValLoss: 8.816432091407478e-05 Epoch 113, Loss: 0.00010069684503832832, ValLoss: 8.777298353379592e-05 Epoch 114, Loss: 0.00010013308929046616, ValLoss: 8.679935854161158e-05 Epoch 115, Loss: 9.942353790393099e-05, ValLoss: 8.68182978592813e-05 Epoch 116, Loss: 9.872536611510441e-05, ValLoss: 8.593926031608135e-05 Epoch 117, Loss: 9.795451478566974e-05, ValLoss: 8.538769907318056e-05 Epoch 118, Loss: 9.76957380771637e-05, ValLoss: 8.500527474097908e-05 Epoch 119, Loss: 9.70507608144544e-05, ValLoss: 8.465236896881834e-05 Epoch 120, Loss: 9.612533904146403e-05, ValLoss: 8.410255395574495e-05 Epoch 121, Loss: 9.557974408380687e-05, ValLoss: 8.357264596270397e-05 Epoch 122, Loss: 9.495245467405766e-05, ValLoss: 8.322524809045717e-05 Epoch 123, Loss: 9.439724090043455e-05, ValLoss: 8.270366379292682e-05 Epoch 124, Loss: 9.387391764903441e-05, ValLoss: 8.23332738946192e-05 Epoch 125, Loss: 9.332220361102372e-05, ValLoss: 8.179923315765336e-05 Epoch 126, Loss: 9.278312791138887e-05, ValLoss: 8.129751950036734e-05 Epoch 127, Loss: 9.181214409181848e-05, ValLoss: 8.081929991021752e-05 Epoch 128, Loss: 9.155758016277105e-05, ValLoss: 8.007945871213451e-05 Epoch 129, Loss: 9.08247020561248e-05, ValLoss: 7.998643559403718e-05 Epoch 130, Loss: 9.03634718270041e-05, ValLoss: 7.972929597599432e-05 Epoch 131, Loss: 8.992947550723329e-05, ValLoss: 7.92703140177764e-05 Epoch 132, Loss: 8.909121243050322e-05, ValLoss: 7.853080751374364e-05 Epoch 133, Loss: 8.917766535887495e-05, ValLoss: 7.861365156713873e-05 Epoch 134, Loss: 8.821805386105552e-05, ValLoss: 7.827149966033176e-05 Epoch 135, Loss: 8.73429817147553e-05, ValLoss: 7.773260585963726e-05 Epoch 136, Loss: 8.73180542839691e-05, ValLoss: 7.732029189355671e-05 Epoch 137, Loss: 8.692162373336032e-05, ValLoss: 7.696587999816984e-05 Epoch 138, Loss: 8.637362770969048e-05, ValLoss: 7.648804603377357e-05 Epoch 139, Loss: 8.576566324336454e-05, ValLoss: 7.610169996041805e-05 Epoch 140, Loss: 8.548558253096417e-05, ValLoss: 7.588588050566614e-05 Epoch 141, Loss: 8.469771273666993e-05, ValLoss: 7.525213732151315e-05 Epoch 142, Loss: 8.417369826929644e-05, ValLoss: 7.543394167441875e-05 Epoch 143, Loss: 8.403304673265666e-05, ValLoss: 7.491637370549142e-05 Epoch 144, Loss: 8.341378270415589e-05, ValLoss: 7.445009396178648e-05 Epoch 145, Loss: 8.31255892990157e-05, ValLoss: 7.401545735774562e-05 Epoch 146, Loss: 8.269595127785578e-05, ValLoss: 7.397351873805746e-05 Epoch 147, Loss: 8.201109449146315e-05, ValLoss: 7.34662389731966e-05 Epoch 148, Loss: 8.16588508314453e-05, ValLoss: 7.319092401303351e-05 Epoch 149, Loss: 8.156441617757082e-05, ValLoss: 7.286760228453204e-05 Epoch 150, Loss: 8.096719830064103e-05, ValLoss: 7.231336348922923e-05 Epoch 151, Loss: 8.071609772741795e-05, ValLoss: 7.185150752775371e-05 Epoch 152, Loss: 8.018741209525615e-05, ValLoss: 7.180197280831635e-05 Epoch 153, Loss: 7.991428719833493e-05, ValLoss: 7.168372394517064e-05 Epoch 154, Loss: 7.942452793940902e-05, ValLoss: 7.128709694370627e-05 Epoch 155, Loss: 7.886569073889405e-05, ValLoss: 7.097036723280326e-05 Epoch 156, Loss: 7.858366006985307e-05, ValLoss: 7.07080471329391e-05 Epoch 157, Loss: 7.818536687409505e-05, ValLoss: 7.035362796159461e-05 Epoch 158, Loss: 7.777188147883862e-05, ValLoss: 6.985614891164005e-05 Epoch 159, Loss: 7.751920202281326e-05, ValLoss: 6.963971827644855e-05 Epoch 160, Loss: 7.709070632699877e-05, ValLoss: 6.958712765481323e-05 Epoch 161, Loss: 7.660505070816725e-05, ValLoss: 6.935084093129262e-05 Epoch 162, Loss: 7.614598871441558e-05, ValLoss: 6.884124013595283e-05 Epoch 163, Loss: 7.607352017657831e-05, ValLoss: 6.846713222330436e-05 Epoch 164, Loss: 7.544943946413696e-05, ValLoss: 6.838155968580395e-05 Epoch 165, Loss: 7.519159407820553e-05, ValLoss: 6.787291204091161e-05 Epoch 166, Loss: 7.489386916859075e-05, ValLoss: 6.79543154546991e-05 Epoch 167, Loss: 7.460633059963584e-05, ValLoss: 6.749135354766622e-05 Epoch 168, Loss: 7.422864291584119e-05, ValLoss: 6.722264515701681e-05 Epoch 169, Loss: 7.39458337193355e-05, ValLoss: 6.71482557663694e-05 Epoch 170, Loss: 7.351837848545983e-05, ValLoss: 6.697660137433559e-05 Epoch 171, Loss: 7.321776502067223e-05, ValLoss: 6.651913281530142e-05 Epoch 172, Loss: 7.30101382941939e-05, ValLoss: 6.634639430558309e-05 Epoch 173, Loss: 7.273530354723334e-05, ValLoss: 6.630265852436423e-05 Epoch 174, Loss: 7.205294969025999e-05, ValLoss: 6.573236896656454e-05 Epoch 175, Loss: 7.18759692972526e-05, ValLoss: 6.560175825143233e-05 Epoch 176, Loss: 7.14152047294192e-05, ValLoss: 6.522313196910545e-05 Epoch 177, Loss: 7.114845357136801e-05, ValLoss: 6.512289110105485e-05 Epoch 178, Loss: 7.089696737239137e-05, ValLoss: 6.476708949776366e-05 Epoch 179, Loss: 7.061049836920574e-05, ValLoss: 6.44790634396486e-05 Epoch 180, Loss: 7.042722427286208e-05, ValLoss: 6.423613376682624e-05 Epoch 181, Loss: 7.016076415311545e-05, ValLoss: 6.410154310287908e-05 Epoch 182, Loss: 6.973155541345477e-05, ValLoss: 6.386579480022192e-05 Epoch 183, Loss: 6.951358227524906e-05, ValLoss: 6.366533489199355e-05 Epoch 184, Loss: 6.915105041116476e-05, ValLoss: 6.35671749478206e-05 Epoch 185, Loss: 6.871091318316758e-05, ValLoss: 6.327741721179336e-05 Epoch 186, Loss: 6.885244511067867e-05, ValLoss: 6.290139572229236e-05 Epoch 187, Loss: 6.86003768350929e-05, ValLoss: 6.2580693338532e-05 Epoch 188, Loss: 6.824215961387381e-05, ValLoss: 6.266911805141717e-05 Epoch 189, Loss: 6.786196900065988e-05, ValLoss: 6.235529872355983e-05 Epoch 190, Loss: 6.745967402821407e-05, ValLoss: 6.189324631122872e-05 Epoch 191, Loss: 6.704821134917438e-05, ValLoss: 6.205453246366233e-05 Epoch 192, Loss: 6.684808613499627e-05, ValLoss: 6.154912989586592e-05 Epoch 193, Loss: 6.671046139672399e-05, ValLoss: 6.137994205346331e-05 Epoch 194, Loss: 6.641817162744701e-05, ValLoss: 6.115304859122261e-05 Epoch 195, Loss: 6.60835939925164e-05, ValLoss: 6.0879538068547845e-05 Epoch 196, Loss: 6.60102377878502e-05, ValLoss: 6.0658880101982504e-05 Epoch 197, Loss: 6.582066271221265e-05, ValLoss: 6.064143235562369e-05 Epoch 198, Loss: 6.551038677571341e-05, ValLoss: 6.036843478796072e-05 Epoch 199, Loss: 6.522988405777141e-05, ValLoss: 6.016541374265216e-05

MSROSENLAB commented 2 years ago

Hello Hao, My student Danyal will be in touch shortly to help you.

With very best regards, Matt

On Fri, Dec 3, 2021 at 7:56 PM Hao Chang @.***> wrote:

Hi,

I am training the AUTOMAP with the sample data now. I found that after 200 epoches of training, the results were still not as good as the original FFT converted data. So I am wondering what is the best number of epoches should I train for the sample data? It would be good if you could provide me some detailed suggestions about how to train the network.

Here is one of the test data after training. The left is test_real (predicted), the right is test_img (predicted). Do you think I should train more epoches?

[image: image] https://user-images.githubusercontent.com/6476316/144690262-a7961164-2f37-4a26-8b85-1581569af5f4.png

Here is the loss function values of 200 epoches:

Epoch 0, Loss: 0.001818065531551838, ValLoss: 0.0014511736808344722 Epoch 1, Loss: 0.0007942133815959096, ValLoss: 0.0006231632432900369 Epoch 2, Loss: 0.0007066625985316932, ValLoss: 0.00056696921819821 Epoch 3, Loss: 0.0006540745962411165, ValLoss: 0.0005262654740363359 Epoch 4, Loss: 0.0006044905167073011, ValLoss: 0.000490388018079102 Epoch 5, Loss: 0.0005670794053003192, ValLoss: 0.0004573477490339428 Epoch 6, Loss: 0.0005273337010294199, ValLoss: 0.0004268423072062433 Epoch 7, Loss: 0.0004948762943968177, ValLoss: 0.0003992934653069824 Epoch 8, Loss: 0.00046704066335223615, ValLoss: 0.0003744650457520038 Epoch 9, Loss: 0.00044045771937817335, ValLoss: 0.00035213076625950634 Epoch 10, Loss: 0.0004158079391345382, ValLoss: 0.00033372483449056745 Epoch 11, Loss: 0.0003962920745834708, ValLoss: 0.000316171208396554 Epoch 12, Loss: 0.0003755392099265009, ValLoss: 0.0003021014272235334 Epoch 13, Loss: 0.00035998993553221226, ValLoss: 0.0002880832180380821 Epoch 14, Loss: 0.0003454314137343317, ValLoss: 0.000277158833341673 Epoch 15, Loss: 0.0003345168079249561, ValLoss: 0.0002675647265277803 Epoch 16, Loss: 0.0003207316913176328, ValLoss: 0.00025875313440337777 Epoch 17, Loss: 0.0003113432030659169, ValLoss: 0.0002497326349839568 Epoch 18, Loss: 0.0003029639774467796, ValLoss: 0.00024375253997277468 Epoch 19, Loss: 0.0002923588326666504, ValLoss: 0.00023745720682200044 Epoch 20, Loss: 0.000285791204078123, ValLoss: 0.00022963630908634514 Epoch 21, Loss: 0.0002775583998300135, ValLoss: 0.00022426243231166154 Epoch 22, Loss: 0.0002706784871406853, ValLoss: 0.0002196639688918367 Epoch 23, Loss: 0.0002640376624185592, ValLoss: 0.000214996631257236 Epoch 24, Loss: 0.00025948972324840724, ValLoss: 0.00021032812946941704 Epoch 25, Loss: 0.0002549767668824643, ValLoss: 0.0002059525577351451 Epoch 26, Loss: 0.00024848629254847765, ValLoss: 0.00020235440752003342 Epoch 27, Loss: 0.00024289476277772337, ValLoss: 0.0001982448302442208 Epoch 28, Loss: 0.00023839290952309966, ValLoss: 0.00019418032024987042 Epoch 29, Loss: 0.00023414791212417185, ValLoss: 0.00019088423869106919 Epoch 30, Loss: 0.00023099774261936545, ValLoss: 0.00018756708595901728 Epoch 31, Loss: 0.00022674580395687371, ValLoss: 0.00018515123520046473 Epoch 32, Loss: 0.00022328173508867621, ValLoss: 0.0001821798359742388 Epoch 33, Loss: 0.00021917946287430823, ValLoss: 0.00017934449715539813 Epoch 34, Loss: 0.00021650336566381156, ValLoss: 0.00017626609769649804 Epoch 35, Loss: 0.00021100122830830514, ValLoss: 0.000174134605913423 Epoch 36, Loss: 0.00020855992625001818, ValLoss: 0.00017197859415318817 Epoch 37, Loss: 0.00020676346321124583, ValLoss: 0.0001691854267846793 Epoch 38, Loss: 0.0002030799660133198, ValLoss: 0.00016700538981240243 Epoch 39, Loss: 0.0002005492424359545, ValLoss: 0.00016416564176324755 Epoch 40, Loss: 0.00019814611005131155, ValLoss: 0.00016256155504379421 Epoch 41, Loss: 0.00019474698638077825, ValLoss: 0.00015991319378372282 Epoch 42, Loss: 0.0001935036270879209, ValLoss: 0.0001587242732057348 Epoch 43, Loss: 0.00019041869381908327, ValLoss: 0.00015689057181589305 Epoch 44, Loss: 0.00018732836178969592, ValLoss: 0.00015467061894014478 Epoch 45, Loss: 0.00018558897136244923, ValLoss: 0.00015315142809413373 Epoch 46, Loss: 0.00018393354548607022, ValLoss: 0.0001512866874691099 Epoch 47, Loss: 0.0001814405550248921, ValLoss: 0.0001497800840297714 Epoch 48, Loss: 0.0001796913129510358, ValLoss: 0.00014762063801754266 Epoch 49, Loss: 0.00017729074170347303, ValLoss: 0.0001464946399210021 Epoch 50, Loss: 0.0001761972380336374, ValLoss: 0.00014423044922295958 Epoch 51, Loss: 0.0001742267340887338, ValLoss: 0.000143263881909661 Epoch 52, Loss: 0.00017161200230475515, ValLoss: 0.00014147408364806324 Epoch 53, Loss: 0.0001699174608802423, ValLoss: 0.00014015343913342804 Epoch 54, Loss: 0.00016769973444752395, ValLoss: 0.00013844929344486445 Epoch 55, Loss: 0.00016653981583658606, ValLoss: 0.00013798376312479377 Epoch 56, Loss: 0.00016434340795967728, ValLoss: 0.00013622385449707508 Epoch 57, Loss: 0.00016258899995591491, ValLoss: 0.00013519843923859298 Epoch 58, Loss: 0.0001605571451364085, ValLoss: 0.00013368930376600474 Epoch 59, Loss: 0.00016010172839742154, ValLoss: 0.00013238107203505933 Epoch 60, Loss: 0.00015834473015274853, ValLoss: 0.00013111584121361375 Epoch 61, Loss: 0.0001566501014167443, ValLoss: 0.00012990583491045982 Epoch 62, Loss: 0.00015505394549109042, ValLoss: 0.00012902167509309947 Epoch 63, Loss: 0.00015318863734137267, ValLoss: 0.00012798671377822757 Epoch 64, Loss: 0.00015171203995123506, ValLoss: 0.00012643956870306283 Epoch 65, Loss: 0.00015100585005711764, ValLoss: 0.00012506914208643138 Epoch 66, Loss: 0.0001486591499997303, ValLoss: 0.00012379357940517366 Epoch 67, Loss: 0.00014735291188117117, ValLoss: 0.00012299284571781754 Epoch 68, Loss: 0.00014615840336773545, ValLoss: 0.00012132155825383961 Epoch 69, Loss: 0.0001450248237233609, ValLoss: 0.00012085985508747399 Epoch 70, Loss: 0.00014381662185769528, ValLoss: 0.00011974234075751156 Epoch 71, Loss: 0.00014258526789490134, ValLoss: 0.00011902034748345613 Epoch 72, Loss: 0.00014162249863147736, ValLoss: 0.00011759660264942795 Epoch 73, Loss: 0.00013948307605460286, ValLoss: 0.00011663532495731488 Epoch 74, Loss: 0.00013829657109454274, ValLoss: 0.00011576826364034787 Epoch 75, Loss: 0.00013684049190487713, ValLoss: 0.00011490806355141103 Epoch 76, Loss: 0.00013576405763160437, ValLoss: 0.0001138898660428822 Epoch 77, Loss: 0.0001339329464826733, ValLoss: 0.00011295831791358069 Epoch 78, Loss: 0.00013348847278393805, ValLoss: 0.00011154610547237098 Epoch 79, Loss: 0.00013139711518306285, ValLoss: 0.00011117180110886693 Epoch 80, Loss: 0.00013132749882061034, ValLoss: 0.00011007770808646455 Epoch 81, Loss: 0.00012964176130481064, ValLoss: 0.00010864507203223184 Epoch 82, Loss: 0.00012866318866144866, ValLoss: 0.00010831371764652431 Epoch 83, Loss: 0.00012718130892608315, ValLoss: 0.00010733326780609787 Epoch 84, Loss: 0.00012586201773956418, ValLoss: 0.00010666351590771228 Epoch 85, Loss: 0.00012525951024144888, ValLoss: 0.00010600133100524545 Epoch 86, Loss: 0.0001242460130015388, ValLoss: 0.00010528654092922807 Epoch 87, Loss: 0.00012272957246750593, ValLoss: 0.00010414209100417793 Epoch 88, Loss: 0.00012215366587042809, ValLoss: 0.0001039269263856113 Epoch 89, Loss: 0.00012134561984566972, ValLoss: 0.00010236727393930778 Epoch 90, Loss: 0.00011975372763117775, ValLoss: 0.00010162792750634253 Epoch 91, Loss: 0.00011942184210056439, ValLoss: 0.0001013100118143484 Epoch 92, Loss: 0.00011762550275307149, ValLoss: 0.00010062673391075805 Epoch 93, Loss: 0.00011756252933992073, ValLoss: 9.963889169739559e-05 Epoch 94, Loss: 0.00011582951992750168, ValLoss: 9.896473784465343e-05 Epoch 95, Loss: 0.00011509107571328059, ValLoss: 9.83579593594186e-05 Epoch 96, Loss: 0.00011435808846727014, ValLoss: 9.793078788788989e-05 Epoch 97, Loss: 0.00011381338845239952, ValLoss: 9.726460120873526e-05 Epoch 98, Loss: 0.00011263025953667238, ValLoss: 9.617573232389987e-05 Epoch 99, Loss: 0.0001113981197704561, ValLoss: 9.567739471094683e-05 Epoch 100, Loss: 0.00011066416482208297, ValLoss: 9.509914525551721e-05 Epoch 101, Loss: 0.00010995173215633258, ValLoss: 9.465649054618552e-05 Epoch 102, Loss: 0.0001091085359803401, ValLoss: 9.397033863933757e-05 Epoch 103, Loss: 0.00010830321843968704, ValLoss: 9.286426939070225e-05 Epoch 104, Loss: 0.00010754285176517442, ValLoss: 9.255079203285277e-05 Epoch 105, Loss: 0.00010733550152508542, ValLoss: 9.192111610900611e-05 Epoch 106, Loss: 0.0001062702140188776, ValLoss: 9.127301018452272e-05 Epoch 107, Loss: 0.00010540661605773494, ValLoss: 9.087169746635482e-05 Epoch 108, Loss: 0.00010416420263936743, ValLoss: 9.03348991414532e-05 Epoch 109, Loss: 0.00010299142741132528, ValLoss: 8.944542787503451e-05 Epoch 110, Loss: 0.00010280736023560166, ValLoss: 8.941981650423259e-05 Epoch 111, Loss: 0.00010213429777650163, ValLoss: 8.826781413517892e-05 Epoch 112, Loss: 0.00010154563642572612, ValLoss: 8.816432091407478e-05 Epoch 113, Loss: 0.00010069684503832832, ValLoss: 8.777298353379592e-05 Epoch 114, Loss: 0.00010013308929046616, ValLoss: 8.679935854161158e-05 Epoch 115, Loss: 9.942353790393099e-05, ValLoss: 8.68182978592813e-05 Epoch 116, Loss: 9.872536611510441e-05, ValLoss: 8.593926031608135e-05 Epoch 117, Loss: 9.795451478566974e-05, ValLoss: 8.538769907318056e-05 Epoch 118, Loss: 9.76957380771637e-05, ValLoss: 8.500527474097908e-05 Epoch 119, Loss: 9.70507608144544e-05, ValLoss: 8.465236896881834e-05 Epoch 120, Loss: 9.612533904146403e-05, ValLoss: 8.410255395574495e-05 Epoch 121, Loss: 9.557974408380687e-05, ValLoss: 8.357264596270397e-05 Epoch 122, Loss: 9.495245467405766e-05, ValLoss: 8.322524809045717e-05 Epoch 123, Loss: 9.439724090043455e-05, ValLoss: 8.270366379292682e-05 Epoch 124, Loss: 9.387391764903441e-05, ValLoss: 8.23332738946192e-05 Epoch 125, Loss: 9.332220361102372e-05, ValLoss: 8.179923315765336e-05 Epoch 126, Loss: 9.278312791138887e-05, ValLoss: 8.129751950036734e-05 Epoch 127, Loss: 9.181214409181848e-05, ValLoss: 8.081929991021752e-05 Epoch 128, Loss: 9.155758016277105e-05, ValLoss: 8.007945871213451e-05 Epoch 129, Loss: 9.08247020561248e-05, ValLoss: 7.998643559403718e-05 Epoch 130, Loss: 9.03634718270041e-05, ValLoss: 7.972929597599432e-05 Epoch 131, Loss: 8.992947550723329e-05, ValLoss: 7.92703140177764e-05 Epoch 132, Loss: 8.909121243050322e-05, ValLoss: 7.853080751374364e-05 Epoch 133, Loss: 8.917766535887495e-05, ValLoss: 7.861365156713873e-05 Epoch 134, Loss: 8.821805386105552e-05, ValLoss: 7.827149966033176e-05 Epoch 135, Loss: 8.73429817147553e-05, ValLoss: 7.773260585963726e-05 Epoch 136, Loss: 8.73180542839691e-05, ValLoss: 7.732029189355671e-05 Epoch 137, Loss: 8.692162373336032e-05, ValLoss: 7.696587999816984e-05 Epoch 138, Loss: 8.637362770969048e-05, ValLoss: 7.648804603377357e-05 Epoch 139, Loss: 8.576566324336454e-05, ValLoss: 7.610169996041805e-05 Epoch 140, Loss: 8.548558253096417e-05, ValLoss: 7.588588050566614e-05 Epoch 141, Loss: 8.469771273666993e-05, ValLoss: 7.525213732151315e-05 Epoch 142, Loss: 8.417369826929644e-05, ValLoss: 7.543394167441875e-05 Epoch 143, Loss: 8.403304673265666e-05, ValLoss: 7.491637370549142e-05 Epoch 144, Loss: 8.341378270415589e-05, ValLoss: 7.445009396178648e-05 Epoch 145, Loss: 8.31255892990157e-05, ValLoss: 7.401545735774562e-05 Epoch 146, Loss: 8.269595127785578e-05, ValLoss: 7.397351873805746e-05 Epoch 147, Loss: 8.201109449146315e-05, ValLoss: 7.34662389731966e-05 Epoch 148, Loss: 8.16588508314453e-05, ValLoss: 7.319092401303351e-05 Epoch 149, Loss: 8.156441617757082e-05, ValLoss: 7.286760228453204e-05 Epoch 150, Loss: 8.096719830064103e-05, ValLoss: 7.231336348922923e-05 Epoch 151, Loss: 8.071609772741795e-05, ValLoss: 7.185150752775371e-05 Epoch 152, Loss: 8.018741209525615e-05, ValLoss: 7.180197280831635e-05 Epoch 153, Loss: 7.991428719833493e-05, ValLoss: 7.168372394517064e-05 Epoch 154, Loss: 7.942452793940902e-05, ValLoss: 7.128709694370627e-05 Epoch 155, Loss: 7.886569073889405e-05, ValLoss: 7.097036723280326e-05 Epoch 156, Loss: 7.858366006985307e-05, ValLoss: 7.07080471329391e-05 Epoch 157, Loss: 7.818536687409505e-05, ValLoss: 7.035362796159461e-05 Epoch 158, Loss: 7.777188147883862e-05, ValLoss: 6.985614891164005e-05 Epoch 159, Loss: 7.751920202281326e-05, ValLoss: 6.963971827644855e-05 Epoch 160, Loss: 7.709070632699877e-05, ValLoss: 6.958712765481323e-05 Epoch 161, Loss: 7.660505070816725e-05, ValLoss: 6.935084093129262e-05 Epoch 162, Loss: 7.614598871441558e-05, ValLoss: 6.884124013595283e-05 Epoch 163, Loss: 7.607352017657831e-05, ValLoss: 6.846713222330436e-05 Epoch 164, Loss: 7.544943946413696e-05, ValLoss: 6.838155968580395e-05 Epoch 165, Loss: 7.519159407820553e-05, ValLoss: 6.787291204091161e-05 Epoch 166, Loss: 7.489386916859075e-05, ValLoss: 6.79543154546991e-05 Epoch 167, Loss: 7.460633059963584e-05, ValLoss: 6.749135354766622e-05 Epoch 168, Loss: 7.422864291584119e-05, ValLoss: 6.722264515701681e-05 Epoch 169, Loss: 7.39458337193355e-05, ValLoss: 6.71482557663694e-05 Epoch 170, Loss: 7.351837848545983e-05, ValLoss: 6.697660137433559e-05 Epoch 171, Loss: 7.321776502067223e-05, ValLoss: 6.651913281530142e-05 Epoch 172, Loss: 7.30101382941939e-05, ValLoss: 6.634639430558309e-05 Epoch 173, Loss: 7.273530354723334e-05, ValLoss: 6.630265852436423e-05 Epoch 174, Loss: 7.205294969025999e-05, ValLoss: 6.573236896656454e-05 Epoch 175, Loss: 7.18759692972526e-05, ValLoss: 6.560175825143233e-05 Epoch 176, Loss: 7.14152047294192e-05, ValLoss: 6.522313196910545e-05 Epoch 177, Loss: 7.114845357136801e-05, ValLoss: 6.512289110105485e-05 Epoch 178, Loss: 7.089696737239137e-05, ValLoss: 6.476708949776366e-05 Epoch 179, Loss: 7.061049836920574e-05, ValLoss: 6.44790634396486e-05 Epoch 180, Loss: 7.042722427286208e-05, ValLoss: 6.423613376682624e-05 Epoch 181, Loss: 7.016076415311545e-05, ValLoss: 6.410154310287908e-05 Epoch 182, Loss: 6.973155541345477e-05, ValLoss: 6.386579480022192e-05 Epoch 183, Loss: 6.951358227524906e-05, ValLoss: 6.366533489199355e-05 Epoch 184, Loss: 6.915105041116476e-05, ValLoss: 6.35671749478206e-05 Epoch 185, Loss: 6.871091318316758e-05, ValLoss: 6.327741721179336e-05 Epoch 186, Loss: 6.885244511067867e-05, ValLoss: 6.290139572229236e-05 Epoch 187, Loss: 6.86003768350929e-05, ValLoss: 6.2580693338532e-05 Epoch 188, Loss: 6.824215961387381e-05, ValLoss: 6.266911805141717e-05 Epoch 189, Loss: 6.786196900065988e-05, ValLoss: 6.235529872355983e-05 Epoch 190, Loss: 6.745967402821407e-05, ValLoss: 6.189324631122872e-05 Epoch 191, Loss: 6.704821134917438e-05, ValLoss: 6.205453246366233e-05 Epoch 192, Loss: 6.684808613499627e-05, ValLoss: 6.154912989586592e-05 Epoch 193, Loss: 6.671046139672399e-05, ValLoss: 6.137994205346331e-05 Epoch 194, Loss: 6.641817162744701e-05, ValLoss: 6.115304859122261e-05 Epoch 195, Loss: 6.60835939925164e-05, ValLoss: 6.0879538068547845e-05 Epoch 196, Loss: 6.60102377878502e-05, ValLoss: 6.0658880101982504e-05 Epoch 197, Loss: 6.582066271221265e-05, ValLoss: 6.064143235562369e-05 Epoch 198, Loss: 6.551038677571341e-05, ValLoss: 6.036843478796072e-05 Epoch 199, Loss: 6.522988405777141e-05, ValLoss: 6.016541374265216e-05

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/MattRosenLab/AUTOMAP/issues/4, or unsubscribe https://github.com/notifications/unsubscribe-auth/AP7ZISYCBUM5KLWABHHKYHLUPFRKTANCNFSM5JK57ZGA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

-- Matthew Rosen, Ph. D. Director, Low Field MRI and Hyperpolarized Media Laboratory Co-Director, Center for Machine Learning MGH/Martinos Center for Biomedical Imaging Harvard Medical School http://martinos.org/lab/lfi/

Mailing address: 149 13th Street, Suite 2301, Charlestown MA 02129 LFI Lab: (617) 643-8636 Office: (617) 724-5598 Fax: (617) 726-7993 Mobile: (617) 909-5100

superresolution commented 2 years ago

Great! Thanks a lot for your quickly response!

danyalb commented 2 years ago

Hi Hao,

I normally train for 100 epochs. Your MSE is 9e-5 after 100 epochs. That's fine. You can also decrease the learning rate to .000001 and that might help.

Can you confirm a few things? Are you using the latest Automap code uploaded on November 9th 2021? Did you train two networks, for real and imaginary each respectively? Then what you should do after reshaping is something like "imagesc(abs(complex(real_img, imag_img)), colormap gray ". Can you compare that predicted 'magnitude' image now to the image formed by combining the truth real and imaginary. Do they look similar? Keep in mind that this data is 64 by 64, so both aren't the best resolution to begin with. However, the truth and predicted images should look similar.

Best, Danyal

On Fri, Dec 3, 2021 at 10:08 PM Hao Chang @.***> wrote:

Great! Thanks a lot for your quickly response!

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/MattRosenLab/AUTOMAP/issues/4#issuecomment-985956008, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABKDJTLUQEIQE2VCMZRCTQ3UPGA4RANCNFSM5JK57ZGA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

superresolution commented 2 years ago

Hi Danyal,

Maybe I am using the old version. Yes I trained the real and img separately.

This figure is reconstructed by real and img data: [cid:df8e9174-9b81-4911-bc11-1d5aa00b9227]

This figure is reconstructed by the predicted real and img data:

[cid:e191e4f6-3478-42e4-9432-858ebf2cc392] It seems some details are missing. I will try the new version, to see if it would be much better. Thanks for your kindly response.

Best,

Hao

Post-doctoral Fellow Laboratory of Dr. Ivan de Araujo

Nash Family Department of Neuroscience Icahn School of Medicine at Mt. Sinai 203-390-9285


发件人: danyalb @.> 发送时间: 2021年12月4日 10:19 收件人: MattRosenLab/AUTOMAP @.> 抄送: Chang, Hao @.>; State change @.> 主题: Re: [MattRosenLab/AUTOMAP] How many epoches should each network be trained for? (Issue #4)

USE CAUTION: External Message.

Hi Hao,

I normally train for 100 epochs. Your MSE is 9e-5 after 100 epochs. That's fine. You can also decrease the learning rate to .000001 and that might help.

Can you confirm a few things? Are you using the latest Automap code uploaded on November 9th 2021? Did you train two networks, for real and imaginary each respectively? Then what you should do after reshaping is something like "imagesc(abs(complex(real_img, imag_img)), colormap gray ". Can you compare that predicted 'magnitude' image now to the image formed by combining the truth real and imaginary. Do they look similar? Keep in mind that this data is 64 by 64, so both aren't the best resolution to begin with. However, the truth and predicted images should look similar.

Best, Danyal

On Fri, Dec 3, 2021 at 10:08 PM Hao Chang @.***> wrote:

Great! Thanks a lot for your quickly response!

― You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/MattRosenLab/AUTOMAP/issues/4#issuecomment-985956008https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_MattRosenLab_AUTOMAP_issues_4-23issuecomment-2D985956008-253E&d=DwQFaQ&c=shNJtf5dKgNcPZ6Yh64b-A&r=MKeg9NkEeUMbzvGj3NC6JKlpB4qpgEO9p74bUwwVXh4&m=269rJfAX9Di6oTiQ13ucuY3SImaU6ndGwNK15OAcDUU&s=ZpPli0E0Z82ToJ93W0ld0RgzIGIkJhuHhdE4WpuO4U4&e=, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABKDJTLUQEIQE2VCMZRCTQ3UPGA4RANCNFSM5JK57ZGAhttps://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_notifications_unsubscribe-2Dauth_ABKDJTLUQEIQE2VCMZRCTQ3UPGA4RANCNFSM5JK57ZGA-253E&d=DwQFaQ&c=shNJtf5dKgNcPZ6Yh64b-A&r=MKeg9NkEeUMbzvGj3NC6JKlpB4qpgEO9p74bUwwVXh4&m=269rJfAX9Di6oTiQ13ucuY3SImaU6ndGwNK15OAcDUU&s=a68mVYV-ryy8WFemzV0hNqT5zLIgeF8V03vIZsO8HhM&e= . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675https://urldefense.proofpoint.com/v2/url?u=https-3A__apps.apple.com_app_apple-2Dstore_id1477376905-3Fct-3Dnotification-2Demail-26mt-3D8-26pt-3D524675-253E&d=DwQFaQ&c=shNJtf5dKgNcPZ6Yh64b-A&r=MKeg9NkEeUMbzvGj3NC6JKlpB4qpgEO9p74bUwwVXh4&m=269rJfAX9Di6oTiQ13ucuY3SImaU6ndGwNK15OAcDUU&s=akCefY9ZQk0WqT4TyBSPCj9rKtQ2k1Fe7E9ShMpo6-Y&e= or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithubhttps://urldefense.proofpoint.com/v2/url?u=https-3A__play.google.com_store_apps_details-3Fid-3Dcom.github.android-26referrer-3Dutm-5Fcampaign-253Dnotification-2Demail-2526utm-5Fmedium-253Demail-2526utm-5Fsource-253Dgithub-253E&d=DwQFaQ&c=shNJtf5dKgNcPZ6Yh64b-A&r=MKeg9NkEeUMbzvGj3NC6JKlpB4qpgEO9p74bUwwVXh4&m=269rJfAX9Di6oTiQ13ucuY3SImaU6ndGwNK15OAcDUU&s=o0BJeuEZw71GfSvuecfAG8i9-SY1HCfjdvle4PjGS8s&e=.

― You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHubhttps://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_MattRosenLab_AUTOMAP_issues_4-23issuecomment-2D986044140&d=DwMFaQ&c=shNJtf5dKgNcPZ6Yh64b-A&r=MKeg9NkEeUMbzvGj3NC6JKlpB4qpgEO9p74bUwwVXh4&m=269rJfAX9Di6oTiQ13ucuY3SImaU6ndGwNK15OAcDUU&s=P_3Psuq5tiojEgD8tAUEM9aX5Uot1-yZ8xnQQvypgoM&e=, or unsubscribehttps://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_notifications_unsubscribe-2Dauth_ABRNEHG4QSJDQ427HQL4YNDUPIWRBANCNFSM5JK57ZGA&d=DwMFaQ&c=shNJtf5dKgNcPZ6Yh64b-A&r=MKeg9NkEeUMbzvGj3NC6JKlpB4qpgEO9p74bUwwVXh4&m=269rJfAX9Di6oTiQ13ucuY3SImaU6ndGwNK15OAcDUU&s=pUUkHf_0ibYYDb73V-aSrtqUUyM10qDP9LEbN93PfHI&e=. Triage notifications on the go with GitHub Mobile for iOShttps://urldefense.proofpoint.com/v2/url?u=https-3A__apps.apple.com_app_apple-2Dstore_id1477376905-3Fct-3Dnotification-2Demail-26mt-3D8-26pt-3D524675&d=DwMFaQ&c=shNJtf5dKgNcPZ6Yh64b-A&r=MKeg9NkEeUMbzvGj3NC6JKlpB4qpgEO9p74bUwwVXh4&m=269rJfAX9Di6oTiQ13ucuY3SImaU6ndGwNK15OAcDUU&s=G60qe557jqlg_iWi-aPd_sUW-5d3MroABzHrltUNi1M&e= or Androidhttps://urldefense.proofpoint.com/v2/url?u=https-3A__play.google.com_store_apps_details-3Fid-3Dcom.github.android-26referrer-3Dutm-5Fcampaign-253Dnotification-2Demail-2526utm-5Fmedium-253Demail-2526utm-5Fsource-253Dgithub&d=DwMFaQ&c=shNJtf5dKgNcPZ6Yh64b-A&r=MKeg9NkEeUMbzvGj3NC6JKlpB4qpgEO9p74bUwwVXh4&m=269rJfAX9Di6oTiQ13ucuY3SImaU6ndGwNK15OAcDUU&s=FDZXeyjxI5GN1p3vZ6TJiB794cRqVeeUdPJwiRu_9aE&e=.