Open arianaa30 opened 1 year ago
Can we train a variation of the model that is lightweight and can run on a phone?
Does the S variant not work for you? It is lightweight enough to be able to run smoothly on my iPhone.
Can we train a variation of the model that is lightweight and can run on a phone?
Does the S variant not work for you? It is lightweight enough to be able to run smoothly on my iPhone.
Which S variant can you please elaborate on? And is it possible to hook the filters in iPhone like mpv?
Can we train a variation of the model that is lightweight and can run on a phone?
Does the S variant not work for you? It is lightweight enough to be able to run smoothly on my iPhone.
Can I also run it easily on Android? Also, does the latest release have the S model size? It seems the older version has this:
5 network sizes (S/M/L/VL/UL).
Which S variant can you please elaborate on? And is it possible to hook the filters in iPhone like mpv?
I'm referring to Anime4K_Restore_CNN_S.glsl. This is the smallest and most lightweight restore shader that should be able to run on any modern smartphone. This shader can be run individually or chain with other shaders. It is possible to chain and hook the shaders on iPhone like in mpv with Anime4KMetal. I was able to run Anime4K: Mode A (Fast) on an iPhone 13 Pro Max smoothly with minimal frame drops.
Can I also run it easily on Android?
I have not tested on Android, but according to Bloc (the Owner), it is possible https://github.com/bloc97/Anime4K/issues/99#issuecomment-897873000.
@Tama47 Getting back to this, we have S variant for 2x upscales. But do we have S and M sizes for 4x upscalers as well?
do we have S and M sizes for 4x upscalers as well?
I don't think so, why do you need S and M sizes for 4x upscale?
Just like 2x If it doesn't run on a phone I go for a more lightweight version.
Btw, do you know how I can train it on my dataset? There is a training script in the Tensorflow directory, but it is not complete. It needs a .npy array file which is not clear how to generate it.
Just like 2x If it doesn't run on a phone I go for a more lightweight version.
Honestly, even with an iPhone 15 Pro Max screen, I can still barely tell the difference when upscaled to 4K. In the rare occasions that I do watch from my phone rather than on my 4K TV, I usually just watch in 1080p directly from the Crunchyroll app. The difference is just too small. Does the 2x Upscale not work for you? All you really need on a phone is maybe a small Restore Shader.
Btw, do you know how I can train it on my dataset? There is a training script in the Tensorflow directory, but it is not complete. It needs a .npy array file which is not clear how to generate it.
Yeah sorry, I do not know either, only Bloc would be able to answer that.
I see the training is performed on 256x256 color images by Synla dataset. How can I replace it with my own dataset? What is the purpose of images being 256x256?
Can we train a variation of the model that is lightweight and can run on a phone? The train model notebook doesn't have any explanations. Can you please add some notes into that? What data do I need exactly? (I want to run it say on old gameplay videos). What is
/hdd/sdb/SYNLA_Plus_4096.npy
?