Open papasanimohansrinivas opened 3 years ago
hi!
The models should follow the same block of the other models, so It shouldn't be necessary to implement new blocks.
I suggest to check if the A3, A4 and A5 require a positional encoding and to look at the architecture details of the original implementation directly from the source code. The paper may not be updated and may lack details, you can find the original source code link in the README.
I also suggest you look at the weight_load.ipynb, and check how I managed to load the weights. You need to look at the name of the layers in order to convert the weights to pytorch. For the model supported, the weights names are loaded in the variable loaded_list
of weight_load.ipynb
That's what came to my mind, let me know if you need anything more specific.
Hi @Atze00 , Many thanks to you for implementing movinets pytorch version , I had benefited greatly for my project .
In order to take my project to next level ,I need to switch to streaming versions as bigger the input dimension the better .
So I decided to implement myself if needed be the a3,a4,a5 versions of movinets streaming models and may be contribute back to community
Could u kindly give me the direction to where to start, and some tips if u may for doing it.
Thanks !
Hi, @papasanimohansrinivas,
Thank you for your great idea. Did you finish the reproducing of the streaming A3, A4, A5 models? It will be great if you can share your work.
Thanks!
@papasanimohansrinivas , any progress in reproducing of streaming A3, A4, A5 ? looking eagerly forward to your work.
Hi @Atze00 , Many thanks to you for implementing movinets pytorch version , I had benefited greatly for my project .
In order to take my project to next level ,I need to switch to streaming versions as bigger the input dimension the better .
So I decided to implement myself if needed be the a3,a4,a5 versions of movinets streaming models and may be contribute back to community
Could u kindly give me the direction to where to start, and some tips if u may for doing it.
Thanks !