torralba-lab / im2recipe-Pytorch

im2recipe Pytorch implementation
MIT License
268 stars 101 forks source link

Visualization #13

Open rahilwazir opened 5 years ago

rahilwazir commented 5 years ago

In the lua version, you have a section to see the visualization of the trained model, how this will be achieved with Pytorch once the model is trained?

nhynes commented 5 years ago

I'd send you some python code for doing the vis, but I no longer have access to the system :(

On the bright side, it's not too hard (and in fact much easier!) to load the embeddings using numpy and displaying them in the Tensorboard projector or an iPython notebook. Please let me know if you have any questions about doing either of these!

rahilwazir commented 5 years ago

@nhynes Thank you so much for the reply! Actually, I need to visualize this on web app, honestly I'm not sure which one is better for that case? Can you please advise?

nhynes commented 5 years ago

Ah, a webapp is going to require a bit of trickery to display the results on the page, but making the backend which generates the results is much more easily done using a python backend.

Do you need the results to be real-time or do you just want to see what you get for your own viewing? In the latter case, a Jupyter notebook would be ideal since you can plot the images inline using matplotlib.

If you want to be able to serve im2recipe results to users, I had a good experience creating a lightweight Twisted webserver which sent the image links to a javascript frontend for viewing. Alas, I no longer have the code that does this or I would gladly give it to you!

rahilwazir commented 5 years ago

@nhynes

I'm OK with running a backend server and frontend UI.

Do you need the results to be real-time or do you just want to see what you get for your own viewing? In the latter case, a Jupyter notebook would be ideal since you can plot the images inline using matplotlib.

I want the former. The flow would be like, the user uploads a recipe image and the server (Python backend) returns the image processed through the model.

If you want to be able to serve im2recipe results to users

Yes, exactly this. What I'm looking for a simple command line to request the results from im2recipe model. Which I can serve to my users

nhynes commented 5 years ago

Gotcha. Yeah, the best way I've found to do that is convert the model to CPU and then create a worker pool of model executors (to service concurrent requests; the latency of a request is pretty high). Then, you just pop a webserver on top of that (e.g., Tornado) and have it send each request to the pool, wait for a response, and send the API response back to the client.

rahilwazir commented 5 years ago

@nhynes Great. Here are a couple of more questions for you

Thank you for the help so far!

nhynes commented 5 years ago

Why do I need to convert the model into CPU?

If you have a GPU sitting around waiting for requests, then more power to you! For deployment, I found CPU boxes more reliable, but that was mostly because they worked better with Kerberos logins 📦

Where can I find the model executors?

They're just threads in a thread pool into which you call when you receive a request on the main webserver thread. Like basically just do round-robin scheduling: call _do_inference(req) on the thread pool and _do_inference just does return model(unpack(req))-ish. Really sorry I don't have code available any more...

demdecuong commented 5 years ago

I has taken a course project on my university (University of Science_HCMUS) which we should follow and analyze a science paper in our favor topic. And i choose your paper as the best one since i am a food lover .

I am running the your pretrained model (model_e500_v-8.950.pth.tar) well now.However , i would like to see the visualization off this trained model and view the dataset recipe.Cause i try alot to open the .mdb file on ubuntu but no one is effective enough. Could you send me some python code for this issue.

Moreover, i wonder if i want to have a new predict one,what size of image i should prepared. Please give me some advices ?