Closed RossPitman closed 6 years ago
not sure. it would entail server side processing.
@kenwheeler is this correct? or can I change something in my use of slickjs to support this problem?
you can also check if the problem is on the R side since the images need to be processed so slick can use them. If it is that then there is probably more things that can be done.
I don't think it's an issue with R (or at least with the code used to get the observeEvent). Within the observeEvent, paste0 passes a vector of image names/directories for slick to render. Hopefully there's a way around this.
i meant this line in the main script. the lines you pass get read and parsed.
i could try to allow for parallelization instead of a basic lapply. I never thought people would use it for a large vector of images
Ahh, I see what you mean now. Yeah, slickR is probably the best way to view images from within a shiny app. We use it to view hundreds (or thousands) of camera trap images for scientific research.
this might be helpful. an option in slick is lazy loading
// To use lazy loading, set a data-lazy attribute
// on your img tags and leave off the src
<img data-lazy="img/lazyfonz1.png"/>
$('.lazy').slick({
lazyLoad: 'ondemand',
slidesToShow: 3,
slidesToScroll: 1
});
or if the images are on a separate server and you send the url instead of local images. that may speed things up.
@bborgesr @daattali any ideas, on where efficiency can be increased?
@RossPitman you should try to use https://rstudio.github.io/profvis/ or even just some simple timers that calculate difftime
to identify whether the slow part is on the rendering side, or on the loading into the browser side.
Regardless, one technique that will definitely work is splitting the images and only rendering subsets at a time. This will be simplest way to reduce what is processed by slickr at a time + will lower the burden of loading thousands of images into the browser at a time as well.
split the dataframe into chunks of say 20-100 images, so now you have a list of dataframes. You can then prev/next to the next batch of images, and slickr will only load that subset.
You can also go wild and do things like put the subsets in different tabs or other ways of navigating.
Let me know if you have other questions about this approach and I can likely provide some snippets
@dpastoor Thanks for your suggestions. I hadn't yet come across profvis, so will definitely try make use of that. Currently, I allow the app user to specify which cameras––and therefore, which images––they want to view (see attached screenshot showing the "Filter Images" tabBox at the top; hope it uploads correctly). This approach limits the images to much fewer images than would otherwise to rendered, but can still be in excess of 2,000–5,000. I was hoping not to split the images further, since I'm not sure how that would impact the user experience? The idea of a prev/next button sounds like a useful plan, but I'm not sure how to make that clean and un-clunky. I'd be very interested to see some snippets of ideas you might have!
@yonicd Thanks for the lazy loading approach. This may be a great solution. Though I'm not sure how I would implement this in slickR? Is this even a possibility in slickR?
Many thanks!
good point, i'll add it as an option when i get a chance
if the images are time stamped you can create a slider to filter to a given time interval (instead of prev/next)
@yonicd I'm not sure the specifics of how it works with R, but typically in this situation you'd look to do some kind of virtualization/windowing
@kenwheeler is there an example of slick+virtualization (not in a react envir). thanks again
@RossPitman some of this comes down to how you're loading information into slickr. Are you just providing it an external url (eg images hosted elsewhere) for the user's browser to fetch, or are you actually serving the images from the shiny app through slickr.
If you are just providing external urls for slick to fetch on the client side, then you want lazy fetching to reduce the client burden on initial page load so it only needs to fetch the relevant images as you move along the carousel. I do NOT think this is the case, as it shouldn't cause the slowdowns you are experiencing - but I'm not sure until you actually profile. On the flip side, if the slow down is on the R side embedding the images to be sent to the browser and generated during render, then there is no way around chunking the data. This is literally you just asking to process and send an enormous amount of data down the wire. To use a data loading example, it'd be like saying - I'm not sure why read.csv is so slow when I try to load 1000's csv's by doing lapply(csv_vector, read.csv)
just so the user can look at one table at a time. In reality you'd say "just load the data that the user wants to view and show it to them then, or maybe the next few as well but definitely not all at once".
@yonicd @dpastoor Apologies for my late response. I've been travelling since we last discussed this topic. I've created a reproducible example (images and code can be downloaded here: https://www.dropbox.com/sh/u19eaw9u9rkjpc1/AACga8qE1Vk5YefNykRaOxvMa?dl=0; please let me know if there's an issue with the download). The images total ±200mb (for ±500 images), sorry for the large size but I wanted to provide a good example. I ran this through profiling and can see that large chunks of time are used up during the slickR call (assuming I understand the profile output). Many thanks for any advice!
fyi – once you've loaded the data file and browsed to the images folder provided and then clicked "View Images", I'd suggest waiting a minute or two before the images render.
@yonicd Just checking to see if there's an update to this issue?
I don't think this is something that slickr will be able to handle. The problem almost certainly is that passing in all the images is causing everything to be generated/loaded at once.
My suggestion, as I mentioned above, would be to add a second level of control so slickr only renders sets of images at a time, with additional forward/backward arrows to load the next batch.
OK thanks very much @dpastoor I'll work on integrating batching of the images.
I'm not sure if this is a limitation with shiny, or with slick itself, but when rendering a large number of images (e.g., >500) using slickR, the process can take many minutes to execute. In my example code, I use a filter button, which when fired, passes a data frame of image names (images stored on a local SSD hard drive). This works fantastically when only a few images are rendered, but becomes almost unusable when data sets are large. Is there any way to optimise my approach? See below: