Brown-University-Library / OLD-ARCHIVED_iip-production

3 stars 9 forks source link

optimize map loading on main search page #56

Open emylonas opened 4 years ago

emylonas commented 4 years ago

The page is non-functional until all the facet data and the map have loaded. This is especially burdensome in places with slow net connections. It would be great to figure out a way to optimize the loading.

emylonas commented 3 years ago

This is probably happening because the map coordinates are being loaded from Pleiades each time the page is loaded. If this is the case, one solution is to look up coordinates when inscriptions are being indexed - this would require a change to the solr xsl and a change to the search page code.

atbradley commented 3 years ago

I've made a few minor changes to the mapsearch.js script running on dlib. It should be a little faster, especially on slower connections. https://dlibwwwcit.services.brown.edu/iip/mapsearch/

atbradley commented 3 years ago

The query that's used to build the map just gives Pleiades URLs and counts for each. It's not looking at item-level details, so it wouldn't get any Pleiades data we index in Solr. If we want to limit the number of queries, we could download the Pleiades data and build a simple API that takes a list of Pleiades IDs and returns geodata for all of them in a single request.

I think my pull request above has the highest improvement:effort ratio we can manage here. For anything else, I'd probably have to refactor or rewrite large chunks of the map script. I could remove a lot of the code smell, but I don't know how much the end user would notice.

atbradley commented 3 years ago

Actually it looks like there are multiple steps in building the map. The bit that creates the map markers uses the location name and region from Solr for the first search result in that location. That still doesn't have to happen when originally setting up the page and should probably be put off until the user clicks on a circle on the map.

I've spent this morning looking at the script and I think I was right yesterday--the code is spaghetti-ish enough it'd probably be easier to rewrite than to make more improvements.