🆕 Functionality - Playlist Service looks for cached playlists before generating new ones
Highlight - It looks for both the requested playlist and the inverse playlist
When a playlist is requested,
1) Look for a playlist the fits the exact parameters
2) Look for a playlist that fits the inverse parameters
Example
Requested Params
User 1 - id: 1
User 2 - id: 2
Preference - USER1-ONLY
Inverse
User 1 - id: 2
User 2 - id: 1
Preference - USER2-ONLY
3) Generate Playlist
Impact - The site appears 10-20x faster
Request Time without Pre-cache
Median - 1.2s
90th percentile - 2.0s
99th percentile - 4.5s
Request Time with Pre-cache
Median - 150ms
90th percentile - 200ms
99th percentile - 250ms
Caveat - No speed up provided if the pre-load has not completed
Approach for Future Optimization - Cache the pending preload function and return it
The preload call already kicks off 3 playlist generation calls. We can store those pending functions in a cache. If those playlists are requested while the calls are pending, we can add some kind of callback to the pending function so that just wait for the previous calls to finish, rather than kick off new requests.
Benchmarks are approximate
These numbers are approximate, based on informal benchmarking.
Median - What I saw most often
90th percentile - Numbers I saw a few times during testing
Eagerly Generate Playlist and Store in a Cache
Playlists are taking a long time to load. For more, see Issue #25.
Features
🆕 Class -
PlaylistCache
- Stores entire playlistsfunction createPlaylistKey(users: [number, number], preferenceType: PreferenceType): string
Dependency -
lru-cache
This library looks well supported.
🆕 Functionality -
Playlist Service
looks for cached playlists before generating new onesHighlight - It looks for both the requested playlist and the inverse playlist
When a playlist is requested,
1) Look for a playlist the fits the exact parameters 2) Look for a playlist that fits the inverse parameters
Impact - The site appears 10-20x faster
Caveat - No speed up provided if the pre-load has not completed
Approach for Future Optimization - Cache the pending preload function and return it
The preload call already kicks off 3 playlist generation calls. We can store those pending functions in a cache. If those playlists are requested while the calls are pending, we can add some kind of
callback
to the pending function so that just wait for the previous calls to finish, rather than kick off new requests.Benchmarks are approximate
These numbers are approximate, based on informal benchmarking.