The project will soon be hosted somewhere that will never shut it down, so we need a way for the scrape data (The returned articles object) to stay fresh.
This was previously discussed in #27 , so please check there for some reference.
Possible implementation
setTimeout()seems to be the simplest way to achieve something like this.
I would suggest creating a const freshArticles = {} and use the setTimeout() to trigger the scraping function to populate this new object, and then at the end of the process replace the articles object with the contents of the freshArticles. That way we don't have any blank spots if someone makes a call to the API while the "refresh" is underway.
Additional information
Even if this issue gets assigned to someone, the discussion will remain open to accept suggestions of ways to implement this.
What feature would you like to see?
The project will soon be hosted somewhere that will never shut it down, so we need a way for the scrape data (The returned articles object) to stay fresh.
This was previously discussed in #27 , so please check there for some reference.
Possible implementation
setTimeout()
seems to be the simplest way to achieve something like this.Check out the docs here to see how the function works.
I would suggest creating a
const freshArticles = {}
and use the setTimeout() to trigger the scraping function to populate this new object, and then at the end of the process replace thearticles
object with the contents of thefreshArticles
. That way we don't have any blank spots if someone makes a call to the API while the "refresh" is underway.Additional information
Even if this issue gets assigned to someone, the discussion will remain open to accept suggestions of ways to implement this.