PathwayCommons / hyper-recent

Hyper-recent article feed
MIT License
1 stars 0 forks source link

Improving redirects #39

Closed Suhyma closed 1 year ago

Suhyma commented 1 year ago

Implementing a function to check for redirects with the paper link pulled from our data. Pull request just to put the code up here, this isn't super refined at the moment.

Refs #40

jvwong commented 1 year ago

You definitely have the right idea! My advice is now to fill out issue #40 to help us better define the feature/problem before doing any more coding.

cloudflare-pages[bot] commented 1 year ago

Deploying with  Cloudflare Pages  Cloudflare Pages

Latest commit: e7cb0c8
Status: ✅  Deploy successful!
Preview URL: https://81c3bc8e.hyper-recent.pages.dev
Branch Preview URL: https://24-improve-redirects.hyper-recent.pages.dev

View logs

codecov[bot] commented 1 year ago

Codecov Report

Merging #39 (27612d7) into main (87f6597) will not change coverage. The diff coverage is n/a.

:exclamation: Current head 27612d7 differs from pull request most recent head e7cb0c8. Consider uploading reports for the commit e7cb0c8 to get more accurate results

@@          Coverage Diff          @@
##            main     #39   +/-   ##
=====================================
  Coverage   0.00%   0.00%           
=====================================
  Files          2       2           
  Lines        106     106           
=====================================
  Misses       106     106           

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

Suhyma commented 1 year ago

As I'm looking into caching papers, a few questions came up on how we should implement it:

jvwong commented 1 year ago

As I'm looking into caching papers, a few questions came up on how we should implement it:

  • Do we need a separate cache for each topic, or just one cache storage for all the papers available?
  • Should the entire cache be deleted and recreated every night when getting data? Or, do we just add new cache for new papers for that day & delete outdated ones (assuming we display the latest papers in the last month)?

Maybe we used the term "cache" but I was thinking much more simple. Just add an article URL to the data.json.

maxkfranz commented 1 year ago

By storing the address in the json you’re effectively caching it for the day (assuming a daily update interval).

maxkfranz commented 1 year ago

Let’s go over the issue tomorrow during the meeting.

On Mar 21, 2023, at 14:55, Suhyma @.***> wrote:

@Suhyma commented on this pull request.

In src/data-search.js https://github.com/PathwayCommons/hyper-recent/pull/39#discussion_r1143866287:

@@ -45,6 +46,15 @@ export async function getData () { }; const collection = await Promise.all(config.map(doSearches));

  • // Find and save final link for each paper's DOI
  • const { papers } = collection;
  • for (const paper in papers) { I attempted using the JS destructuring method here but since papers is an array within the collection object it doesn't seem to work the same way as it does with keywords in line 41. Any better ways to carry out the same function? I've been stuck with this for some time since destructuring is also new to me.

— Reply to this email directly, view it on GitHub https://github.com/PathwayCommons/hyper-recent/pull/39#pullrequestreview-1351201853, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAHRO45OATVCPW4CPUBXYOTW5H2TNANCNFSM6AAAAAAVZRFEDY. You are receiving this because your review was requested.

maxkfranz commented 1 year ago

FYI: There are functions like uniqBy in lodash that make deduping much easier: