clarete / hackernews.el

Hacker News client for Emacs
GNU General Public License v3.0
248 stars 26 forks source link

Support asynchronous retrieval #40

Open basil-conto opened 6 years ago

basil-conto commented 6 years ago

It should be possible to continue using Emacs whilst news are fetched (and rendered), either optionally or by default. This would be especially nice for fetching large numbers of articles in one go.

I think something like the man.el user option Man-notify-method would be the best way for users to choose how finalised hackernews buffers are notified/presented.

If #39 is implemented, a uniform asynchronous interface to both url.el and subprocess invocations could in theory be straightforward, given the correspondence of url.el callbacks to process sentinels.

clarete commented 6 years ago

This is a very nice feature for sure. I don't mind the current behavior but it could definitely be a great addition for some people. Also if we ever wanted to implement displaying comments, this could be very useful for pulling comment threads which might get way larger than the news feed.

basil-conto commented 6 years ago

For posterity, I'll be working on this feature in the branches blc/async and blc/async-playground. As its name suggests, the latter may, at any given time, be in complete and utter disarray.

basil-conto commented 6 years ago

@clarete @naclander Sorry about the delay on the asynchronous front further to https://github.com/clarete/hackernews.el/issues/44#issuecomment-367530666. Here's a status update for the blc/async branch:

Both synchronous and asynchronous retrieval work OOTB on all supported Emacs versions via either url.el or any external executable, so feel free to take a peek and give it a whirl. :)

What's left to do:

  1. Decide whether single vs multiple URL retrieval when using external tools (i.e. wget url1 url2 vs wget url1; wget url2) should be toggled by a defvar, a defcustom or be implemented as separate backends, and thus be configured via the new user option hackernews-backend (any suggestions on this before I put up a PR for review are welcome).
  2. Further abstract the code across backends (subject to the decision for (1)).
  3. Handle multiple existing retrievals of the same feed using the same approach as the Man-getpage-in-background function and Man-notify-method user option from the man.el library.
  4. Add/improve/update documentation.
  5. (Low priority - might open a separate issue for this.) Try to devise a way by which backends can be separated into different files, or at least by which features aren't required until needed, without angering the byte-compiler. For example, json shouldn't be required on Emacs 27 built with native JSON support, and mm-url shouldn't be required until an external grabber is invoked, if at all.

So, not too much left to iron out. If tomorrow goes well I might have something almost ready. :)

basil-conto commented 6 years ago

Once again, I'm sorry for promising and not delivering anything other than some WIP branches. The preceding months proved too busy to tidy up the implementation and feature branches for this, and I'm currently in an important and uncertain transition period in my academic career, so no big promises until September.

Some notes on my older notes:

Decide whether single vs multiple URL retrieval when using external tools (i.e. wget url1 url2 vs wget url1; wget url2) should be toggled by a defvar, a defcustom or be implemented as separate backends, and thus be configured via the new user option hackernews-backend

The latest implementation I was working on tried to generalise this as much as possible by implementing each variation as a separate backend. What is still TBD, though, is how this should interplay with the new universal-async-argument being proposed for Emacs 27; see the emacs-devel threads beginning with the following messages for more details on this:

Further abstract the code across backends (subject to the decision for (1)).

Following the decision to generalise variations as different backends, the only abstraction still needed is to provide a mostly uniform interface to both process invocation and url.el retrieval, e.g. by passing url.el functions the same kind of plist as that assigned to subprocesses, and having both sentinels and callbacks expect similar argument lists.

Handle multiple existing retrievals of the same feed using the same approach as the Man-getpage-in-background function and Man-notify-method user option from the man.el library.

Still to-do.

Add/improve/update documentation.

I was doing this as I went.

(Low priority - might open a separate issue for this.) Try to devise a way by which backends can be separated into different files, or at least by which features aren't required until needed, without angering the byte-compiler. For example, json shouldn't be required on Emacs 27 built with native JSON support, and mm-url shouldn't be required until an external grabber is invoked, if at all.

Whenever I tried figuring out a way to do this I came to the conclusion it's too low-priority and too complicated to pursue it. One problem, for example, is how to dynamically load a backend after a user changes hackernews-backend. I still think this is doable and nice to have, though, so I don't intend to give up on the idea.

So, not too much left to iron out. If tomorrow goes well I might have something almost ready. :)

Famous last words. Plot twist: I didn't have something ready tomorrow. ;)

Another problem I was contemplating while working on this which I haven't addressed yet, is how to handle quits (C-g) during both sync and async retrieval, i.e. how to ensure a quit doesn't corrupt a/sync state beyond repair/reason, and how to quit async retrieval ASAP following a quit.

basil-conto commented 6 years ago

Yet another idea I was contemplating was how best or in how many ways it's possible to report retrieval progress. For example, it might be nice to have the progress report in the hackernews buffer's mode line for async retrievals, so as to keep the echo area free for other purposes.