clarete / hackernews.el

Hacker News client for Emacs
GNU General Public License v3.0
248 stars 26 forks source link

Support alternatives to url.el for retrieval #39

Closed basil-conto closed 6 years ago

basil-conto commented 6 years ago

This issue is inspired by Chris Wellons' article on using cURL in Elfeed, which see.

In my tests on Emacs 27, url.el is as fast if not faster than calling curl or wget in a subprocess, which is a pleasant surprise. :) Perhaps the overhead of subprocesses and their I/O can be blamed.

Either way, some users could conceivably like to use external tools like these for fetching news. Would this be a worthwhile feature?

clarete commented 6 years ago

I wouldn't probably use this feature if url.el is fast but I'm sure people could enjoy the freedom of choosing the backend tool. So if you feel like implementing it it'd be certainly cool! 👍

basil-conto commented 6 years ago

I think the main benefit of using external tools is that they make implementing asynchronous retrieval way easier on older Emacsen, particularly 23, which lacks url-queue.el. The only other way to support asynchronous retrieval on Emacs 23, short of backporting url-queue.el, is creating callback chains, which, apart from being a pain to maintain, can also potentially exceed max-lisp-eval-depth.

Then again, we could just provide the asynchronous option on later Emacsen...

basil-conto commented 6 years ago

The only other way to support asynchronous retrieval on Emacs 23, short of backporting url-queue.el, is creating callback chains, which, apart from being a pain to maintain, can also potentially exceed max-lisp-eval-depth.

Not true; it should usually, if not always, be possible to replace a callback chain with an explicit stack to avoid recursion. Either way, given the lack of objections to #46, we can safely ignore Emacs 23 constraints.

I'm closing this issue in favour of #40, which discusses more about both url.el alternatives and a/sync retrieval methods.