Open bbappserver opened 4 years ago
The what now from the what now to the what now for what purpose? And why? Some sort of customisable "open in external program"?
@Zweibach It is all very technical programming words, but in essence yes, although it's more like process these hashes by way of external program.
I don't want to use argv as the interface because it has limits wher as you can just keep pumping lines into a subprocess's stdout. So I thought I'd go ahead and define a good interface instead of just leaving it vague.
Something along the lines of, I don't know where in the threading and coroutine ratsnest of the controller it will sit, but I think the simplest way to do it would be to drop it into a manager subordinate to the controller running in a different thread. https://docs.python.org/3/library/asyncio-subprocess.html#asyncio.create_subprocess_exec
import asyncio
EXIT_SUCCESS=0
p=asyncio.create_subprocess_exec(program, *args,
stdin=asyncio.subprocess.PIPE,
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.PIPE)
newlined_hexes=[ x.hex() for x in selection ].join"\n"
stdout,stderr=p.communicate(input=newlined_hexes)
log_informational(stdout)
log_errors(stderr)
if p.returncode != EXIT_SUCCESS:
log_failure()
What'd be the benefit of this? To me it seems like a push mechanism, or a "selection" mechanism to notify a script (not a program, I dont know of a program that'd accept hashes in stdin, it'd be through commandline arguments, and very much customizable (per hash 1 execution, commas between hashes, spaces, every new hash a new arg flag) and then... what?
If the program does not know that it's getting hashes from hydrus, what does it need to do with that? If it does know it's getting hashes from hydrus, where would it get the actual files from? If they'd existed locally, since you could possibly just select hashes from a remote repo and they'd not be stored locally anywhere.
This feels like it should be API functionality. The API already has mechanisms for getting information about and interacting with the current GUI session.
I agree that this could be added as some sort of notification service you can subscribe to when #247 gets implemented.
Giving it the system:api
label in any case because even if it's not related to the current HTTP client API, this feature would still be a form of API for external programs.
for me it is a nice to have feature.
currently my workflow:
command $(echo $(copyq clipboard))
if this implemented, i don't have to open terminal to run that command and just run it from context menu
It is not part of the API because the initiator of the procedure is not an external already running procedure. While results can and will most likely be returned to hydrus by the API, this is moreso for userscripts executing specific fixed features.
Examples:
Other details
I don't believe using any line terminator other than newline is useful, though it is up to dev's discretion to pick one. Since it is just passing from one program to another, and newline display is irrelevant making a configurable terminator needlessly complicates the interface to the subprogram. I don't strictly speaking care what the line separator is as long as it is well defined. Another option is to use the OS line separator, which makes things like std::getline(input, str, input.widen('\n'))
and things built on top of it work well. If you want to eliminate programming errors all together you might just use ,
as the line separator, it means every subprocess needs to be explicit, but it should prevent the most portability related programming errors.
The subprocess is indifferent to the actual file, it only care what hydrus tells it.
If you want to run an arbitrary subprogram that expects different input than a newlined hexed hash list then simply use a wrapper to convert to the format needed by the rest of your pipeline, as you would for any other unix pipelining problem. I'm not really reinventing the wheel here, I'm just letting hydrus initiate a procedure.
==Subprocess url class== -Invoke subprocess with url as argument -Subprocess pushes results back into hydrus with the API -Url considered processed when subprocess EXIT_SUCCESS=0 If you want a passthrough test, here is my shit script, https://github.com/bbappserver/hydrus-youtube-dl/blob/master/youtube-dl.py Would need a little patching. Also would be nice to have a standard way to pass API access.
Was pruning wishlist when I found this from back in June last year. Is this the same as this issue?
@Zweibach It's a similar but different. This subsystem can certainly be reused part of that grander idea, but this issue is for invoking utilities on a hash list. The discussion you mentioned was intended to passthrough urls from the companion, into a helper program which throws the result back into hydrus.
For example hydrus does no know how to download a youtube video but the program youtube-dl
does. So the idea there was to map youtube urls to a subprogram, so that companions send to hydrus could launch the subprogram which would do the heavy lifting and then import the result into hydrus.
This issue is the much simple idea of having a subprocess perform some procedure on a group of hashes on the part of the user, for example asking e621 for new tags, and then pushing them back into hydrus with the tag management APIs.
I haven't bumped this issue in a while but I would like to take a moment to point out that what i'm asking for here is not exceptionally novel. In fact it's as old as dirt.
I am basically just asking to establish a UNIX pipeline and pass me a list.
Like in the command line
cat file.txt | wc -l
Here the receiving program is wc, it will count the number of lines it got. Cat just reads the contents of the file and pipes it to wc. In a similar manner Hydrus would pipe a newlined list to the subprocess.
I can also see the merit to having a single subprocess spawn with each element of the selection sent as argv[1]
But in general for just a quick script I write, a line processing is easy
from sys import stdin
#echo each hash you got line by line
for line in stdin:
print(line)
Pass hexed hashes from selection to a subprocess as newline separated list by stdin pipeline, by selecting from context menu.
Context menu items are declared as [ Menu item name, relative/path/to/executable ] In preferences/plugins