In the process of testing, I hooked up one of the harvest processes to a channel and sent data through and out to HTTP as a JSON response (streaming) so I could look at real data somewhere other than the database or by tailing the log files.
I realized that a streaming harvest API is yet another great option to enhance output flexibility. So that's to be added, but more importantly I realize that I can (and likely should) be writing to log file, saving to a database (one of 3 currently supported -- or why not all that are set in the config?), and streaming through the API all at once. It also makes everything feel a bit cleaner. It separates concerns too.
A channel for each series is likely needed (messages, mentions, share_links, etc.), but this may not be super handy for the API. The API may want a combined streaming response that may even be filterable. This will take some thought, but it is the direction I'd like to eventually head in.
In the process of testing, I hooked up one of the harvest processes to a channel and sent data through and out to HTTP as a JSON response (streaming) so I could look at real data somewhere other than the database or by tailing the log files.
I realized that a streaming harvest API is yet another great option to enhance output flexibility. So that's to be added, but more importantly I realize that I can (and likely should) be writing to log file, saving to a database (one of 3 currently supported -- or why not all that are set in the config?), and streaming through the API all at once. It also makes everything feel a bit cleaner. It separates concerns too.
A channel for each series is likely needed (messages, mentions, share_links, etc.), but this may not be super handy for the API. The API may want a combined streaming response that may even be filterable. This will take some thought, but it is the direction I'd like to eventually head in.