OpenPHDGuiding / phd2

PHD2 Guiding
https://openphdguiding.org
BSD 3-Clause "New" or "Revised" License
250 stars 114 forks source link

[INFO/WISH] server API and headless setup #683

Open GuLinux opened 6 years ago

GuLinux commented 6 years ago

Hi,

I'm currently working on a lightweight web application for handling devices and generating sequences via INDI. The goal is to be able to start an astrophotography sequence by using only your browser, and it's particularly useful for users with embedded systems (Raspberry Pi, Odroid, etc). Most of these users are currently using a VNC server, which will increase network latency and drain batteries faster.

My project is still in very early stages (right now there's a fully functional INDI control panel, but no sequences support yet), you can have a look at the code here: https://github.com/GuLinux/StarQueW

Now for the PHD2 related question: Personally I don't use an autoguider as I'm more into wide/mid field photography (<300mm), but I can see more and more users will probably want autoguiding as a feature. And browsing the docs about your server API it seems that PHD2 has almost everything to be a guiding backend application.

I think the most important missing parts are:

Do you think this could be possible? Or am I missing something?

Kind regards, Marco

agalasso commented 6 years ago

HI Marco,

We primarily envision PHD2 as something that a user sets up interactively, but then, once it is setup, it can be controlled and monitored either interactively or through the server API. You can see that the server API allows for control and monitoring of guiding by imaging apps, but essentially requires manual setup of an equipment profile.

I think it will be difficult to avoid the manual setup. The initial setup steps almost always require manual intervention (equipment connection, calibration, polar alignment.) However, once the setup is done (over VNC?) then automatic operation is expected and fully supported.

APIs for setting up gear,

not likely based on comments above

retrieving images to display

we currently have an api for that: save_image

and more fine tuning in general of the guiding process

we currently have API's for setting/getting guiding parameters, but, if phd2 is working properly parameter tuning should not be necessary (and we discourage it). PHD2 Best Practices

a non-gui version of the app

again, not something we be likely to pursue at this time, however we would be absolutely interested in making sure the server API has everything you need to control and monitor phd2 after the initial interactive equipment setup is done

Andy

GuLinux commented 6 years ago

Hi, That makes sense, of course. And in fact, delegating the setup/configuration to PHD2 would ease my work a lot, as I'd have to create less UI and use less APIs for autoguiding.

But thinking about it, another solution, probably better and more useful, would be to create a daemon/frontend protocol, internal only to PHD. This way PHD2 would still be configured by PHD2 UI, but using serialization over network. I did something like this with PlanetaryImager (although I had the advantage of Qt5 offering some interesting tools for serialization out of the box, not sure if there's anything like this on WxWidgets).

This would not be a public API (which needs more validations, docs, etc), but a private protocol, and might be useful to anyone using PHD2 remotely.

What do you think?

Thanks, Marco

agalasso commented 6 years ago

another solution, probably better and more useful, would be to create a daemon/frontend protocol, internal only to PHD. This way PHD2 would still be configured by PHD2 UI, but using serialization over network.

I'm not quite getting that. Could you elaborate? What would be serialized over the network?

(If you want to take discussion of github you can email me at andy.galasso@gmail.com; but ok to continue discussion here too)

GuLinux commented 6 years ago

Not sure.. until we get too technical I guess it might be ok to continue here, it's all the same to me anyway :)

Well, serializing all the calls between UI and core classes, assuming there's already a fair separation between them (mine wasn't that much ready, but it didn't take too much to refactor it).

I produce three binaries in my build: the main PlanetaryImager, executable with no network serialization, where, say, the connect camera method from the UI is directly sent to the core class, and frontend and backend executables, where the connect camera method is sent to a serializer, through TCP, and then deserialized and sent to the core.

This of course really depend a lot on both the technology used in PHD2 (I know nothing about wxWidgets) and the internal design of the project

jpaana commented 6 years ago

This would be beneficial for setups running indiserver and PHD2 on Raspberry Pi and KStars on Windows for example who would like to run guiding locally but without running X just for PHD2. I have run such arrangement myself as well and after the initial setup didn't really have to touch PHD2, but still had to run VNC session to start it. Having it start as a daemon automatically on boot would be ideal for this particular use case.

d33psky commented 6 years ago

I agree it would be good to have a headless PHD2 that can be controlled completely via API. Serializing the UI <-> PHD2-Core comms sounds like a good first step to me. It will be quite some work of course but would open up very interesting possibilities.

GuLinux commented 5 years ago

Hi, sorry for bumping, but I have another proposal to implement this: How about splitting the codebase in order to have some "backend" libraries and a frontend that links to them? This way not only PHD, but other application frontends could link to them

agalasso commented 5 years ago

sounds ok in principle, but the devil is in the details. lots of stuff in the code base mixes UI presentation with "backend" guiding and it would be a project to separate those apart. If you want to start making some of those types of changes, you can send PRs. It would be best to see this as a series of small changes that could be individually reviewed.

gnthibault commented 4 years ago

Any news about this very important feature ?

gnthibault commented 4 years ago

Is there any way to financially support the project in order to see this become real ?

agalasso commented 4 years ago

@gnthibault this item is not really a high priority for us at this time because the current non-headless PHD2 is already controllable for automated imaging applications. AFAICT the main benefit of this item would be to run PHD2 on a system without an X display. But other than that, the remote control capability already exists in PHD2's current form.

As noted in an earlier comment, we will welcome PRs that work towards a headless mode for PHD2.

gnthibault commented 4 years ago

Thank you for your feedback @agalasso I already use the json rpc api, and it is really amazing ! You are right, working with PHD2 without the GUI is already possible, and works well, so it is more of a "luxury" feature. However, I don't have enough time to contribute to PHD2, but I am planning on using it a lot on a upcoming personal project, that is why I was asking about financially supporting it.

For instance, I'd love to see more commands available from the API like the ability to set-up the slit mode for spectroscopy directly from the API.

agalasso commented 4 years ago

For instance, I'd love to see more commands available from the API like the ability to set-up the slit mode for spectroscopy directly from the API.

That's an easy one. Feel free to add a separate issue for that and let us know what you need (slot position, size, angle, anything else? display on/off? anything else?)

gnthibault commented 1 year ago

Any update about this feature on the upcoming roadmaps ?

With regard to the spectroscopic feature to be available through the API, I will proceed with futher test with a colleague in the upcoming month, to assess if current event API is sufficient, or if there are key components in the UI that are missing in the event API.

Long live to this amazing project