Open europacafe opened 2 years ago
Hi, this feature is already on my roadmap. However, I think that your Raspberry Pi Zero 2 might not be enough. The problem is that I can't run Astrometry/ASTAP on Android (it's unsupported, and hopefully you agree'll on the fact that nobody wants 4GB of index files on this smartphone). Thus, plate solving must happen on the Raspberry Pi using the Astrometry INDI driver. This makes the process a bit complicated:
.fits
file from the camera.jpg
, downscaled, compressed and then sent back to the Raspberry via the Astrometry INDI driverI honestly need some help with developing this feature. All this waiting and sending the picture back and forth is not easy to code, and will also be pretty slow (there's nothing I can do about that, tho). If you or someone you know wants to contribute, I'll be more than happy!
Clear skies, Marco Cipriani
Thanks Marco for your detailed explanation.
I've just tried plate solving with ASTAP and Astrometry.net linux app on an
ASTAP data occupies 2GB space; whilst Astrometry.net full index files 9GB.
On Android TV box, during plate solving using ASTAP command line, it uses about 25% of CPU resource. I didn't measure on Poco phone, which is more powerful. Platsolving took from 1.Xsecs to within 2 minutes. Usually the fits file contains key information, i.e, estimated RA/DEC where the scope is pointing to, plus image field of view (or user can specify) which helps speed up plate-solving to within a few seconds.
As I'm not qualified to assess the technical obstacles/difficulty of Android app development but I can say the current mobile phone is powerful enough to do the plate solve.
So the phone, apart from converting the image to a jpeg file, our phone should continue doing the plate-solving part instead of sending it back to Pi. Therefore, the Telescope.touch could further repeat the plate solving, delta computation, slew command issuing to the mount until the target is centered. All within the nowadays powerful phone.
Actually, the Pi job is just to do the image capturing and, should, also convert the captured image to a smaller compressed jpg/png file, size in a few hundred KBs, before sending it wirelessly to our phone. This will speed up the data transfer time a lot. Just an idea, unfortunately, I can't do the coding myself. Perhaps, I should start learning Android coding by using this idea as a personal inspiring project.
Hi, May i suggest to have a look at a specific indi driver called... "astrometry" ? I don't know if it has been maintained since Stellarsolver enhancements, but you can give it a try Once this driver is started, it allows any client to use astrometry installed on server side (Btw : thanks for this nice piece of software) G.
Sorry, it has already been mentionned...😄
@europacafe:
ASTAP data occupies 2GB space; whilst Astrometry.net full index files 9GB.
That's exactly what I don't want. 9GB on a smartphone is a lot, unlike on PCs. I'm not just talking about available storage space, but also user experience. Google Play Store doesn't allow me to package all that stuff:
APK files have a maximum file size, based on the Android version that your APK supports:
- 100 MB – APKs that target Android 2.3 and higher (API level 9-10 and 14+)
This means that users will need to download the index files manually or copy them from a PC. I can surely do it, but I want this app to idiot-proof. Mobile games like Asphalt 9 use something called "APK expansion files" to go beyond that limit, but they are limited to 2GB. That's not enough to package ASTAP, a Linux virtual machine and the index files. Moreover, it's pretty hard to implement these extension files.
- Android TV box ... turned to Armbian, and
- under Ubuntu running inside Android Userland app on my Pocofone...
I don't want to transform people's phones into Armbian bricks, nor I can directly (somehow) embed Armbian into my app. Infact, a lot of clever engineering has gone into Android Userland in order to make it work without root. And even if you already have Android Userland installed with ASTAP configured inside, Telescope.Touch wouldn't be able to access it. Your Linux VM in Userland is isolated, and only the user can type commands and read the output. My app simply can't open an Ubuntu command line in Userland and type stuff, unfortunately. There are a lot of security "walls" in place on Android, and of them is the fact that apps are isolated. To give you an idea, interactions between apps (and even inside different "activities" of the same app) are done using something called Intent
s, which are just "intentions" to do something. Android decides if an app's intent is legitimate or not, depending on the Manifest files of the app that requests the action and the app that should run that action. This means that both apps must be designed with that function in mind. I would need to modify Android Userland to accomodate for my usecase, and I can't do that. Neither I want my idiot-proof app to ask the user to install a Linux environment, configure ASTAP and finally download the index files. Telescope.Touch is made for astrophotographers and amateur astronomers, not developers or superusers.
On Android TV box, during plate solving using ASTAP command line, it uses about 25% of CPU resource.
Speed is, indeed, not a concern. These days, most smartphones are much faster than Raspberry Pis. I meant that it would be slow because of having to send back and forth the image (INDI CCD → app → INDI Astrometry). At the moment, I see no other option.
Actually, the Pi job is just to do the image capturing and, should, also convert the captured image to a smaller compressed jpg/png file, size in a few hundred KBs, before sending it wirelessly to our phone.
The INDI CCD driver sends only .fits
files if your camera is an ASI/QHY astro camera. It sends .nef
or .cr2
for DSLR cameras set to RAW, and high-resolution .jpg
files for DSLRs set to JPG. There's no "downsample" option built-in into INDI, it just forwards what the camera sends, no processing in between. This is the reason why Telescope.Touch doesn't support live view when a DSLR is set to .nef
or .cr2
(INDI sends the RAW file, and Android isn't able to read it). Just like with ASTAP and Astrometry, I can't run a virtual machine with dcraw
(a command-line software that converts camera RAW files) inside Telescope.Touch to read those files.
I'm really sorry for the long reply, but I felt like you should know all the software challenges that prevent me from implementing these features, and why I'm asking for help on this feature. I'm just a 20 years-old electrical engineeing student, not a senior Android engineer 😅😅
@gehelem:
May i suggest to have a look at a specific indi driver called... "astrometry" ?
Yes, I'm talking about that exact INDI driver. It would solve the "running ASTAP inside Android" problem, but adds quite some complexity and a lot of work is needed in that direction.
Btw : thanks for this nice piece of software
Thank you! Appreciated!
I appreciate you took your precious time to give me a lengthy and informative explanation Marco. I did not mean to have your app working hand in hand with Ubuntu inside Userland. I just cite it to demonstrate the mobile phone capability. I'm an old-school programmer and just threw out some ideas without knowing inside the Android ecosystem. Keep doing the best you can do. Thanks.
To use Telescope touch with my electronic finder (Finder+guider camera). My finder must be pre-align (manually) with my main scope it is riding on. Workflow:
I may place my Raspberry Pi zero 2 to the main scope. It will run only INDI server and guider scope and mount driver. Telescope will do the rest, including plate solve on the Android device.