Open fritzw opened 2 years ago
As per that thread in the lightburn forum I was directed to a 'developer mode' for the laserbox software Friday. Clicking around there highlighted some interesting things, one of which was that the software sends a '$H' as a homing command. That is very grbl like when I thought the software on the machine side was a derivative of marlin.
Interesting. Does it also send that command using the HTTP cnc/cmd interface? Can you try to send other grbl-specific commands (like ?
or $I
) in developer mode, and can you see responses to those commands (other than ok
) somewhere in Wireshark?
If the firmware is really based on grbl (instead of just imitating the commands), that would mean that xTool would have to publish the source code of their firmware (or at least of that part of the firmware) according to the GPL.
Okay, findings:
{"result":"failed"}
via HTTP. This is literally the only command I know where the result is not "ok" (including invalid G-code like "asdf"). According to grbl documents, this means that the jogging move was not valid (e.g. hitting soft limits, which should not have been the case here).So definitely more like Marlin than Grbl. Wonder why they chose to do $H instead of just using G0.
I have picked back up my java equivalent for the LaserBox to this project. Looking over the discussion here to see what I had forgotten in the meantime. I am having some pretty good success and I think I will be able to get to where I want to go....complete LightBurn streaming control of the machine without any XTools software. I believe the biggest piece to figure out is how to throttle things such that my proxy software only responds 'ok' to LightBurn after the previous movement has completed. As it stands the HTTP interface seems to buffer commands and response immediately.
Manual jogging via HTTP should be achievable, maybe even cutting simple calibration patterns. But I don’t think that streaming complex jobs will be possible by sending single G-Code lines via HTTP. For example, engraving dithered images required sending hundreds of tiny G-Code moves per second, and HTTP just isn’t built for that kind of real-time application. From my tests, sending multiple G-Codes via HTTP in quick succession will execute them in a pseudo-random order, not in the order they were sent.But don’t let me discourage you. If you find a way to get an acknowledgment or to enforce the execution order, that would be awesome.By the way: Have you figured out a way to get the current position from the Laser? That‘s the main hurdle I see for manual jogging, because LightBurn asks for the position when you start jogging. Alternatively, we could home the device when LightBurn asks for the position the first time, and then track the position locally in the proxy by parsing all G-Code lines we send to the Laser. However, I haven’t found a „Home“ G-Code yet.Am 12.12.2023 um 14:35 schrieb gsrunion @.***>: I have picked back up my java equivalent for the LaserBox to this project. Looking over the discussion here to see what I had forgotten in the meantime. I am having some pretty good success and I think I will be able to get to where I want to go....complete LightBurn streaming control of the machine without any XTools software. I believe the biggest piece to figure out is how to throttle things such that my proxy software only responds 'ok' to LightBurn after the previous movement has completed. As it stands the HTTP interface seems to buffer commands and response immediately.
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you authored the thread.Message ID: @.***>
"By the way: Have you figured out a way to get the current position from the Laser?" Not from the source of truth...the laser. However, the laser only deals in absolute coordinates and I have a layer in my software that stores current position such that it can convert relative coordinates to absolute when LightBurn demands relative mode.
Finding a way to get real position or command completion status will be paramount for streaming to work. But if I don't find that, the real motivation for streaming is really work positioning and I probably can get to a workflow I want by manually jogging, finding an absolute coordinate that I want, and moving stuff in the design to that. Using the camera in XCS or the old LaserBox software is always a little off and using an led, and known offsets from the cutting beam, will be more reliable.
I got everything working the way I wanted, for flat laser jobs, working last night sans the throttling issue. After depackaging the electron app that is XCS and inspecting the obfuscated javascript, I see one call that looks promising for asking the machine if it is idle/completed the last gcode command. I need to do some testing to see if that pans out. If so that gives me everything I need to stream.
GRBL has some additional commands like $I or ? and some other idiosyncracies.