bdring / Grbl_Esp32

A port of Grbl CNC Firmware for ESP32
GNU General Public License v3.0
1.69k stars 529 forks source link

Going beyond 6 Axis #349

Closed AndreasEpp closed 4 years ago

AndreasEpp commented 4 years ago

Hello,

Do you see any chance pushing grbl esp32 beyond 6 Axis? I'm looking for 6 stepper axis and 2 servo axis but am interested how far you could push it axis wise. Background: I'm building a motion control rig and will run out of axis^^

To be clear, I am not asking for you to implement this. I am just looking for a direction. If this is theoretically possible I will go for it, if not I'll look elsewhere.

Thank you for your time and for making Grbl_Esp32!

Regards, Andy

DirtyEngineer commented 4 years ago

Why not just use 2 grbl ESP32 setups?

AndreasEpp commented 4 years ago

Thats one of the alternatives I thought of. I am heavily relying on the G93 inverse feed rate mode and am concerned two esp32 won't be fully synchronized if one is running servos and the other is running steppers.

MitchBradley commented 4 years ago

In its current form, Grbl_ESP32 is limited by RMT channels and pins. Each stepper needs 2 RMT channels of which the ESP32 has 10, so 5 total steppers. There is also a pin limitation. If you exclude the pins that are needed for UART and SD cards, you are left with about 19 pins, of which 5 can only be inputs, so you get 14 output pins. Use a few of them for spindle control, coolant, and whatnot, and you are pretty much out of pins, considering that each stepper needs 2 pins.. There are some possibilities for using I/O expanders but it will require substantial work, both software and hardware, to make that happen. One approach would be to use the new ESP32-S2 which has more pins. It is supposed to come out this summer. Bottom line - no super easy way to extend Grbl_ESP32 to more axes in the very short term.

AndreasEpp commented 4 years ago

So 5 stepper axis is a limit anyway, hmmm... The ESP32-S2 might be worth to wait ("43 programmable GPIOs" according to their website) I am currently utilizing 4 Axis on an arduino Mega, so I've got a bit of time.

My plan is to use the 6 stepper axis of the arduino mega and try to sync an esp32 to it for servo control.

This is starting to smell like it needs a homebrew solution^^

bdring commented 4 years ago

Each stepper only needs 1 RMT per channel on the step pin. The direction pin is standard GPIO.

Axes like hobby servos do not need RMT.

Most of the axis calculations use 8 bit values, so that is a limiting factor. The settings like $100 would be the next limit at 10.

With this said, it is a lot of work to change from the current limit of 6.

AndreasEpp commented 4 years ago

Axes like hobby servos do not need RMT.

Okay, but the limit with only 5 RMTs being available would still rule out the esp32 for the stepper control for me.

Most of the axis calculations use 8 bit values, so that is a limiting factor. The settings like $100 would be the next limit at 10.

That's 5 steppers and 5 servos right?

If anyone is interested: Here is a first video of my motion control rig in action. All controlled with grbl and a bit of movie magic ;)

bdring commented 4 years ago

There are 8 RMT Channels.

You are currently limited to 6 lettered axes by Grbl.

AndreasEpp commented 4 years ago

Oh, I see. I thought because of MitchBradleys answer that there would be a limit at 5. In any case it looks like going beyond 6 axis on a singel esp32 is going to be difficult.

On synchronizing two grbl controllers: Is it possible to set the "Feed Hold" mode via the GPIO pin, fill the buffer of both controllers and then start both of them via the "Resume" GPIO pin. After that I just have to keep both buffers from starving and all should be good right?

bdring commented 4 years ago

You can control hold and start via GPIO, but there is no way to guarantee sync.

What are the servos used for? Maybe they don't need to be lettered axes.

AndreasEpp commented 4 years ago

The servos control the zoom and focus of the camera. They need to be animated along with the motors.

I generate g-code with the G93 command to move to a location within a specific amount of time (e.g. 1 second). Because the steppers have a different acceleration than the servos seperating them to different controllers might be a bad idea ...

I have seen the use of M commands to lift and lower a pen with a servo in a pen plotter. The problem with this ist that the movement is instantaneous and not synced to the movement of the axis. But, if I choose the G93 intervalls small enough this may not make a difference. Is it possible to add two of these servos ontop of the 6 lettered stepper axis?

bdring commented 4 years ago

If the motion synchronization was not a big deal, I thought a command like M67 could be added.

That could send a PWM value to the servo. While the gcode is defined as synchronized, that just means the planner makes sure all previous moves have completed before sending out that PWM value. The servo would then start moving at its fastest rate towards its target and the next gcodes would also begin executing.

I could foresee upping the count to 8 sometime in the future, but it is not a priority at this time. I don't even see a lot of Grbl 6 axis machine right now.

shrelf commented 3 years ago

Hi, since the ESP32-S2 is out now, would that make adding more synchronized axes easier? Or is there a way to synchronize multiple Grbl_ESP32s? I am looking for a solution for building a motion control rig as well and Grbl_ESP23 looks almost perfect. Thanks for your awesome work!

bdring commented 3 years ago

I believe the S2 is a single core processor. That would require a more firmware work than it is worth.

The S3 sounds very promising, but I have not seen it available in module form yet.

shrelf commented 3 years ago

Ok, thanks for your reply. Is there any way to sync movements on two ESP32 devices with the existing firmware features? I think for Camera motion control, it doesn't have to be sub-millimeter accurate, so maybe there is a workaround. For example, would it be possible to have an additional virtual axis on one controller which is sent to the other controller via wifi or another data connection?(probably not an existin feature) Or Code all moves with the G93 command and send the G-code commands to the other controller? Sorry if my questions may seem trivial to you.

joedirium commented 3 years ago

Maybe a stupid idea, but you could write a small application which runs on your computer and connects to two or more boards via telnet and sends coordinated and synchronized commands to the devices. A simple raspberry would be enough to do this job. Longer motions using axis distributed along the boards will certainly not be perfectly in sync, so not suitable to reach 1/1000mm precisions along the way. Splitting down longer travels into smaller segments would result in unsteady movements - for sure not what you plan to do when moving a camera. Besides I am wondering why 6 axis are not enough to move a camera.

shrelf commented 3 years ago

You only need 6 axes to move the camera, (3 linear and 3 rotational: pan, tilt, roll), but if you want to control focus as well (which you do), you need an additional axis. Additional axes would also be useful for zoom and things like a synchronized turntable for product shots.

I think controlling two boards from an external computer with G Code commands would mean that both firmwares would calculate the moves independently, so getting some extend of sync would require coding everything in G93 inverse time mode. Which could work for relatively simple moves or one move at at time. (Please correct me if I'm wrong.) If you want the machine to move more like a 3d printer on custom paths, the movement is usually broken down into many small G-Code commands (which thanks to the planner usually don't result in jerky movement). But I think coding a lot of small moves in inverse time mode is much more complex. (again please correct me if I'm wrong) So having one firmware coordinate all axes would be ideal.

But maybe there is some other way to synchronize two boards?

However, you could still build a pretty versatile motion control rig with 6 axes including focus control. The roll axis is not neccesary for a lot of moves, so you could only use pan and tilt and use the 6th axis for focus.

joedirium commented 3 years ago

Just an opinion: Frankly speaking I do not assume big potential in implementing a feature to sync two boards into the firmware, as priority wise there are for sure more common requests. To implement this on the g-code layer would not give any advantage over the suggested approach to do this outside of grbl. Implementing something which sends sync data via SPI or similar to a second board would not give any advantage over adding hardware to the board layout and could simply complicate things.

Building new hardware supporting more than 6 axis or extending the 6pack board with additional hardware breakout would be easier. So a module for additional drivers/axis could be a good idea. (I for example am using 5 axis, might use the sixth for a tool changer some day. So additional ones to load material, move security lids etc. could make sense as well. )

As a workaround you could use 5+1 of the 7 required axis synchronously depending on the shot scenario. So your machine would implement 7 motors, If you use external drivers you could switch the step/direction lines for example by relays or transistors and use for example the mist control in g-code to switch between these two motors. You should do this for the two motors which need least synchronization. In this case there would be no hardware mod on the board and no firmware mod necessary and you could start this today - but it has limitations.

AndreasEpp commented 3 years ago

I am using the G93 command to time all the motions (4 Axis synchronous so far). Otherwise generating animations would be pretty hard. To synchronize multiple boards I am in the process of converting my machines to a wireless telnet connection. I am relying on a fast host to stream the animations to multiple boards (that's the plan anyway). So far streaming to a single Grbl_Esp32 telnet clinet works with 30fps (30 G93 commands with 33ms each). I tried using a Raspberrry Pi 3 as a wifi access point but had with my wifi router far better reliability. To test and fiddle around I can recommend LaserGRBL as it supports streaming over telnet.

shrelf commented 3 years ago

I'm aware a camera motion control rig is a special use case, it's not what grbl is designed for and that 6 axes are enough for most machines. And of course I understand that there are other priorities. So I'm not asking for anything to be implemented, just trying to find out what is possible with the existing features. Switching between Motors depending on the shot scenario sounds like a good option, but I think I might just ditch the tilt axis for now to keep it relatively simple. Just to be sure: There is no way to trigger any communication to another device via Gcode in Grbl, right? I'm asking because there are the M260 I2C send and M118 Serial print Commands in Marlin and I couldn't find anything like it in on linuxcnc.org.

Being able to send commands to another device could be useful, just to trigger that device to do something at a specific point. It doesn't have to be perfectly coordinated. In the camera control case, this could be a turntable that it told to rotate a product or setting lights to different colors and intensities. If you look at something like this, they trigger als sorts of things. (I know this is very different hardware and they use software specifically made for this, I'm just looking to make a poor man's version of something like this)

I would imagine that it could be useful for other machines as well. for example if your tool changer requires more complex movements, it could have a controller of it's own, which receives commands from the first controller whenever it needs it to to something.

@AndreasEpp Are you coding your moves manually or are you using any software to generate the Gcode?

AndreasEpp commented 3 years ago

I've written a host application in Javascript on nodeJs that generates the gcode. I've built a remote control to set keyframes and the timing between them. If you're interested here is my youtube channel where I've uploaded videos about the remote control and the motion control rig. This was the first version which has 3 axis and keyframe control. currently I am working on a second version which will be controlled from blender on my computer and send gcode over telnet. Here is my gitlab repo where you can find my code and designfiles for the remote and moco rig. Cheers

shrelf commented 3 years ago

I loked at your youtube channel, nice work! Keyframes totally make sense for a slider. I had the Idea of using blender to create Gcode as well. I was thinking about animating a camera in blender, bake the animation and export the animation path and camera orientation as Gcode via a script. The file could then be uploaded to the ESP's sd card to be stored and played back aven without a computer connected. Ideally, the blender camera would be attatched to a rigged model of the motion control rig, to have a clear preview of how the machine is going to move. Your approach of sending gcode directly from Blender sounds awesome as well, and it would allow live control of the machine. I'm curious to see how it will turn out.

AndreasEpp commented 3 years ago

Thank you very much! That is very kind of you

I think it is possible to use the webUI API to upload files (maybe there is a more direct way...) if you know the right url. Although there might be problems if the UI relies on cookie magic.

"Ideally, the blender camera would be attatched to a rigged model of the motion control rig, to have a clear preview of how the machine is going to move." => exactly my idea It's nice to meet like minded people XD

howiemnet on youtube had the same idea but is far ahead in the implementation.

In a further step I want to use gphoto2 to control the camera and stream the viewfinder to my computer so I don't have to get up at all! But until this is possible I have to be able to home all axis, learn python and write a blender plugin which can stream gcode. The rest is a piece of cake! (that's what I tell myself anyway...)

If you're interested (or anyone reading this) I've uploaded a video to youtube where I stream gcode over telnet to grbl_esp32.

Cheers

shrelf commented 3 years ago

Howiemnet's video shows exactly why additional axes would be useful: Rotating/manipulating things synchronized to the camera movement and focus control. And his shots are exactly the type I am (and probably you are) looking for.

Have you already tried to send gcode to two separate grbl controllers simultaneously as described above?

Regarding the live preview, I would probably just mount an HDMI monitor next to my computer screen.

So far I have a blender script that outputs position and rotation of an animated camera to a gcode file. I haven't tested it on a real machine yet, but I think it should work. I can share it with you if you want.

I tend to tell myself that it should be a piece of cake too, until I hit the next roadblock;). Which is usually in the details.

ithinkido commented 3 years ago

Have you already tried to send gcode to two separate grbl controllers simultaneously as described above?

https://github.com/svenhb/GRBL-Plotter

AndreasEpp commented 3 years ago

Have you already tried to send gcode to two separate grbl controllers simultaneously as described above?

Yes but one is connected over serial and the other over telnet. I have to setup two telnet capable machines to dig deeper there, but for now I am pretty happy with the result. Jogging with my remote control feels responsive and has no hiccups so far and streaming gcode works like a charm. Now I have to generalize my implementation of keyframes to accommodate different modes of transportation (serial, telnet, bluetooth) and different devices (grbl, Lamps, etc).

I think I will alter my plan a little and keep the generation of gcode on the onboard raspberry pi because I will need it for the camera stream later on anyways. I know this is kind of fancy for the purpose but "fancy" is what brought me here... On the blender side I'll interact with the same interfaces in my software as, for example my remote or the web gui does. That means less interfaces and keeping things consistent.

Regarding the live preview, I would probably just mount an HDMI monitor next to my computer screen.

To home my machine an therefore sync it to the camera in blender I need continuous rotation for the axis on my machine (it has to find an endstop in one direction and, worst case, do at least one full rotation to find it. I've used slip ring connectors to transfer all signals through the rotational axis so not wire will get twisted. HDM has up to 19 connectors and usb just 4, so I figured this might be the easier option for me. An alternative I am using currently is the built in wifi of my camera. I comes with an app that lets you control the camera and stream the preview bit it is pretty slow and has high latency. A wireless hdmi transmitter might also be an option but they are pretty expensive. I'm looking forward to see what you will come up with :)

So far I have a blender script that outputs position and rotation of an animated camera to a gcode file. I haven't tested it on a real machine yet, but I think it should work. I can share it with you if you want.

Yes please! I'm a noob to blender and python and would love to see how you made it work.

I tend to tell myself that it should be a piece of cake too, until I hit the next roadblock;). Which is usually in the details.

Glad I'm not alone XD

@ithinkido Thank you for the tip! this looks nice I'll snoop around in there and see if I can find something I can use

shrelf commented 3 years ago

Hi, sorry for the late reply. I have attached the script. I am a noob in python and blender too, so I actually asked a guy on fiverr for help. He wrote the basic functionality of reading location and rotation values for each frame. I added the formatting needed for gcode and the inverse time mode F commands.

To use the script, just load it into the scripting area of blender. You will see an output path and file name in the script. Create that file before you run the script for the first time. Otherwise it might fail if blender can't create the file. If you use parenting and/or constraints to animate your camera, you need to bake the animation before running the script and rename the baked camera to "Camera", so the script uses the right one. Otherwise movements caused by parents and constraints will not be considered and the resulting values will be wrong. To bake, click Object>Animation>Bake action image

And select visual keying, clear constraints and clear parents: image

About the live preview: SDI uses only two wires. If your camera doesn't have an SDI output, you could use HDMI to SDI converters. I'm not sure if it would work through slip rings though, as it usually uses coaxial cables. And the converters also cost money. The benefit would be that you aren't stuck with a specific usb protocol in case you want to use a different camera in the future. I'm guessing you may have a sony alpha camera. I know the wireless preview though the app is pretty useless and it sucks. :(

Which slip rings are you using? I need continous rotation as well. I'm struggleing a bit to find parts that fit well together. I can design things in Fusion and 3D print them, but I try to use as many off-the-shelf components as possible.

Export Gcode 0.2.py.zip

AndreasEpp commented 3 years ago

Well, I am also sorry I replied so late^^

Your script looks very promising, but I haven't had the chance to try it yet. I think I can use the generate part and reroute it to mqtt instead of a file. Thanks!

SDI is a very good solution, I hadn't thought of that^^. I have a Canon EOS 80D and, since last week, an EOS R6. Both have HDMI and USB and no SDI though. The wireless app works better with the R6 (maybe more CPU horsepower?) but is far from optimal. Great I am not the only one suffering from a sucky app, haha :)

The slip rings I use are 12 channel Senring rings, like this one. Adafruit also sells these (for the price of a kidney) but if you can wait, buy from a chinese seller of your choice^^ They make these also with far more channels, though they are harder to find.

I can design things in Fusion and 3D print them, but I try to use as many off-the-shelf components as possible.

Same here^^ The latest addition to my arsenal are angle brackets for furniture to strengthen 3D prints XD

I am very interested in your moco rig build. Do you maybe have pictures or a video?

shrelf commented 3 years ago

I think I can use the generate part and reroute it to mqtt instead of a file.

That sounds promising as well!

I just saw these wireless HDMI transmitters announced, they are more affordable, so maybe that's a good alternative: https://www.cined.com/z-cam-ipman-s-affordable-hdmi-streaming-device-announced/

I ordered one of the slip rings you posted and started to design a rotary joint around that. With the slip ring I found it even more difficult to find standard parts to fit around it, so now everything is 3D printed after all ;).

I am very interested in your moco rig build. Do you maybe have pictures or a video?

I haven't assembled it yet, so I don't have any pictures. But the Idea is to build it out of openbuilds aluminum extrusion profiles, probably C-Beams, with a cartesian part and the rotary head attached to the end of the y axis. The advantage is that you can easily build linear rails and easily mount stuff on them. The plan is to build it so that every axis is made of only one extrusion that supports itself. Here is a rough Mockup (without rotary head): image The x axis will have to be mounted to the floor and probably made broader with dual rails to support the overhanging weight. I plan to add a counterweight on the Z axis, so the motor does not have to hold the weight of the y axis.

Do you have pictures of your build?