gnea / grbl

An open source, embedded, high performance g-code-parser and CNC milling controller written in optimized C that will run on a straight Arduino
https://github.com/gnea/grbl/wiki
Other
4.1k stars 1.61k forks source link

Architecture Question & Project Idea #809

Open BassMatiFreecad opened 4 years ago

BassMatiFreecad commented 4 years ago

Hello everybody!

After quite a while of searching for the "perfect" CNC controller board, I decided that Grbl on ARM is the way to go, but I didn't fall in love with a specific hardware. I have read the issue #67 discussion, and that sparked the following idea for a new project. The core question is:

Is it feasible to separate Grbl motion planning and step/direction/motion profile generation for each axis, so that they could run on different (separated) MCUs?

The core idea is to have multiple, cheap, mechanically stackable "Grbl axis drivers" connected to some sort of high-speed communication bus (e. g. SPI), driven by a Grbl n-axis motion controller, as shown in figure 1:

fig-001

The motion controller (orange) could be a "non-real-time" ARM, perhaps with FPU and enough RAM to get the job done, perhaps an i.MX RT1050 or i.MX RT1060 as used in the Teensy 4. Either stand-alone, or as shield for an Arduino, RasPi, etc.

The single-axis motion profile controller (green) could be a dirt-cheap, small but fast ARM (or other), responsible for generating a jitter-free step/direction pulse timing, combined with a stepper driver IC of choice with integrated or external MOSFETs. Not unlike the stepper driver shields, but with an added MCU.

Instead of an MCU, an FPGA or CPLD could be used, but as you stated in another issue here, that's out of the scope for Grbl. The system should be easy-to-build and program.

A more complete system on the module level would then look like attached figure 2

fig-002

An application processor (Arduino, RasPi,…) provides the front end, user interface and connectivity (Ethernet, USB, WiFi), and acts as G-Code sender (or file provider). The application processor is outside of the scope of this project.

The motion planner's sole purpose is planning the motion of axes, sent as high-level command to the axis controllers. On the other hand, it should not be impossible to integrate the front-end to some degree into the motion controller, provided that the real-time constraints of the entire system are safely met. But, as stated, this is out of scope.

Advantages ○ Low cost: PCBs with small dimensions are dead-cheap, compared with large all-in-one multilayer boards (ideally below 20 USD, the lower the better). ○ Modularity by separation of concerns ○ Simple to debug: Output of the motion planner could be routed to a file or UART instead of axis controllers. Axis motion profile generators can have a built-in self-test feature. ○ Combined debugging: Using two SPIs, motion planner and profile generator software can be run and debugged on a single development system. Useful to hunt down problems with communication. ○ Low EMI and high stability because the high frequency signals are limited to a small area on the axis drivers. ○ Number of axes is easily scalable (within the capabilities of the motion planner software). A user can start with two or thee, and add more axes later when required. ○ Driver capabilities (voltage, current, step frequency) can be adjusted to the axis needs and/or the user's budget. ○ Simple, clean wiring concept. ○ Clean and separated high current path for the stepper drivers. ○ Everything can be plugged together with existing hardware (MCU dev boards, stepper driver shields).

Of course, there are also some...

Disadvantages ○ PCBs might have to be designed and fabricated to make it plug & play for the end user. Should be no big problem.

Risks ○ Lack of CNC experience ○ Lack of solid G-code knowledge ○ Lack of knowledge about the inner workings of Grbl ○ Lack of knowledge about integrating processors into the Arduino IDE.

So, my question is: Can I hope for some support here? A discussion would be fine!

Greetings BassMati

langwadt commented 4 years ago

splitting a task into multiple MCUs gets you into the rabbit hole of synchronisation, so unless you absolutely can't do it any other way don't do it. A few dollars will get you an ARM with an FPU and +10x the performance and memory of an AVR

terjeio commented 4 years ago

@BassMatiFreecad Trinamic TMC429 may be of interest, but I have no idea how using such a chip will affect the depth of the rabbit hole. Or if it is usable at all...

BassMatiFreecad commented 4 years ago

Synchronization: Good point! I have to think about that. Could be addressed with good quartz oscillators, and a system tick, distributed to the ramp generators to keep the timers in sync. Or better yet: A system clock that drives all the timers.

TMC429: Generates 3 Ramps from a single timer. This is not that different from current Grbl, which I would prefer then. No sync issues, but only 64 microsteps. Not scalable to more than 3 axes.

TMC5160 https://www.trinamic.com/products/integrated-circuits/details/tmc5160/: Single axis stepper driver with 6-point ramp generator and external MOSFET drivers. Beautiful features, but cannot run S acceleration curves. But a candidate for prototyping and/or a feasibility study.

BassMatiFreecad commented 4 years ago

Synchronisation again: Perhaps it is easier to generate step/dir signals for all axes from a central MCU and distribute them... For this to decide, I will have to look into the Grbl ARM source code first, and understand how it works under the hood. On the other hand: If a sysclock frequency can be tuned up and down, controlled by, say, the spindle, we'd have the synchronisation feature for thread tapping and perhaps also lathe thread cutting...

Can someone provide a short overview over the structure of the code? That would really help a lot!

terjeio commented 4 years ago

@BassMatiFreecad This description is perhaps what you are looking for.

You do not want to tune the sysclock if by that you mean the main processor clock, IMO that is likely to be a additional rabbit hole dig yourself out of. The stepper subsystem is currently driven by the clock supplied to the timer(s) involved. This can be a prescaled version of the system clock, or may be supplied externally (never seen that). If supplied internally then you are out of luck because that is not, AFAIK, possible to adjust in a fine grained fashion.

Spindle sync can be achived in different ways, I am currently tinkering with using a PID loop to adjust timing of the stepper segments, this for lathe threading. Using a stepper motor for the spindle is another.

BassMatiFreecad commented 4 years ago

@terjeio: With "sysclock" i mean a system-generated (where "system" is the motion-planner CPU) clock signal that drives all the timers on the axis ramp generators. Of course they must have an external clock input for that purpose. I'm sure I've seen that (TI controllers?) Sometimes the timer is called "event counter" then, with associated compare/capture units.

So in this area all plain prairie without rabbit holes.

Lathe threading with a spindle stepper: This Old Tony on Youtube did that. Highly recommended, it's always fun to watch him! Although he used Mach3...

terjeio commented 4 years ago

@BassMatiFreecad Well, the clock naming nomenclature is varied - thats why I asked.

FYI most processors I've been working with, if not all, has timers that can be externally clocked.

However, I am not so sure the prairie is without any rabbit holes - so looking forward to see your implementation or ideas for spindle sync. IMO best by not requiring a stepper spindle motor. I am using a MSP432 for my lathe and may be willing to try whatever you come up with, for that it has to work with a 120 PPR encoder with index pulse.

BassMatiFreecad commented 4 years ago

MSP432 sounds familiar, I should have one lying around, somewhere.

For the spindle sync, a first idea:

fig-003

Orange: Functions in the Motion Planner MCU. Green: Functions in an Axis Control MCU.

There are two axisclock signals, #1 is fixed-frequency, #2 is variable frequency. The #2 frequency is determined by the rotary encoder ticks going into the up/down scaler - which will be a tricky part in the concept, if done in the digital domain. A keyword might be "clock synthesizer"...

Not sure if AxisClock1 is really needed...

The scaling factor determines the ratio between encoder ticks and linear axis ticks. It is chosen so that the outcome is the desired thread angle, regardless of the spindle speed variations. I guess, that's the way a mechanical lathe works, only that the frequency up/down scaler is a set of gear wheels combined in such a way that the spindle RPM is in a fixed relation to the X axis speed.

BassMatiFreecad commented 4 years ago

By the way: The link to the Super-Gerbil documentation really helped a lot. At the end of the G-Code interpreter is something like a "motion blocks queue". This will have to be moved to the axis controller(s).

I guess that this queue also eases the real-time requirements for the Motion Planner. Everything behind this queue is "hard real-time" though, but I think that's easily possible even at "higher" step pulse rates.

shooter64738 commented 4 years ago

Synching the axis motion together can be done fairly easy if you send the motion block data to all the axis controllers. Using bresenham, only one axis is 'master' and all other axis are slaves to that master. With each pulse output of the master axis the slaves axis' can read that pulse and after N counts read, the first slave would step, then the next and so forth. The problem i was never able to over come was that the master (which could be any axis) still had to go through all the same math to determine its acceleration rates, cornering, and segment appendices. So it was possible to do, but I really couldnt find a worthwhile advantage to it.