Open basfari opened 3 years ago
The clock isn't precisely 12MHz and to get an accurate value, you'll have to run the motor with a known STEP frequency and read the TSTEP value and then do the conversion (ch12.1). It doesn't explain why your measurements are ordered this way. It should more likely be that first case would be the faster one. Though both should exhibit near instantaneous velocity ramps.
Thank you very much for the answer. Please forgive me if my questions are stupid.
How to know the STEP frequency?
Do you mean by the conversion Ch12.1 the following formula: V[us/s] = (VMAX f_CLK) / 224 ?
I did not understand what you meant in your last sentence “It doesn't explain why your measurements are ordered this way. It should more likely be that first case would be the faster one. Though both should exhibit near instantaneous velocity ramps.” Do you want to say that the first set of parameters (the one with VSTART=1) should be actually faster than the second set of parameters?
You can either use an MCU to send step pulses into the driver (so switch off from SPI_MODE) or run a motor with a known velocity using the ramp controller and count rotations over time. The longer you count the better your accuracy. So if the motor rotates 120 revolutions in two minutes, that's 120 x 200 fullsteps, which is 120 x 200 x 256 microsteps. This would be 30720 usteps per second.
I suppose you can use that formula but I was thinking the last one; TSTEP = f_clk / f_step
.
In your first set, the V1 is greater than VMAX so AMAX and DMAX aren't used. Acceleration from 1 to 1000 at A1 should take 0.04t, deceleration 0.02t. Basically instant.
In your second set V1 is 0 so A1 and D1 are not used. Acceleration from 950 to 1000 at 500 takes 0.10t and deceleration 0.06t.
Nominal speeds being the same, I'd expect the second set to spend more time in acceleration stages and being the slower one. It's quite possible I missed something but just my quick cursory look at the parameters used.
Hi, first of all thank you very much for your work. This is more a question than an issue. It is about the velocity of TMC5160 which I did not fully understand it. In the datasheet, the unit of the velocity is [micro-steps / t]. Where t = 2^24 / f_FLK ; f_CLK is set to 12MHz (by default). --> t = 1.3981 sec. Thus, when we set for example the velocity to 1000, it is going to be 1000 microsteps per "at least" 1.3981 second (at least because there is still acceleration and deceleration during the first and end several steps as they will be slower than the max speed in the middle).
For example, with the following parameters: VSTART = 1 A1 = 25000 V1 = 250000 AMAX = 5000 VMAX = 1000 DMAX = 5000 D1 = 50000 VSTOP = 10 I get the time duration needed to move the stepper motor 1000 microsteps: 1.5 second (there is 0.1 second more than 1.3981 due to the accerelation and deceleration).
BUT when I set the parameters to the following values: VSTART = 950 V1 = 0 AMAX = 500 VMAX = 1000 DMAX = 500 D1 = 100 VSTOP = 970 I get the time duration to move the stepper motor 1000 microsteps: 1.3762 second (which is even less then the max velocity limit: 1000 micro-steps per 1.3981 second).
How can I understand this weird value?
Did anyone develop a converting factor to make the velocity moves per standard unit "1 second"?