Open darrabito opened 3 years ago
I'll look into it. Could you provide a little more context? Have you made any changes to the firmware? Does running the calibration routine work or does that have the same behavior.
I found that the calibration routine was incorrectly detecting PHASE_ORDER so I commented out
PHASE_ORDER = cal->phase_order;
in order_phases() and hardcoded that value. (EDIT: Motor pins were flipped)
Calibration now performs somewhat expecteded (a bit noisy/grindy but moves slowly in one direction then back). However, entering motor mode makes the motor 'grind' rapidly back and forth a very small amount as I mentioned previously. This occurs with a low velocity command or simply entering motor mode from serial.
The phase order and ppairs seemed to me mostly responsible for the incredible current draw. The effect is lessened now with the controller drawing ~1-1.5A instead while grinding in motor mode and ~500-700mA for calibration. However, it will occasionally dump all current when first issuing a velocity command, then start the grinding with lower current. Upon entering motor mode now in debug, after reset_foc() and the drv is enabled, it only starts drawing ~350mA and very slowly increases the draw. Commutation doubles the current draw and increases the rate at which the current draw increases. With certain KP and KD ranges, I am able to get a noisy spin in one direction (while it occasionally jitters back and forth or jumps back a lot before going the correct direction). It draws max current still while spinning but "spins" nonetheless
Perhaps in commutate(), you meant to do
controller->dtheta_mech = encoder->velocity/GR;
instead of
controller->dtheta_mech = encoder->velocity*GR;
This change allowed the motor to spin more freely without jumping backwards but still grinding and drawing a large amount of current. Higher KD_MAX values allow the motor to spin more while as KD_MAX approaches 0, it becomes more locked. Decreasing max current limit to 10A in software makes the motor less noisy as it operates only at about 600mA but does not turn at lower currents (locks, jitters).
I changed analog_sample() to set i_a, i_b, and i_c as it is done in the old firmware version (I additionally switched the motor pins to what they were in the old version). Calibration still works but is not perfect (motor "pulsates" instead of constant velocity) although reading the encoder output it looks good.
The system is mostly stable now after I tuned a bit more constants (big changes are adc sample time from 3 ->15 cycles, TIM1 period to 0x1194, and DT to 0.00005) and set overmodulation to 1.0. Additionally, I would like to account for voltage discharge now. However, if I set the field weakening current > 0A, the system exhibits large velocity oscillations at target velocity and still doesn't reach target at lower voltages (although top speed is a bit higher than without field weakening). Larger and faster oscillations as forward current is increased
I'm experiencing similar issues with T-motor AK80-9. The motor runs fine using this older binary, but experience lots of grinding noise and very little movement using the firmware here. I've tried to implement the changes mentioned without luck.
@darrabito Did you do any other changes also? I was not sure what you meant by switching motor pins, so have not done anything there yet. However, I did try to swap the PHASE_ORDER which didn't do anything good other than producing some magic smoke after a while :)
@bgkatz Do you know about any hardware changes that may have caused some differences in firmware between this branch and the previous ones?
I'm experiencing similar issues with T-motor AK80-9. The motor runs fine using this older binary, but experience lots of grinding noise and very little movement using the firmware here. I've tried to implement the changes mentioned without luck.
@darrabito Did you do any other changes also? I was not sure what you meant by switching motor pins, so have not done anything there yet. However, I did try to swap the PHASE_ORDER which didn't do anything good other than producing some magic smoke after a while :)
@bgkatz Do you know about any hardware changes that may have caused some differences in firmware between this branch and the previous ones?
Me too. Did you guys solve the problem? @darrabito @bgkatz @jljakob
__HAL_TIM_SET_COMPARE(&TIM_PWM, TIM_CH_U ((TIM_PWM.Instance->ARR))*dtc_u);
In set_dtc() or reset_foc(), if the drv is enabled, the motor dumps max current from power supply to FETs on known working conrtoller (works with version 1.9 on mbed).