LinuxCNC / linuxcnc

LinuxCNC controls CNC machines. It can drive milling machines, lathes, 3d printers, laser cutters, plasma cutters, robot arms, hexapods, and more.
http://linuxcnc.org/
GNU General Public License v2.0
1.8k stars 1.15k forks source link

Gmoccapy MSG: Must be in MDI mode..... #2453

Closed zz912 closed 1 year ago

zz912 commented 1 year ago

I use RIP LCNC Branche 2.9

Simulate the problem: 1) edit /home/user/linuxcnc/linuxcnc-2.9/configs/sim/gmoccapy/gmoccapy.ini add:

[HALUI]
MDI_COMMAND = M61 Q5
MDI_COMMAND = M61 Q2

2) Run LCNC Gmoccapy-MSG Must be in MDI mode

3) This error is random, so it may not appear the first time. Constantly switch to MDI mode and enable/disable HAL pins halui.mdi-command-00 and halui.mdi-command-01

MDI commands run fine, but sometimes this message pops up.

I have more sophisticated commands on the real machine. I was in MDI-mode the whole time when I tuned them and never had this problem. However, for normal use of my commands, I want to be in JOG mode and that's where the message appears.

zz912 commented 1 year ago

I added comments here: https://github.com/LinuxCNC/linuxcnc/blob/beacb3c0572eee72d0faf80cb736610059c37768/src/emc/usr_intf/halui.cc#L2135

   if (halui_sent_mdi) { // we have an ongoing MDI command
        if (emcStatus->status == 1) { //which seems to have finished
            halui_sent_mdi = 0;
            switch (halui_old_mode) {
            case EMC_TASK_MODE_MANUAL: 
                fprintf(stderr,"***********************************\n");
                fprintf(stderr,"HAF HAF MANUAL MODE will be set\n");
                fprintf(stderr,"***********************************\n");
                sendManual();
                fprintf(stderr,"***********************************\n");
                fprintf(stderr,"HAF HAF MANUAL MODE was set\n");
                fprintf(stderr,"***********************************\n");
                break;
            case EMC_TASK_MODE_MDI: break;
            case EMC_TASK_MODE_AUTO: sendAuto();break;
            default: sendManual();break;
            }
        }
    }

I found the "emcStatus->status" definition here: https://github.com/LinuxCNC/linuxcnc/blob/56883b969838b721c0b0ebe12b9c14abf91d640e/src/emc/task/emctaskmain.cc#L3551 https://github.com/LinuxCNC/linuxcnc/blob/56883b969838b721c0b0ebe12b9c14abf91d640e/src/emc/task/emctaskmain.cc#L3561 https://github.com/LinuxCNC/linuxcnc/blob/56883b969838b721c0b0ebe12b9c14abf91d640e/src/emc/task/emctaskmain.cc#L3564

RCS_DONE: 1 RCS_EXEC: 2 RCS_ERROR: 3

So, I added comments here: https://github.com/LinuxCNC/linuxcnc/blob/56883b969838b721c0b0ebe12b9c14abf91d640e/src/emc/task/emctaskmain.cc#L3572

    emcStatusBuffer->write(emcStatus);

    printf("!taskPlanError: %i\n", !taskPlanError);
    printf("!taskExecuteError: %i\n", !taskExecuteError);
    printf("emcStatus->task.execState == EMC_TASK_EXEC_DONE: %i\n", emcStatus->task.execState == EMC_TASK_EXEC_DONE);
    printf("emcStatus->motion.status == RCS_DONE: %i\n", emcStatus->motion.status == RCS_DONE); 
    printf("emcStatus->io.status == RCS_DONE: %i\n", emcStatus->io.status == RCS_DONE);
    printf("mdi_execute_queue.len() == 0: %i\n", mdi_execute_queue.len() == 0);
    printf("interp_list.len() == 0: %i\n", interp_list.len() == 0);
    printf("emcTaskCommand == 0: %i\n", emcTaskCommand == 0);
    printf("emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: %i\n", emcStatus->task.interpState == EMC_TASK_INTERP_IDLE);       
    printf("HAF HAF emcStatus->status: %i\n\n\n", emcStatus->status);

Result without bug: 1) emcStatus->status: 1 2) set manual mode 3) emcStatus->status: 2 (only 2 times) 4) emcStatus->status: 1 (many times) 5) set manual mode finish

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

***********************************
HAF HAF MANUAL MODE will be set
***********************************
!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 0
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 0
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 2

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 0
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 2

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

***********************************
HAF HAF MANUAL MODE was set
***********************************

Result with bug: 1) emcStatus->status: 1 2) set manual mode 3) emcStatus->status: 2 (only 1 times) 4) bug: "Must be in MDI mode to issue MDI command" 5) emcStatus->status: 3 (many times) 6) set manual mode finish


!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 1

***********************************
HAF HAF MANUAL MODE will be set
***********************************
!taskPlanError: 1
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 0
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 0
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 2

Must be in MDI mode to issue MDI command
!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 0
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

!taskPlanError: 0
!taskExecuteError: 1
emcStatus->task.execState == EMC_TASK_EXEC_DONE: 1
emcStatus->motion.status == RCS_DONE: 1
emcStatus->io.status == RCS_DONE: 1
mdi_execute_queue.len() == 0: 1
interp_list.len() == 0: 1
emcTaskCommand == 0: 1
emcStatus->task.interpState == EMC_TASK_INTERP_IDLE: 1
HAF HAF emcStatus->status: 3

***********************************
HAF HAF MANUAL MODE was set
***********************************
zz912 commented 1 year ago

I continued to search for the bug. I edited this part: https://github.com/LinuxCNC/linuxcnc/blob/beacb3c0572eee72d0faf80cb736610059c37768/src/emc/usr_intf/halui.cc#L450

static int emcCommandSend(RCS_CMD_MSG & cmd)
{
    // write command
    if (emcCommandBuffer->write(&cmd)) {
        rtapi_print("halui: %s: error writing to Task\n", __func__);
        return -1;
    }
    emcCommandSerialNumber = cmd.serial_number;

    // wait for receive
    double end;
    for (end = 0.0; end < receiveTimeout; end += EMC_COMMAND_DELAY) {

    updateStatus();
    fprintf(stderr,"HAF HAF 1;\n"); 

    int serial_diff = emcStatus->echo_serial_number - emcCommandSerialNumber;

    if (serial_diff >= 0) {
        return 0;
    }
    fprintf(stderr,"HAF HAF 2;\n");
    esleep(EMC_COMMAND_DELAY);
    fprintf(stderr,"HAF HAF 3;\n");
    }

    rtapi_print("halui: %s: no echo from Task after %.3f seconds\n", __func__, receiveTimeout);
    return -1;
}

Result:

HAF HAF 1;
HAF HAF 2;
Must be in MDI mode to issue MDI command
HAF HAF 3;
HAF HAF 1;

Possible interpretation of the problem: 1) request to change the mode to MANUAL MODE 2) esleep will give the option to run emctaskmain.cc

zz912 commented 1 year ago

Hello everybody,

1) Please ignore recent posts. I wanted to find a bug so that if a bug appears between two of my "HAF HAF" comments, that's where something is wrong. This method of finding bugs has worked for me in the past. Unfortunately, it is absolutely useless for finding race bugs. I had to start looking for the bug all over again.

2) To explain the bug, we'll start from the beginning. We will use a clean install of LCNC 2.8 and above.

3) edit /home/user/linuxcnc/linuxcnc-2.9/configs/sim/gmoccapy/gmoccapy.ini add: [HALUI] MDI_COMMAND = M61 Q5 MDI_COMMAND = M61 Q2

4) It is very important that MDI_COMMAND contains the automatic tool command. Otherwise, the bug will not happen.

5) Now I will describe the race without bug:

We will change mode to MANUAL

HALUI: change mode to MDI GMOCCAPY: -

HALUI: start MDI_COMMAND GMOCCAPY: -

HALUI: execute command M61 Q5 GMOCCAPY: run on_hal_status_tool_in_spindle_changed https://github.com/LinuxCNC/linuxcnc/blob/56883b969838b721c0b0ebe12b9c14abf91d640e/src/emc/usr_intf/gmoccapy/gmoccapy.py#L2643-L2647

HALUI: finish MDI_COMMAND GMOCCAPY: -

HALUI: change mode to old_mode (MANUAL) GMOCCAPY: -

HALUI: - GMOCCAPY: run _update_toolinfo https://github.com/LinuxCNC/linuxcnc/blob/56883b969838b721c0b0ebe12b9c14abf91d640e/src/emc/usr_intf/gmoccapy/gmoccapy.py#L3497

HALUI: - GMOCCAPY: run G43 + change MDI mode https://github.com/LinuxCNC/linuxcnc/blob/56883b969838b721c0b0ebe12b9c14abf91d640e/src/emc/usr_intf/gmoccapy/gmoccapy.py#L3540-L3544

6) Now I will describe the race with bug:

We will change mode to MANUAL

HALUI: change mode to MDI GMOCCAPY: -

HALUI: start MDI_COMMAND GMOCCAPY: -

HALUI: execute command M61 Q5 GMOCCAPY: run on_hal_status_tool_in_spindle_changed

HALUI: - GMOCCAPY: run _update_toolinfo

Now BOOM emctaskmain get command from HALUI for finish MDI_COMMAND + change mode to old_mode (MANUAL) and at the same time emctaskmain get command from GMOCCAPY run G43 + change MDI mode Result this BOOM is message: Must be in MDI mode to issue MDI command

7) Theoretical solution Remove this lines: https://github.com/LinuxCNC/linuxcnc/blob/56883b969838b721c0b0ebe12b9c14abf91d640e/src/emc/usr_intf/gmoccapy/gmoccapy.py#L3540-L3544 As soon as I deleted these lines the bug did not appear. This solution is only suitable for testing my theory.

8) A suggested solution "def on_hal_status_tool_in_spindle_changed(self, object, new_tool_no): " should be activated only when the main task is IDLE

9) Next problem These lines leave MDI mode. https://github.com/LinuxCNC/linuxcnc/blob/56883b969838b721c0b0ebe12b9c14abf91d640e/src/emc/usr_intf/gmoccapy/gmoccapy.py#L3540-L3544 The edit should look something like this:

       if "G43" in self.active_gcodes and self.stat.task_mode != linuxcnc.MODE_AUTO:
            old_mode = self.stat.task_mode
            self.command.mode(linuxcnc.MODE_MDI)
            self.command.wait_complete()
            self.command.mdi("G43")
            self.command.wait_complete()
            self.command.mode(old_mode)
            self.command.wait_complete()

I hope I managed to explain the bug.

zz912 commented 1 year ago

I looked in the history and the lines executing the G43 command in _update_toolinfo have been in Gmoccapy since the beginning.

Commits on Feb 28, 2014 https://github.com/LinuxCNC/linuxcnc/commit/579797703e6da6f026ed03896cad78be191b3dec#diff-f312cf11426f38ce14f58116be820e007813335ec2fc89eab9bcc7a6913808a9 https://github.com/LinuxCNC/linuxcnc/blob/579797703e6da6f026ed03896cad78be191b3dec/src/emc/usr_intf/gmoccapy/gmoccapy.py#L1950-L1955

And this control is even older than Gmoccapy. https://github.com/LinuxCNC/linuxcnc/blob/eea3b99b90a05bac68bef549042c208905d35720/src/emc/task/emctaskmain.cc#L2016-L2019

It's strange that the bug didn't show up earlier.

I am thinking about how to make a fix for this bug. I would like to ask for an explanation: What are the lines executing the G43 command in _update_toolinfo for?

I can think of several solutions to this bug, but I don't want to create more bugs with my fix. Anyway, I don't think it's a good idea to execute any commands in the "toolinfo" function.

zz912 commented 1 year ago

I tried to modify Gmoccapy to remove the bug. I came up with another weird thing. Sometimes the Python IDLE status is before the HALUI IDLE status.

I modified Gmoccapy.py:

    def on_hal_status_interp_idle(self, widget):
        self.stat.poll()
        LOG.debug("IDLE")
        print("HAF HAF - is it realy idle?")
        print(hal.get_value("halui.program.is-idle"))
        print(self.stat.interp_state)
        print(linuxcnc.INTERP_IDLE)

Result without bug:

[Gmoccapy][DEBUG]  IDLE (gmoccapy:2576)
HAF HAF - is it realy idle?
True
1
1

Result with bug:

[Gmoccapy][DEBUG]  IDLE (gmoccapy:2576)
HAF HAF - is it realy idle?
False
1
1

I think there is a bug in the definition of stat.interp_state. Can anyone advise where to look for the stat.interp_state source for python?

gmoccapy commented 1 year ago

Why do you think the interprate state is the problem?The output of both messages do differ only by the halui Hal pin, the interpreter state are the same in both outputs.So if you want to try to go deaper, I would suggest to look in the halui Hal pin stuff.IMHO this "behavior" is caused due to a race conflict.Norbert-------- Ursprüngliche Nachricht --------Von: zz912 @.>Datum: Mi., 31. Mai 2023, 20:06An: LinuxCNC/linuxcnc @.>Cc: Norbert Schechner @.>, Comment @.>Betreff: Re: [LinuxCNC/linuxcnc] Gmoccapy MSG: Must be in MDI mode..... (Issue #2453) I tried to modify Gmoccapy to remove the bug. I came up with another weird thing. Sometimes the Python IDLE status is before the HALUI IDLE status. I modified Gmoccapy.py: def on_hal_status_interp_idle(self, widget): self.stat.poll() LOG.debug("IDLE") print("HAF HAF - is it realy idle?") print(hal.get_value("halui.program.is-idle")) print(self.stat.interp_state) print(linuxcnc.INTERP_IDLE)

Result without bug: [Gmoccapy][DEBUG] IDLE (gmoccapy:2576) HAF HAF - is it realy idle? True 1 1

Result with bug: [Gmoccapy][DEBUG] IDLE (gmoccapy:2576) HAF HAF - is it realy idle? False 1 1

I think there is a bug in the definition of stat.interp_state. Can anyone advise where to look for the stat.interp_state source for python?

—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you commented.Message ID: @.***>

zz912 commented 1 year ago

Thank you Norbert for your response.

Why do you think the interprate state is the problem?

I know the interpreter status is related to this error. I don't know if the state of the interpreter is the result or the cause of this error. I currently assume that the cause of this error is the state of the interpreter. I am currently trying to confirm or disprove this assumption.

I made another attempt. I added these lines to gmooccapy.py.

    def _periodic(self):
        # we put the poll command in a try, so if the linuxcnc pid is killed
        # from an external command, we also quit the GUI
        try:
            self.stat.poll()
        except:
            raise SystemExit("gmoccapy can not poll linuxcnc status any more")

        if hal.get_value("halui.program.is-idle") == False or self.stat.interp_state !=1:
            print("halui.program.is-idle: %s" % hal.get_value("halui.program.is-idle"))
            print("self.stat.interp_state: %i" % self.stat.interp_state)

Result 1: (Look at the last line)

halui.program.is-idle: False
self.stat.interp_state: 2
halui.program.is-idle: False
self.stat.interp_state: 2
............
............
halui.program.is-idle: False
self.stat.interp_state: 2
halui.program.is-idle: False
self.stat.interp_state: 2
halui.program.is-idle: False
self.stat.interp_state: 1

Result 2:

halui.program.is-idle: False
self.stat.interp_state: 2
halui.program.is-idle: False
self.stat.interp_state: 2
............
............
halui.program.is-idle: False
self.stat.interp_state: 2
halui.program.is-idle: False
self.stat.interp_state: 2
halui.program.is-idle: False
self.stat.interp_state: 2

These results were generated during the MDI_COMMAND run.

The results of this test can be explained by two theories. Theory 1: self.stat.interp_state works fine and halui.program.is-idle is just slower and I'm on the wrong track. Theory 2: halui.program.is-idle works correctly and self.stat.interp_state switches to IDLE earlier than it should.

My guess is that theory 2 is the correct one. I don't have proof for it, but a lot of other experiments suggest that it might be.

So if you want to try to go deaper, I would suggest to look in the halui Hal pin stuff.

I have already studied this:

Source code for MDI_COMMAND finish is here: https://github.com/LinuxCNC/linuxcnc/blob/7baecb67c09951fc34965c53f041e9ff26d87541/src/emc/task/emctaskmain.cc#L683-L699

Source code for halui IDLE is here: https://github.com/LinuxCNC/linuxcnc/blob/7baecb67c09951fc34965c53f041e9ff26d87541/src/emc/usr_intf/halui.cc#L2177

But now I would like to find the Source code for Python IDLE. Unfortunately I don't know where to look.

zz912 commented 1 year ago

I bought a new, more powerful computer DELL 9010 SFF: INTEL i5/ 16GB/ SSD 240GB/ to eliminate this problem. I installed bookworm on it. Unfortunately, the problem still persists.

It is strange that you are not able to simulate the problem. On the other hand, I believe you, because this bug was hiding from me for two Sundays. He is insidious.

I have been working on this issue since Apr 27. Is there anything that could motivate you developers to fix the bug?

Sigma1912 commented 1 year ago

I can reproduce this on a 2.10pre build from march. This is a simulation machine without RT-kernel. What I have noticed is that sometimes while in the jogging screen and changing the mdi-command-xx pins in halshow it will switch to MDI screen but not go back to the joggin screen as it usually does.

zz912 commented 1 year ago

Hello everybody,

I spent this evening again looking for the source of this bug.

I am convinced that the source of this bug is the bad functionality of EMC_TASK_INTERP. I spent tonight to prove that EMC_TASK_INTERP works badly.

I would like to ask you to confirm or refute my theory.

Since I don't want to waste your precious time, I have prepared an attempt that has no dependence on previous posts. If you are willing to help me, just read only this post.

In this part of the Gmoocapy code https://github.com/LinuxCNC/linuxcnc/blob/67b8140c2788073cf9f95e4bef67271b1ea53e0c/src/emc/usr_intf/gmoccapy/gmoccapy.py#L2572-L2622 I would assume that the commands will be executed when the LCNC is only in IDLE. QUESTION 1: Is my assumption correct?

To verify that LCNC is in IDLE, I added the following lines to Gmoccapy's source code:

   def on_hal_status_interp_idle(self, widget):
        print("HAF HAF - linuxcnc.INTERP_IDLE = " + str(linuxcnc.INTERP_IDLE))
        print("HAF HAF - linuxcnc.INTERP_READING = " + str(linuxcnc.INTERP_READING))
        print("HAF HAF - linuxcnc.INTERP_PAUSED = " + str(linuxcnc.INTERP_PAUSED))
        print("HAF HAF - linuxcnc.INTERP_WAITING = " + str(linuxcnc.INTERP_WAITING))
        print("HAF HAF - now is " + str(self.stat.interp_state))

I would expect self.stat.interp_state to be equal to linuxcnc.INTERP_IDLE. QUESTION 2: Is my expectation correct?

Now let's watch the video: Peek 2023-06-14 21-37

When we look at the video, we find out that mostly LCNC is READING (2) and at the end of the video, LCNC is IDLE (1). QUESTION 3: Can LCNC READING (2) be considered a bug? QUESTION 4: Can the random status of LCNC IDLE (1) and LCNC READING (2) be considered a bug?

This video shows mostly state 2 and once 1. During longer testing, states 1 and 2 appear more randomly.

Please do not take this post of mine as offensive or sarcastic. I really just need help. I am very unhappy with this bug.

I asked the questions in such a way that they could be answered simply yes/no and did not delay you. QUESTION 1: yes/no QUESTION 2: yes/no QUESTION 3: yes/no QUESTION 4: yes/no

phillc54 commented 1 year ago

Do you need to poll the status channel to ensure that it is up to date before reading it? self.stat.poll()

zz912 commented 1 year ago

Do you need to poll the status channel to ensure that it is up to date before reading it? self.stat.poll()

Oh yeah, you're right. I'm an idiot :-(.

3 2
HAF HAF - self.stat.poll() was executed
HAF HAF - linuxcnc.INTERP_IDLE = 1
HAF HAF - linuxcnc.INTERP_READING = 2
HAF HAF - linuxcnc.INTERP_PAUSED = 3
HAF HAF - linuxcnc.INTERP_WAITING = 4
HAF HAF - now is 1
Must be in MDI mode to issue MDI command
3 2
Must be in MDI mode to issue MDI command
HAF HAF - self.stat.poll() was executed
HAF HAF - linuxcnc.INTERP_IDLE = 1
HAF HAF - linuxcnc.INTERP_READING = 2
HAF HAF - linuxcnc.INTERP_PAUSED = 3
HAF HAF - linuxcnc.INTERP_WAITING = 4
HAF HAF - now is 1
3 2
3 2
HAF HAF - self.stat.poll() was executed
HAF HAF - linuxcnc.INTERP_IDLE = 1
HAF HAF - linuxcnc.INTERP_READING = 2
HAF HAF - linuxcnc.INTERP_PAUSED = 3
HAF HAF - linuxcnc.INTERP_WAITING = 4
HAF HAF - now is 1
3 2
HAF HAF - self.stat.poll() was executed
HAF HAF - linuxcnc.INTERP_IDLE = 1
HAF HAF - linuxcnc.INTERP_READING = 2
HAF HAF - linuxcnc.INTERP_PAUSED = 3
HAF HAF - linuxcnc.INTERP_WAITING = 4
HAF HAF - now is 1
zz912 commented 1 year ago

Sigma1912

I can reproduce this on a 2.10pre build from march. This is a simulation machine without RT-kernel. What I have noticed is that sometimes while in the jogging screen and changing the mdi-command-xx pins in halshow it will switch to MDI screen but not go back to the joggin screen as it usually does.

This bug is possibly the cause of other problems. If it's not resolved, we can't move on. For example, when this error occurs, nonsensical tool corrections are displayed. Wrong_correction

That's why I'm sorry that there is no interest in fixing this bug because it makes using LCNC very dangerous when using ATC.

Sigma1912 commented 1 year ago

AXIS gui does not seem to have this bug, would that not indicate that the problem is with gmoccpy rather than with python stat or halui?

zz912 commented 1 year ago

AXIS gui does not seem to have this bug,

This bug is really insidious, the fact that it did not show up in AXIS does not mean that it is not there. I'm paranoid. I've been looking for that bug for a long time.

Assuming this bug is not in AXIS, we can rule out halui.

I think AXIS doesn't use python stat. It is so? That's why I didn't rule out python stat.

phillc54 commented 1 year ago

AXIS does use linuxcnc.stat L3540

zz912 commented 1 year ago

AXIS does use linuxcnc.stat L3540

Thank you. I did not know it.

Sigma1912 commented 1 year ago

I have not been able to reproduce this bug in AXIS despite changing mdi-command-xx for about a hundred times in jog mode, zero issue while in GMOCCAPY it pops up pretty much right away on my machine. Maybe comparing the relevant code of axis and gmoccapy could give an indication of why one is working while the other is not?

zz912 commented 1 year ago

Sigma1912 Can I ask you for test? Could you remove lines from point 7 https://github.com/LinuxCNC/linuxcnc/issues/2453#issuecomment-1565086089 ?

Sigma1912 commented 1 year ago

I'm running a RIP install and I presumed that I could just alter python files and restart the config to test, yet it seems that somehow it is not using the updated code? Surely I don't need to recompile for python code.

zz912 commented 1 year ago

When I make a change in the python file, I have to be in the terminal in the linuxcnc/src folder and I have to run make.

Sigma1912 commented 1 year ago

I see, I'll have to switch to another machine since this one has issues that give me errors while running make.

Sigma1912 commented 1 year ago

Just made a new rip install on a different computer but on that machine I cannot reproduce the issue at all unfortunately.

zz912 commented 1 year ago

Try again after some time, after restarting the PC. I've cheered many times that some modification of mine fixed the bug, but it always appeared.

zz912 commented 1 year ago

Hello everybody,

I feel like I've solved it again. However, I don't want to write here that I have a solution. Therefore, I will write that I have another theory of the cause of this bug.

It will be theory number 156:

Here si defined finish of MDI_command: https://github.com/LinuxCNC/linuxcnc/blob/56883b969838b721c0b0ebe12b9c14abf91d640e/src/emc/task/emctaskmain.cc#L683-L699

I believe that one condition is missing to determine the end of MDI_command. The condition halui_sent_mdi == 0 is missing there.

The halui_sent_mdi parameter is defined here: https://github.com/LinuxCNC/linuxcnc/blob/beacb3c0572eee72d0faf80cb736610059c37768/src/emc/usr_intf/halui.cc#L253

A situation may arise: 1)emctaskmain.cc sets emcStatus->task.interpState = EMC_TASK_INTERP_IDLE but halui_sent_mdi in halui.cc is 1 2) When emcStatus->task.interpState = EMC_TASK_INTERP_IDLE Gmoccapy wants to set G43 and therefore sets MDI mode 3) halui_sent_mdi in halui.cc has the value 1 and thus the old_mode setting is started https://github.com/LinuxCNC/linuxcnc/blob/beacb3c0572eee72d0faf80cb736610059c37768/src/emc/usr_intf/halui.cc#L2131-L2141 4) If old_mode was manual, we have the mode manual NOT MDI !!! 5) Gmoccapy execute G43 command in manual 6) https://github.com/LinuxCNC/linuxcnc/blob/eea3b99b90a05bac68bef549042c208905d35720/src/emc/task/emctaskmain.cc#L2016-L2019

I would like to ask for help in verifying this theory. I would like to add the condition halui_sent_mdi == 0 to finish MDI_command.

The problem is that the halui_sent_mdi parameter is in halui.cc and the finish MDI_command is in emctaskmain.cc. I don't know how to transfer parameters between two files. I guess it has to be done through the header file somehow?

zz912 commented 1 year ago

Theory number 157:

https://forum.linuxcnc.org/gmoccapy/49335-self-command-mdi-without-self-command-wait-complete?start=10#274170

gmoccapy commented 1 year ago

Sorry, I spend a whole day, trying to reproduce the error, but was not able on my Laptop (i7 with Linux Mint and actual kernel 6.2.0-43). I also use MDI commands for a 24 positions rack tool change on a real machine about 3 years and had never this problem.

It is very complicated to find a supposed misbehavior which only occur randomly and only on very few computers. I will not spend more time on this at this moment. ReneDEV is actually rebuilding the complete tool handling in LinuxCNC and will throw out all the IOCONTROL stuff, so io will be gone in a short time.

By the way, are you using io or iov2? do not use iov2!

Closing this until getting a situation where reproducing is possible.

zz912 commented 1 year ago

Hello Norbert,

It's a shame you can't simulate it.

ReneDEV is actually rebuilding the complete tool handling in LinuxCNC

Does this change apply to version 2.9.0 which is coming out soon?

Will the Rene-dev remake cover this issue as well? https://github.com/LinuxCNC/linuxcnc/issues/2489

Please Re-open this Issue.

I would like to ask you to believe me that this problem is in these lines: https://github.com/LinuxCNC/linuxcnc/blob/56883b969838b721c0b0ebe12b9c14abf91d640e/src/emc/usr_intf/gmoccapy/gmoccapy.py#L3540-L3544 These lines cause race conditions. I realize that these lines have been in Gmoccapy since the beginning and no one has complained. Trust me, I'm a bug magnet.

Once I removed these lines from gmoccapy.py, my problems with the M61 and M6 disappeared, not only this problem.

I have several Issues open here on github regarding the M6 and M61. It makes a bit of a mess because I didn't know all these Issues had this cause.

In some other Issue, you wrote that it is LCNC's fault. I don't want to disagree with you, but I think there are two ways to look at it.

First way: Your code is OK and the problem is deeper in LCNC. I think, that this I can not solve. I was waiting for LinuxCNC Meetup in Stuttgart.

Second way: We have to accept that the LCNC is imperfect. We introduce the rule: "def _update_toolinfo(self, tool):" must not run any python interface command. "def _update_toolinfo(self, tool):" is only for editing the GUI I was hoping that fixing RETAIN_G43 in the INI file might allow these lines to be deleted.

I think that the conclusion of the Meetup in Stuttgart may not be a solution to this problem, but an opinion should be agreed whether it is a bug of LCNC or a bug of Gmoccapy (Python Interface)

Way1 or way2 can be correct. It is necessary to agree.

I'm sorry I couldn't make it to the Meetup in Stuttgart. My English is bad. I would like to know you, but I would not understand you.

gmoccapy commented 1 year ago

Rene's part will be in 2.10 not in 2.9! The G43 problem can be reproduced, every time you enter a tool with Z-Offset of zero, the G43 will be canceled, that causes strange behavior. This for sure will be fixed, but mostly not in 2.9!

I know it is a shame not to be able to solve every problem, but that is unfortunately the fact. If disabling lines in gmoccapy code, that might be the way for you to go. I am sure that the lines do not cause the problem you discribed, as also with lower cycle time you reported the beehavior.

zz912 commented 1 year ago

Would it help if I somehow made my computer available to you, where the error can be simulated? I don't know how to do it right now, but I can keep my computer on all the time.