plusk01 / airdamon_f3

Be the Matt Damon of the skies
2 stars 1 forks source link

Compiler optimization disabling VCP output #1

Open plusk01 opened 6 years ago

plusk01 commented 6 years ago

Not sure what is going on, but on the betafpv board, make clean && DEBUG= make flash is corrupting the VCP printf -- nothing prints! (however, as tested on the discovery board with VCP + UART, VCP still reads. Also, note that UART/printf/optimization still prints as expected)

Note that if you add the following to the top of vcp.cpp, the computer will see random characters coming from the VCP:

#pragma GCC push_options
#pragma GCC optimize ("O0")

Also, perhaps change optimization from O3 to O2, see here.

plusk01 commented 6 years ago

I've narrowed it down (by applying __attribute__((optimize("O0"))) to the minimum number of functions) to the following two functions:

hw_config:281:

uint32_t __attribute__((optimize("O0"))) CDC_Send_DATA(uint8_t *ptrBuffer, uint32_t sendLength)
{
    /* Last transmission hasn't finished, abort */
    if (packetSent) {
        return 0;
    }

    // We can only put 64 bytes in the buffer
    if (sendLength > 64 / 2) {
        sendLength = 64 / 2;
    }

    // Try to load some bytes if we can
    if (sendLength) {
        UserToPMABufferCopy(ptrBuffer, ENDP1_TXADDR, sendLength);
        SetEPTxCount(ENDP1, sendLength);
        packetSent += sendLength;
        SetEPTxValid(ENDP1);
    }

    return sendLength;
}

usb_prop.c:

void __attribute__((optimize("O0"))) Virtual_Com_Port_Reset(void)
{
    /* Set Virtual_Com_Port DEVICE as not configured */
    pInformation->Current_Configuration = 0;

    /* Current Feature initialization */
    pInformation->Current_Feature = Virtual_Com_Port_ConfigDescriptor[7];

    /* Set Virtual_Com_Port DEVICE with the default Interface*/
    pInformation->Current_Interface = 0;

    SetBTABLE(BTABLE_ADDRESS);

    /* Initialize Endpoint 0 */
    SetEPType(ENDP0, EP_CONTROL);
    SetEPTxStatus(ENDP0, EP_TX_STALL);
    SetEPRxAddr(ENDP0, ENDP0_RXADDR);
    SetEPTxAddr(ENDP0, ENDP0_TXADDR);
    Clear_Status_Out(ENDP0);
    SetEPRxCount(ENDP0, Device_Property.MaxPacketSize);
    SetEPRxValid(ENDP0);

    /* Initialize Endpoint 1 */
    SetEPType(ENDP1, EP_BULK);
    SetEPTxAddr(ENDP1, ENDP1_TXADDR);
    SetEPTxStatus(ENDP1, EP_TX_NAK);
    SetEPRxStatus(ENDP1, EP_RX_DIS);

    /* Initialize Endpoint 2 */
    SetEPType(ENDP2, EP_INTERRUPT);
    SetEPTxAddr(ENDP2, ENDP2_TXADDR);
    SetEPRxStatus(ENDP2, EP_RX_DIS);
    SetEPTxStatus(ENDP2, EP_TX_NAK);

    /* Initialize Endpoint 3 */
    SetEPType(ENDP3, EP_BULK);
    SetEPRxAddr(ENDP3, ENDP3_RXADDR);
    SetEPRxCount(ENDP3, VIRTUAL_COM_PORT_DATA_SIZE);
    SetEPRxStatus(ENDP3, EP_RX_VALID);
    SetEPTxStatus(ENDP3, EP_TX_DIS);

    /* Set this device to response on default address */
    SetDeviceAddress(0);

    bDeviceState = ATTACHED;
}

The variables in use seem to have the __IO (which is #define'd to mean volatile) specifier, so I'm not sure what is going on...

plusk01 commented 6 years ago

It is interesting that if you let the compiler optimize out one/both of the above functions, nothing prints, but a delay is noticeable (as in issue #2) if you are not attempting to consume the data in miniterm.py for example. As soon as you connect the delay in the blinking LED disappears, but you still cannot see anything being sent.