pkourany / SparkIntervalTimer

Interval Timer Library using Core hardware timers
Other
49 stars 33 forks source link

Serial() issue with SparkIntervalTimer interrupts #1

Closed aaraujo11 closed 8 years ago

aaraujo11 commented 9 years ago

Hi @pkourany ,

I am experience some missing bytes when i'm reading data via Serial1, and even in Serial when i'm printing some debug info, that sometimes it occurs that issue. But when i don't use SparkIntervalTimer interrupts routines all work fine.

There is any solution for this or it is a simple fact when we use interrupts?

Thanks!

pkourany commented 9 years ago

I have heard of this issue but have not had to time to discuss it with the Spark engineers nor try to debug it myself. I'll tag the Spark folks to see if they can help.

aaraujo11 commented 9 years ago

Ok, thanks if you need further information, i'm glad to help. I have some deadline issue, otherwise i have to change the strategy. I will try to debug the issue, meanwhile if some one can give me some guidelines they are welcome.

Just more info about my serial config: USART_InitStructure.USART_BaudRate = 115200; USART_InitStructure.USART_WordLength = USART_WordLength_8b; USART_InitStructure.USART_StopBits = USART_StopBits_2;

pkourany commented 9 years ago

Are you using the allocated timer or selecting a specific timer? I am not sure if TIMR2, 3 or 4 affect the timing on the USART. Also, depending on the frequency of timer interrupts, the priority defined in SparkIntervalTimer may be higher than the USART ones. So it's either a hardware issue or a software issue in the timer setup or IRQ servicing. My gut tells me it may be an interrupt priority issue.

aaraujo11 commented 9 years ago

I have already tried independently the 3 timers (TIMR2, TIMR3 e TIMR4) and the Auto allocation. In terms of interrupt frequency i already tried 100Hz and 10Hz. Yes it sounds to be in IRQ servicing, interrupt priority issue.

pkourany commented 9 years ago

I will have to look at the latest repo code to see what the Serial ports are set to. I can't remember what I did for the timer interrupts. I'll look tonight.

aaraujo11 commented 9 years ago

In a quick view if i'm not wrong, in terms of priorities, we have:

SparkIntervalTimer Interrupt: nvicStructure.NVIC_IRQChannelPreemptionPriority = 0; nvicStructure.NVIC_IRQChannelSubPriority = 1;

USART Interrupt: NVIC_InitStructure.NVIC_IRQChannelPreemptionPriority = 7; NVIC_InitStructure.NVIC_IRQChannelSubPriority = 0;

aaraujo11 commented 9 years ago

Hi Paul,

Yesterday I tried some configurations, and actually i end up in the following working config, SparkIntervalTimer Interrupt: nvicStructure.NVIC_IRQChannelPreemptionPriority = 8; nvicStructure.NVIC_IRQChannelSubPriority = 1;

But i really don't know if using this config will affect other routines, what you think about it? I will continuing perform some tests.

Thanks!

pkourany commented 9 years ago

@aaraujo11, it turns out the Spark is configure for 16 levels of interrupt priorities (0 = highest) and NO sub-priorities. From what I can see, the firmware leaves out levels 8, 9 and 10 (possibly for future use) and user-interrupts are set to 11. This is why the 8 worked. You should change the subPriority to 0 since it may cause problems if set to other values. I will revise the code to set the priority to 10 which is the lowest of the highest priority interrupts. Can you try that value and let me know how it goes? FYI, the Serial1 and 2 priorities are set to 7.

aaraujo11 commented 9 years ago

Hi Paul, Sorry take so long to reply, thanks for the guide, i already make that change by setting to level 10 the priority, and apparently it is work perfectly. I never had the issue again.

Thanks!