Open rei-vilo opened 11 years ago
I took a quick look at this, and it's a frustrating issue. The I2C module sets the P flag when a stop condition is encountered, but does NOT raise an interrupt, so the stop condition isn't handled. I guess a possible workaround would be to set a timer and periodically test the P flag if we've been through the slave receiver (TW_SR_DATA) code. If the P flag is set, then the buffer is returned...
I stopped using the PIC32 platform for this very reason.
The Tiva C Series ARM-based MCU from Texas Instruments provide a great alternative. The evaluation board, called LaunchPad Tiva C Series, even includes a hardware debugger, all for USD13.
Graham / /Rei,
What chips were you trying to use with I2C that raised this issue? I want to try to recreate the issue.
EDIT: Never mind, the forum thread in Rei's first submission explains how to reproduce.
Jacob
I've tried and faced the issue with the PIC32MX320F128 featured in the Diligent Uno32 board.
This bug/feature was confirmed at chipKIT forum
I am using the PIC32MX320F128H, as on a Uno32 board. Looking at the PIC32 Family Reference Manual - Section 24 - Inter-Integrated Circuit, it looks like the chip doesn't raise an interrupt on a stop condition.
I was able to get I2C to behave like an Atmel Arduino at the expense of burning a few hundred cycles in the interrupt. It potentially could be flakey if the master has too much delay from the last bit before presenting the stop. Tomorrow I will present the code for review. I think it should be the defacto operation for I2C for maximum Arduino compatibility and documented well incase the user doesn't want to burn cycles in an interrupt.
Jacob On Feb 4, 2014 12:08 PM, "GrahamM" notifications@github.com wrote:
I am using the PIC32MX320F128H, as on a Uno32 board. Looking at the PIC32 Family Reference Manual - Section 24 - Inter-Integrated Circuit, it looks like the chip doesn't raise an interrupt on a stop condition.
Reply to this email directly or view it on GitHubhttps://github.com/chipKIT32/chipKIT32-MAX/issues/310#issuecomment-34101159 .
So here is the change in twi.c that I made to allow chipKIT I2C to mimic detection of the stop bit in an interrupt. It works by burning a some CPU cycles in the interrupt after the data has been received so that the stop bit (P flag) can be detected. Potential issues with this method are that a stop bit that comes late could be missed. I will post my sketches in the forum thread mentioned in Rei's initial comment.
case TW_SR_DATA:
if( twi_rxBufferIndex < TWI_BUFFER_LENGTH )
{
twi_rxBuffer[twi_rxBufferIndex] = ptwi->ixRcv.reg;
twi_rxBufferIndex++;
}
// Release clock line
ptwi->ixCon.set = (1 << _I2CCON_SCLREL);
ptwi->ixStat.clr = (1 << _I2CSTAT_I2COV) | (1 << _I2CSTAT_RBF);
// Burn a few hundred cycles to wait for the stop condition
// (burns about four cycles per iteration)
int i;
for(i = 0; i < 250; i++)
{
asm volatile("nop");
}
// If the stop flag is set then invoke the twi_onSlaveRecieve callback
if(ptwi->ixStat.reg & (1 << _I2CSTAT_P) )
{
twi_onSlaveReceive(twi_rxBuffer, twi_rxBufferIndex);
twi_rxBufferIndex = 0; // Reset the index to zero
}
break;
I was thinking the I2C interrupt could be made very low priority to prevent the delay in the interrupt from starving other interrupts or optionally very high priority to make sure that detection of the stop bit is not missed.
Jacob
There are probably two more things that need to be done for completeness.
Jacob
I tried it successfully with the following configuration:
300
instead of 250
loops to wait for the stop
delay(20);
between two consecutive sendings.Now, for newer PIC32 MCUs with stop detect, how to implement stop detection? Section 24. Inter-Integrated CircuitTM (I2CTM) says bit 22 PCIE
of I2CXCON
PCIE: Stop Condition Interrupt Enable bit (I2C Slave mode only)(1) 1 = Enable interrupt on detection of Stop condition 0 = Stop detection interrupts are disabled
The only problem is the solution isn't very reliable.
Unfortunately,
for(i = 0; i < 250; i++)
on the PIC32-powered slave I²C delay(20);
on the master I²C both need to be changed each time I take another board for I²C master.
Is this a software bug or a hardware bug with PIC32 or ChipKit? I can work around the 1 byte reads if sending data from another micro controller as I can make a block protocol, but with data coming from sensors I cannot modify it is more difficult, certainly not a show stopper but wasting a lot of resources working around it.
Whats the solution or can there ever be a solution?
Another question, is this really a bug or are multi byte sends just something the arduino has and in reality dealing with single bytes sends is the norm?
@screamingtiger
Please have a look at the different threads at the chipKIT and Diligent forums, and feel free to add your post!
@rei-vilo its very apparent this issue is not going to be resolved. Sad, the ChipKit is a great way to use the Pic32. I2C being so common, it needs to work. Once byte reads suck as many sensors may send chunks of data in 4 byte blocks and you cannot tell when a block stops or starts other than attempting to keep in "sync".
I still don't understand if this is a driver/software problem or a hardware problem, any idea? I can research it but not going to as I have it working for what I need. But using i2C sensors will require an Arduino to read them, and then relay the information to the chipkit via a block protocol. For many projects the Arduino can just replace the chip kit as the chip kit cant be stand alone!
As I wrote before,
I stopped using the PIC32 platform for this very reason.
I've switched to the LaunchPad boards from Texas Instruments and I²C slave works very well, in standard mode (100 kHz), fast mode (400 kHz) and even fast mode+ (1 MHz)! The I²C slave identifies the speed and adapts itself automatically.
If you're dealing with one single I²C master, try the work-around proposed by @JacobChrist. The only limitation is, settings need to be adapted each time the chipKIT board acting as I²C slave is connected to a new I²C master.
Oh I see, you gave up :) That work around will not work for me since it ties up the ISR too long. I am using the softwarePWMservo library as well as reading PWM signals and a delay like that is unacceptable when I have been doing what I can to optimize the rest of the code to increase speed.
Would you agree?
Well, maybe your turn to take another board!
Maybe! I was given my chipkit Pi for free in a design challenge so I need to use it this time. Now that there is a C compiler for the parallax propeller, I think I am going to go that route.
A prop pared with a Pi or beagle bone is some serious power. A computer with a very strong real time multi-core component , opens up some serious opportunity.
From http://www.chipkit.org/forum/viewtopic.php?f=7&t=1076
Thank you for having a look and fixing this issue!