Closed johnboiles closed 10 years ago
Pin change interrupts are not the same as an input capture. The hardware event does not cause a rapidly changing timer to be captured to a register. If interrupts are disabled, there can be significant latency between the hardware event and when the interrupt routine actually executes.
On AVR, I'm not willing to consider merging such code. It's simply not going to be reliable when used together with ordinary hardware serial, where the interrupt routine would delay AltSoftSerial's interrupt. With the input capture hardware, that interrupt latency can be tolerated without loss of data.
However, on Teensy 3.x, the pins have DMA trigger capability. DMA can also have variable latency, but it's on a much shorter time scale (a few cycles of a 48 or 72 or 96 MHz clock), depending on bus arbitration. The pin change DMA trigger could be used to build input capture that's good enough for all standard serial baud rates.
Thanks for the quick reply @PaulStoffregen! And great work on AltSoftSerial.
I may be misunderstanding the code, but AltSoftSerial in it's current implementation can fail if interrupts are disabled for more than 1 bit time right (assuming a transition from 0->1 or 1->0 in that time)? I understand that you'd have to capture TCNT in the service routine with pin change interrupts, so it would fail if it took more than 1 bit time to run the service routine, but that's the same failure time using Input Capture right? Saw you did the pin change interrupt support for NewSoftSerial, so you'd know better (I did something similar back in the day). I'd love to know what i'm misunderstanding here.
For background, I'm building something that needs a bunch of slow (4800-9600 baud) UARTs, mostly for receiving.
With input capture, there is no timing error as long as the latency stays under 1 bit time. If the latency is 90% of a bit time, the error is still zero.
With pin change, if the latency is 90% of a bit time, the algorithm is sampling by 90% error relative to the waveform. Not good.
I think I see. Is your objection that the error from the first bit will be added to the error from any additional bit (for example if rx_target
is initially (when state==0
) off by 50% bit time than future interrupt timings (capture
variable when 0 < state < 9
) only have 50% bit time maximum error? Trying to connect what you're saying to your code.
For slow baud rates, maximum half bit time would probably be acceptable.
I totally accept that you don't think this is appropriate for your library. But I'd love your opinion as to whether it'd be a workable option for 9600 & 4800 baud ports.
The only way to know for sure would be to write a 3rd software emulated serial library. At least you can draw on code from AltSoftSerial for an event timing approach and SoftwareSerial for the pin change interrupt abstraction.
The tricky part about "workable" is it depends on the timing characteristics of other interrupt-based code in use. Ideally, you want your code to be impacted as little as possible by latency from other code, and to impose as little latency onto other code as possible. Ordinary SoftwareSerial is particularly bad on both of those measures. AltSoftSerial is much better. I'm imagine you'd end up somewhere in the middle if you create such a library.
If you do this on Teensy 3.1 using pin change DMA triggers that capture the Systick timer into a large circular buffer, you'll probably end up with something far better, maybe even superior to ordinary hardware serial? But Teensy 3.1 has 8 byte FIFOs on 2 of its 3 hardware serial ports, which give excellent performance, on top of a very fast processor and a 16-level nested priority interrupt controller.
If you do write this 3rd library, please let me know and I'll add a link to yours from my AltSoftSerial web page.
Cool, thanks for all your feedback. I'll certainly let you know if I decide to go this route.
On Oct 27, 2014, at 12:20 PM, Paul Stoffregen notifications@github.com wrote:
The only way to know for sure would be to write a 3rd software emulated serial library. At least you can draw on code from AltSoftSerial for an event timing approach and SoftwareSerial for the pin change interrupt abstraction.
The trick part about "workable" is it depends on the timing characteristics of other interrupt-based code in use. Ideally, you want your code to be impacted as little as possible by latency from other code, and to impose as little latency onto other code as possible. Ordinary SoftwareSerial is particularly bad on both of those measures. AltSoftSerial is much better. I'm imagine you'd end up somewhere in the middle if you create such a library.
If you do this on Teensy 3.1 using pin change DMA triggers that capture the Systick timer into a large circular buffer, you'll probably end up with something far better, maybe even superior to ordinary hardware serial? But Teensy 3.1 has 8 byte FIFOs on 2 of its 3 hardware serial ports, which give excellent performance, on top of a very fast processor and a 16-level nested priority interrupt controller.
If you do write this 3rd library, please let me know and I'll add a link to yours from my AltSoftSerial web page.
— Reply to this email directly or view it on GitHub.
Pin change interrupts work like input capture interrupts except that an interrupt is triggered any time a pin on a port (e.g. PORTC, PORTB) is changed. Using some clever masking, you can make the equivalent of an input capture interrupt, except that it works for any pin.
Not totally sure if ARM based boards (such as the Teensy 3) support pin change interrupts. More research needed. At very least this would allow fully configurable pins for AVR based boards.
@PaulStoffregen would you be open to a change like this? If so, I'd be happy to take a stab at it and send a pull request.