voodootikigod / voodoospark

A RPC based firmware for a Spark Core device (like being connected, but without the wire!)
MIT License
145 stars 35 forks source link

Servo and PWM doesn't work at the same time on the same timing peripheral #32

Open reconbot opened 10 years ago

reconbot commented 10 years ago

I've been working on solving an issue with J5, spark-io, and voodoospark and have traced it down to the latest master of voodoospark having issues with servos and pwm pins at the same time.

https://github.com/rwaldron/spark-io/issues/39 has some of my troubleshooting.

Since the version of voodoospark that comes with the spark-cli is precompiled I don't know what version it is or how to compare it to the latest master.

I also can't reliably reproduce which device, the pwm or the servo won't work. It's one or the other.

rwaldron commented 10 years ago

Is this a bug in voodoospark or hardware capability issue?

reconbot commented 10 years ago

This is with no extra hardware just a pulse counting arduino. I also tried with 2 generations of sparkcores. Both work with the CLI tools copy of voodoo spark and not with master. On Oct 16, 2014 9:01 AM, "Rick Waldron" notifications@github.com wrote:

Is this a bug in voodoospark or hardware capability issue?

— Reply to this email directly or view it on GitHub https://github.com/voodootikigod/voodoospark/issues/32#issuecomment-59357113 .

voodootikigod commented 10 years ago

That helps a lot I can go from there

On Thursday, October 16, 2014, Francis Gulotta notifications@github.com wrote:

This is with no extra hardware just a pulse counting arduino. I also tried with 2 generations of sparkcores. Both work with the CLI tools copy of voodoo spark and not with master. On Oct 16, 2014 9:01 AM, "Rick Waldron" <notifications@github.com javascript:_e(%7B%7D,'cvml','notifications@github.com');> wrote:

Is this a bug in voodoospark or hardware capability issue?

— Reply to this email directly or view it on GitHub < https://github.com/voodootikigod/voodoospark/issues/32#issuecomment-59357113>

.

— Reply to this email directly or view it on GitHub https://github.com/voodootikigod/voodoospark/issues/32#issuecomment-59357442 .

Chris Williams

@voodootikigod http://twitter.com/voodootikigod | GitHub http://github.com/voodootikigod

Maker of Improbable Things: JSConf http://jsconf.com/ | RobotsConf http://robotsconf.com/ | Beer.js http://beerjs.com | Logo.js https://github.com/voodootikigod/logo.js | node-serialport http://nodebots.io/

rwaldron commented 10 years ago

I'm confused, in the spark-io bug you said the precompiled voodoospark works fine, but not the latest master, but that's not reflected here

rwaldron commented 10 years ago

This is with no extra hardware

That's not what I meant. I'm thinking about available timers, PWM vs Servo, etc

reconbot commented 10 years ago

I mean to say the precompiled version of spark that comes with the spark-cli works, and the latest master of voodoo spark (specifically at #163641e) is giving me issues.

rwaldron commented 10 years ago

I understand that part :P I only meant that you didn't include that info in this bug. Ok, moving on, I filed this: https://github.com/spark/spark-cli/issues/101

rwaldron commented 10 years ago

https://github.com/spark/spark-cli/issues/101#issuecomment-59364622

https://github.com/voodootikigod/voodoospark/blob/333f7c672ef478cab68933ac4c4675ddbb64a2ce/src/voodoospark.cpp

zsup commented 10 years ago

@voodootikigod sounds like you're on it, but if you need any input from the Spark team, lemme know

reconbot commented 10 years ago

It actually seems like this may be present on the version of voodoospark included with the spack-cli too. (What version is that?)

I'll have reduced test case a little later tonight.

rwaldron commented 10 years ago

2.2.0

Resseguie commented 10 years ago

@reconbot did you try logging the values being written by spark-io just before they are sent as suggested by @rwaldron in the original bug report?

https://github.com/rwaldron/spark-io/issues/38#issuecomment-59201437

I'm curious to see how many commands (and how often) are being sent. Too many commands at once has always been the culprit for me when I get the red SOS flashing. I've gotten it doing testing on the LED lib (e.g. pulse with short or no delay). I still suspect that might be the problem here.

If so, we might need to revisit the possibility of throttling of commands being sent, though I'm not sure how best to handle that. It's relatively straightforward for something like an LED (just set minimum delay) but much more complicated for motors or multiple devices at once.

reconbot commented 10 years ago

I did, I'll do it again to give you some output. I don't think it's an issue with a flood of data. (unless 3 commands in a row can trigger the issue, I don't get a red flash of death in any case) What I expect is being sent to the spark, it's reading it from the network and outputing the correct values as serial debug statements, but the pins are not doing what we would expect.

reconbot commented 10 years ago

Alright, I have a reduced test case.

board.on('ready', function(){
  var pwmPin = "A0";
  var servoPin = "A1";

  this.pinMode(servoPin, this.MODES.SERVO);
  this.pinMode(pwmPin, this.MODES.PWM);
  this.servoWrite(servoPin, 90);

  setTimeout(function(){
    console.log('pwm on');
    this.analogWrite(pwmPin, 200);
  }.bind(this), 5000);
});

On an arduino I'm monitoring pin A1.

Channel 1:1465
Channel 1:1465
Channel 1:1464
// PWM Pin on
Channel 1:61
Channel 1:61
Channel 1:61

And I have a serial logger on the sparkcore

Bytes Available: 3
Action received: 0
PIN received: 11
MODE received: 4
Bytes Available: 6
Action received: 0
PIN received: 10
MODE received: 1
Bytes Available: 3
Action received: 65
PIN: 11
WRITING TO SERVO: 90
// pwm on
Bytes Available: 3
Action received: 2
PIN received: 10
VALUE received: C8

Lastly, I uncommented the console logs (and annotated them) in spark-io's pinmode and *write functions.

pinMode <Buffer 00 0b 04>
pinMode <Buffer 00 0a 01>
write <Buffer 41 0b 5a>
pwm on
write <Buffer 02 0a c8>
reconbot commented 10 years ago

I forgot the best part of a bug report;

I expected the servo pin not to change output when setting a pulse width on a different pin, the servo pin however dropped to almost a 0 degree angle.

rwaldron commented 10 years ago

Doesn't make sense:

PIN received: 10 MODE received: 1

Mode should be 3

reconbot commented 10 years ago

ps. Those tests were with v2.2.0. If it helps I can repeat with master later tonight.

rwaldron commented 10 years ago

Ugh. https://github.com/rwaldron/spark-io/blob/master/lib/spark.js#L273-L276

rwaldron commented 10 years ago

Actually, I don't know what that even means at the moment.

rwaldron commented 10 years ago

Re:

Doesn't make sense:

PIN received: 10 MODE received: 1 Mode should be 3

I'm wrong, it's correct to set the pin to regular output mode.

reconbot commented 10 years ago

Well.. This may not be our problem.

This c program suffers from the same issues.

Servo s;

void setup() {
  Serial.begin(9600);
  s.attach(A1);
  pinMode(A0, OUTPUT);
}

void loop() {
  Serial.println("A1 Servo to 90");
  s.write(90);
  delay(5000);
  Serial.println("A0 PWM to 200");
  analogWrite(A0, 200);
  delay(5000);
}
Resseguie commented 10 years ago

@zsup then that might be something you all will want to look into from your end?

zsup commented 10 years ago

My gut says that the reason for this issue is that A1 and A0 are both on the same timer peripheral, which means that one can't act as a PWM pin while the other is controlling a servo, because the timer patterns are different. @satishgn is that correct?

reconbot commented 10 years ago

@zsup any idea of which pins might not be?

zsup commented 10 years ago

see this document:

https://github.com/spark/core/blob/master/Pin%20mapping/core-pin-mapping-v1.xlsx

A0 and A1 are on the same timer peripheral (Timer 2), but A4, A5, A6, and A7 are on Timer 3, and D0 and D1 are on Timer 4

reconbot commented 10 years ago

A4 and A0 have no interactions!

reconbot commented 10 years ago

We should probably take this over to spark-io but a warning or error when running incompatible pin modes or pins that have interactions would be great. I'm not sure what I'm reading with the excel sheet, but I'd love to figure out exactly what doesn't work.

rwaldron commented 10 years ago

Spark-IO can't have special knowledge like that, does Firmata warn of similar things on an UNO?

reconbot commented 10 years ago

Does this happen on the UNO?


Francis Gulotta wizard@roborooter.com

On Mon, Oct 20, 2014 at 11:40 PM, Rick Waldron notifications@github.com wrote:

Spark-IO can't have special knowledge like that, does Firmata warn of similar things on an UNO?

— Reply to this email directly or view it on GitHub https://github.com/voodootikigod/voodoospark/issues/32#issuecomment-59874817 .

satishgn commented 10 years ago

@reconbot, as @zsup pointed out, it's happening because both A0 and A1 channels belong to the same TIM peripheral. Servo works on fixed PWM freq of 50Hz whereas analogWrite() works on fixed PWM freq of 500Hz. Currently we don't have a check in place to warn about possible clashes between servo and analogwrite.

reconbot commented 10 years ago

If we reported back to spark-io what hardware we had then maybe it would be in a position to provide a warning. But i see why its a bad idea now.

Can I assume that all the timing peripherals have this limitation?

rwaldron commented 10 years ago

@reconbot that's the type of thing that I would put in Johnny-Five, but how can we signal to Johnny-Five that there are hardware constraints? Can we make it a generalized mechanism that Firmata.js, Galileo-IO, etc can also implement?

zsup commented 10 years ago

My gut reaction is that the correct way to do this would be to give each bit of hardware a json file that stores the hardware peripherals. Besides keeping the user from setting up a servo and analogWrite on the same timer peripheral, this could also do things like block the user from doing an analogWrite on a pin that doesn't have a timer peripheral at all. Perhaps the JSON would look something like this:

{
  "device": {
    "type": "spark-core",
    "version": "1.0",
    "pins": {
      "A0": ["Timer 2", "ADC"]
    }
  }
}
reconbot commented 10 years ago

@zsup would it be reasonable to assume that some timing peripherals don't have this issue? I'm wondering if a mechanism that can return errors or warnings would be the right way to approach this. That way each *-io project can assume responsibility for reporting the limitations regardless of what they are.

Possibly with a new event? Though I suppose error would do the job.


Spark.prototype.pinMode = function(pin, mode) {
  // detect a conflict with pinmodes
  this.emit('warning', string_or_array_or_object_of_warnings);
}
zsup commented 10 years ago

@reconbot every piece of hardware has different constraints, basically depending on how the peripherals are set up and exposed, so it's safe to assume there will be variability in the errors that need to be emitted. That said, I think it's safe to say that every piece of hardware has some constraints (peripherals available on some pins but not others), and maybe worthwhile to put the pattern of how to handle these constraints in the overall framework (johnny-five or firmata or both)

rwaldron commented 10 years ago

Perhaps the JSON would look something like this:

That example is reasonable starting place, but we'd still need to design aspects, eg. "conflicts" are defined. "A0": ["Timer 2", "ADC"] doesn't address the immediate issue, but works as a starting point.

rwaldron commented 10 years ago

Here's a thing... Galileo-IO needs to be able to warn you that Gen 2 boards only have 1 pwm period shared by all pins (yes... unreal); if you set up a PWM for analogWrite, then a Servo, I believe the last wins.

Resseguie commented 8 years ago

@rwaldron @reconbot was there a consensus on the best way to handle this? Is voodoospark the correct place to track it? Or should this be in either Johnny-Five or the specific IO plugins? Did you ever do something similar for devices like Galileo?

@satishgn @zsup I assume there are similar limitations for Photons now instead of Spark Core?

zsup commented 8 years ago

Yes, there are similar limitations to the Photon, but they are different, because we use a different chip (although it's very similar - an STM32F205 instead of an STM32F103, so a souped-up chip in the same family). But again, every chip has limitations that are specific to how the peripherals are exposed by the chip and by the GPIO API you're using (Arduino/Wiring, mbed, etc.). @technobly or @m-mcgowan can comment on the specifics of Photon (and upcoming Electron) vs. Core