particle-iot / spark-server

UNMAINTAINED - An API compatible open source server for interacting with devices speaking the spark-protocol
https://www.particle.io/
GNU Affero General Public License v3.0
442 stars 136 forks source link

core won't stay connected to local server #35

Closed schwiet closed 9 years ago

schwiet commented 9 years ago

For some reason, most of the time, my core won't stay connected to the server. It keeps going from cyan, to blinking green, to blinking red. and from the server I see:

Your server IP address is: 192.168.x.x
server started { host: 'localhost', port: 5683 }

Connection from: 192.168.x.y, connId: 1
on ready { coreID: 'xxxxxxxxxxxxxxxxxxxxx',
  ip: '192.168.x.y',
  product_id: 0,
  firmware_version: 0,
  cache_key: '_0' }
Core online!
Connection from: 192.168.x.y, connId: 2
on ready { coreID: 'xxxxxxxxxxxxxxxxxxxxx',
  ip: '192.168.x.y',
  product_id: 0,
  firmware_version: 0,
  cache_key: '_1' }
Core online!
Connection from: 192.168.x.y, connId: 3
on ready { coreID: 'xxxxxxxxxxxxxxxxxxxxx',
  ip: '192.168.x.y',
  product_id: 0,
  firmware_version: 0,
  cache_key: '_2' }
Core online!

etc....

Sometimes, I eventually see:

 1: Core disconnected: socket error Error: read ECONNRESET { coreID: 'xxxxxxxxxxxxxxxxxxxx',
   cache_key: '_3',
   duration: 240.642 }
 Session ended for _3

And then it will connect and stay connected. Is this indicative of something being configured incorrectly?

kennethlimcp commented 9 years ago

@schwiet, just wondering what user firmware are you running on the core? Default tinker firmware or your own via local compile or cloud compile?

schwiet commented 9 years ago

Hi Kenneth,

Thanks for the reply! I'm just using the default firmware.

This leads me to another question though (this is my first spark), How exactly do I write my own firmware? I have tried the Spark-dev atom editor. It looks like I'm logged into my local cloud server, but the ide doesn't actually find my connected core. You mention local compile, can you point me to any documentation on how to do that? I'm assuming I need a cross compiler, but I have been unsuccessful finding mention of this in the documentation...

Thanks

On Thu, Dec 25, 2014 at 7:28 PM, Kenneth Lim notifications@github.com wrote:

@schwiet https://github.com/schwiet, just wondering what user firmware are you running on the core? Default tinker firmware or your own via local compile or cloud compile?

— Reply to this email directly or view it on GitHub https://github.com/spark/spark-server/issues/35#issuecomment-68117621.

kennethlimcp commented 9 years ago

This is not the best place to discuss and the community would be more appropriate.

1.) Read docs.spark.io. There is the Web IDE for you to write your code and flash on the core. Spark dev does the same thing except that the IDE is installed on your desktop

2.) The local cloud code does not come with the build farm feature. i.e. you will not be able to compile code in to a binary using the local cloud setup

3.) Local compile setup can be done using GCC. You can find more information in the community as well

4.) If you have Spark-cli installed, try to flash the latest tinker firmware and see if you still observe the same issue. ;)

schwiet commented 9 years ago

How do I flash the latest firmware? I can't find any documentation for this, all I can find on the forums is that they are supposed to update automatically from the cloud, but my core is connected to my local cloud. Sorry if this is documented and I'm just not seeing it...

kennethlimcp commented 9 years ago

You will need to use Spark cli and the command spark flash core_name xxxxxx.bin to perform the flash. The docs for this is here: https://github.com/spark/spark-cli#spark-flash

I would suggest you head over to community.spark.io, open up a thread for a better discussion and understanding of how all of these come together ;).

Playing around with the default spark cloud first is also a better option as the full capabilities are available for you to tinker around before switching over to your own cloud where you will then have a better understanding of what's available and what's not.

schwiet commented 9 years ago

you're asking me to update the firmware, but it's still unclear how... spark flash core_name xxxxxx.bin where do I get the latest firmware? the documentation doesn't say, it just tells you how to reset to "the original firmware that ships with the core". Thanks for the tip to read the community posts, I've been doing that the last few days. I'm just trying to figure out how to do what you're asking me to try.

kennethlimcp commented 9 years ago

@schwiet, that's why i mentioned you should open up a thread in the community for us to help you on this.

Discussing this on github is simply not comprehensive enough.

There are many things you need to know to even get to this stage:

1.) The default firmware is known as tinker (documented in docs.spark.io)

2.) To flash firmware via Spark-cli, the command is spark flash core_name xxx.bin

3.) There is also some known firmware packaged in Spark-CLI listed here: https://github.com/spark/spark-cli#flashing-a-known-app

4.) You want to flash the core with the latest default firmware (tinker) so the command will be:

spark flash core_name tinker

I wouldn't want to continue this discussion about the procedures required as my original intent is to help debug the original issue you opened. We can discuss this in the community and once you are more familiar with the environment, it will take a few minutes for us to figure out the original issue.

Hope this helps!

schwiet commented 9 years ago

sorry for the confusion, I wasn't trying to ask you what tinker is, what spark-cli is, etc... It just was not obvious to me (from the documentation) that spark flash core tinker would flash the 'latest', the documentation makes it sound like it restores the factory image. I just wanted to know how to do what you were suggesting...

updating the firmware with spark flash core_name tinker

seems to have done the trick. Thanks and sorry for the digression.

kennethlimcp commented 9 years ago

Awesome! So i presume this is resolved and you can probably close the issue?