particle-iot / spark-server

UNMAINTAINED - An API compatible open source server for interacting with devices speaking the spark-protocol
https://www.particle.io/
GNU Affero General Public License v3.0
441 stars 136 forks source link

0.6.0-rc.2 firmware no longer able to connect to spark-server #74

Closed chuank closed 7 years ago

chuank commented 7 years ago

I know I'm flogging a dead horse here – it's 2 years since spark-server was updated. ☹️

However... Has anyone tried connecting their 0.6.0-rc.2 photon to your local spark-server?

It's no longer working for me. Went through all the bits on key generation, provisioning on the spark-server, then claiming it there, but the Photon flashes cyan, then gives a short white LED status, and loops in flashing cyan.

No traces on the spark-server log that showed the Photon even being seen by spark-server.

Anyone?

dmiddlecamp commented 7 years ago

Thanks for the heads up, I'll check this out

Sent from my iPhone

On Oct 26, 2016, at 9:37 PM, Chuan Khoo notifications@github.com wrote:

I know I'm flogging a dead horse here – it's 2 years since spark-server was updated. ☹️

However... Has anyone tried connecting their 0.6.0-rc.2 photon to your local spark-server?

It's no longer working for me. Went through all the bits on key generation, provisioning on the spark-server, then claiming it there, but the Photon flashes cyan, then gives a short white LED status, and loops in flashing cyan.

No traces on the spark-server log that showed the Photon even being seen by spark-server.

Anyone?

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.

ghost commented 7 years ago

David, just to try to help motivate this, I am in the middle of product development that is significantly hindered by the lack of local server support. Local servers let me download a lot of sensor reading quickly to diagnose the signal the photon is receiving and analyzing. Currently I can't even get the photons to connect to the local server. It's been very frustrating. Please help us out!

dmiddlecamp commented 7 years ago

Hi @IcedMocha ,

If you need an urgent workaround, something I like to do when I need to dump a ton of data for debugging is use UDP broadcasts.

example code: https://gist.github.com/dmiddlecamp/24db9746998373872b472b616135dcc4

how to monitor:

particle udp listen 3444

I will look into this as soon as I can though! But it might take me a few hours, and maybe up to a few days.

Thanks! David

ghost commented 7 years ago

Cool, I will look into that. Thanks. On Thu, Oct 27, 2016 at 10:12 AM David Middlecamp notifications@github.com wrote:

Hi @IcedMocha https://github.com/IcedMocha ,

If you need an urgent workaround, something I like to do when I need to dump a ton of data for debugging is use UDP broadcasts.

example code: https://gist.github.com/dmiddlecamp/24db9746998373872b472b616135dcc4

how to monitor:

particle udp listen 3444

I will look into this as soon as I can though! But it might take me a few hours, and maybe up to a few days.

Thanks! David

— You are receiving this because you were mentioned.

Reply to this email directly, view it on GitHub https://github.com/spark/spark-server/issues/74#issuecomment-256692483, or mute the thread https://github.com/notifications/unsubscribe-auth/ATI3HeKXMpNg6ROsPxiNgW5VcWoD6qRJks5q4M1cgaJpZM4Kh4co .

chuank commented 7 years ago

@dmiddlecamp thanks for looking into this. Looking forward to your updates!

Right after posting the issue, I switched (reluctantly) to UDP and got my Photons dumping sensor data locally – I'm in a very similar situation as @IcedMocha.

However, my case here is this: many projects restrict us from utilising the client's local network, or only permit WPA2 Enterprise connections. We know that WPA2 Enterprise is being bottlenecked externally (https://community.particle.io/t/wpa2-enterprise-costs-to-develop/11530/38), so local offline deployments using spark-server are the only way atm to work with Photons.

You might be surprised how many times this becomes the eventual decision (i.e. offline deployment), as most clients' IT department stonewall even the mere suggestion of connecting to their WLAN, but are also reluctant with the idea of a recurring cost for cell data usage (via 4G router, or Electrons).

Using plain UDP in such deployments raises even more alarm bells. It runs well as a stopgap measure, but the notion of open ports, even on a localised sensor network, is uncomfortable.

If such deployments already have a spark-server instance running, it makes a lot more sense to leverage on the built-in security of the Particle ecosystem.

dmiddlecamp commented 7 years ago

Hi @chuank,

Hmm, I just tested this, went through the whole thing, and my photon running 060 connects as expected.

I used the photon system parts from here: https://github.com/spark/firmware/releases/tag/untagged-1d1596b5ed1eeca20466

I flashed this to my photon with:

 particle flash --usb ~/Downloads/system-part1-0.6.0-photon.bin 
 particle flash --usb ~/Downloads/system-part2-0.6.0-photon.bin 

I setup my key with:

#convert to der
particle keys server default_key.pub.pem 

#put device into DFU mode, and tag key with my IP address
keys server default_key.pub.der 10.x.y.z

after my device was hitting the server, I accepted the key by removing the _handshake bit. (This is really weird usability-wise to me).

cd core_keys
mv XYZ_handshake.pub.pem XYZ.pub.pem

Can you re-flash the system parts and tinker from the firmware release link I posted above and try those?

For good measure, I also deleted and reinstalled my node_modules directory:

rm -rf node_modules
npm install

I tested this running node 0.10.36, which was the last version when this was released initially.

I'm not able to reproduce the issue you were seeing, so any extra info you can provide would help.

Thanks! David

chuank commented 7 years ago

Hi @dmiddlecamp I replied via the community forum: https://community.particle.io/t/local-cloud-a-k-a-spark-server-updates/26792/14. All good.

Turns out I had to re-generate new device keys and re-send them to the local cloud. That worked, although I wonder why. Does new system firmware require the re-provisioning of new device keys?

My server key never changed IP addresses, btw, and a quick diff showed no difference between what I set up months ago and now. As a reference, I'm using node 4.4.2 on the local cloud with no issues.

I also did not have to manually mv the _handshake key on my local cloud. I ran particle cloud claim devid and was able to get the local cloud to handle this for me. Had to restart spark-server. Thanks for highlighting the _handshake bit though – I'll definitely monitor it when switching future devices to the local cloud.

dmiddlecamp commented 7 years ago

Hi @chuank ,

Awesome! I'm glad you got it working. Safe to close this issue?

Thanks, David