particle-iot / spark-server

UNMAINTAINED - An API compatible open source server for interacting with devices speaking the spark-protocol
https://www.particle.io/
GNU Affero General Public License v3.0
441 stars 136 forks source link

CryptoStream error when using Node 0.12 #48

Open endorama opened 9 years ago

endorama commented 9 years ago

Hello, I've found a misbehaviour when running the server using nodejs 0.12.

It throws a CryptoStream error which prevents my Core to properly complete handshake with the server.

Here you can find the relevant forum post about the behaviour.

For future reference, I'm copying here the relevant part of the server log:

Your server IP address is: [...]
server started { host: 'localhost', port: 5683 }
Connection from: 192.168.1.6, connId: 1
CryptoStream transform error TypeError: Cannot read property 'length' of null
CryptoStream transform error TypeError: Cannot read property 'length' of null
on ready { coreID: '[...]',
  ip: '192.168.1.6',
  product_id: 65535,
  firmware_version: 65535,
  cache_key: '_0' }
Core online!
CryptoStream transform error Error: error:06065064:digital envelope routines:EVP_DecryptFinal_ex:bad decrypt
onSocketData called, but no data sent.
1: Core disconnected: socket close false { coreID: '[...]',
  cache_key: '_0',
  duration: 25.082 }
Session ended for _0

As a workaround you can use node 0.10.36 ( download it from here ) and everything should work properly.

satendra4u commented 9 years ago

Hi Edoardo, On my local cloud which is my Digital Ocean cloud and by mistake I update the node.js to 0.12 versio. Now even I went back to 0.10 it still not connecting but not giving me any error.

The core is not breathing cyan and not connecting to the cloud. Do you think even after reverting the node.js there is some elese broken ?

Please let me know if you have any idea.

Thanks, Satyen

kennethlimcp commented 9 years ago

@satendra4u , i highly recommend that you setup again using a clean installation and verify if the issue exists.

ashtonj commented 8 years ago

@kennethlimcp, I re-ran through spark-server setup again (Mac OSX Yosemite 10.10.4) and am getting this:

Followed guide for installing spark-server (master branch)

$ git clone https://github.com/spark/spark-server.git Cloning into 'spark-server'... remote: Counting objects: 350, done. remote: Total 350 (delta 0), reused 0 (delta 0), pack-reused 350 Receiving objects: 100% (350/350), 84.73 KiB | 0 bytes/s, done. Resolving deltas: 100% (201/201), done. Checking connectivity... done.

$ cd spark-server spark-server $ npm install

ursa@0.8.5 install /Users/../../../spark-server/node_modules/ursa node-gyp rebuild

CXX(target) Release/obj.target/ursaNative/src/ursaNative.o CXX(target) Release/obj.target/ursaNative/src/asprintf.o SOLINK_MODULE(target) Release/ursaNative.node xtend@4.0.0 node_modules/xtend

node-oauth2-server@1.5.3 node_modules/node-oauth2-server

when@3.7.3 node_modules/when

moment@2.10.6 node_modules/moment

hogan-express@0.5.2 node_modules/hogan-express └── hogan.js@3.0.2 (mkdirp@0.3.0, nopt@1.0.10)

spark-protocol@0.1.5 node_modules/spark-protocol ├── buffer-crc32@0.2.5 ├── h5.buffers@0.1.1 ├── h5.coap@0.0.0 └── hogan.js@3.0.2 (mkdirp@0.3.0, nopt@1.0.10)

express@3.4.8 node_modules/express ├── methods@0.1.0 ├── merge-descriptors@0.0.1 ├── range-parser@0.0.4 ├── debug@0.8.1 ├── fresh@0.2.0 ├── cookie-signature@1.0.1 ├── buffer-crc32@0.2.1 ├── cookie@0.1.0 ├── mkdirp@0.3.5 ├── commander@1.3.2 (keypress@0.1.0) ├── send@0.1.4 (mime@1.2.11) └── connect@2.12.0 (uid2@0.0.3, pause@0.0.1, qs@0.6.6, bytes@0.2.1, raw-body@1.1.2, batch@0.5.0, negotiator@0.3.0, multiparty@2.2.0)

ursa@0.8.5 node_modules/ursa ├── bindings@1.2.1 └── nan@1.8.4

request@2.60.0 node_modules/request ├── aws-sign2@0.5.0 ├── forever-agent@0.6.1 ├── caseless@0.11.0 ├── stringstream@0.0.4 ├── oauth-sign@0.8.0 ├── tunnel-agent@0.4.1 ├── isstream@0.1.2 ├── json-stringify-safe@5.0.1 ├── extend@3.0.0 ├── node-uuid@1.4.3 ├── qs@4.0.0 ├── tough-cookie@2.0.0 ├── http-signature@0.11.0 (assert-plus@0.1.5, asn1@0.1.11, ctype@0.5.3) ├── combined-stream@1.0.5 (delayed-stream@1.0.0) ├── mime-types@2.1.3 (mime-db@1.15.0) ├── form-data@1.0.0-rc3 (async@1.4.0) ├── hawk@3.1.0 (cryptiles@2.0.4, sntp@1.0.9, boom@2.8.0, hoek@2.14.0) ├── bl@1.0.0 (readable-stream@2.0.2) └── har-validator@1.8.0 (bluebird@2.9.34, commander@2.8.1, chalk@1.1.0, is-my-json-valid@2.12.1)

spark-server $ node -v v0.12.7

spark-server $ nvm install 0.10.36 v0.10.36 is already installed. Now using node v0.10.36 (npm v1.4.28)

spark-server $ node -v v0.10.36

spark-server $ node main.js

No users exist, you should create some users!

connect.multipart() will be removed in connect 3.0 visit https://github.com/senchalabs/connect/wiki/Connect-3.0 for alternatives connect.limit() will be removed in connect 3.0 dyld: lazy symbol binding failed: Symbol not found: _node_module_register Referenced from: /Users/../../../spark-server/node_modules/ursa/build/Release/ursaNative.node Expected in: dynamic lookup

dyld: Symbol not found: _node_module_register Referenced from: /Users/../../../spark-server/node_modules/ursa/build/Release/ursaNative.node Expected in: dynamic lookup

Trace/BPT trap: 5

spark-server $

If I switch to node 0.12.7 I get the same output as @endorama when i run node main.js

dmiddlecamp commented 8 years ago

It sounds like a problem building the "ursa" module, which depends on OpenSSL being installed. Can you run:

openssl version
ashtonj commented 8 years ago

spark-server $ openssl version OpenSSL 0.9.8zf 19 Mar 2015

dmiddlecamp commented 8 years ago

Hmm, this might be an issue with node modules that were partially installed, and then the version of node changed... Can you delete your "node_modules" folders, and try running npm install again?

ashtonj commented 8 years ago

That solved it! Thanks David!

dmiddlecamp commented 8 years ago

Awesome!