particle-iot / spark-server

UNMAINTAINED - An API compatible open source server for interacting with devices speaking the spark-protocol
https://www.particle.io/
GNU Affero General Public License v3.0
441 stars 136 forks source link

Pushing Events through HTTP API to subscribed core not working #30

Open chron0 opened 9 years ago

chron0 commented 9 years ago

I want to have my core subscribe to certain events, say "alerts" for example and then execute whatever I defined in the associated event handler. This should not be bound to a coreID, I want any core to react to all events named "alerts" when it subscribed that event name. Now, when I POST to /v1/devices/events using the name "alerts", the event never actually gets pushed to the core. After a lot of logger.log in different parts of the code I think I could isolate the problem down to node_modules/spark-protocol/clients/SparkCore.js:

       try {
           if (!global.publisher) {
                logger.error('No global publisher');
                return;
           }

           if (!global.publisher.publish(isPublic, obj.name, obj.userid, obj.data, obj.ttl, obj.published_at, this.getHexCoreID())) {
               //this core is over its limit, and that message was not sent.
               this.sendReply("EventSlowdown", msg.getId());
               logger.log('EventSlowdown triggered' + this.getHexCoreID());
           }
           else {
                 this.sendReply("EventAck", msg.getId());
                 logger.log("onCoreSentEvent: sent to " + this.getHexCoreID());
           }
       }

It seems to me that global.publisher.publish is always going into this limit. I haven't really understood how the publisher works and have some trouble interpreting the code and I might just be doing something completely wrong. If anyone else has something like this working, any advice/example is welcome, otherwise it feels like a bug to me :)

To make it more clear, I don't want the core to subscribe/react to events from other cores, I just want to have them subscribe to a designated channel "alerts" and have them decide what to depending on event data. The trigger should be a simple POST through the spark-server API (as defined in the spark-server docs) so that hubot scripts or whatever else can trigger these events.