Right now as a workaround, we are using the TMI.js library to access Twitch Chat, since it was ready-written for Javascript. This is not the correct implementation. First, we can connect natively to the Twitch API using Streamer.bot's context. We don't need an external library to access this functionality. Separately, we can using the queueing functionality of C# code, and the feature to "keep instance alive" in Streamer.bot to keep the last 20 messages from chat in memory.
The dependency on Node.JS has been removed in 1.0 RC1, the bot now records chat into memory and the memory is cleared on each exit. This should be more realiable, and keep up with really fast chat.
Right now as a workaround, we are using the TMI.js library to access Twitch Chat, since it was ready-written for Javascript. This is not the correct implementation. First, we can connect natively to the Twitch API using Streamer.bot's context. We don't need an external library to access this functionality. Separately, we can using the queueing functionality of C# code, and the feature to "keep instance alive" in Streamer.bot to keep the last 20 messages from chat in memory.