Open OughtImpliesCan opened 7 years ago
So are you saying that I should make it possible to export Neataptic
's network to TensorFlow? Or are you saying I should implement some kind of training script that uses TensorFlow to compute the error?
I'll do some reading on how TensorFlow saves/exports networks. I'll look into the possibility of converting TensorFlow Graph to Neataptic or the other way around.
I've started writing something like a wrapper for Node JS to call Keras API functions starting with LSTM, I'll post the link here later tonight On Aug 25, 2017 10:27 AM, "Thomas Wagenaar" notifications@github.com wrote:
So are you saying that I should make it possible to export Neataptic's network to TensorFlow? Or are you saying I should implement some kind of training script that uses TensorFlow to compute the error?
I'll do some reading on how TensorFlow saves/exports networks. I'll look into the possibility of converting TensorFlow Graph to Neataptic or the other way around.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/wagenaartje/neataptic/issues/63#issuecomment-324985900, or mute the thread https://github.com/notifications/unsubscribe-auth/AVUlkZOJrKVK104K4VlXzteTNE7eFbtGks5sbwQPgaJpZM4O8asj .
Here's what I've got so far (sorry, haven't gotten around to putting things on github yet)
https://drive.google.com/drive/folders/0BwA2KVlVFTQtaEc0aFRrSy1WTTQ?usp=sharing
there's an error on line 16 because of the units variable (one layer I'm making a LSTM with the units variable set to 50 and the other time I don't pass it in at all so it's passed in as undefined which throws an error even though you don't actually need a units variable for the first layer)... I'm very much not a Python developer, my experience has been exclusively Javascript since the very beginning ... C, Python and the Shell were looked at maybe a couple weeks all together, years for Javascript. I know how to do what I want to do in the Python code if I were to write it in Javascript, but I'm unsure of the translation, will research it of course but if you take a look at what I have and know what I'm doing wrong, let me know lol. There's not much else that really needs to be done at the moment in terms of just getting the basic wrapper up besides this error... after I have this error cleared, the other attributes should be available.
Upload all of those files to your main directory, for me it was /home/oughtimpliescan1/ so you will have to do a search and replace across lstm_init.py , lstm_train.py and tensorNode.js with whatever directory you choose to use... I've essentially made an artificial version of the shell's stdout and stdin but with objects saved as JSON strings that are then parsed and then attributes are passed to keras functions. There's a bit to it but essentially you initialize the model with the function lstm_init(configurations,callback) in the tensorNode.js... same thing with lstm_train ... everything's operating off of a global object that is used by the various boilerplate functions when they need it, updated with returns from keras functions, which saves everything to the io file automatically. Hopefully that makes sense? Locations for where models are saved are passed to the io object (data for multiple models can be stored in a separate JSON object saved to a file instead of the main io object file).
After you've uploaded the files, chmod 755 all of them and then do a 'sudo node tensorNode.js' ... you will see what the error codes are... Oh, btw, I'm doing all of this on Google Cloud Platform so I'm just SSHing into a VM running ubuntu and all of the prerequisites needed for keras and tensorflow as well as neataptic and headless chrome (should be useful for the pipeline)... I've got a copy of the image that I think is possible for others to download and use for their own GCP compute engine and if they set it to be a f1-micro machine, they wont use any of their cloud credits during the configuration stage and can later upgrade for more computation and RAM
After I get this wrapper working, the only real thing left to do is to experiment with the different parameters available for the lstm, dense and fit methods
https://keras.io/layers/recurrent/#lstm https://keras.io/models/sequential/#sequential-model-methods https://keras.io/layers/core/#dense https://keras.io/layers/core/#activation https://keras.io/layers/core/#dropout
Oh, the function lstm_example() that is called when you run the sudo node tensorNode.js command, is essentially trying to replicate the very simple model that was used for predicting google's stock price here:
https://github.com/llSourcell/How-to-Predict-Stock-Prices-Easily-Demo/blob/master/stockdemo.ipynb
I'm very unfamiliar with TensorFlow and Keras. But from what i'm seeing, you basically created a Node.js
script that calls a batch command to run a python script?
I was actually thinking of making Neataptic and TensorFlow models exchangable.
Exchangeable in the sense that the developer would have to know both tensorflow and neataptic or just neataptic and a function call within neataptic/JavaScript that would then start a tensorflow session, just using whatever parameters you have provided? On Aug 26, 2017 10:18 AM, "Thomas Wagenaar" notifications@github.com wrote:
I'm very unfamiliar with TensorFlow and Keras. But from what i'm seeing, you basically created a Node.js script that calls a batch command to run a python script?
I was actually thinking of making Neataptic and TensorFlow models exchangable.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/wagenaartje/neataptic/issues/63#issuecomment-325148624, or mute the thread https://github.com/notifications/unsubscribe-auth/AVUlkdruEWMnWDrqPsNlIfyDvIacKiyDks5scFNhgaJpZM4O8asj .
I actually thought you meant the ability to train/evolve a network in Neataptic and then being able to import that same network to TensorFlow (or the other way around).
I think basically what you have made is a Node wrapper for TensorFlow. But before you actually start putting a lot of effort into this, you have to ask yourself: "why wouldn't people just use TensorFlow without the Node.js wrapper?". If people really want to have fast training, they would just use Python anyways.
My understanding of how Tensorflow works is by coming up with a list of configurations and sequences in python, all of that then gets sent to the Tensorflow API all at once and then processes with Tensorflow... essentially it could have been any scripting language since the scripting language isn't the actual processing code but they chose to go with Python instead of Javascript which means, if you use the Keras API and some of the boilerplate functions, using the shell you can just pass configuration settings and do the rest of your projects logic in Javascript
well yes, I want to be able to both train/evolve a network on Neataptic and Tensorflow so as to have a local presence through the browser but also for more industrial computing on GCP that becomes the starting point for whatever network is being applied locally.
The question I was asking myself before I began building this thing is, "Do I want to spend my time learning arbitrary things like redundant programming languages or do I want to spend my time learning the actual science behind machine learning and just do everything in Javascript (which so happens to be the most popular programming language on the planet and so would have the widest accessibility to others potentially)?" lol, loaded question I know, but as I say, my mind thinks in Javascript and there's nothing that any other programming language can do that Javascript cannot do apart from in absolute terms for speed performance... which is why the Tensorflow API is helpful since one of the only areas where speed performance actually matters and isn't just some obsessive compulsive thing is with machine learning since computation sessions can literally take hours or days instead of fractions of a second... especially if you start to do some of the more interesting things with AI.
That said, I want to reiterate what I said before, I love Neataptic, I love the whole concept behind evolving a network's architecture and even some of the learning parameters... but as far as I'm aware there isn't a pre-built model within Keras or Tensorflow that applies the NEAT approach and I've already been playing with Neataptic for a while now so I think that would be the way to go?
so, just to hopefully clarify, I have not begun the process of examining the way in which keras stores the representation of the network with their model.save function, that is still on the list of things to do... the first thing for me to do is to successfully recreate a Keras model through sending configuration settings via a Node JS function call... I have made some progress since yesterday but there's still some confusion with the Python syntax that I'm trying to work on right now... the files in that Drive folder should be updated with where I am right now.
I see what you mean. I think the best thing to do is to create a seperate Node module for your project (or fork Neataptic). If you decide to create a seperate Node module that has Neataptic
as a dependency and implements TensorFlow learning, i'll definitely link it in the README.md
file. If you choose to fork, I'll keep track of the development and I might create a PR.
I am completely unfamiliar with TensorFlow, but do Keras and TensorFlow allow such flexible network topologies as Neataptic has? I always thought Keras only supported layered networks, but I think I might be wrong.
But if anything, keep me updated. It would be nice to see some working examples in the future with a step-by-step tutorial on how to run them.
I'm sure there are bugs somewhere, but I've got the basics working now, same folder, files will have updated. I broke down and resorted to using eval on strings instead of trying to figure out the idiosyncrasies of python lol.
I also wonder how flexible they are. I am not sure it can be done.
Ok, so I've got at least a working example to show, and I can pretty easily document everything with something like google docs, screenshots and step by step, but I've never put in the time to learn how to use Github lol, I've always just been focusing on my own projects and using google drive... I can make the document and if you want to, just copy paste it into one of your github pages (assuming you're still ok with the project I've got started so far intending to eventually scrape the keras network data like architecture, weights, biases, activation's, etc. and then recreate those networks in Neataptic for computing local/exclusive networks in the end user's browser, desktop or mobile progressive web app, etc. etc. etc.)
Wrappers already exist for both Tensorflow and Keras. To infer the network, just use one of the existing wrappers.
Also, this would not be possible to work on a project with collaborators on a Google Drive folder.
git clone
# do edits
git add .
git commit -m "Changing N with M"
git push origin master
Thank you so much for linking me to those two projects, I was unaware of them prior to this... however, here are my thoughts after just a quick glance at both of them...
only one of those is keras, tensorflow is unnecessarily complicated which is why keras exists (compare how easy it is to save a graph in keras compared to how much it takes to save a graph in tensorflow... it would be relevant if the underlying substance of the thing you are doing actually justified the complexity, but you're really not doing anything all that complex when you're trying to save and restore a neural network...
I like how I've laid out my default interface a lot more than what I initially saw from keras-js github (that said, they have GPU support for the browser so that's huge, I wonder if they're going to be supporting TPU's on compute engine, though I think Keras will support TPU's automatically since it's a Google project on top of Tensorflow). I've essentially packed everything into 11 functions, all of which are from this page and have a very straight forward configuration objects and callbacks... within those configuration objects are where you define layers as an array of sub objects with their attributes and whatever additional data is needed like optimizer's and loss functions for the compiler... In essence the entire process of compile, fit, evaluate, and predict really are condensed to those four functions... at least they are for the LSTM and Dense layers, I know there's tons and tons of stuff with Keras, lol... and I do need to trim a few things here and there but I guess I'll continue with what I've been working on so far, really fascinating to see what other people are working on
As for working on the google drive folder project with others, if you're using Tensorflow, even if it's Keras.js within node, you should have access to a shell of some sorts that is set up for node js and keras and tensorflow etc. If you do have all of that, you could look in that drive folder I linked earlier, navigate to the start file, open it in a text editor and copy the third chunk there except for the last line, run that in a terminal and you will see the result of the tensorNode.js file. As far as those out there who do not already have an environment set up for all of this, you can easily initialize a compute engine instance on Google Cloud Platform and then run the whole start file from the SSH portal. The f1-micro instance is free but you will need a larger machine type just to install Tensorflow, so pick a larger machine type, then after everything has been calculated, the last line there will shut off your machine and it will probably cost less than five cents (Google gives you $300 for your first year). If anybody needs me to give more explicit steps for compute engine on GCP, let me know and I'll take screenshots.
After you've done all of that, simply do a sudo ./runScript in the directory where you first saved all of those files and you will get the latest version of the code for this project.
I love Javascript and I love Neataptic. I think Tensorflow should definitely have been scripted in Javascript instead of python... but there's the principle of what I'm talking about, in theory, there should be some procedure where a Network in Neataptic stored as a JSON object can be transformed into a Graph in Tensorflow where all of the same logic of applying activation functions and modifying weights and biases is computed, it's just computed within Tensorflow's C library, which is suppose to be 5 times faster than Javascript's V8 engine if I'm correct. That, and Google's TPU announcement makes me want this JSON converter more than anything lol.
What would be needed is to interface between the myNetwork.toJSON() - Network.fromJSON() in Neataptic ... the tf.train.export_meta_graph - tf.train.import_meta_graph in Tensorflow (as well as whatever boilerplate would be needed to run training off of inputs, aka a glorified function call that takes a couple objects as input data).
Anybody else think this a good idea? I think some minimal Python code and perhaps even some shell scripts from Node JS child processes could make this a relatively simple thing no? Anybody have any experience with Tensorflow?