Open champierre opened 6 years ago
I'm not sure if this is the correct way to do this, but how I've been doing it is by storing tensor data in storageDB and then retraining the model onLoad. To store tensor data, simply use the .data() method on the tensor:
// Clone the existing tensor
// .data() call is asynchronously, so we clone the old tensor
// in case it's disposed of by the time we generate our next tensor
let tensor = oldTensor.clone()
tensor.data().then((data) => {
// Store the tensor
saveTensorToStorageDB(data)
tensor.dispose()
})
Then, to load it back in try something like:
let tensorShape = [227, 227, 3]
loadTensorsFromStorageDB.then((trainedData) => {
trainedData.tensors.forEach((tensors, modelClass) => {
tensors.forEach((data) => {
let tensor = deeplearn.tensor(data, tensorShape)
this.classifier.addImage(tensor, modelClass)
tensor.dispose()
})
})
})
You can either download the trained data as a JSON file, save it to localStorage (5MB limit), or save it to StorageDB (50MB limit). You can also save it to Firebase or some other remote store. I wrote about it in more detail (with examples) here: https://learnwithoz.github.io/day-3-model-persistence
Thank you very much for your advice. I wanted to see your examples, but was unable to see pages on https://learnwithoz.github.io/day-3-model-persistence.
@champierre hey looking for the same thing, did you ever figure out how to do it?
@atzin-em I could implement export/import feature on a different project but the mechanysm should be same. Hope that they could help you.
download(export) https://github.com/champierre/ml2scratch/blob/master/main.js#L341
upload(import) https://github.com/champierre/ml2scratch/blob/master/main.js#L354
@champierre Thank you very much, help was greatly appreciated
@champierre would you happen to know how to also upload photos to the model manually instead of through the webcam?
@atzin-em It is interesting to try, but I have not yet.
Hi - I am wondering about this same issue. A lot of the example links above are broken now. Please advise.
Hi! A simple strategy is saving the model out as a JSON file with data from the classifier, then importing it back in to access the weights.
You can read more about this method here: https://js.tensorflow.org/tutorials/model-save-load.html
hi @mayaman - thanks for the tip, but I am very much a noob. In trying to adapt this boilerplate code specifically, I can't quite pinpoint what they are calling the "model" if I was to apply the code below:
const saveResult = await model.save('localstorage://my-model-1');
Any ideas?
Is there any way to save the data and import it? Where can i find the trained data from the teachable machine example?
All the links in this thread are dead :(
Thank you.
There is a new release coming later this year that will have this feature! You can read more about it here: https://teachablemachine.withgoogle.com/io19.
There is a new release coming later this year that will have this feature! You can read more about it here: https://teachablemachine.withgoogle.com/io19.
I've seen that, but is there any way to export this now? or how to do that?
There are a number of ways to save your trained model and also upload that trained model and continue from where you left off.
To use the below functions, you need to create 2 buttons for saving and uploading in the index.html file.
then in main.js file under the start() function do somthing like this
document.getElementById('download_button').addEventListener('click', async () => downloadJSON(this.knn)); document.getElementById('loadJSON_button').addEventListener('change', async () => loadJSON(this.knn,event));
and define your fuctions.
Here is an example for saving (Downloading a JSON file)
const downloadJSON = async (knn) => { let dataset = this.knn.getClassifierDataset() var datasetObj = {} Object.keys(dataset).forEach((key) => { let data = dataset[key].dataSync(); datasetObj[key] = Array.from(data); }); let jsonStr = JSON.stringify(datasetObj) let jsonModel = JSON.stringify(datasetObj); let downloader = document.createElement('a'); downloader.download = "model.json"; downloader.href = 'data:text/text;charset=utf-8,' + encodeURIComponent(jsonModel); document.body.appendChild(downloader); downloader.click(); downloader.remove(); }
And here is an example to upload the JSON file in to the web application
const loadJSON = async (knn, event) => { let inputModel = event.target.files; console.log("Uploading"); let fr = new FileReader(); if (inputModel.length>0) { fr.onload = async () => { var dataset = fr.result; var tensorObj = JSON.parse(dataset); Object.keys(tensorObj).forEach((key) => { tensorObj[key] = tf.tensor(tensorObj[key], [tensorObj[key].length / 1000, 1000]); }); knn.setClassifierDataset(tensorObj); console.log("Classifier has been set up! Congrats! "); }; } await fr.readAsText(inputModel[0]); console.log("Uploaded"); }
Is there a way that I can export/import the training data? After I trained the class with many images, I want to export(download) the training data and I want to reuse it.