Closed kingsharaman closed 4 years ago
请问 relu6 什么时候可以在 webgl 里用
@lewis617 relu6 will be available in the next tf-core release. It was added after 1.2.9.
@lewis617 relu6 will be available in the next tf-core release. It was added after 1.2.9.
When will the next version will be released?
it is available now , please check.
Is it also available in tensorflowjs_converter? I've been trying to check out a keras->tfjs_graph_model conversion to see if I get an inference speed boost but even with an upgrade to 1.2.10 I'm still getting this today: ValueError: Unknown activation function:relu6 (Update: still seeing this with 1.2.10.1 today)
Hey @pyu10055 - is this expected? I would have thought this would be fixed with 1.2.10.
@annxingyuan @pyu10055 Any further thoughts on this? I had tried it out with 1.2.10.1 and it still was giving the same message. I'd try 1.2.11 but I don't see it on pypi yet.
For what it's worth, I did get this "working" by hacking it directly into my venv's keras/losses.py. I know that I had to add this as a custom object to Keras in an older version for general python usage, but without control of the custom object scope, I figured this would be the fastest method to test it (I also had a custom weighted loss function to plumb in while I was it that was not stock MobileNetV2, so it wasn't a big deal).
One thing I did run into too while working on this was errors like the following: File "/home/developer/Desktop/tfjs-convert/venv/lib/python3.6/site-packages/tensorflow/python/eager/context.py", line 865, in remove_function TypeError: 'NoneType' object is not callable
After a bit of searching, this seemed like #1582. I noticed that requirements.txt seems to be pulling in 1.14.0 so I tried upgrading tensorflow which bumped to 2.0.0. After re-hacking relu6 (and my custom weighted loss), conversion succeeded. I'll have to do further testing on the output to see if everything ended up all right.
Update: correctness seems fine and speed is almost 2x! Thanks @annxingyuan @pyu10055 for helping look at this and making improvements.
@wingman-jr-addon Are you also using tfjs-core 1.2.10 and tfjs-converter 1.2.10 on the client side? Or tfjs 1.2.10 (which depends on tfjs-core 1.2.10 and tfjs-converter 1.2.10).
JS client libraries: https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@1.2.10/dist/tf.min.js
Python side:
(venv) developer@developer-VirtualBox:~/Desktop/tfjs-convert$ python --version Python 3.6.8 (venv) developer@developer-VirtualBox:~/Desktop/tfjs-convert$ pip show tensorflow Name: tensorflow Version: 2.0.0 Summary: TensorFlow is an open source machine learning framework for everyone. Home-page: https://www.tensorflow.org/ Author: Google Inc. Author-email: packages@tensorflow.org License: Apache 2.0 Location: /home/developer/Desktop/tfjs-convert/venv/lib/python3.6/site-packages Requires: gast, grpcio, keras-applications, astor, termcolor, tensorflow-estimator, absl-py, keras-preprocessing, protobuf, google-pasta, wheel, wrapt, tensorboard, opt-einsum, numpy, six Required-by: tensorflowjs (venv) developer@developer-VirtualBox:~/Desktop/tfjs-convert$ pip show tensorflowjs Name: tensorflowjs Version: 1.2.10.1 Summary: Python Libraries and Tools for TensorFlow.js Home-page: https://js.tensorflow.org/ Author: Google LLC Author-email: opensource@google.com License: Apache 2.0 Location: /home/developer/Desktop/tfjs-convert/venv/lib/python3.6/site-packages Requires: h5py, tensorflow-hub, gast, tensorflow, PyInquirer, six, numpy Required-by:
Only the local installation of tensorflow was modified in the following way: (venv) developer@developer-VirtualBox:~/Desktop/tfjs-convert$ vi venv/lib/python3.6/site-packages/tensorflow_core/python/keras/activations.py
@keras_export('keras.activations.relu6') def relu6(x): return K.relu(x, max_value=6)
Sorry for the formatting.
@wingman-jr-addon try 1.2.11? we just did another release.
@annxingyuan Well, I've been watching for an update but even though GitHub says it's out there, PyPI and pip say it's still on 1.2.10.1. Perhaps the build server needs a poke?
@annxingyuan @pyu10055 Just checking, should PyPI still say 1.2.10.1 or should it be 1.2.11? I just want to make sure there is not an unexpected problem with build infrastructure; I'm not in a rush for the converter updates.
@wingman-jr-addon I'm a little confused - are you saying you converted a model with the tensorflowjs converter 1.2.10.1, but then are seeing "cannot use relu6 with webgl" in the browser? Which versions of tfjs are you using in the browser?
@annxingyuan The issue is on the Python side.
Originally, I did the following at release 1.2.10.
Then I waited a few days. When I saw 1.2.10.1 was released, I thought that maybe this issue was fixed since it appeared relu6 support was being added across the board. Then I did the following.
Then I decided to dig into the code and try to figure out why this was not working. After examination of this and trying a thing or two, I realized it would probably be fastest for my to modify my venv's local Keras. So I did the following:
After this, we corresponded and you indicated that v1.2.11 is released. So then I did the following:
So, I had two questions.
Thank you for your patience if I am misunderstanding the situation or relu6 support.
@wingman-jr-addon Thank you for trying out the different options, The error message is from tf.keras loader, since the relu6 activation function is a mobilenet custom python activation function, it needs to be provided to the loader similar to following:
model = load_model('mobilenet.h5', custom_objects={
'relu6': mobilenet.relu6,
'DepthwiseConv2D': mobilenet.DepthwiseConv2D})
Our converter did not fix the error from the keras loader, we only added support for the fusing tensorflow relu6 op into conv2d.
Thank you for catching the version discrepancy of the pip package, we actually did not release 1.2.11, since there is no change since 1.2.10.1.
There might be better way to work around your problem:
This only works if you only need to do inference using TFJS.
Thanks @pyu10055 for the explanation! I'm familiar with the "load_model with custom_objects" method from the Python side as I've had to deal with it there. I didn't see a corresponding custom objects parameter etc. for tensorflowjs_converter (nor am I quite sure how one would add that flexibility in the future), so I just tried my hack. I may have to give your method a shot in the future as it sounds cleaner for sure; I haven't played as much with TF's underlying formats yet vs. Keras .h5 format so thanks for the suggestion on a better supported path.
1.2.9
Chrome 76.0.3809.132 (Official Build) (64-bit)
When I try to run
model.predict()
with the converted DeepLab model with MobileNet backend I get the following error message:Activation relu6 has not been implemented for the WebGL backend.
This should have been fixed in #2016 but I think you should also modify
mapActivationToShaderProgram
in https://github.com/tensorflow/tfjs/blob/89f8275e6ae6f3d0315225365c86d3c346df0ccd/tfjs-core/src/backends/webgl/backend_webgl.ts