Open TriLoo opened 7 years ago
You can download it manually from here: https://github.com/fchollet/deep-learning-models/releases/download/v0.1/vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5
and paste it on
~/.keras/models
Hello everyone,
I am using Jetson TX2 (flashed with Jetpack 3.0) for the tutorial "ImageNet classification with Python and Keras". I have installed all the dependencies. I tried running the script as mentioned in the tutorial. It gave me below error:
Downloading data from https://github.com/fchollet/deep-learning-models/releases/download/v0.1/vgg16_weights_tf_dim_ordering_tf_kernels.h5 Traceback (most recent call last): File "test_imagenet.py", line 40, in model = VGG16(weights="imagenet") File "/home/nvidia/deep-learning-models/imagenet-example/vgg16.py", line 143, in VGG16 cache_subdir='models') File "build/bdist.linux-aarch64/egg/keras/utils/data_utils.py", line 222, in get_file Exception: URL fetch failure on https://github.com/fchollet/deep-learning-models/releases/download/v0.1/vgg16_weights_tf_dim_ordering_tf_kernels.h5: None -- [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)
I tried downloading file and placed it in " ~/.keras/models" folder. But still, I am getting the same error.
Can someone help me with this.
Thank you.
Export proxy environment variables http_proxy, socks_proxy, https_proxy,ftp_proxy etc
I faced the same problem that the vgg16_weights_tf_dim_ordering_tf_kernels.h5 was downloaded directly from the browser and placed into the ./keras/models folder, but when the application is running, it is still downloading the model, why it is happening
I am facing the same issue. Even if I download and manually place the VGG16 weights file in Keras model folder the program still tries to downloads and stops after downloading about 33 MB with the error 'Connection closed by remote host'. Please suggest solution.
I also met the same problem.I download the weight file manually,but it also make me download repeatedly. How can I solve the problem.Thank you!
I faced the same problem found a work around like this ! If you are using VGG16 from keras.applications or from 'https://github.com/fchollet/deep-learning-models' the download happens in any case. Now in order to stop this download what I did was took the vgg16.py from the link I just mentioned and then provided the path where I stored my manually downloaded weights. This happens at line 170 in vgg16.py where you can comment the if/else condition and modify the load_weights accordingly. Finally include 'from vgg16 import VGG16' to your main python file. Hope this made sense.
If you're on Mac OS, I found a solution here: https://github.com/ageron/handson-ml/issues/46
If you download the model from https://github.com/fchollet/deep-learning-models/releases/download/v0.1/vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5 make sure you set the include_top variable as False:
model = VGG16(weights = 'imagenet', include_top = False)
For Mac OS X
1) Update to Python 3.6.5 using the native app installer downloaded from the official Python language website https://www.python.org/downloads/
I've found that the installer is taking care of updating the links and symlinks for the new Python a lot better than homebrew.
2) Install a new certificate using "./Install Certificates.command" which is in the refreshed Python 3.6 directory
> cd "/Applications/Python 3.6/"
> sudo "./Install Certificates.command"
add these lines before downloading the weights:
import ssl
ssl._create_default_https_context = ssl._create_unverified_context
Hi, guys, I found a really easy solution. If you cannot download it due to proxy settings in a company like me. You can use PyPAC to auto set proxy for you by using the internet explorer's setting
here you go
@unnir , thank you a lot, it works!!
@unnir - many thanx for your elegant solution !
Downloading https://github.com/fchollet/deep-learning-models/releases/download/v0.1/vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5 and passing the path to weights worked for me. self.resnet = ResNet50(weights='models/vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5', include_top=False, pooling='avg')
copy paste and run. you see my power.
from keras.applications import VGG16
from keras import backend as K
import tensorflow as tf
import httplib2
# detect presense of proxy and use env varibles if they exist
pi = httplib2.proxy_info_from_environment()
if pi:
import socks
socks.setdefaultproxy(pi.proxy_type, pi.proxy_host, pi.proxy_port)
socks.wrapmodule(httplib2)
# now all calls through httplib2 should use the proxy settings
httplib2.Http()
mn = VGG16()
saver = tf.train.Saver()
sess = K.get_session()
saver.save(sess, "./TF_Model/vgg16")
I faced the same problem that the vgg16_weights_tf_dim_ordering_tf_kernels.h5 was downloaded directly from the browser and placed into the ./keras/models folder, but when the application is running, it is still downloading the model, why it is happening
I am having the same problem, please advise
What error message do u have?
Hi
It just seemed to freeze when I was downloading the vgg16_weights, but it worked today and I completed the assignment
On Tue, May 28, 2019 at 1:33 AM Vadym Stupakov notifications@github.com wrote:
What error message do u have?
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/fchollet/deep-learning-models/issues/33?email_source=notifications&email_token=ACYHH34HX777YTDTSD6W23DPXS7ZHA5CNFSM4C4S4XFKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODWLAKQY#issuecomment-496371011, or mute the thread https://github.com/notifications/unsubscribe-auth/ACYHH37AFZRBYCKXYRHPEKTPXS7ZHANCNFSM4C4S4XFA .
-- Clive DaSilva CPA,CMA Home: 416-421-2480|Mobile: 416-560-8820 Email: clive.dasilva@gmail.com LinkedIN: http://ca.linkedin.com/pub/clive-dasilva/3/197/b89
My app can't download files from gihub because of the company's firewall.
What worked for me is downloading the weights file manually and putting it to a folder "models" within the app.
Then in place of
model = ResNet50(weights="imagenet")
i linked it to the file:
model = ResNet50(weights="models/resnet50_weights_tf_dim_ordering_tf_kernels.h5")
But how do i locally link the index file which is also being attempted to download in the imagenet_utils.py? This:
CLASS_INDEX_PATH = 'https://s3.amazonaws.com/deep-learning-models/image-models/imagenet_class_index.json'
@evaldasjab, I guess u need remove weights=
and the manually load your model
@vstupakov, weights are not an issue now, it's imagenet_class_index.json that is not downloadable.
You must enable Internet Access in your Kernel Settings to be able to download data from github!
@SpocWeb, unfortunately my company has firewalled outgoing connections, my app can't request anything from github directly.
@evaldasjab So you are running your Code locally? Well than there is no choice but downloading it, right?
@SpocWeb, the code runs in Openshift, in a Docker container. I don't know where to put the downloaded file so that the model could find it. The Docker container doesn't have "~/.keras/models" folder.
You can manually create dir and set KERAS_HOME
env: https://github.com/keras-team/keras/blob/master/keras/utils/data_utils.py#L174
PS: Sorry, but read the first time Openshift as Openshit XD
@Red-Eyed, wonderful, that worked! So, what i did eventually: i created a folder keras/models in the app's working directory where i moved both files _imagenet_classindex.json and _resnet50_weights_tf_dim_ordering_tfkernels.h5. Then i added to the app's code:
import os
os.environ['KERAS_HOME'] = os.path.join(os.getcwd(), 'keras')
You must make that model = VGG16(weights='vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5', include_top=True, pooling='avg')
My app can't download files from gihub because of the company's firewall. What worked for me is downloading the weights file manually and putting it to a folder "models" within the app. Then in place of
model = ResNet50(weights="imagenet")
i linked it to the file:model = ResNet50(weights="models/resnet50_weights_tf_dim_ordering_tf_kernels.h5")
But how do i locally link the index file which is also being attempted to download in the imagenet_utils.py? This:
CLASS_INDEX_PATH = 'https://s3.amazonaws.com/deep-learning-models/image-models/imagenet_class_index.json'
Hi, I have the same problem. Could you help me how I should tackle it? Thank you
Hi, I have the same problem. Could you help me how I should tackle it? Thank you
Hi,read my last post.
dont use the python idle software. run the python code in terminal.
Hi fchollet, Now, I'm using vgg16 under Keras to classify oxford 102 flower dataset and I want to download the trained weight file: vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5. But I cannot download it... Wourld you please email this file to me? thank u very much. my email: songmh_1@hotmail.com
Try this before downloading VGG16: import ssl ssl._create_default_https_context = ssl._create_unverified_context
from . import get_submodules_from_kwargs ImportError: attempted relative import with no known parent package can u provide me a solution for this?
Hi fchollet, Now, I'm using vgg16 under Keras to classify oxford 102 flower dataset and I want to download the trained weight file: vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5. But I cannot download it... Wourld you please email this file to me? thank u very much. my email: songmh_1@hotmail.com