CovertLab / DeepCell

49 stars 26 forks source link

running convnets example error #9

Closed hftsai closed 7 years ago

hftsai commented 7 years ago

Hi Dave, sorry I run into trouble again. i switched to using old version of keras (1.2) I can run the running_template for theriot and give good results. However when i'm trying to modify for it to run for segmentation on HeLa, 3T3, or MCF10A data set, I'm getting error.

I wonder if it's possible for ask for your kind help?

import os
import datetime
import numpy as np
os.environ["KERAS_BACKEND"]="theano"
os.environ["THEANO_FLAGS"]="mode=FAST_RUN,device=gpu,floatX=float32"
import keras
keras.backend.set_image_dim_ordering('th')

print 'keras version is ' + keras.__version__

import h5py
import tifffile as tiff
from keras.backend.common import _UID_PREFIXES

from cnn_functions import nikon_getfiles, get_image, run_models_on_directory, get_image_sizes, segment_nuclei, segment_cytoplasm, dice_jaccard_indices
from model_zoo import sparse_bn_feature_net_61x61 as cyto_fn
from model_zoo import sparse_bn_feature_net_61x61 as nuclear_fn

direc_name = '/home/davince/DeepCell-master/validation_data/HeLa/'
data_location = os.path.join(direc_name, 'RawImages')
cyto_location = os.path.join(direc_name, 'Cytoplasm')
nuclear_location = os.path.join(direc_name, 'Nuclear')
mask_location = os.path.join(direc_name, 'Masks')
cyto_channel_names = ['phase']
nuclear_channel_names = ['farred']

trained_network_cyto_directory = "/home/davince/DeepCell-master/trained_networks/HeLa/"
trained_network_nuclear_directory = "/home/davince/DeepCell-master/trained_networks/Nuclear/"

cyto_prefix = "2016-07-12_HeLa_all_61x61_bn_feature_net_61x61_"
nuclear_prefix = "2016-07-12_nuclei_all_61x61_bn_feature_net_61x61_"

win_cyto = 30
win_nuclear = 30

image_size_x, image_size_y = get_image_sizes(data_location, nuclear_channel_names)

list_of_cyto_weights = []
for j in xrange(5):
    cyto_weights = os.path.join(trained_network_cyto_directory,  cyto_prefix + str(j) + ".h5")
    list_of_cyto_weights += [cyto_weights]

list_of_nuclear_weights = []
for j in xrange(5):
    nuclear_weights = os.path.join(trained_network_nuclear_directory,  nuclear_prefix + str(j) + ".h5")
    list_of_nuclear_weights += [nuclear_weights]

cytoplasm_predictions = run_models_on_directory(data_location, cyto_channel_names, cyto_location, model_fn = cyto_fn, 
    list_of_weights = list_of_cyto_weights, image_size_x = image_size_x, image_size_y = image_size_y, 
    win_x = win_cyto, win_y = win_cyto, split = False)

nuclear_predictions = run_models_on_directory(data_location, nuclear_channel_names, nuclear_location, model_fn = nuclear_fn, 
    list_of_weights = list_of_nuclear_weights, image_size_x = image_size_x, image_size_y = image_size_y, 
    win_x = win_nuclear, win_y = win_nuclear, split = False)

and the error is like this

---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
<ipython-input-20-1bae99843176> in <module>()
     13 nuclear_predictions = run_models_on_directory(data_location, nuclear_channel_names, nuclear_location, model_fn = nuclear_fn, 
     14         list_of_weights = list_of_nuclear_weights, image_size_x = image_size_x, image_size_y = image_size_y,
---> 15     win_x = win_nuclear, win_y = win_nuclear, split = False)

/home/davince/DeepCell-master/keras_version/cnn_functions.pyc in run_models_on_directory(data_location, channel_names, output_location, model_fn, list_of_weights, n_features, image_size_x, image_size_y, win_x, win_y, std, split, process, save)
   1482 
   1483         batch_input_shape = (1,len(channel_names),image_size_x+win_x, image_size_y+win_y)
-> 1484         model = model_fn(batch_input_shape = batch_input_shape, n_features = n_features, weights_path = list_of_weights[0])
   1485         n_features = model.layers[-1].output_shape[1]
   1486 

/home/davince/DeepCell-master/keras_version/model_zoo.pyc in sparse_bn_feature_net_61x61(batch_input_shape, n_features, reg, init, weights_path)
    567         model.add(Activation(tensorprod_softmax))
    568 
--> 569         model = set_weights(model, weights_path)
    570 
    571         return model

/home/davince/DeepCell-master/keras_version/cnn_functions.pyc in set_weights(model, weights_path)
     90 
     91         for layer in model.layers:
---> 92                 if layer.name in f['model_weights'].keys():
     93                         if 'bn' in layer.name:
     94                                 g = f['model_weights'][layer.name]

h5py/_objects.pyx in h5py._objects.with_phil.wrapper (/tmp/pip-4rPeHA-build/h5py/_objects.c:2684)()

h5py/_objects.pyx in h5py._objects.with_phil.wrapper (/tmp/pip-4rPeHA-build/h5py/_objects.c:2642)()

/usr/local/lib/python2.7/dist-packages/h5py/_hl/group.pyc in __getitem__(self, name)
    164                 raise ValueError("Invalid HDF5 object reference")
    165         else:
--> 166             oid = h5o.open(self.id, self._e(name), lapl=self._lapl)
    167 
    168         otype = h5i.get_type(oid)

h5py/_objects.pyx in h5py._objects.with_phil.wrapper (/tmp/pip-4rPeHA-build/h5py/_objects.c:2684)()

h5py/_objects.pyx in h5py._objects.with_phil.wrapper (/tmp/pip-4rPeHA-build/h5py/_objects.c:2642)()

h5py/h5o.pyx in h5py.h5o.open (/tmp/pip-4rPeHA-build/h5py/h5o.c:3570)()

KeyError: "Unable to open object (Object 'model_weights' doesn't exist)"

But the weights seems to be loaded in the list_of_weights is it the same problem you mentioned keras changed the way the weights are saved? (now i reverted back to 1.2) the h5py version is 2.6.0

Also, i know from your paper, you wrote the nuclear channel help segmentation. Is this limited to semantic segmentation in coculture examples? Or is it general for all experiment? Could the segmentation be also good if there is no nuclear channel?

Thank you.

hftsai commented 7 years ago

Hi guys i tried a bit more So if i run it in the docker container there seems to be no error. So i wonder did i do anything wrong when trying to just run it in the keras_version with jupyter notebook?(the notebook is trusted)

hftsai commented 7 years ago

Just to close this. The issue is non-existent on my setup after the last update.