Open xiaomingdaren123 opened 6 years ago
Hi,
You probably do not get any response because the face detection method does not detect the face. Keep in mind that DAN performs only facial landmark localization not face detection. In the example code face detection is performed using OpenCV's face detection functionality. If you want to get landmarks for this image you may specify the face rectangle by yourself or use a better face detector.
As for the training results: in order to get the best results you should train the first stage separately and then the second stage. Also: do not train the first stage till it overfits, stop training when the error stops improving frequently (say, you don't get a new, better result in the last 20 minutes).
Thanks,
Marek
hello,friend. 1、the file of meanFaceShape.npz,Is it the average of all landmarks in the dataset? 2、initialization == 'rect'、‘'similarity' 、'box' , What is the difference between them?
Hi,
Thanks,
Marek
Thank you for your earnest answer to every question.thank you very much. If I wantto get meanFaceShape.npz with my own data set.According to your reply .First of all,I should align all faces of the training set.Then, calculate the average of aligned face landmarks?Could you provide the code to generate meanFaceShape.npz? This meanFaceShape.npz seems to have only been used during the data augmentation phase.If I don't do data augmentation, can I not use this file? thank you ,friend.
The mean shape is also used as the initial shape for face alignment, so you will still need it. I do not have access to the code at the moment (it was on an old university computer). It should however be quite simple to implement. Look at the paper I linked to in my previous post.
Hi, MareKowalski.I used the code you provided to train the model. when I run the DAN._V2.py,there were some problems,I try to solve the problem,unfortunately ,I didnit make it.I hope you could give me some suggestions. The problems as follows: Traceback(most recent call last): File"./DAN_V2/dan_model.py" ,line 112, in call inputs_imgs= tf.reshape(inputs_imgs,[-1,self.img_size,self.img_size,1]) File"./anaconda3/lin/python3.6/site-packages/tensorflow/python/ops/gen_array_ops.py",line 3938,in reshape "Reshape",tensor=tensor,shape = shape,name=name) File"./anaconda3/lin/python3.6/site-packages/tensorflow/python/framework/op_def_library.py",line 513,in_apply_op_helper raise err File"./anaconda3/lin/python3.6/site-packages/tensorflow/python/framework/op_def_library.py",line 510,in_apply_op_helper preferred_dtype=defalut_dtype) File"./anaconda3/lin/python3.6/site-packages/tensorflow/python/framework/ops",line 926,in_constant_tensor_to_tensor ret =conversion_func(value,dtype=dtype,name=name,as_ref=as_ref) File"./anaconda3/lin/python3.6/site-packages/tensorflow/python/framework/constant_op.py",line 229,in_constant_tensor_conversion_function return constant(v,dtype=dtype,name=name) File"./anaconda3/lin/python3.6/site-packages/tensorflow/python/framework/constant_op.py",line 208,in constant value,shape=shape,verify_shape=verity_shape)) File"./anaconda3/lin/python3.6/site-packages/tensorflow/python/framework/tensor_util.py",line 472,in make_tensor_proto "supported type." % (type(values),values))
TypeError:Failed to convert object of type<class"tensorflow.python.data.ops.dataset_ops.PrefetechDataset"> to Tensor. Contents:<PrefetchDataset shapes:(
Hi,
I think this code is not from my repo, please start a thread in the repo it comes from.
Thanks,
Marek
hi,MarekKowalski,I used the code you provided to train the model. When I tested the image, it didn't respond. Using your model (DAN.npz) also meet same problem,but No problem with other images。I don't know what is going on.
2、load your network (DAN.npz),the validation error is 0.0177,the train error is 0.035395,When I train this network,the validation error is 0.04789,the train error is 0.0433754, If I continue training the network ,can I achieve the same result as you?How many epochs did you train(approximately)?Can these two stages train at the same time?
I hope you can help me ,thank you very much!