MikeOfZen / Yet-Another-Openpose-Implementation

This project reimplements from scratch the OpenPose paper (Cao et al,2018), Using Tensorflow 2.1 and optional TPU powered training.
Mozilla Public License 2.0
92 stars 26 forks source link

Different shape size between expected input and gotten array #1

Closed blldw closed 4 years ago

blldw commented 4 years ago

In [8]: %run applications/cam.py
Press ESC to exit

ValueError Traceback (most recent call last) ~/PHD/new/references/human motion tracking/Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields/Yet-Another-Openpose-Implementation/applications/cam.py in 46 if name == "main": 47 app = CamApp() ---> 48 app.run()

~/PHD/new/references/human motion tracking/Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields/Yet-Another-Openpose-Implementation/applications/cam.py in run(self) 33 img_rgb = cv2.cvtColor(cam_img_bgr, cv2.COLOR_BGR2RGB) 34 ---> 35 processed_img_rgb = self.process_frame(img_rgb) 36 37 processed_img_bgr = cv2.cvtColor(processed_img_rgb, cv2.COLOR_RGB2BGR)

~/PHD/new/references/human motion tracking/Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields/Yet-Another-Openpose-Implementation/applications/cam.py in process_frame(self, img) 19 def process_frame(self, img): 20 ---> 21 skeletons = self.model_wrapper.process_image(img) 22 23 skeleton_drawer = vis.SkeletonDrawer(img, draw_config)

~/PHD/new/references/human motion tracking/Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields/Yet-Another-Openpose-Implementation/applications/model_wrapper.py in process_image(self, img) 17 input_img /= 255 18 input_img = input_img[tf.newaxis, ...] ---> 19 pafs, kpts = self.model.predict(input_img) 20 pafs = pafs[0] 21 kpts = kpts[0]

~/anaconda3/env/py3.7-tf2.1/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/training.py in predict(self, x, batch_size, verbose, steps, callbacks, max_queue_size, workers, use_multiprocessing) 1011 max_queue_size=max_queue_size, 1012 workers=workers, -> 1013 use_multiprocessing=use_multiprocessing) 1014 1015 def reset_metrics(self):

~/anaconda3/env/py3.7-tf2.1/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/training_v2.py in predict(self, model, x, batch_size, verbose, steps, callbacks, max_queue_size, workers, use_multiprocessing, kwargs) 496 model, ModeKeys.PREDICT, x=x, batch_size=batch_size, verbose=verbose, 497 steps=steps, callbacks=callbacks, max_queue_size=max_queue_size, --> 498 workers=workers, use_multiprocessing=use_multiprocessing, kwargs) 499 500

~/anaconda3/env/py3.7-tf2.1/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/training_v2.py in _model_iteration(self, model, mode, x, y, batch_size, verbose, sample_weight, steps, callbacks, max_queue_size, workers, use_multiprocessing, **kwargs) 424 max_queue_size=max_queue_size, 425 workers=workers, --> 426 use_multiprocessing=use_multiprocessing) 427 total_samples = _get_total_number_of_samples(adapter) 428 use_sample = total_samples is not None

~/anaconda3/env/py3.7-tf2.1/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/training_v2.py in _process_inputs(model, mode, x, y, batch_size, epochs, sample_weights, class_weights, shuffle, steps, distribution_strategy, max_queue_size, workers, use_multiprocessing) 644 standardize_function = None 645 x, y, sample_weights = standardize( --> 646 x, y, sample_weight=sample_weights) 647 elif adapter_cls is data_adapter.ListsOfScalarsDataAdapter: 648 standardize_function = standardize

~/anaconda3/env/py3.7-tf2.1/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/training.py in _standardize_user_data(self, x, y, sample_weight, class_weight, batch_size, check_steps, steps_name, steps, validation_split, shuffle, extract_tensors_from_dataset) 2381 is_dataset=is_dataset, 2382 class_weight=class_weight, -> 2383 batch_size=batch_size) 2384 2385 def _standardize_tensors(self, x, y, sample_weight, run_eagerly, dict_inputs,

~/anaconda3/env/py3.7-tf2.1/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/training.py in _standardize_tensors(self, x, y, sample_weight, run_eagerly, dict_inputs, is_dataset, class_weight, batch_size) 2408 feed_input_shapes, 2409 check_batch_axis=False, # Don't enforce the batch size. -> 2410 exception_prefix='input') 2411 2412 # Get typespecs for the input data and sanitize it if necessary.

~/anaconda3/env/py3.7-tf2.1/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/training_utils.py in standardize_input_data(data, names, shapes, check_batch_axis, exception_prefix) 580 ': expected ' + names[i] + ' to have shape ' + 581 str(shape) + ' but got array with shape ' + --> 582 str(data_shape)) 583 return data 584

ValueError: Error when checking input: expected input_1 to have shape (368, 368, 3) but got array with shape (360, 360, 3)

MikeOfZen commented 4 years ago

I checked it and there was indeed a mismatch in the shapes. Fixed it in the last commit. but somehow it ran for me with the mismatch, which makes me believe that maybe the latest tensorflow automatically scales incorrect input tensors (though I'm not certain). I'm running it with tensorflow '2.0.0' (locally)

Please try it now (with the last commit)

blldw commented 4 years ago

Now the commit 10271fd3d73ac2c34b6562e1f8ea175521044a3f works for me with python 3.7 and tensorflow 2.1.0.

MikeOfZen commented 4 years ago

Great, happy to hear that!