nicknochnack / KerasRL-OpenAI-Atari-SpaceInvadersv0

A notebook walking through how to use Keras RL to solve Atari environments.
23 stars 38 forks source link

Facing issue in build_model function #1

Closed John-p-v1999 closed 3 years ago

John-p-v1999 commented 3 years ago

[Input 0 of layer conv2d_7 is incompatible with the layer: expected ndim=4, found ndim=5. Full shape received: [None, 3, 210, 160, 3]] This is the error I am getting when the model is instantiated. I used both keras.layers.Convolution2D and keras.layers.Conv2D. Both seems to give the same error. Please do help me out. I am attaching the link to the code I wrote. https://github.com/John-p-v1999/KerasRL-OpenAI-Atari-SpaceInvadersv0/blob/main/rl2.ipynb Thanking you for your time

nicknochnack commented 3 years ago

Heya John,

I just tested this out with a minimal viable example on my machine and wasn't able to reproduce the error. Running the following code in a new notebook executed successfully. What version of Tensorflow/Python are you running?

` import tensorflow as tf from tensorflow import keras

import tensorflow as tf from tensorflow import keras import gym import random '''print(tf.version) assert tf.config.list_physical_devices('GPU') print('hello')''' env = gym.make('SpaceInvaders-v0') h,w,c = env.observation_space.shape actions = env.action_space.n

def build_model(h,w,c,actions): model = keras.models.Sequential() model.add(keras.layers.Conv2D(32,(7,7),strides = (3,3),activation='relu',input_shape=(3,h,w,c) )) model.add(keras.layers.Conv2D(64,(5,5), activation = 'relu' )) model.add(keras.layers.Conv2D(64,(3,3), activation = 'relu' )) model.add(keras.layers.Flatten()) model.add(keras.layers.Dense(256,activation = 'relu')) model.add(keras.layers.Dense(128,activation = 'relu')) model.add(keras.layers.Dense(actions,activation = 'linear')) return model

model = build_model(h,w,c,actions) `

John-p-v1999 commented 3 years ago

in my jupyter kernel tensorflow version is 2.2.0 and python version is 3.7.9. I updated it to tensorflow 2.3.1 and it worked. Thank you so much.