tambetm / simple_dqn

Simple deep Q-learning agent.
MIT License
692 stars 184 forks source link

when running /play.sh on cpu #51

Closed mcbrs1a closed 6 years ago

mcbrs1a commented 6 years ago

I receive this error:

./play.sh snapshots/pong_200.pkl --backend cpu A.L.E: Arcade Learning Environment (version 0.5.1) [Powered by Stella] Use -help for help screen. Warning: couldn't load settings file: ./ale.cfg Game console created: ROM file: roms/pong.bin Cart Name: Video Olympics (1978) (Atari) Cart MD5: 60e0ea3cbe0913d39803477945e9e5ec Display Format: AUTO-DETECT ==> NTSC ROM Size: 2048 Bankswitch Type: AUTO-DETECT ==> 2K

Screen Display Active. [Manual Control Mode] 'm' [Slowdown] 'a' [Speedup] 's' [VolumeDown] '[' [VolumeUp] ']'.

WARNING: Possibly unsupported ROM: mismatched MD5. Cartridge_MD5: 60e0ea3cbe0913d39803477945e9e5ec Cartridge_name: Video Olympics (1978) (Atari)

Running ROM file... Random seed is 0 2017-11-24 21:10:14,480 Using minimal action set with size 6 2017-11-24 21:10:14,481 Using ALE Environment Traceback (most recent call last): File "src/main.py", line 110, in mem = ReplayMemory(args.replay_size, args) File "/home/putz/neon/simple_dqn/src/replay_memory.py", line 12, in init self.screens = np.empty((self.size, args.screen_height, args.screen_width), dtype = np.uint8) MemoryError

Suggestions appreciated

TRIED

changing replay_size in play.sh and it still doesn't work?

tambetm commented 6 years ago

See #17 and reduce replay size even more.