Open DrFrankatron opened 8 months ago
This is my code, sorry the bad formatted previous one:
from mario_gpt import MarioLM
mario_lm = MarioLM()
prompts = ["many pipes, many enemies, some blocks, high elevation"]
generated_level = mario_lm.sample( prompts=prompts, num_steps=100, temperature=2.0, use_tqdm=True )
generated_level.play()
generated_level.run_astar()
Hey! Are you sure you’re using a gpu? It can be pretty slow without one. Basically has the steps increase the information in the context increases.
You can check with torch.cuda.is_available()
This is the script:
from mario_gpt import MarioLM
mario_lm = MarioLM()
prompts = ["many pipes, many enemies, some blocks, high elevation"]
generated_level = mario_lm.sample( prompts=prompts, num_steps=100, temperature=2.0, use_tqdm=True )
play in interactive
generated_level.play()
run Astar agent
generated_level.run_astar()
And I need more num_steps, but the AI lags when it reach more than 100 steps... its possible to get performance with this python library?