pastra98 / NEAT_for_Godot

An implementation of Kenneth O. Stanley's NEAT Algorithm for the Godot game engine, written in gdscript.
MIT License
36 stars 9 forks source link

Works like magic #5

Closed Gumbo64 closed 2 years ago

Gumbo64 commented 3 years ago

amazing package with very good documentation and examples. Took me less than a day to implement and I'm gonna leave it on overnight and see how it does.

Big thanks for making this

pastra98 commented 3 years ago

Hey man, thanks for the nice words! Really made my day :) Please do let me know how it went. Performance of the algorithm can be a bit disappointing, and letting it run for a long time usually produces bad results as eventually the best agents get killed off if they don’t improve. I see you have your bhop-AI repository, are you trying to teach an AI bunny hopping cs style?

Gumbo64 commented 3 years ago

Yeah I'm trying to make a bhop AI like in cs and it actually does pretty well. It's just that I had a part where it jumped pretty far down and the raycasts couldn't see down so I'm going to keep working on it. I'll probably try harder algorithms later to do surfing and stuff but this is pretty good.

pastra98 commented 3 years ago

Awesome! I'll be sure to check out the code and see what inputs you're feeding to the neural nets. Let me know if you have any cool videos to share or something. And don't hesitate to ping me if there are any issues with NEAT 🙂

Gumbo64 commented 2 years ago

it's like 2am where I'm at but I totally did make a video and a bit of a writeup after U suggested it. Actually pretty fun summarising all the stuff I did and showing it to people. Posted it on Reddit and got some views which was fun so I've started putting READMEs on my new projects. Hope Ur having fun making things too! https://youtu.be/x8CDa-khAYY

pastra98 commented 2 years ago

Wow that's really cool! I'll have to dig into your code to understand whats going on here haha. I guess NEAT didn't really work out in your case, how did you arrive at MCTS?

Racing the AI looks like fun, I'd imagine you could also train it for something like Tribes: Ascend or cs surf.

Gumbo64 commented 2 years ago

I think it was this video which showed me that if you have unrestricted access to your environment and it's deterministic then it's usually much easier to use non-deep RL algorithms. Much less (if any) training data/time with often better performance.

So I started looking up each of the algorithms and MCTS seemed to be the simplest. I was thinking about the simplest inputs/outputs possible already when I was using NEAT. I found that the player camera can be locked to the same direction they're moving in and just have the AI choose an angle to travel towards. Then I thought I can just use 2 outputs of either W or A so that they always move at max speed. Normally MCTS is used in stuff like board games because they are fast to simulate and gain depth in (like chess AI) but now we only have 2 outputs so using a tree approach is reasonable. MCTS is like brute forcing where you simulate possible outcomes but instead of just simulating say 8 moves ahead every time, MCTS decides which moves are more "promising" and go deeper in them instead of wasting time on bad moves

pastra98 commented 2 years ago

Ah, thx for the detailed explanation. Applying some constraints makes problems like these a lot more approachable.

Best of luck with your further projects!

Gumbo64 commented 2 years ago

Thanks, you have fun too!