evolvio4all / evolvio4JavaScript

MIT License
10 stars 2 forks source link

More senses #17

Closed Air1N closed 5 years ago

Air1N commented 5 years ago

I think we need more senses in order for the creatures to be able to interact with the world easier. Sight would be a useful start.

Air1N commented 5 years ago

@winnie334 maybe you'd like to work on this?

winnie334 commented 5 years ago

Thanks for thinking of me! While it definitely sounds interesting, I don't know if it's the best idea. Recall from the "real" evolvio that sight was basically used to scan below the creature itself, and that it brought a lot of trouble to carykh. Plus, personally I don't really see any improvements being made after a while of training, so I'm not sure if this would fix that.

What I was kinda planning to do is to try to give some more UI information, such as displaying the population count, years passed, food on a tile, and so on (also maybe like the original evolvio). I'm kinda busy though, so I can't make any promises! Hopefully I'll find some time :)

Air1N commented 5 years ago

I've been thinking the UI needed a re-work, so it'd be awesome if you did that. However, they won't be able to develop into predators without the ability to see. In the original evolv.io it was a single neural network with an uncommon "memory" system so that proposed a huge problem. I believe the LSTM that this version uses will enable creatures to remember that they're attacking something and chase it. (look into ... I forgot, they already did a thing like this) However, I want to use a different eye system than cary did. Instead of lines (1D) I think the creatures should see an area (2D) in front of (or around) them. I don't know the exact specifics we could use but basically it would allows the creature to use the tile's (or if a creature is on the tile: the creature's) color (converted to a number) as an input.

Air1N commented 5 years ago

Oh hey I found it https://www.youtube.com/watch?v=PoGwxgYHMe8

winnie334 commented 5 years ago

Ah, now I see what you mean. Multiple "raycasts" to check what's in front of them. That honestly sounds like a pretty cool idea! I'm not at all familiar with the brain though, and LSTMs are not really my cup of tea (I know NEATs a bit better). Presenting the field of view with lines and stuff should be right up my alley however :)

Air1N commented 5 years ago

Alright, so the brain is just a series of NEAT networks that are connected. They each modify a "state" and that's basically the memory.

So, first the forget network decides what to keep and what to forget from the state (an output of 0 is forget, 1 is keep, and everything in-between can be chosen). This is because its outputs get multiplied by the state.

Then the decide and modify networks work together to decide what to store in the state (add to the current state) The decide network can output -1 to 1 — this is what the brain wants to add to the state. The modify network can output 0 to 1 — this means it can very easily decide not to add anything even if the decide network wants to (by outputting 0). They are multiplied by each other, and then added to the state (after the forget network has decided what to forget and keep).

THEN (the tanh of) the state gets multiplied by the output of the main network (which also outputs 0 to 1) — basically the other 3 control the memory and the main actually controls the actions.

The state is both modified by them but also input to them aswell. They also get the output input into them too.

winnie334 commented 5 years ago

Oh, super cool they are related! If I understand correctly, LSTM uses NEATs, just like a NEAT uses feed-forward neural networks.

Thanks for the explanation on all the parts, it is very interesting to learn how it is designed. These networks sound really powerful should they work perfectly. I'm guessing that each individual NEAT network is trained/modified based on how long the creature survived? Also, when I inspect a creature, there are 4 values under every action. Are those values the outputs of the forget, decide, modify and main network?

Anyhow, sadly even after running the simulation for hours I didn't get really interesting results. Population stays at 50 and creatures keep doing seemingly random stuff (only eating, running forward in straight line, etc). No idea if it's the same for you -- perhaps I'm just unlucky. Still, it'd be nice to see the creatures learn to eat after a few years.

Thanks again for the really nice explanation, and sorry for all my questions. I'm glad I'm learning some things with this project though, which is great! :)

EDIT: nevermind that last part, I just downloaded your brain update and it makes a huge difference. Even after a few years I could see evolution happening, in some cases extremely fast: image

Super cool to see, thanks! :D

Air1N commented 5 years ago

You can press "d" to turn off the direction lines, right now there's a bug where the species becomes "undefined" that I'm working on. The brain can actually do really interesting things, because it's designed to remember stuff, so I'm thinking it will be able to remember if it's attacking something. Yes, the 4 circles are the brains: forget, decide, modify, and main, respectively.

Air1N commented 5 years ago

So, do you want to do the sight? I'm asking because if not I will :P

winnie334 commented 5 years ago

Well, honestly I think you'd do a much better job at it ^^. Sight is super important and I don't want to mess it up, haha. (Plus my weekend is super busy, so it'd take a while before i could deliver anything meaningful)

Air1N commented 5 years ago

Alright, I'll see if I can whip something up right now

Air1N commented 5 years ago

Alright, so the sight is not as I originally intended, but that's alright. We should work on scent and hearing, maybe we could achieve some interesting results. What I envision for scent is the ability to place a scent on a tile that other creatures can "smell" and for hearing the ability to make a sound that the other creatures can hear.

Basically, they should be able to place a smell with, say, a value of 0.8 and then the creatures near it would get a 0.8 value as an input, uhh of course averaging with the surrounding scents. Or perhaps even have 2 outputs, one being the strength (how far it can be detected from: 0 to 1) and another being the type (this could mean anything, the creatures would use the value as an input: -1 to 1)

The hearing is basically the same situation, except, it would only last as long as the creature is outputting above 0, whereas smell would last a certain amount of time (maybe even deteriorating over time based on strength?)

Air1N commented 5 years ago

This is in the TODO, I think it's pretty important — maybe one of the most important things. (I also think multiple parent procreation is important, too!)