Currently, the world has an interface for fetching all objects within a radius of a certain position
but this might not really be what we need? praps?
Animal brain input should be all nearby objects with their distance and the angle between the animals's forward direction and the direct direction to the object.
Awe may need to think ahead to how this will inputted to the neural network? weve got a bunch of numbers and associations 2xn matrix. Idk how thats gonna input into a network layer. We dont want the animal to have to learn the structure of the input - i.e. that this number goes with this one because this is the corresponding distance.
One option would be something akin to that ray diagram where each input node stands for a certain relative angle and then the input value is the distance to the closest thing at that angle
Currently, the world has an interface for fetching all objects within a radius of a certain position
but this might not really be what we need? praps?
Animal brain input should be all nearby objects with their distance and the angle between the animals's forward direction and the direct direction to the object.
Awe may need to think ahead to how this will inputted to the neural network? weve got a bunch of numbers and associations 2xn matrix. Idk how thats gonna input into a network layer. We dont want the animal to have to learn the structure of the input - i.e. that this number goes with this one because this is the corresponding distance.
One option would be something akin to that ray diagram where each input node stands for a certain relative angle and then the input value is the distance to the closest thing at that angle
hmmm