Closed schrum2 closed 2 years ago
First make a list with the possible block types, starting with the red stone block and the quartz block. Then, with the new list, change query_cppn_for_shape to work with the list. From there, we can go about altering the set to increase the number of blocks in the list.
Specifically, CPPNs should be initialized to have an output neuron for each element in the list of block types, plus one additional block determining the presence of any block at all. query_cppn_for_shape
will figure out which output neuron has the highest activation after the first, and pick the block from the list based on this. This operation is known as argmax: https://towardsdatascience.com/there-is-no-argmax-function-for-python-list-cd0659b05e49 (use option 2 or 3, def
a standalone function to handle this operation).
Instead of copying what was done in train.py, like what our initial idea was, we opted to go a different route. First the function argmax was added in so that the maximum value in the list can be selected. This index corresponds to a block type, in its own list, which is the block that ends up getting placed. For this all to function, the numbers of outputs had to be able to scale to the number of block types specified in the list. It's also good to note that air "blocks" are sperate from this list to avoid issues with indexing.
Screenshot of structures with 6 different block types (and Mel's fences!)
The blocks generated by
evolve_CPPN.py
are a starting point, but clearly we need more. The code fortrain.py
already generates a wide variety of block types. The approach used there isn't necessarily the greatest, but it does more than what our code currently does, so recreating their approach in our code seems like a reasonable starting point. Modify the code (and the config file too probably) to generate a much wider range of possible block types.