I'm not fully sure how to do this, but I think there is potential here. The challenge is that BD is currently built on top of MuLambda, but Picbreeder uses selective breeding instead. You would need to create a way to provide access to some of the BD methods (can they be made static? Can they be moved to a BehavioralDiversityUtil class?).
As far as what behavioral metric to use, you should use the same one from the image match task (avoid duplicating code by sharing the method somehow).
Ultimately, I think the algorithm would work like this (but open to suggestions): After the user selects some images, generate twice as many images as are needed to fill the next generation. Still save the images the user selected, but all other slots should be filled in the following manner: use behavioral diversity to select the most diverse collection of images out of those generated.
Obviously, this option should be optional ... a command line parameter can switch between it and regular Picbreeder.
I'm not fully sure how to do this, but I think there is potential here. The challenge is that BD is currently built on top of MuLambda, but Picbreeder uses selective breeding instead. You would need to create a way to provide access to some of the BD methods (can they be made static? Can they be moved to a BehavioralDiversityUtil class?).
As far as what behavioral metric to use, you should use the same one from the image match task (avoid duplicating code by sharing the method somehow).
Ultimately, I think the algorithm would work like this (but open to suggestions): After the user selects some images, generate twice as many images as are needed to fill the next generation. Still save the images the user selected, but all other slots should be filled in the following manner: use behavioral diversity to select the most diverse collection of images out of those generated.
Obviously, this option should be optional ... a command line parameter can switch between it and regular Picbreeder.