Inspired by David Ha's A Visual Guide to Evolution Strategies to implement the CMA-ES optimization algorithm.
Original paper from 2001: This is what I used to hack out the core algorithm
def rastrigin(X, A=10):
return A + np.sum((X**2 - A * np.cos(2 * np.pi * X)), -1)
X = np.random.normal(0,1.24, (d, n))
fitness = function(X.T)
arg_topk = np.argsort(fitness)[:k]
topk = X[:,arg_topk]
centered = topk - X.mean(1, keepdims=True) C = (centered @ centered.T)/(k-1)
* Sample the new population for the next iteration from this covariance
matrix using the mean of the top k
```python
w, E = la.eigh(C)
N = np.random.normal(size=(d,n))
X = topk.mean(1,keepdims=True) + (E @ np.diag(np.sqrt(w)) @ N)
n = 100 # Population size
d = 2 # Dimensions
k = 25 # Size of elite population
# Initial random sample
X = np.random.normal(0,1.24, (d, n))
*Note the official implementation of CMA-ES should be used for any actual use cases.