-
Notifications
You must be signed in to change notification settings - Fork 192
Description
I want to solve the following problem: place
constraint handling notebook:
import cma
import numpy as np
# problem instance: place n points in 2D into a ball of radius r
n = 3
r = 3
# sum of smallest squared distance to nearest neighbor
def objective(x):
total = 0
for i in range(n):
best = 1e100
for j in range(n):
if j == i: continue
dist = np.linalg.norm(x[2*i:2*i+2] - x[2*j:2*j+2])
best = min(best, dist**2)
total += best
return -total
# all points must be inside of the (closed) ball
def constraints(x):
c = np.zeros(n)
for i in range(n):
c[i] = np.linalg.norm(x[2*i:2*i+2]) - r
return c
x0 = np.random.randn(2*n)
print("initial:", constraints(x0), objective(x0))
x, es = cma.fmin_con2(objective, x0, 1, constraints)
print("optimized:", constraints(x), objective(x))
xx = r * np.array([1,0,-0.5,-np.sqrt(0.75),-0.5,np.sqrt(0.75)])
print("analytic:", constraints(xx), objective(xx))For
Note that the problem is hard in the sense that the case distinction that is implicit in the minimum computation is active at the optimum. However, it is unclear to me why the problem is solved with the norm, while squaring the norm results in a breakdown of AL-CMA-ES. Also, it is off that the stability of the algorithm depends on the radius value.
This should probably not be tagged as a bug, but rather as a problem related to stability and parameter tuning. It would be nice to make the algorithm succeed on this simple problem.