- What for?
- a Hypervolume Newton Method for solving continuous multiobjective optimization problems (MOPs),
- enabled by the analytically computation of Hypervolume Hessian Matrix.
- Why?
- the Newton method has a local quadratic convergence under some mild condition of objective functions.
- Perhaps, you'd like to refine the final outcome of some direct optimizers with the Hypervolume Newton method..
- When to use it? the objective function is at least twice continuously differentiable.
Specifically, you will find the following major functionalities:
hvd.HypervolumeDerivatives
: the analytical computation of the HV Hessian and specifically, Alg. 2 described in [DEW22].hvd.HVN
: Hypervolume Newton Method for (Constrained) Multi-objective Optimization Problems in [WED+22].
A pypi
package will be available soon:
For now, please take the lastest version from the master branch:
git clone https://github.com/wangronin/HypervolumeDerivatives.git
cd HypervolumeDerivatives && python setup.py install --user
Hypervolume (HV) Indicator of a point set
Consider an objective function
Compute the HV Hessian w.r.t. the objective points, i.e., the objective function is an identity mapping:
from hvd import HypervolumeDerivatives
ref = np.array([9, 10, 12])
hvh = HypervolumeDerivatives(3, 3, ref, minimization=True)
out = hvh.compute_hessian(X=np.array([[5, 3, 7], [2, 1, 10]]))
# out["HVdY2"] = np.array(
# [
# [0, 3, 7, 0, 0, -7],
# [3, 0, 4, 0, 0, -4],
# [7, 4, 0, 0, 0, 0],
# [0, -0, 0, 0, 2, 9],
# [-0, 0, 0, 2, 0, 7],
# [-7, -4, 0, 9, 7, 0],
# ]
# )
Compute the HV Hessian w.r.t. the decision points.
First, we define an objective function (the objective space is
import numpy as np
c1 = np.array([1.5, 0, np.sqrt(3) / 3])
c2 = np.array([1.5, 0.5, -1 * np.sqrt(3) / 6])
c3 = np.array([1.5, -0.5, -1 * np.sqrt(3) / 6])
ref = np.array([24, 24, 24])
def MOP1(x):
x = np.array(x)
return np.array(
[
np.sum((x - c1) ** 2),
np.sum((x - c2) ** 2),
np.sum((x - c3) ** 2),
]
)
def MOP1_Jacobian(x):
x = np.array(x)
return np.array(
[
2 * (x - c1),
2 * (x - c2),
2 * (x - c3),
]
)
def MOP1_Hessian(x):
x = np.array(x)
return np.array([2 * np.eye(3), 2 * np.eye(3), 2 * np.eye(3)])
Then, we compute the HV Hessian w.r.t. to the decision points in
hvh = HypervolumeDerivatives(
n_decision_var=3, n_objective=3, ref=ref, func=MOP1, jac=MOP1_Jacobian, hessian=MOP1_Hessian
)
w = np.random.rand(20, 3)
w /= np.sum(w, axis=1).reshape(-1, 1)
X = w @ np.vstack([c1, c2, c3])
out = hvh.compute(X)
The Hypervolume Newton Method is straightforward given the analytical computation of the HV Hessian:
Note that, in the code base, we also implemented a HV Newton method for equality-constrained MOPs.
We took the MOP1
problem defined above:
import numpy as np
from hvd.newton import HVN
max_iters = 10
mu = 20
ref = np.array([20, 20, 20])
w = np.abs(np.random.rand(mu, 3))
w /= np.sum(w, axis=1).reshape(-1, 1)
x0 = w @ np.vstack([c1, c2, c3])
opt = HVN(
dim=2,
n_objective=2,
ref=ref,
func=MOP1,
jac=MOP1_Jacobian,
hessian=MOP1_Hessian,
mu=len(x0),
x0=x0,
lower_bounds=-2,
upper_bounds=2,
minimization=True,
max_iters=max_iters,
verbose=True,
)
X, Y, stop = opt.run()
The hypervolume indicator (HV) of a set of points is the m-dimensional Lebesgue measure of the space that is jointly dominated by a set of objective function vectors in
We show an example of 3D hypervolume indicator and the geometrical meaning of its partial derivatives as follows.
In this chart, we have three objective function to minimize, where we depicts three objective points,
Also, we include, in folder mathematica/
, several cases of the hypervolume indicator Hessian computed symoblically using Mathematica
.
-
[WED+22] Wang, H.; Emmerich, Michael T. M.; Deutz, A.; Hernández, V.A.S.; Schütze, O. The Hypervolume Newton Method for Constrained Multi-objective Optimization Problems. Preprints 2022, 2022110103.
-
[DEW22] Deutz, A.; Emmerich, Michael T. M.; Wang, H. The Hypervolume Indicator Hessian Matrix: Analytical Expression, Computational Time Complexity, and Sparsity, arXiv, 2022.