Solution methods and parameters¶
Standard reformulation¶
Consider the chance constraint
where \(z \sim \mathcal{N}(\mu,\Sigma)\) is a vector in \(\mathbb{R}^n\) of jointly normal random variables with mean \(\mu\) and covariance matrix \(\Sigma\). JuMPChance currently only supports a diagonal covariance matrix \(\Sigma\), i.e., all variables are independent, but we present the more general case here. For simplicity, we can introduce a new set of variables \(y_i = c_i^Tx + d_i\) and reduce the constraint to:
Recall that \(y^Tz\) is normally distributed with mean \(y^T\mu\) and variance \(y^T\Sigma y\). Then
where \(\Phi\) is the standard normal cumulative distribution function.
Therefore the chance constraint is satisfied if and only if
or, since \(\Phi^{-1}\) is monotonic increasing,
which is
For \(\epsilon \leq 0\), \(\Phi^{-1}(1-\epsilon) > 0\), so the above constraint is convex and equivalent to
which is a second-order conic constraint, where \(\Sigma^{\frac{1}{2}}\) is the square root of \(\Sigma\).
Methods for distributionally robust constraints¶
Following the notation in the quick start guide, a distributionally robust chance constraint can be formulated as
This is a convex constraint because it is the intersection of a large (possibly infinite) set of convex constraints. These are challenging to reformulate into an explicit conic form. Instead, we approximate the constraint by a sequence of linear tangents, i.e., given a point \(y\), we detect if the constraint is violated for any choice of \(\mu\) or \(\Sigma\), and if so we add a separating hyperplane which is simple to compute.
solve parameters¶
The solve
method has the following optional keyword parameters when invoked on a JuMPChance model:
method::Symbol
, either:Reformulate
to use the second-order conic formulation or:Cuts
to approximate the constraints by a sequence of linear outer-approximations. Defaults to:Reformulate
.linearize_objective::Bool
, eithertrue
orfalse
indicating whether to provide a convex quadratic objective directly to the solver or to use linear outer approximations. Defaults tofalse
.probability_tolerance::Float64
, chance constraints are considered satisfied if they actually hold with probability \(1-\epsilon\) minus the given tolerance. Defaults to0.001
.debug::Bool
, enables debugging output for the outer approximation algorithm. Defaults tofalse
.iteration_limit::Int
, limits the number of iterations performed by the outer approximation algorithm. (In each iteration, a single linearization is added for each violated constraint.) Defaults to60
.objective_linearization_tolerance::Float64
, absolute term-wise tolerance used when linearizing quadratic objectives. Defaults to1e-6
.reformulate_quadobj_to_conic::Bool
, iftrue
, automatically reformulates a quadratic objective into second-order conic form. This is necessary for some solvers like Mosek or ECOS which don’t support mixing quadratic and conic constraints. Defaults tofalse
except if the solver is ECOS.lazy_constraints::Bool
, iftrue
andmethod==:Cuts
, use lazy constraints instead of re-solving the mixed-integer relaxation in a loop. This option is experimental and not yet implemented for distributionally robust problems. Defaults tofalse
.