transform linear inequality constraints into bound constraints for optimization solvers

7 visualizzazioni (ultimi 30 giorni)
Suppose there are linear inequality constraints and are the parameters optimized by fmincon, lsqnonlin, or friends.
Since linear inequality constraints can generally not be enforced at intermediate iterations, I want to transform them into bound constraints which can be enforced at all iterations.
General idea:
If A is dense and has m rows (i.e, there are m constraints), then we can make a "change of variables"
, where are the rows of A, for instance, .
This leads to simple bound constraints . Does that make sense?
Specific problem:
My parameters y represent interpolation values corresponding to interpolation points x and the resulting interpolation function is supposed to be convex on the interpolation domain. For the specific case of linear interpolation, @Matt J showed that convexity is realized by increasing second order finite differences
(which represents a set of linear inequality constraints)
This can be transformed into simpler bounds on the new parameters by the change of variables
with
What I am really working with are smoother B-spline basis functions of higher degree using spapi function. @Bruno Luong showed that the linear inequality matrix A is given by
% k: spline order, x: points where values y live
B = spapi(k,x,eye(length(x)));
Bdd = fnder(B,2);
A = Bdd.coefs'
and
is a sufficient condition for convexity.
I am wondering whether a change of variables (like for linear interpolation functions) can be applied here to transform the inequalities into bounds?
Any opinions are greatly appreciated!
  1 Commento
John D'Errico
John D'Errico il 3 Ott 2024
For example, it is possible to convert a set of linear inequality constraints into a bound constrained problem by the introduction of what are called slack variables.
That is, if you have an inequality
A*x >= b
then you CAN write it as an equality
A*x == b + y
where y is a new (slack) variable that is constrained to be non-negative. If you have multiple constraints, then you would introduce one slack variable for each inequality constraint.
Such slack variables are a common artifice in mathematics, used for example to solve the minimum sum of absolute values problem (versus the traditional linear least squares) but also used to formulate optimization problems using Lagrange multipliers. You can also view the question of whether a point lies inside a convex polytope, in terms of slack variables, one for each facet of the polytope. I can think of at least one other example.
However, you are asking to solve the problem in a different way, by use of a transformation. For example, given a vector x, which must form a monotonic increasing sequence, then we can transform the problem such that y=diff(x)>=0. And this is not always so easy.

Accedi per commentare.

Risposta accettata

Bruno Luong
Bruno Luong il 3 Ott 2024
Modificato: Bruno Luong il 3 Ott 2024
No you cannot. It is well known that any generic linear constraints
Aeq * x = beq
A * x <= b
lo <= x <= up
is equivalent to "standard form" and
D * y = e
y >= 0
After some linear transformation of x to y.
In general simple box constraints is NOT equivalent to the generic constraints (in two forms expressed above).
It is not a proof but geometrically box constraints are linear polytope with parallel faces, and the other two are generally generic polytopes.
You cannot simple do variable change for instant if the number of inequality constraints is larger than the number of decision parameters.
In any case solver such as linprog, quadprog, lsqlin do such transformation behind the scene for us, and user should not worry about doing that before hand.
Here is a video to explain the standard form and the conversion some of them using slack variables https://www.youtube.com/watch?v=db979cMaQ0M
  14 Commenti
Bruno Luong
Bruno Luong il 5 Ott 2024

I can't see how Weyl-Minkowski easily combine with other ideas.
The brute force of enumerate all combonations (vertices and extrem rays) rarely work when implementing on computer. Its value is on theoretcal side.

Accedi per commentare.

Più risposte (1)

Matt J
Matt J il 3 Ott 2024
Modificato: Matt J il 3 Ott 2024
Yes, a set A*y>=0 is a cone and so can equivalently be expressed as where are the so-called extreme rays of the cone and are non-negative coefficients. So, in theory, you could re-express the problem with as the unknowns and which require only non-negativity constraints. However, finding the extreme rays given the matrix A can be computationally demanding, depending on the dimension of the problem.
  9 Commenti
Matt J
Matt J il 4 Ott 2024
Modificato: Matt J il 4 Ott 2024
For a small example, you can look here:
There is also 3rd party code for higher dimensional cases. The following is one offering, though I've never used it myself:
Bruno Luong
Bruno Luong il 4 Ott 2024
In the theoreme In page 23 of this document the extreme directions (rays directions) can be seen as vertices of an almost similar form of the original polyhedra.
To find a vertices of an polyhedra you need to pick a lot of combination of vectors (50) within a large number of vectors (96 = 2*48). This brute force would be impossible to achieve.

Accedi per commentare.

Categorie

Scopri di più su Spline Postprocessing in Help Center e File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by