Minimization problem with constraint

Hello, the problem is as follows: Minimize R Subject to: (x-a_i)^2+(y-b_i)^2 ≤ R^2
Am looking to find x, y and R, knowing that
  • a_i and b_i are known values from (100*1 matrices) each.
  • x and y are within min and max of a_i and b_i.
Is it possible to find a solution with the optimzation tool of MATLAB? If not, any suggestion for a solution ?

 Risposta accettata

Alan Weiss
Alan Weiss il 13 Set 2018

0 voti

I haven't tried this, but it sounds straightforward.
Decision variables: x(1) = x, x(2) = y, x(3) = R.
100 nonlinear inequality constraints: (x(1) - a(i))^2 + (x(2) - b(i))^2 - R^2 <= 0
Objective function: R = x(3)
Bounds: ll = min(min([a,b])), mm = max(max([a,b]))
lb = [ll,ll,0] , ub = [mm,mm,Inf]
Call fmincon from a reasonable start point, such as [(ll+mm)/2,(ll+mm)/2,abs(mm)]
Alan Weiss
MATLAB mathematical toolbox documentation

Più risposte (2)

Bruno Luong
Bruno Luong il 13 Set 2018
Modificato: Bruno Luong il 13 Set 2018
Here is solution
R = -Inf
x = something between min(ai,bi),max(ai,bi)
y = something between min(ai,bi),max(ai,bi)
next
Matt J
Matt J il 13 Set 2018
Modificato: Matt J il 13 Set 2018

0 voti

You could probably use minboundcircle in this FEX distribution.

Prodotti

Release

R2015b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by