[SOLVED] CS chain Constrained NL Optimization (NLP)

$25

File Name: CS_chain_Constrained_NL_Optimization_(NLP).zip
File Size: 395.64 KB

5/5 - (1 vote)

Constrained NL Optimization (NLP)
Goals:
Optimality conditions for linear constraints
Lagrange multipliers and Lagrangian
Optimality conditions for nonlinear constraints Duality
3/21/2020 @2020 New York University Tandon 204 School of Engineering

3/21/2020
@2020 New York University Tandon 205 School of Engineering
General Statement
A minimization problem with constraints is often written as:
min f (x)
subjectto gi(x)0, iE
gj(x)0, jI where f and g are C2.

Essential Difference Between LP and NLP
For solvable LP problems, optimal solution occurs at an extreme point.
For NLP problems, the optimal solution may not be at an extreme point or even on the boundary of the feasible set.
3/21/2020 @2020 New York University Tandon 206 School of Engineering

Example 1: max P KL,
subject to 4K L 8,
Example 2:
max f(x)sinx,
K, L 0 3/21/2020
subject to 0 x .
Examples
@2020 New York University Tandon 207 School of Engineering

Case 1: Linear Equality Constraints Consider the optimization problem:
min f (x)
subjectto Axb, Amn
with rank(A) m n.
Idea: transform the constrained problem into an unconstrained optimization problem.
3/21/2020 @2020 New York University Tandon 208 School of Engineering

3/21/2020
@2020 New York University Tandon 209 School of Engineering
A Motivating Example
Solve the optimization problem with a linear equality constraint:
min f(x)x2 2x x2 x2 4x 11233
subjectto x x 2x 2. 123

3/21/2020
@2020 New York University Tandon 210 School of Engineering
A Motivating Example (contd)
The optimization problem can be made unconstrained as follows:
min f(x)22 32 4x x 2x 23232
using x 2x 2x. 123

Answer
A strict local minimizer to the unconstrained problem
is:
x2 1.5, x3 1. Thus, for the original problem,
3/21/2020
@2020 New York University Tandon 211 School of Engineering
x (2.5, 1.5, 1), *
f x 1.5. *

3/21/2020
@2020 New York University Tandon 212 School of Engineering
An Exercise
Solve the following optimization problem:
max f (x) x x x 123
subject to 123
x x x 1 aaa
123 withai 0,xi 0,i1,2,3.

Observation
Indeed, any optimization problem with linear equality constraints Ax=b
can be transformed into an unconstrained problem!
3/21/2020 @2020 New York University Tandon 213 School of Engineering

Some Terms from Matrix Theory
To generalize, lets recall the following notions.
The null space of a matrix Amn , with m n, is: NApn: Ap0.

T
The range space of A is defined as:
RAT qn:qAT,forsomem .
NARA. nT
3/21/2020 @2020 New York University Tandon 214 School of Engineering

Nullspace Matrix Z: AZ=0
A matrix Z nr is a null-space matrix of A if any vector in N ( A) can be expressed as
a linear combination of the columns of Z.
If r nm, then Z is called a basis matrix for
the null space of A (as the columns of Z are linearly
independent!).
3/21/2020 @2020 New York University Tandon 215 School of Engineering

Representation of the Null Space
Since the null space can be represented as
N(A) p: pZvforsomev , r
it follows that
N(A) R(Z).
3/21/2020 @2020 New York University Tandon 216 School of Engineering

Comment
If Amn is a matrix of full (row) rank m,
then without loss of any generality, we can assume
AB N, Bmm fullrank,Nm(nm). As it can be directly checked,
Z B1Nn(nm). I
nm
3/21/2020 @2020 New York University Tandon 217 School of Engineering

As a consequence, we have:
If x is any point satisfying Ax b, then all other points can be written as
x xZv, forsomevectorv Particular non-homogeneous solution
because x x N ( A).
3/21/2020 @2020 New York University Tandon 218 School of Engineering
General homogenous solutions

Optimization with Linear Equality Constraints
min f (x) subjectto Axb

m i nr ( v ) f ( x Z v ) . v
The function is called the reduced function.
3/21/2020 @2020 New York University Tandon 219 School of Engineering

Comment
If Z is a basis matrix for the null space of A, then

is a function of n m variables, i.e., r = n m. In other words, the transformed unconstrained
problem involves less variables.
3/21/2020 @2020 New York University Tandon 220 School of Engineering

Motivating Example revisited
min f(x)x2 2x x2 x2 4x 11233
subjectto x x 2x 2. 123
1 2 B1N
TakeZ1 0,forAB N
and x200.
1112 0 1 I
22 T
3/21/2020 @2020 New York University Tandon 221 School of Engineering

3/21/2020
@2020 New York University Tandon 222 School of Engineering
Motivating Example revisited
2 1 2v xxZv01 01.
as before.
v2 0 0 1

min(v)2v2 3v2 4vv 2v 12121

Optimality Conditions
minf(x) min(v) f(xZv) subjectto Axb vr

Using the chain rule,
(v) ZT f (x Zv) ZT f (x) 2(v)ZT2 f(xZv)ZZT2 f(x)Z
3/21/2020 @2020 New York University Tandon 223 School of Engineering

Necessary Conditions
Ifx isalocalminimizeroff over x:Axb
*
and Z is a null-space matrix for A, then
ZTf(x)0. (x:stationarypoint) *
ZT2 f (x )Z is positive semi-definite. *
That is, the reduced gradient is zero and the reduced Hessian matrix is positive semidefinite.
3/21/2020 @2020 New York University Tandon 224 School of Engineering

Comments

A point at which the reduced gradient is zero is

called a stationary point.
The secondorder condition implies:
3/21/2020
@2020 New York University Tandon 225 School of Engineering
pT2f(x)p0, pN(A). *
However, this condition does not require that 2 f (x ) be positive semidefinite!
*

Ax b, *
Sufficient Conditions
If x satisfies: *
ZTf(x)0,and *
ZT2f(x)Zispositivedefinite, *
where Z n(nm) is a basis matrix for N(A), then, x is a strict local minimizer of
*
fover x:Axb.

3/21/2020 @2020 New York University Tandon 226 School of Engineering

3/21/2020
@2020 New York University Tandon 227 School of Engineering
Motivating Example revisited
min f(x)x2 2x x2 x2 4x 11233
subjectto x x 2x 2. 123
1 2 B1N TakeZ1 0 ,
I 0 1

forA B N 1 1 2 .
N

3/21/2020
@2020 New York University Tandon 228 School of Engineering
T Sincef(x)2x 2, 2x , 2x 4 ,
123
ZTf(x)(0, 0)T atthefeasiblepoint *
x 2.5, 1.5, 1. *
In addition,
ZT2f(x)Z 0 2 01 0
1 1 02 0 01 2 * 201
4 4 positivedefinite 4 6
0 0 20 1

However, 2 f (x ) is not positive definite!!
*

Lagrange Multipliers
Fact : At a local minimizer x , *
f (x ) AT **
where*m iscalledLagrangemultipliers.
3/21/2020 @2020 New York University Tandon 229 School of Engineering

Proof
3/21/2020
@2020 New York University Tandon 230 School of Engineering
Let Z be any n r null-space matrix for A. Then,vr,m s.t.
Zv 0. *
**
f(x)Zv AT ***
ZTZv 0, usingZT AT (AZ)T 0 *

3/21/2020
@2020 New York University Tandon 231 School of Engineering
A Geometric Demonstration
aT x b
x*
f (x ) *
a
f (x)
x
isovalue contours

3/21/2020
@2020 New York University Tandon 232 School of Engineering
Motivating Example revisited
min f(x)x2 2x x2 x2 4x 11233
subjectto x x 2x 2. 123
2x 2 1 1
22 1 usingthe1st-ordern.c. 2x 4 2
3
and x x 2x 2 the equality constraint
123
3, x 2.5, 1.5, 1 .
**

T

Lagrangian Function
For the problem: min f (x), subject to Ax b, Amn with rank(A) m,
we can define the Lagrangian function:
L(x,)f(x)m aTxbf(x)T(Axb)
note :
aiT stands for the ith row of A.
iii i1
3/21/2020 @2020 New York University Tandon 233 School of Engineering

3/21/2020
@2020 New York University Tandon 234 School of Engineering
FirstOrder Optimality Conditions
Lx , 0, **
because
L(x, )f(x)A ,
T L(x, )bAx.
x

3/21/2020
@2020 New York University Tandon 235 School of Engineering
A Perturbed Problem
Assumethatf(x )min f(x), subjecttoAxb. *
Then, for the perturbed problem:
f ( x ) min f ( x), subject to Ax b .
Indeed, for small ,
f (x) f x x x f (x )
***
T f(x)xx A
***
f(x )T **
T
T

Exercise
Consider min f(x)x2 x2x2 2x x x4 8x 113 122 2
subjectto: 2x 5x x 3. 123
(a) Which of the following points are stationary: TT
T
(b) local minimizer? local maximizer? saddle point?
(0,0,2) ; 0,0,3 ; 1,0,1 .
3/21/2020 @2020 New York University Tandon 236 School of Engineering

Case 2: Linear Inequality Constraints
Consider the optimization problem:
min f (x)
subjectto Axb, Amn.
3/21/2020 @2020 New York University Tandon 237 School of Engineering

3/21/2020
@2020 New York University Tandon 238 School of Engineering
An Example
min f (x) 1 x2 1 x2 21 22
subjectto x 2x 2 12
x x 1 12
x 3. 1

x2
x
*
The optimal solution x* reduces to a solution of the equality constrained problem
with the active constraints.
3/21/2020 @2020 New York University Tandon 239 School of Engineering
x1

More specifically,
The minimizer x* for the original problem is the same as the simplified problem:
min1x2 12 21 22
24 2
x* 5, 5,* 50.
subjectto x2x2.

For this example, x 2x 2 is an active constraint
12
Note :
at x* , while the other two are inactive.
12
3/21/2020 @2020 New York University Tandon 240 School of Engineering

m i n f ( x )
subjectto Axb, Amn
Observations
m i n f ( x )
subjectto Axb, alltheactiveconstraints

For the inactive constraints, we associate with zero Lagrange multipliers:
aTxb0, i1,,m. ii*i
3/21/2020 @2020 New York University Tandon 241 School of Engineering

Remark
It may happen that none of the constraints in NL programming are active, i.e., constrained NL programming may be the same as unconstrained. A sharp difference with linear programming!
3/21/2020
@2020 New York University Tandon 242 School of Engineering
22 minf(x)x1 x2
subject to
4x x 0,

12 2x x 0,
12
x 0, x 0.
12
12

3/21/2020
@2020 New York University Tandon 243 School of Engineering
Necessary Conditions
At a local minimizer x , then s.t. **
f(x)AT,(or,ZTf(x)0), ***
* 0,
T Ax b0, and
**
ZT 2 f (x )Z is positive semi-definite, *
where Z is a null-space matrix for the matrix

A of active constraints at x .
*

Comment 1
f(x)AT AT()
* * * active
because*i 0foranyinactiveithconstraint.
3/21/2020 @2020 New York University Tandon 244 School of Engineering

Comment 2 (On the sign of Lagrange multipliers for the inequality constraints)
3/21/2020
to guarantee non-existence of such a s. @2020 New York University Tandon
245
By contradiction, find a stepsize s to decrease f at x : *
0 f(x s) f(x )f(x )T s. ***
On the other hand, for the active constraints

atx , i.e., Ax b0, wewant
**
0A x s bAx b As
Equivalently,
**

0
T
{f(x) s0andAs0 f(x)A, 0,
**
School of Engineering

A s
Clearly, the shaded area is empty only when these two vectors are in the same direction!
3/21/2020 @2020 New York University Tandon 246 School of Engineering
f x *

3/21/2020
@2020 New York University Tandon 247 School of Engineering
An Example
Check the necessary conditions at x (1, 0)T *
of the constrained minimization problem: 24
subject to
minx1 1.5 x2 0.5 x
1x x 0, 12
1x x 0, 12
1x x 0, 12
1x x 0. 12

Reviews

There are no reviews yet.

Only logged in customers who have purchased this product may leave a review.

Shopping Cart
[SOLVED] CS chain Constrained NL Optimization (NLP)
$25