Consider the relaxation problem, and adding slack variables, the optimal simplex tableau is:
cj→
c1=2
c2=1
c3=0
c4=0
c5=0
CB
XB
b
x1
x2
s1
s2
s3
1
x2
9/4
0
1
3/2
0
−1/4
0
s2
1/2
0
0
−2
1
1/2
2
x1
11/4
1
0
−1/2
0
1/4
σj
0
0
−1/2
0
−1/4
The optimal solution is x(0)=(11/4,9/4), we choose the third constraint to construct a cut: x1−21s1+41s3=411⇒x1−s1−2=43−21s1−41s3⇒43−21s1−41s3≤0⇒−2s1−s3+s4=−3
By applying dual simplex method, we obtain:
(1)
cj→
c1=2
c2=1
c3=0
c4=0
c5=0
c6=0
CB
XB
b
x1
x2
s1
s2
s3
s4
1
x2
9/4
0
1
3/2
0
−1/4
0
0
s2
1/2
0
0
−2
1
1/2
0
2
x1
11/4
1
0
−1/2
0
1/4
0
0
s4
−3
0
0
[−2]
0
−1
1
σj
0
0
−1/2
0
−1/4
0
(2)
1
x2
0
0
1
0
0
−1
3/4
0
s2
7/2
0
0
0
1
3/2
−1
2
x1
7/2
1
0
0
0
1/2
−1/4
0
s1
3/2
0
0
1
0
1/2
−1/2
σj
0
0
0
0
0
−1/4
The optimal solution is x(1)=(7/2,0), we choose the third constraint to construct a cut: x1+21s3−41s4=27⇒x1−s4−3=21−21s3−43s4⇒21−21s3−43s4≤0⇒−2s3−3s4+s5=−2
By applying dual simplex method, we obtain:
(1)
cj→
c1=2
c2=1
c3=0
c4=0
c5=0
c6=0
c7=0
CB
XB
b
x1
x2
s1
s2
s3
s4
s5
1
x2
0
0
1
0
0
−1
3/4
0
0
s2
7/2
0
0
0
1
3/2
−1
0
2
x1
7/2
1
0
0
0
1/2
−1/4
0
0
s1
3/2
0
0
1
0
1/2
−1/2
0
0
s5
−2
0
0
−8
0
[−2]
−3
1
σj
0
0
0
0
0
−1/4
0
(2)
1
x2
1
0
1
0
0
0
9/4
−1/2
0
s2
2
0
0
0
1
0
−13/4
3/4
2
x1
3
1
0
0
0
0
−1
1/4
0
s1
1
0
0
1
0
0
−5/4
1/4
0
s3
1
0
0
0
0
1
3/2
−1/2
σj
0
0
0
0
0
−1/4
0
The optimal solution is x(2)=(3,1), with optimum of 7.
Problem 2
State and prove the first-order necessary condition, second-order necessary condition, and second-order sufficient condition for the existence of extreme points inmultivariate functions.
First-order necessary condition: Assume that R is an open set of Euclidean space En, f(x) is continuously differentiable and x∗ is a local minimizer of f(x), then ∂x1∂f(x∗)=∂x2∂f(x∗)=⋯=∂xn∂f(x∗)=0. Proof: Suppose for contradiction that ∇f(x∗)=0. Define the vector p=−∇f(x∗) and note that pT∇f(x∗)=−∥∇f(x∗)∥2<0. Because ∇f is continuous near x∗, there is a scalar T>0 such that pT∇f(x∗+tp)<0, for all t∈[0,T].
For any t∈(0,T], we have by Taylor’s theorem that f(x∗+tp)=f(x∗)+tpT∇f(x∗+tp), for some t∈(0,t).
Therefore, f(x∗+tp)<f(x∗) for all t∈(0,T]. We have found a direction leading away from x∗ along which f decreases, so x∗ is not a local minimizer, and we have a contradiction. Second-order necessary condition: If x∗ is a local minimizer of f, ∇2f is continuous in an open neighborhood of x∗, then ∇f(x∗)=0, ∇2f(x∗) is positive semi-definite. Proof: We know from First-order necessary condition that ∇f(x∗)=0. For contradiction, assume that ∇2f(x∗) is not positive semidefinite. Then we can choose a vector p such that pT∇2f(x∗)p<0, and because ∇2f is continuous near x∗, there is a scalar T>0 such that pT∇2f(x∗+tp)p<0 for all t∈[0,T].
By doing a Taylor series expansion around x∗, we have for all t∈(0,T] and some t∈(0,t) that f(x∗+tp)=f(x∗)+tpT∇f(x∗)+21t2pT∇2f(x∗+tp)p<f(x∗).
As in First-order necessary condition, we have found a direction from x∗ along which f is decreasing, and so again, x∗ is not a local minimizer. Second-order sufficient condition: Assume that R is an open set of Euclidean space En, f(x) is twice continuously differentiable in R. If x∗ is a local minimizer of f, ∇2f is continuous in an open neighborhood of x∗, ∇f(x∗)=0 and ∇2f(x∗) is positive definite, then x∗∈R is a strict local minimizer of f(x). Proof: Because the Hessian is continuous and positive definite at x∗, we can
choose a radius r>0 so that ∇2f(x) remains positive definite for all x in the
open ball D={z∣∥z−x∗∥<r}. Taking any nonzero vector p with ∥p∥<r, we have x∗+p∈D and so f(x∗+p)=f(x∗)+pT∇f(x∗)+21pT∇2f(z)p=f(x∗)+21pT∇2f(z)p, where z=x∗+tp for some t∈(0,1). Since z∈D, we have pT∇2f(z)p>0, and therefore f(x∗+p)>f(x∗), giving the result.
Problem 3
Prove: Let f(x) be a differentiable convex function defined on the convex set Rc. If there exists a point x∗∈Rc such that, for all x∈Rc, the following condition holds:
∇f(x∗)⊤(x−x∗)≥0
then x∗ is the global minimum point of f(x) on Rc.
Using the first-order characterization of convex functions, we have that for all x,y∈Rc, f(y)≥f(x)+∇f(x)⊤(y−x). And by the convexity inequality at x=x∗, we have that for any x∈Rc, f(x)≥f(x∗)+∇f(x∗)⊤(x−x∗). Then consider ∇f(x∗)⊤(x−x∗)≥0, we have that f(x)≥f(x∗)+∇f(x∗)⊤(x−x∗)≥f(x∗), which means x∗ is a global minimum of f on Rc.
Problem 4
Consider the optimization problem:
x∈Rnminf(x)subjecttogj(x)≥0,j=1,…,l
This problem is called convex programming if f(x) is a convex function and all gj(x) are concave functions. Question: Prove that convex programming has the following properties:
1. The feasible solution set is convex.
Consider feasible set F={x∈Rn∣gj(x)≥0,j=1,…,l}. Take any two points x1,x2∈F, i.e., gj(x1)≥0 and gj(x2)≥0 for all j. For any θ∈[0,1], consider the convex combination xθ=θx1+(1−θ)x2. Since gj is concave, we have that gj(xθ)=gj(θx1+(1−θ)x2)≥θgj(x1)+(1−θ)gj(x2)≥0 for all j. Therefore, xθ∈F, so F is convex. Also, from Property 1, xθ∈F. Therefore, f(xθ)=f∗ and xθ∈X∗. Hence, the set of optimal solutions X∗ is convex.
2. The set of optimal solutions is convex.
Consider optimal solution set X∗=x∈F∣f(x)=f∗,f∗=minx∈Ff(x). Take any two optimal solutions x1,x2∈X∗. Then f(x1)=f(x2)=f∗. For any θ∈[0,1], consider xθ=θx1+(1−θ)x2. Since f is convex, we have that f(xθ)≤θf(x1)+(1−θ)f(x2)=θf∗+(1−θ)f∗=f∗.
3. Any local optimum is also a global optimum.
Let x∗∈F be a local minimum, i.e., there exists ϵ>0 such that f(x∗)≤f(x),∀x∈F with ∣x−x∗∣<ϵ. Since f is convex and F is convex, the first-order condition for convex functions applies that ∇f(x∗)⊤(x−x∗)≥0,∀x∈F. From the theorem we proved earlier (first-order condition implies global minimum for convex functions on a convex set), this implies x∗ is a global minimum. Therefore, every local optimum is globally optimal.
4. If the objective function is strictly convex and a global optimum exists, then it is unique.
Suppose f is strictly convex and there exist two global optima x1=x2 with f(x1)=f(x2)=f∗. Consider xθ=θx1+(1−θ)x2 for θ∈(0,1). Strict convexity of f gives that f(xθ)<θf(x1)+(1−θ)f(x2)=f∗. But f∗ is the global minimum, so f(xθ)≥f∗, which is a contradiction. Hence, the global minimum is unique.
Comments NOTHING