0% found this document useful (0 votes)
37 views1 page

Optimization Problem Analysis

The document presents a problem set focused on optimization, requiring the determination of local minimizers under various constraints and conditions. It includes specific cases involving functions, gradients, Hessians, and necessary conditions for minimization. Additionally, it addresses the minimization of the sum of squared differences for a set of real numbers.

Uploaded by

d04.taniya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views1 page

Optimization Problem Analysis

The document presents a problem set focused on optimization, requiring the determination of local minimizers under various constraints and conditions. It includes specific cases involving functions, gradients, Hessians, and necessary conditions for minimization. Additionally, it addresses the minimization of the sum of squared differences for a set of real numbers.

Uploaded by

d04.taniya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Problem Set 1

1. Consider the problem


minimize f(x)
subject to x∈Ω
2
where f ∈ 𝐶 . For each of the following specifications for Ω, x*, and f, determine if the given
point x* is : (i) definitely a local minimizer; (ii) definitely not a local minimizer; or (iii) possibly
a local minimizer. Fully justify your answer.

A. f : ℝ2 → ℝ, Ω = {x = [x1,x2]T : x1 ≥ 1}, x* = [1,2]T, and gradient 𝛻f(x*) = [1,1]T .


B. f : ℝ2 → ℝ, Ω = {x = [x1,x2]T : x1 ≥ 1, x2 ≥ 2}, x* = [1,2]T, and gradient 𝛻f(x*) = [1,0]T .
C. f : ℝ2 → ℝ, Ω = {x = [x1,x2]T : x1 ≥ 0, x2 ≥ 0}, x* = [1,2]T, and gradient 𝛻f(x*) = [0,0]T, and
Hessian F(x*) = 𝐼 (identity matrix).
D. f : ℝ2 → ℝ, Ω = {x = [x1,x2]T : x1 ≥ 1, x2 ≥ 2}, x* = [1,2]T, and gradient 𝛻f(x*) = [1,0]T, and
1 0
Hessian F(x*) = [ ].
0 −1
2. Show that if x* is a global minimizer of f over Ω, and x* ∈ Ω’ ⊂ Ω, then x* is a global minimizer of f
over Ω’

3. Suppose that x* is a local minimizer of f over Ω, and Ω’ ⊂ Ω. Show that if x* is an interior point of
Ω, then x* is a local minimizer of f over Ω’. Show that the same conclusion cannot be made if x* is
not an interior point of Ω.

4. Consider the problem


minimize f(x)
subject to x ∈ Ω,
2
where f : ℝ → ℝ is given by f(x) = 5x2 with x = [x1,x2]T, and Ω = {x = [x1,x2]T : x12 + x2 ≥ 1}. Answer
each of the following questions, showing complete justification.
a. Does the point x* = [0,1]T satisfy the first-order necessary condition?
b. Does the point x* = [0,1]T satisfy the second-order necessary condition?
c. Is the point x* = [0,1]T a local minimizer?

5. Consider the problem


minimize f(x)
subject to x ∈ Ω,
2
where x = [x1,x2] , f : ℝ → ℝ is given by f(x) = 4x12 - x22, and Ω = {x : x12 + 2x1 - x2 ≥ 0, x1 ≥ 0,
T

x2 ≥ 0}. Answer each of the following questions, showing complete justification.


a. Does the point x* = 0 = [0,0]T satisfy the first-order necessary condition?
b. Does the point x* = 0 satisfy the second-order necessary condition?
c. Is the point x* = 0 a local minimizer?

6. Suppose that we are given n real numbers, x1,x2,…,xn. Find the number 𝑥̅ ∈ ℝ such that the sum
of the squared difference between 𝑥̅ and the above numbers is minimized (assuming the
solution 𝑥̅ exists).

You might also like