Chapter 4: Partial Differentiation
Section 4.8: Unconstrained Optimization
|
Essentials
|
|
•
|
Critical points for a function of several variables are points where the function or at least one of its first-partial derivatives is undefined, or where the first-partial derivatives vanish simultaneously.
|
•
|
Points where the first-partial derivatives of vanish simultaneously are found as solutions of the equations contained in .
|
•
|
Table 4.8.1 contains a statement of the Second-Derivative test for a critical point P that is a solution of .
|
|
and
|
P is a saddle point
|
test fails and no conclusion can be drawn
|
Table 4.8.1 Second-Derivative test for
|
|
|
•
|
A point P is a saddle point if every neighborhood of P contains points at which and points at which . In other words, the tangent plane at P intersects the surface at P. Such a point is a stationary point, but not an extreme point.
|
•
|
The Second-Derivative test stated in Table 4.8.1 is a special case of a more general test that extends to functions of more than two variables. This generalization, stated in Table 4.8.2, is based on the Routh-Hurwitz criterion for quadratic forms. (Some authors, including this one, see the test as based on Sylvester's Law of Inertia.)
|
is the Hessian for evaluated at P
|
is the sequence , where is the th principal minor of
|
Signs of strictly alternate ⇒P is a local maximum
|
Signs of are all the same ⇒P is a local minimum
|
Signs of neither alternate nor are all the same ⇒P stationary, but not extreme
|
At least one ⇒test fails and no conclusion can be drawn
|
Table 4.8.2 Generalized Second-Derivative test
|
|
|
•
|
The principal minor is the determinant of the submatrix of whose main diagonal coincides with the main diagonal of , which starts with the -element of , and whose last row and column are the th row and column of .
|
•
|
Table 4.8.3 provides some insight into the generalized Second-Derivative test stated in Table 4.8.2.
|
•
|
In any form of the Second-Derivative test, the tested function is expanded in a Taylor series about the point P. Since the first-derivative terms for a function of several variables are zero, the function near P is approximately , where , R is the position vector to the general point, and is a row vector, the transpose of X.
|
•
|
The values of near P then depend on whether the quadratic form that comprises the second-derivative terms is always positive, always negative, or sometimes positive and sometimes negative for R near P. If the symmetric matrix is positive definite, the quadratic form is always positive near P. If it is negative definite, the form is always negative near P. If it is both positive and negative near P, the form is indefinite. Thus, the Routh-Hurwitz and Sylvester criteria essentially determine if the quadratic form associated with the Hessian is positive definite, negative definite, or indefinite.
|
•
|
If the quadratic form is positive definite, then it adds to the value of , so is a local minimum.
|
•
|
If the quadratic form is negative definite, then it subtracts from the value of , so is a local maximum.
|
•
|
If the quadratic form is indefinite, then is both adds to and subtracts from the value of , so is a stationary, but not an extreme, point.
|
|
Table 4.8.3 Remarks on the theoretical framework for the Second-Derivative test
|
|
|
|
|
Examples
|
|
Example 4.8.1
|
Find and classify the critical (i.e., stationary) points for .
|
Example 4.8.2
|
Find and classify the critical (i.e., stationary) points for
|
Example 4.8.3
|
Find and classify the critical (i.e., stationary) points for .
|
Example 4.8.4
|
Find and classify the critical (i.e., stationary) points for .
|
Example 4.8.5
|
Find and classify the critical (i.e., stationary) points for .
|
Example 4.8.6
|
Find and classify the critical (i.e., stationary) points for .
|
Example 4.8.7
|
Line passes through the point and has direction . Line passes through the point and has direction . Show that the lines are skew, and find the minimum distance between them. Hint: Parametrize each line with a different parameter and minimize the square of the distance between an arbitrary point on each line.
|
Example 4.8.8
|
Find the minimum distance between the point and the plane .
|
Example 4.8.9
|
If both and are bound to the origin, project U onto V. Hint: Find the minimum distance from the tip of U to the line along V.
|
Example 4.8.10
|
Choose and so that the line minimizes , the sum of squares of the deviations from the points to the line. Such a line is called the "least-squares" line.
|
Example 4.8.11
|
Obtain formulas for and so that the line minimizes , the sum of squares of the deviations from the points , to the line.
|
Example 4.8.12
|
By minimizing the sum of squares of deviations between and the points , obtain the best least-squares quadratic fit to the data.
|
|
|
|
<< Previous Section Table of Contents Next Section >>
© Maplesoft, a division of Waterloo Maple Inc., 2024. All rights reserved. This product is protected by copyright and distributed under licenses restricting its use, copying, distribution, and decompilation.
For more information on Maplesoft products and services, visit www.maplesoft.com
|