We recall that a critical point of a function of several variables is a point at which the gradient of the function is either the zero vector 0 or is undefined. Although every point at which a function takes a local extreme value is a critical point, the converse is not true, just as in the single variable case. In this notebook, we will be interested in identifying critical points of a function and classifying them. As in the single variable case, since the first partial derivatives vanish at every critical point, the classification depends on the values of the second partial derivatives. Let us illustrate what is involved by passing to a specific example, namely the function defined by the input cell below, which we have been considering throughout the past several notebooks.
f=((x^2-1)+(y^2-4)+(x^2-1)*(y^2-4))/(x^2+y^2+1)^2
To find the critical points of f, we must set both partial derivatives of f to 0 and solve for x and y. We begin by computing the first partial derivatives of f.
fy=diff(f,y)
To find critical points of f, we must set the partial derivatives equal to 0 and solve for x and y.
Incidentally, it is worth pointing out a trick here. By default, the outputs xcr and ycr of the solve command are column vectors. As we have just seen if you type
[xcr,ycr]=solve(fx,fy)
then the vectors appear one after another. But a better alternative is:
[xcr,ycr]=solve(fx,fy); [xcr,ycr]
Now xcr and ycr are stacked together in a matrix, so we can see each x-value together with the corresponding y-value. MATLAB has found seven critical points, of which the last four are not real. We now compute the second derivatives of f.
fxy=diff(fx,y)
fyy=diff(fy,y)
Of course there is no need to compute f_{yx}, since for reasonable functions it always coincides with f_{xy}. We compute the Hessian determinant (called the discriminant in your text) of the second partial derivatives, f_{xx}f_{yy}-f_{xy}^{2}, which is essential for the classification, and evaluate it at the critical points. In order to perform the classification efficiently, we create inline vectorized versions of the Hessian determinant and of the second partial derivative with respect to x. This enables us to evaluate them simultaneously at all the critical points.
hessfun=inline(vectorize(hessdetf))
fxxfun=inline(vectorize(fxx))
Now we can produce a table of the critical points. Since xcr and ycr are column vectors, hessfun(xcr,ycr) and fxxfun(xcr,ycr) will also be column vectors, and we can stack all these column vectors together to form a matrix, in other words, a table. Each row of the table gives the x- and y-coordinates of the critical point, followed by the discriminant and the value of f_{xx}.
[xcr,ycr,hessfun(xcr,ycr),fxxfun(xcr,ycr)]
We are only interested in the first three rows of the table, since the remaining four rows have an i in the formula for x, and so correspond to non-real critical points. Notice that all three of the real critical points are on the x-axis, and the first one is at the origin. Since the Hessian determinant is negative at the origin, we conclude that the critical point at the origin is a saddle point. The other two are local extrema and, since fxx is positive, they are local minima.
Let now reconstruct the contour and gradient plot we obtained for f in the last notebook.
genplot(f,-3:.1:3,-3:.1:3,'contour',20)
hold on
genplot(f,-3:.2:3,-3:.2:3,'gradplot',.5)
hold off
From the plot it is evident that there are two symmetrically placed local minima on the x-axis with a saddle point between them. We recognize the minima from the fact that they are surrounded by closed contours with the gradient arrows pointing outward. The pinching in of contours near the origin indicates a saddle point. The contour through the origin crosses itself, forming a transition between contours enclosing the two minima separately and contours enclosing them together.
Find and classify all critical points of the function _{}, which you studied in the previous notebook, and compare your results with the contour and gradient plot of g that you have already made.
We now take another look at the classification of critical points, with a view to extending it to functions of three variables. Let us define the function f2 and compute its first and second derivatives using MATLAB's symbolic jacobian operator.
hessmatf2=jacobian(gradf2,[x,y])
The second output is a symmetric matrix, known as the Hessian matrix, whose entries are the second partial derivatives of f. In order to find the critical points, we must again set the first partial derivatives equal to 0.
[xcr2,ycr2]=solve(gradf2(1),gradf2(2)); [xcr2,ycr2]
There are two critical points. Let us now evaluate the Hessian matrix at the critical points.
H1=subs(hessmatf2,[x,y], [xcr2(1),ycr2(1)])
H2=subs(hessmatf2,[x,y], [xcr2(2),ycr2(2)])
At this point, however, instead of computing the Hessian determinant, which is the determinant of the Hessian matrix, we compute the eigenvalues of the matrix. The determinant of a matrix is the product of its eigenvalues, but the eignevalues carry a lot more information than the determinant alone.
We can obtain numerical versions with the double command:
The significance of the eigenvalues of the Hessian matrix is that if both are positive at a critical point, the function has a local minimum there; if both are negative the function has a local maximum; if they have opposite signs, the function has a saddle point; and if at least one of them is 0, the critical point is degenerate. The Hessian determinant, being the product of the two eigenvalues, takes a negative value if and only if they have opposite signs. In the present case, we see that the critical point at the origin is a local maximum of f2, and the second critical point is a saddle point.
The point of this reformulation is that the Hessian matrix and its eigenvalues are just as easily computed and interpreted in the three-dimensional case. In this case, there will be three eigenvalues, all of which are positive at a local minimum and negative at a local maximum. A critical point of a function of three variables is degenerate if at least one of the eigenvalues of the Hessian determinant is 0, and has a saddle point in the remaining case: at least one eigenvalue is positive, at least one is negative, and none is 0.
Find and classify the critical points of the function _{}. You may have trouble substituting the symbolic values of the coordinates of the critical points into the Hessian matrix; if so, convert them to numerical values, using double.
For some functions (especially if they involve transcendental functions such as exp, log, sin, cos, etc.), the formulas for the partial derivatives may be too complicated to use solve to find the critical points. In this case, a graphical or numerical method may be necessary. Here is an example:
ky=diff(k,y)
In this case solving analytically for the solutions of kx = 0, ky = 0 is going to be hopeless. Also, we can't solve numerically without knowing how many solutions there are and roughly where they are located. The way out is therefore to draw plots of the solutions to kx = 0 and to ky = 0. Then we see where these curves intersect and can use the graphical information to get starting values for the use of a numerical solver. In our particular example, we can try:
[x1,y1]=meshgrid(-8:.1:8,-8:.1:8);
kxfun=inline(vectorize(kx)); kyfun=inline(vectorize(ky));
contour(x1,y1,kxfun(x1,y1),[0,0],'r'), hold on
contour(x1,y1,kyfun(x1,y1),[0,0],'b'), hold off
Here the plot of k_{x}=0 is shown in red and the plot of k_{y}=0 is shown in blue. Thus, for example, we have critical points near (6.2, 2.4) and near (6.2, 3.2). We can locate them more accurately using the mfile newton2d.m, which uses a two-dimensional version of Newton's method to solve more accurately:
[xsol,ysol]=newton2d(kx,ky,6.2, 2.4); [xsol,ysol]
[xsol,ysol]=newton2d(kx,ky,6.2, 3.2); [xsol,ysol]
1. Find and classify all critical points of the function _{}. Relate your results to a simultaneous contour and gradient plot of the function.
2. Find and classify all critical points of the function _{}. MATLAB will report many critical points, but only a few of them are real.
3. Find and classify all critical points of the function h(x, y) = y^{2} exp(x^{2}) - x - 3y. You will need the graphical/numerical method to find the critical points.
4. Find and classify all critical points of the function h (x, y) = y^{2} exp(x) - x - 3y. Use both the analytical and the graphical/numerical method to find the critical points, and compare the results.