Connect your moderator Slack workspace to receive post notifications:
Sign in with Slack

Condition of optimality convex functions

Hello,

I am confused with the condition of optimality to find a global minimum for convex functions. It is written in the lecture 2 of optimization that if the hessian is SPD and the function is convex, we have a global optimum.
However, in lecture 3 of least squares, it is written that whenever the function is convex, the hessian is SPD at every point.
Thus, I do not understand the affirmation in lecture 2. Can you explain me please?Screenshot 2022-01-11 at 10.53.29.jpgScreenshot 2022-01-11 at 10.53.08.jpg

Top comment

No it is true for all convex functions. When the function is strictly convex the global minimum (when it exists) is unique.

For a point to be a local minimum of a certain function that happens to be twice differentiable the said point needs to be stationary i.e. its gradient is zero, and to ensure that it is a minimum not a maximum the function needs to be "locally convex" its Hessian is PSD. When the function is convex, a local minimum is in fact a global minimum. However this does not say that a convex function always has a minimum it only says I repeat when it has a local minimum it is global.

Also if you know the function is convex you only need to look for stationary points (that do not lay at the borders of the domain to be more precise).

"When the function is convex, a local minimum is in fact a global minimum.".
This does work only for STRICTLY convex functions right?

Top comment

No it is true for all convex functions. When the function is strictly convex the global minimum (when it exists) is unique.

Add comment

Post as Anonymous Dont send out notification