Connect your moderator Slack workspace to receive post notifications:
Sign in with Slack

Exam 2018 question

I am looking at problem ten, and wonder why x value of 1 does not have a subgradient?

Hint: note by the definition of subgradient in lecture02a_optimization_annotated.pdf, a subgradient is valid only \(\forall u\), insead of \(\forall u\) in a radius of \(x\).

In other words, subgradient is a global property instead of a local property.

Dear Tianzong,
is it correct that for -1, 0, and 1 there do not exist subgradients because we cannot find any tangents at these locations that always lie below the entire function everywhere it is defined?
Thank you!

I think that for every point in [1,2[ there is no subgradient because you can find a line which crosses the curve at 3 points.

At -1 and 0 you can observe a "concave shape".

By the way, does anyone know how we call the equivalent of a subgradient for a maximization problem (a concave shape)? For example if we are trying to maximize -|x|

Page 1 of 1

Add comment

Post as Anonymous Dont send out notification