Suppose we are wanting to show that a function \(f(x, y)\) is jointly convex in \(x\) and \(y\). I don't think it would prove the point to just show that \(\forall y, y^\prime, x, x^\prime, \forall \theta \in [0, 1]\) we have \(f(\theta x + (1 - \theta) x^\prime\, \theta y + (1 - \theta) y^\prime \leq \theta f(x, y) + (1-\theta) f(x^\prime, y^\prime)\). Is there a "similar" way to do so, or computing the hessian wrt \(x\) and \(y\) is the only way ?
EDIT: I cannot remove my question, but I just noticed the answer in the lecture notes on optimization, using the gradient of the function
Showing that a function is jointly convex
Hello,
Suppose we are wanting to show that a function \(f(x, y)\) is jointly convex in \(x\) and \(y\). I don't think it would prove the point to just show that \(\forall y, y^\prime, x, x^\prime, \forall \theta \in [0, 1]\) we have \(f(\theta x + (1 - \theta) x^\prime\, \theta y + (1 - \theta) y^\prime \leq \theta f(x, y) + (1-\theta) f(x^\prime, y^\prime)\). Is there a "similar" way to do so, or computing the hessian wrt \(x\) and \(y\) is the only way ?
EDIT: I cannot remove my question, but I just noticed the answer in the lecture notes on optimization, using the gradient of the function
1
Hi,
yes you can use the gradient of the function or show that the Hessian is psd.
I just wanted to correct your first formula. The definition is
$$ f( \theta x +(1-\theta) x', \theta y +(1-\theta) y') \leq \theta f(x,y) + (1-\theta) f(x',y') $$
Best,
Nicolas
Add comment