I have a small follow-up question, just to be sure: when you say "up to a sign", you mean that we can have a different decomposition by multiplying the eigenvalues and eigenvectors by -1, correct ?

I have a small follow-up question, just to be sure: when you say "up to a sign", you mean that we can have a different decomposition by multiplying the eigenvalues and eigenvectors by -1, correct ?

Hello! I had a question regarding the local expansion around the optimal parameter that is given in the task specification. I first thought it was a second-order Taylor expansion, but the formula does not match. Could you explain how you came up with the expansion:

## Conclusion of 3.1

How do you get from:

$$w = (H + \mu I)^{-1} Hw*= (QLQ^T + \mu I)^{-1} QLQ^Tw*$$

to

$$w = Q(L + \mu I)^{-1} LQ^Tw*$$

?

## 1

\(H\) is a symmetric matrix, thus its SVD is equivalent (up to a sign) to its eigenvalue decomposition with orthonormal eigenvectors:

$$H = Q\Lambda Q^{\top}$$

and due to the orthonormality of the eigenvectors: \(Q^{\top}Q = QQ^{\top} = I\)

Using these properties, and a property of matrix inversion (\((AB)^{-1}=B^{-1}A^{-1}\)), we can write:

$$\begin{aligned} w&=(H+\mu I)^{-1} H \omega^{*}\\ &=\left[Q\Lambda Q^{\top}+\mu Q Q^{\top}) \right]^{-1} Q \Lambda Q^{\top} \omega^{*}\\ &=\left[Q(\Lambda+\mu I) Q^{\top}\right]^{-1} Q \Lambda Q^{\top} \omega^{*}\\ &=\left[(\Lambda+\mu I)Q^{\top}\right]^{-1}Q^{-1} Q \Lambda Q^{\top} \omega^{*}\\ &=\left(Q^{\top}\right)^{-1}[(\Lambda +\mu I)]^{-1} \Lambda Q^{\top} \omega^{*}\\ &=Q(\Lambda+\mu I)^{-1} \Lambda Q^{T} \omega^{*} \end{aligned}$$

On paper

## 5

Hello,

I have a small follow-up question, just to be sure: when you say "up to a sign", you mean that we can have a different decomposition by multiplying the eigenvalues and eigenvectors by -1, correct ?

## 1

No, it means that the columns of the matrix for singular value decomposition and eigenvector decomposition may differ from each other by only a sign which depends on the sign of the eigenvalue. Maybe looking at the first answer of this would help:

https://math.stackexchange.com/questions/22825/how-to-compute-the-svd-of-a-symmetric-matrix

## 1

Hello! I had a question regarding the local expansion around the optimal parameter that is given in the task specification. I first thought it was a second-order Taylor expansion, but the formula does not match. Could you explain how you came up with the expansion:

$$ \mathcal{L}(w) = L(w^{*}) + \frac{1}{2} (w − w^{*})^T H(w − w^{*}) $$

## 2

## Add comment