Connect your moderator Slack workspace to receive post notifications:
Sign in with Slack

Notation question

Hello,

There is an element of notation in Prof. Flammarion's lecture that I don't understand. When he writes the expected value over $$S{train}$$, it means the expected value over (distribution of) all possible training sets, but when he writes expected value over $$x \in \mathcal{D}$$ and $$S{train}$$, this time $$S_{train}$$ is fixed ? This questions is about the bound of expected loss when $\eta$ is Lipschitz

Hello,

Thank you very much for the question and my apologies if it was not clear.

I think you are referring, e.g., to expectation such:

$$\mathbb{E}_{S_{train},x\sim D} [ || x - nbh_{S_{train}}(x) || ] $$

Here we taking the expectation of a random variable which depends of two different randomness: (a) the point x is random and is distributed according to \(x\) and (b) the function \(nbh_{S_{train}}\) is also random and depends on a random training set \(S_{train}\). So you are really taking the expectation with respect to both.

If it helps you can see this "double" expectation with the tower rule. It is exactly equivalent to
You first fix \(S_{train}\) and take the expectation wrt \(x\) and then wrt \(S_{train}\):

$$\mathbb{E}_{S_{train}} [\mathbb{E}_{x\sim D}[ || x - nbh_{S_{train}}(x) ] || \ | S_{train} ] $$

Or you first fix \(x\) and take the expectation wrt \(S_{train}\) and then wrt \(x\):

$$\mathbb{E}_{x\sim D} [\mathbb{E}_{S_{train}}[ || x - nbh_{S_{train}}(x) ] || \ | x] $$

Best,
Nicolas

Hello,

Thanks a lot for your reply, I think I understand now :)

Page 1 of 1

Add comment

Post as Anonymous Dont send out notification