HDS

Exercise 5.12: Gaussian Contraction Inequality

chapter 5

Let \(X_{\phi(\theta)} = \langle \phi(\theta), w \rangle\), where \(w \sim \gauss(0, I_d)\) is a shared standard normal random vector, so that \(\mathcal{G}(\phi(\mathbb{T})) = \E [\sup_{\theta} X_{\phi(\theta)}]\). By linearity of expectation \(\E [X_{\phi(\theta)}] = 0\), and the associated squared metric \(\E [(X_{\phi(\theta)} - X_{\phi(\theta')})^2] = \| \phi(\theta) - \phi(\theta') \|_2^2\) is upper bounded by \(\| \theta - \theta' \|_2^2\) due to the 1-Lipschitz assumption. Hence for any finite \(G \subseteq \mathbb{T}\) \begin{align} \E \bigl[\sup_{\theta \in G} \langle \phi(\theta) , w \rangle \bigr] \leq \E \bigl[\sup_{\theta \in G} \langle \theta , w \rangle \bigr] \, , \end{align} by the Sudakov-Fernique inequality (Theorem 5.27). Applying the monotone convergence theorem, first on the r.h.s., then on the l.h.s., yields the result.

Published on 1 March 2021.