HDS

Exercise 5.7: Upper Bound for $\ell_0$-"Balls"

chapter 5 Lipschitz Gaussian

(a)

Follows directly from splitting up the supremum and tightness of Cauchy–Schwarz: \begin{equation} \sup_{\theta \in \Tb^d(s)} \lra{\theta, w} = \sup_{S:\abs{S}=s} \sup_{\theta \in \Bb_2^s(1)} \lra{\theta, w_S} = \sup_{S:\abs{S}=s} \norm{w_S}_2. \end{equation}

(b)

Note that, by the reverse triangle inequality, $w_S \mapsto \norm{w_S}_2$ is $1$-Lipschitz with respect to the Euclidean norm, so $\norm{w_S}_2 \in \SG(1)$. Also, note that $\E[\norm{w_S}_2] \le \sqrt{s}$. Then \begin{equation} \P(\norm{w_S}_2 \ge \sqrt{s} + \delta) \le \P(\norm{w_S}_2 \ge \E[\norm{w_S}_2] + \delta) \le e^{-\tfrac12 \delta^2}. \end{equation}

(c)

By (b), \begin{equation} \max_{S : \abs{S} = s} (\norm{w_S}_2 - \E[w_{1:s}]) \end{equation} is the maximum of $\binom{d}{s}$ random variables that have zero mean and are sub-Gaussian with parameter $1$. Therefore, \begin{equation} \E[\max_{S : \abs{S} = s} \norm{w_S}_2] \le 2 \sqrt{\log \binom{d}{s}} + \sqrt{s} \lesssim \sqrt{s \log \frac{e d}{s}}, \end{equation} where we use the estimate \begin{equation} \binom{d}{s} \le \frac{d^s}{s!} \le \parens{\frac{d e}{s}}^s. \end{equation}

Published on 8 April 2021.