HDS

Exercise 5.14: Maximum Singular Value of Gaussian Random Matrices

chapter 5 Lipschitz Gaussian

(a)

This is an easy programming exercise. We leave those to the reader.

(b)

Taking the SVD decomposition \(W = USV^\top\), the result follows.

(c)

If the Sudakov-Fernique inequality applies, \(\E[\sup_{u, v} Z_{u,v}] \leq \E[\sup_{u, v} Y_{u,v}] \, ,\) the desired result is implied by the Jensen’s inequality: \begin{align} \E[\sup_{u, v} Y_{u,v}] = \E [\| g \|_2 + \| h \|_2] \leq \sqrt{n} + \sqrt{d} \, . \end{align}

Reducing to fixed finite subsets \(\lbrace u_1, \ldots , u_N \rbrace \subset S^{n-1}\) and \(\lbrace v_1, \ldots , v_N \rbrace \subset S^{d-1}\), we need to show \begin{align} \E [(Z_{u_i, v_j} - Z_{u_k, v_l})^2] \leq \E [(Y_{u_i, v_j} - Y_{u_k, v_l})^2] \end{align} for any \(i, j, k, l \in [N]\). Using the mutual independence of the standard normal random variables constituting \(Z_{u, v}\) and \(Y_{u, v}\) \begin{gather} \E [(Z_{u_i, v_j} - Z_{u_k, v_l})^2] = \| u_i v_j^\top - u_k v_l^\top \|_F^2 \, , \newline \E [(Y_{u_i, v_j} - Y_{u_k, v_l})^2] = \| u_i - u_k \|_2^2 + \| v_j - v_l \|_2^2 \, . \end{gather} Application of the inequality derived in Exercise 5.11 implies the assumptions of the Sudakov-Fernique inequality hold. Hence \begin{align} \E \bigl[\sup_{i, j \in [N]} Z_{u_i, v_j}\bigr] \leq \E \bigl[\sup_{i, j \in [N]} Y_{u_i, v_j}\bigr] \, , \end{align} and the desired result follows by applying the monotone convergence theorem, first on the r.h.s., then on the l.h.s.

(d)

This follows from Exercise 5.11(c), using that \begin{align} \sup_{u , v} \V (Z_{u, v}) = \sup_{u , v} \E [(u^\top W v)^2] = \sup_{u , v} \| u \|_2^2 \| v \|_2^2 = 1 \, , \end{align} which avoids the factor of two when the one-sided version of the Lipschitz Gaussian concentration (Theorem 2.26) is applied.

Published on 1 March 2021.