HDS

Exercise 15.15: Sharper Bound for Variable Selection in Sparse PCA

chapter 15

Within the class of problems with $\theta^*$ satisfying $\min_{j \in [S]}\abs{\theta^*_j} \ge \theta\ss{min}$ where $S$ is the collection of indices corresponding to non-zero elements in $\theta^*$, we exhibit a problem where support recovery is not possible with non-zero probability whenever \begin{equation} n < c_0 \frac{1 + \nu}{\nu^2}\frac{\log(d - s + 1)}{\theta\ss{min}^2} \end{equation} for some constant $c_0 > 0$. In particular, for every $\e \in \set{-1, 1}^d$, for some fixed $S \sub [d]$ of size $s-1$, we consider the problem of deciding between \begin{equation} [\theta^j(\e)]_\ell = \begin{cases} \sqrt{(1 - \theta\ss{min}^2) / (s - 1)} & \text{if $\ell \in S$}, \newline \theta\ss{min} \e_\ell = \theta\ss{min} \e_j & \text{if $\ell = j$}, \newline 0 & \text{otherwise} \end{cases} \end{equation} for $j \in S^c$. Let $Z_j(\e) \sim \P_{\theta^j(\e)} = \Normal(0, \nu\, \theta^j(\e) \otimes \theta^j(\e) + I_d)$, and let $Z(\e)$ be a sample from $\tfrac1{\abs{\S^c}}\sum_{j \in S^c} \P_{\theta^j(\e)}$. Draw $J \in S^\c$ uniformly, and let $Y(\e)$ be $n$ i.i.d. samples from $Z_J(\e)$. Let $\psi(Y(\e))$ be an estimator of the index $J$. Drawing $\e$ randomly from the uniform distribution over $\set{-1,1}^d$, we find that, by Fano’s inequality and tensorisation of the mutual information, \begin{equation} \P(\psi(Y(\e)) \neq J) = \E_\e[\P(\psi(Y(\e)) \neq J\cond \e)] \ge 1 - \frac{n\, \E_\e[I(Z(\e), J)] + \log(2)}{\log\, \abs{S^\c}} \end{equation} where \begin{equation} 2I(Z(\e), J) \le \log\,\abs{\cov(Z(\e))} - \frac1{\abs{S^\c}}\sum_{j \in S^\c} \log\,\abs{\cov(Z_j(\e))}. \end{equation} Observe that \begin{equation} \log\,\abs{\cov(Z_j(\e))} = \log(\nu + 1). \end{equation} Moreover, by Jensen’s inequality, \begin{equation} \E_\e[\log\,\abs{\cov(Z(\e))}] \le \log\,\abs{\E_\e[\cov(Z(\e))]} \end{equation} where \begin{equation} \E_\e[\cov(Z(\e))] = \frac{\nu}{\abs{S^\c}}\sum_{j \in S^\c}\E_\e[\theta^j(\e) \otimes \theta^j(\e)] + I_d. \end{equation} Deduce that $\E_\e[\cov(Z(\e))]$ is block diagonal with blocks \begin{align} \E_\e[\cov(Z(\e))]_S &= \nu \frac{1 - \theta\ss{min}^2}{s - 1} 11^\T + I_{s - 1}, \newline \E_\e[\cov(Z(\e))]_{S^\c} &= \nu\frac{\theta\ss{min}^2}{d - s + 1} I_{d - s + 1} + I_{s - 1}. \end{align} Therefore, \begin{equation} \E_\e[\log\,\abs{\cov(Z(\e))}] \le \log(1 + \nu(1 - \theta\ss{min}^2)) + (d - s + 1) \log\parens{ 1 + \tfrac{1}{d - s + 1} \nu \theta\ss{min}^2 }. \end{equation} Simplying \begin{equation} \frac{1 + \nu(1 - \theta\ss{min}^2)}{\nu + 1} = \parens{1 - \frac{\nu}{\nu + 1}} + \frac{\nu}{\nu + 1}(1 - \theta\ss{min}^2) = 1 - \frac{\nu}{\nu + 1}\theta\ss{min}^2 \end{equation} and using $\log(1 + x) \le x$ for all $x > -1$, we find that \begin{equation} 2I(Z(\e), J) \le - \frac{\nu}{\nu + 1}\theta\ss{min}^2 + \nu \theta\ss{min}^2 = \frac{\nu^2}{\nu + 1} \theta\ss{min}^2. \end{equation} Therefore, if \begin{equation} n \frac{\nu^2}{\nu + 1} \theta\ss{min}^2 < c_0 \log(d - s + 1) \end{equation} for some universal constant $c_0 > 0$, then $\P(\psi(Y(\e)) \neq J) = 1$.

Notes

- Like in Example 15.20 from the book, we used randomisation to simplify the computation (the role of $\e$). However, in this particular case, the computations can also be done without randomising.

Published on 9 September 2021.