补充推导步骤,重写 Matrix Computations 5.1.2 节

本来的内容有点小小的跳跃,补一下跳跃的部分,下次推导时省点时间,备忘

1. 补充后的内容

补充推导步骤,重写 Matrix Computations 5.1.2 节_第1张图片

补充推导步骤,重写 Matrix Computations 5.1.2 节_第2张图片

2. 代码

LaTeX code:

\documentclass{article}
\title{Matrix Computations 5.1.2 time saving revision}
\date{}
\begin{document}
\maketitle
Let $v \in \mathbf{R}^m$ and $v \ne 0$
\begin{equation}
    \mathbf{P} = \mathbf{I}-\beta vv^T, \beta = \frac{2}{v^Tv}
\end{equation}

Then $\mathbf{P}$ is $Householder\,\, reflection$.

$y = \mathbf{P}x$ is the reflection of $x$ with supper plane: $\mathbf{span}\{v\}^\bot$
 \\
 \\
\begin{equation} \label{eq:eps2}
\mathbf{P}x =
\left(
    \mathbf{I} - \frac{2vv^T}{v^Tv}
\right)x
=
x - \frac{2v^Tx}{v^Tv}v
\end{equation}
To reflect x onto ${e}_1 = I_m(:,1)$, that means
\begin{equation} \label{eq:eps3}
    \mathbf{P}x = w*e_1, w \in \mathbf{R}
\end{equation}
 we can know that $v \in \mathbf{span}\{x, e_1 \}$.
Let

\begin{equation} \label{eq:eps4}
v = x + \alpha e_1
\end{equation}

We know that $v$ is under the restriction of $x$.

Do $x^T * (\ref{eq:eps3})$:
$$x^Tv = x^Tx + \alpha x^Te_1$$
then
\begin{equation}
v^Tx = x^Tx + \alpha x_1
\end{equation}


Do $(v^Tv) = [(\ref{eq:eps4})^T*(\ref{eq:eps4})]$:

$$v^Tv = (x+\alpha e_1)^T(x+ \alpha e_1) = (x^T + \alpha {e_1}^T)(x+\alpha e_1)$$
then
\begin{equation}
v^Tv = x^Tx + 2\alpha x_1 + \alpha^2
\end{equation}


From (\ref{eq:eps2}):
$$\mathbf{P}x = x - \frac{2v^Tx}{v^Tv}v$$
\begin{equation}\label{eq:eps7}
\mathbf{P}x = x - \frac{2v^Tx}{v^Tv}(x + v - x)
=x - \frac{2v^Tx}{v^Tv}x + \frac{2v^Tx}{v^Tv}(v - x)
\end{equation}

and from (\ref{eq:eps4})
$$v = x + \alpha e_1$$
thus
\begin{equation} \label{eq:eps8}
    (v - x) = \alpha e_1
\end{equation}
Take (\ref{eq:eps8}) into (\ref{eq:eps7}):
\begin{equation} \label{eq:eps9}
\mathbf{P}x = (1-\frac{2v^Tx}{v^Tv})x + 2\alpha \frac{v^Tx}{v^Tv}e_1
    = \left(\frac{\alpha^2 - \parallel x \parallel _2^2}{x^Tx + 2\alpha x_1 + \alpha^2} \right)x - 2\alpha \frac{v^Tx}{v^Tv}e_1
\end{equation}

Compare (\ref{eq:eps9}) with (\ref{eq:eps3}), we want:
\begin{equation}
    \mathbf{P}x = 0*x + we_1; w \in \mathbf{R}
\end{equation}

thus
$$\frac{\alpha^2 - \parallel x \parallel _2^2}{x^Tx + 2\alpha x_1 + \alpha^2} = 0$$

that means
$$\alpha^2 - \parallel x \parallel _2^2 = 0$$
\begin{equation}
\alpha = \pm \parallel x \parallel_2
\end{equation}

Take the $\alpha$ into (\ref{eq:eps4}), we have

\begin{equation}
v = x \pm \parallel x \parallel_2e_1
\end{equation}

st.

$$\mathbf{P}x = \left(\mathbf{I} - 2\frac{vv^T}{v^Tv}\right)x = \mp\parallel x \parallel_2e_1; \,\,\mathbf{where}\,\,v=x\pm \parallel x \parallel_2e_1$$

When $v = x \pm \parallel x \parallel_2e_1$, $\mathbf{P}x$ reflects $x$ onto $e_1$.


\end{document}

你可能感兴趣的:(机器学习,算法,人工智能)