.
Householder transformation
In linear algebra, a Householder transformation (also known as Householder reflection or elementary reflector) is a linear transformation that describes a reflection about a plane or hyperplane containing the origin. Householder transformations are widely used in numerical linear algebra, to perform QR decompositions and in the first step of the QR algorithm. The Householder transformation was introduced in 1958 by Alston Scott Householder.[1]
Its analogue over general inner product spaces is the Householder operator.
Definition and properties
The reflection hyperplane can be defined by a unit vector v (a vector with length 1) which is orthogonal to the hyperplane. The reflection of a point x about this hyperplane is:
\( x - 2\langle v,x\rangle v = x - 2 v (v^H x), \)
where v is given as a column unit vector with Hermitian transpose vH.. This is a linear transformation given by the Householder matrix:
\( P = I - 2 vv^H\,, \) where I is the identity matrix.
The Householder matrix has the following properties:
it is Hermitian: \( P = P^H, \)
it is unitary: \( P^{-1}=P^H, \)
hence it is involutary: \( P^2=I . \)
A Householder matrix has eigenvalues\( \pm 1. \) To see this, notice that if u is orthogonal to the vector v which was used to create the reflector, then Pu = u, i.e., 1 is an eigenvalue of multiplicity n-1, since there are n-1 vectors orthogonal to v. Also, notice Pv = -v, and so -1 is an eigenvalue with multiplicity 1.
The determinant of a Householder reflector is -1, since the determinant of a matrix is the product of its eigenvalues.
Applications
In geometric optics, specular reflection can be expressed in terms of the Householder matrix.
Householder reflections can be used to calculate QR decompositions by reflecting first one column of a matrix onto a multiple of a standard basis vector, calculating the transformation matrix, multiplying it with the original matrix and then recursing down the (i, i) minors of that product.
They are also widely used for tridiagonalization of symmetric matrices and for transforming non-symmetric matrices to a Hessenberg form.
Tridiagonalization
Main article: Tridiagonal matrix
This procedure is taken from the book: Numerical Analysis, Burden and Faires, 8th Edition. In the first step, to form the Householder matrix in each step we need to determine \displaystyle \alpha and r, which are:
\( \displaystyle \alpha = -\mbox{sgn}(a_{21})\sqrt{\sum_{j=2}^{n}a_{j1}^2} ; \)
\( r = \sqrt{\frac{1}{2}(\alpha^{2}-a_{21}\alpha)} ; \)
From \( \displaystyle \alpha and r, construct vector v:
\( v^{(1)} = \begin{bmatrix} v_1\\v_2\\...\\v_n \end{bmatrix}, \)
where \( v_1=0;, v_2 = \frac{a_{21}-\alpha}{2r}, \) and
\( v_k = \frac{a_{k1}}{2r} for each k=3,4 ..n \)
Then compute:
\( \displaystyle P^{1} = I - 2v^{(1)}(v^{(1)})^t \)
\( \displaystyle A^{(1)} = P^{1}AP^{1} \)
Having found \( \displaystyle P^{1} \) and computed \( \displaystyle A^{(1)} \) the process is repeated for k =2, 3, ..., n as follows:
\( \displaystyle \alpha = -\mbox{sgn}(a_{k+1,k})\sqrt{\sum_{j=k+1}^{n}a_{jk}^2} ; \)
\( r = \sqrt{\frac{1}{2}(\alpha^{2}-a_{k+1,k}\alpha)} ; \)
\( v^{k}_1=v^{k}_2=..=v^{k}_k=0; \)
\( v^{k}_{k+1} = \frac{a^{k}_{k+1,k}-\alpha}{2r} \)
\( v^{k}_j = \frac{a^{k}_{jk}}{2r} for j=k+2; k+3,..., n \)
\( \displaystyle P^{k} = I - 2v^{(k)}(v^{(k)})^t \)
\( \displaystyle A^{(k+1)} = P^{k}A^{(k)}P^{k} \)
Continuing in this manner, the tridiagonal and symmetric matrix is formed.
Examples
This example is taken from the book "Numerical Analysis" by Richard L. Burden (Author), J. Douglas Faires. In this example, the given matrix is transformed to the similar tridiagonal matrix A2 by using Householder Method.
\( \mathbf{A} = \begin{bmatrix} 4&1&-2&2 \\ 1 & 2 &0&1 \\ -2 & 0 &3& -2 \\ 2 & 1 & -2&-1 \end{bmatrix}, \)
Following those step in Householder Method. We have:
The first Householder matrix:
\( Q1 \mathbf{} = \begin{bmatrix} 1&0&0&0 \\ 0 &-1/3&2/3&-2/3 \\ 0 & 2/3 &2/3& 1/3 \\ 0 & -2/3 &1/3& 2/3 \end{bmatrix}, \)
\( A1 = Q1AQ1 = \mathbf{}\begin{bmatrix} 4&-3&0&0 \\ -3 & 10/3 &1&4/3 \\ 0 & 1 &5/3& -4/3 \\ 0 & 4/3 & -4/3&-1 \end{bmatrix}, \)
Used A1 to form \( Q2 =\mathbf{}\begin{bmatrix} 1&0&0&0 \\ 0&1 &0&0 \\ 0 & 0 &-3/5&-4/5 \\ 0 & 0 & -4/5&3/5 \end{bmatrix}, \)
\( A2 = Q2A1Q2=\mathbf{}\begin{bmatrix} 4&-3&0&0 \\ -3 &10/3 &-5/3&0 \\ 0 & -5/3 &-33/25& 68/75 \\ 0 &0 & 68/75&149/75 \end{bmatrix}, \)
As we can see, the final result is a tridiagonal symmetric matrix which is similar to the original one. The process finished after 2 steps.
Computational and Theoretical Relationship to other Unitary Transformations
See also: Rotation (mathematics)
The Householder Transformation is a reflection about a certain hyperplane, namely, the one with unit normal vector v, as stated earlier. An N by N unitary transformation U satisfies UUH=I. Taking determinant (N-th power of the geometric mean) and trace (proportional to arithmetic mean) of a unitary matrix reveals that its eigenvalues λi are unit modulus. This can be seen directly and swiftly:
\( \frac{\mbox{Trace}(UU^H)}{N}=\frac{\sum_{j=2}^N|\lambda_j|^2}{N}=1, \) \(\mbox{det}(UU^H)=\prod_{j=1}^N |\lambda_j|^2=1. \)
Since arithmetic and geometric means are equal iff the variables are constant, see, inequality of arithmetic and geometric means, we establish the claim of unit modulus.
For the case of real valued unitary matrixes we obtain orthogonal matrices, \( UU^t=I \). In this case all eigenvalues are real, and so the unit modulus eigenvalue constraint is replaced by the binary constraint that all eigenvalues lie in the set {+1,-1}. It follows rather readily (see orthogonal matrix) that any orthogonal matrix can be decomposed into a product of 2 by 2 rotations, called Givens Rotations, and Householder reflections. This is appealing intuitively since multiplication of a vector by an orthogonal matrix preserves the length of that vector, and rotations and reflections exhaust the set of (real valued) geometric operations that render invariant a vector's length.
Finally we note that a single Householder Transform, unlike a solitary Givens Transform, can act on all columns of a matrix, and as such exhibits the lowest computational cost for QR decomposition and Tridiagonalization. The penalty for this "computational optimality" is, of course, that Householder operations cannot be as deeply or efficiently parallelized. As such Householder is preferred for dense matrices on sequential machines, whilst Givens is preferred on sparse matrices, and/or parallel machines.
References
^ Householder, A. S. (1958). "Unitary Triangularization of a Nonsymmetric Matrix". Journal of the ACM 5 (4): 339–342. doi:10.1145/320941.320947. MR0111128.
LaBudde, C.D. (1963). "The reduction of an arbitrary real square matrix to tridiagonal form using similarity transformations". Mathematics of Computation (American Mathematical Society) 17 (84): 433–437. doi:10.2307/2004005. JSTOR 2004005. MR0156455.
Morrison, D.D. (1960). "Remarks on the Unitary Triangularization of a Nonsymmetric Matrix". Journal of the ACM 7 (2): 185–186. doi:10.1145/321021.321030. MR0114291.
Cipra, Barry (2000). The Best of the 20th Century: Editors Name Top 10 Algorithms. 33. p. 1. (Herein Householder Transformation is cited as a top 10 algorithm of this century)
Press, WH; Teukolsky, SA; Vetterling, WT; Flannery, BP (2007). "Section 11.3.2. Householder Method". Numerical Recipes: The Art of Scientific Computing (3rd ed.). New York: Cambridge University Press. ISBN 978-0-521-88068-8
External links
Householder's Method
Householder Transformations
Retrieved from "http://en.wikipedia.org/"
All text is available under the terms of the GNU Free Documentation License