Monday, May 10, 2010

Inverting Matrix - SVD (singular value decomposition)

Every once in a while you find yourself needing to solve a set of equations, or invert a matrix, or worse yet, invert a non-square matrix (eg: pseudo-inverse for manipulator inverse kinematics path control (See: Minerva IK control image on right, work I did at TUM) or kalman filtering). This is when a handy mathematical tool 'singular value decomposition' comes in to play.

In essence the SVD decomposes a matrix 'M' into three other matrices with special properties, making them easier to manipulate. The SVD equation is:
  • U is a matrix whose columns are the eigenvectors of the M * M transposed matrix. (AKA left eigenvectors)
  • S is a diagonal matrix whose diagonal elements are the singular values of M.
  • V is a matrix whose columns are the eigenvectors of the M transposed * M matrix. (AKA right eigenvectors)
Discussing eigenvalues and eigenvectors I'll leave for another time, but they are quite straight-forward to calculate (get the matrix subtract a constant * I , take the determinant, equate to zero and solve, that gets you the eigenvalues!).

Now, how do you use the SVD to invert a matrix? Using a few matrix (inversion) identities (See the Matrix Cook Book, basic identity 1):

So now we just need a way to invert a diagonal matrix, which couldn't be easier. To invert a square diagonal matrix you simple invert each element of the matrix.
What if the matrix isn't square?
Well for that, we can just use the pseudo-inverse: we just invert each element of the matrix, and then transpose it. Done!

A good tutorial on the SVD can be found here by Dr. E. Garcia


John Rawlins said...


"Advanced Eigenvalue Vector Decomposition"

We have made a significant improvement to the Singular Value
Decomposition methodology, This is actually an understatement.

We have discovered that the current Singular Value Decomposition mathematical techniques and resulting FORTRAN and C code is quite slow and inaccurate and what we have accomplished is to speed up computer execution by more than a factor of 1000 and to improve numerical accuracy by about 12 bits out of 48 bits for the standard double mantissa That is to say that there is no more than 1 bit different between the exact answer and my new solution method .Previous or current methods can barely achieve 24 bit accuracy (out of48).This improvement will make it possible to recognize red, green , blue, black, grey and white infrared images in real time as the first application .

The range of applications for this new technology can go well
beyond this.

Of course, we expect skeptics about these claims, but we have demo
programs that will read your data and produce results that they can compare and we can even executed this new code on their computers if they desire.

How AEVD Improves SVD Performance

AEVD Technology, LLC offers a fully developed, advanced form of the Singular Value Decomposition (SVD) algorithm, which offers a generational advance in speed and accuracy. Our Advanced Eigenvalue Vector Decomposition (or AEVD) was built upon a recognition of the shortcomings in how computers manipulate numbers, data and calculations, and reflects a painstaking analysis of the incremental steps in SVD processing to provide a faster process with fewer steps and improved inaccuracy.

The SVD mathematical proposition is linearly independent, in an algebraic sense, of the similarity theorem and as such provides a variety of available solution paths. One such path is to first reduce the input array to a real bidiagonal matrix with a sequence of intermediate left and right unitary transformations. This reduction to a real bidiagonal matrix is usually chosen to be a real diagonal and having one real super diagonal. All of the remaining matrix elements are numerically considered as zero. It is possible to choose other reduced forms of the input matrix, but the use of a real bidiagonal array provides for the most numerically accurate and computationally rapid solution. Additional numerical stability and computer accuracy is obtained by AEVD by choosing unitary transformations that place the larger bidiagonal elements in the upper left and the smaller elements in the lower right positions. This is true since the final determination of the left and right transformations and the SVD weights are always an iterative process for matrix sizes greater than four. Even for matrix sizes of four and less iterative methods are usually simpler and require computational steps comparable to closed form algebraic solutions. Also, when a real bidiagonal array format is chosen as the final iterate, the left and right transformations employed during iteration are orthogonal matrices. Other SVD iterative solution methods available employ orthogonal variations such as Jacobi rotation, Givens, and Householder reduction algorithms. Consistent among the more efficient and accurate SVD algorithms is the choice of the real

John Rawlins


Adrian said...