Generalised Inverses Modal Analysis and Modal Testing S. Ziaei Rad.

Post on 14-Dec-2015

242 views 4 download

transcript

Generalised InversesGeneralised InversesModal Analysis and Modal Testing

S. Ziaei Rad

Matrix propertiesMatrix properties

•Matrix Rank

•rank [A] = number of columns of [A] which are linearly independent.

•Matrix Norm

[A] is a non negative number

Matrix NormMatrix Norm•Frobenius Norm

[A]F = aij2

•Spectrum Form

[A]2 max eigenvalue of [A]H[A]

•Also

[A]1 = max ( aij: j = 1, 2, …m

[A] = max ( aij: i = 1, 2, …n

Generalised InversesGeneralised Inverses

The generalised inverse of [B]Nm where m N is defined as [B]+

m N where:

[B]+m N = ([B]T

m N [B] N m )-1[B]T m N

This is the left-inverse of [B] and exists if [B] is of full rank, m.

Although: [B]+ m N [B] N m = [ I ] m m

Note that: [B] N m [B]+ m N [ I ] N N

For matrix [C] m N where m N, we have [C]+ N m

where:

[C]+ = [C]T ([C] [C] T)-1

This is the right-inverse of [C] and exists only if [C]T is of full rank m.

Square MatrixSquare Matrix

For a square matrix [A] N N, the left-inverst and the right inverse are identical, and are both given by [A]-1

in this case:

[A]-1[A] = [A][A]-1 = [ I ]

applicable to cases where [A] is full rank, N.

Singular value DecompositionSingular value Decomposition

For a general N m (m < N) matrix [D], whose rank r is less than m, none of the previous expressions permit determination of an inverse. Here it is necessary to use the Singular Value Decomposition (SVD):

[D] N m = [U] N N [ ] N m [V]T m m

[D] N m = [U] N N [ ] N m [V]T m m

where [U] and [V] are orthornormal matrices for which [U]T = [U]-1, etc., and [ ] is a diagonal matrix whose r (r m) non-zero diagonal elements (1, 2, 3,... r,) are the singular values of [D].

Then:

[D] + m N = [V] m m [ ]+ m N [U] N N

where

[ ]+ m N =diag (-11, -1

2, -13,… -1

r, 0, 0 ,… 0)

Introduction to the Introduction to the Singular Value Decomposition (SVD) Singular Value Decomposition (SVD) TechniqueTechnique

Main ApplicationsMain Applications

Calculation of the Rank of a MatrixCalculation of Condition NumbersCalculation of Generalised Inverses

The Rank of a MatrixThe Rank of a Matrix

An N N matrix with all rows (or columns) linearly independent has rank = N. If only r rows or columns are linearly independent, then the rank is = r.

An N m matrix where N m is of “full rank” if its rank equals N.

The classical procedure for calculating the rank of a matrix is by Gauss elimination. An N N matrix of rank r(<N) will have (N-r) zero rows after a Gauss elimination

If the rows are not linearly dependent but are very close to it, there will be very small values (but not zeros) after a Gauss elimination. In these cases, it is difficult to establish the rank of the matrix and will be even more so if the elements are complex. The SVD makes this task much easier by working in terms of individual, real, quantities (the singular values of the matrix).

The SVD of an N m real matrix [A] is given by:

[A](N m) = [U](N m) [ ](N m) [V]T (m m)

where [U] and [V] are orthogonal matrices satisfying:

[U]T[U] = [U][U]T = [V]T[V] = [V] [V]T = [ I ]

and [U]T = [U]-1 and [V]T = [V]-1

Also, it can be noted that [U], [V] and [ ] are all real.

The matrix [ ] is the matrix of singular values of [A] (and is, in fact, the eigenvalues of ([A]T[A]) having the form:

If [A] is complex, [ ] is still real, but [U] and [V] are complex and unitary matrices.

)(

0...00

0...00

.....

......

...00

...00

2

1

mNm

Matrix RankMatrix Rank

The rank of matrix [A] is equal to the munber of non-zero singular values.

An N m matrix of rank r (< N) will have r non-zero singular values and (N - r) zero or negliguble values.

Comparison of the singular values permits establishment of the matrix rank. Usually, this requires the specification of a threshhold value below which singular value are deemed to be “zero”.

0...00

0...00

"0".....

.....

...00

...00

2

1

r

In modal analysis applications, the value of matrix rank can be associated with the number of genuine modes existing in a certain frequency range.

Condition NumberCondition Number

The condition number of a matrix can be expressed as:

max/ min

where min is the smallest non-zero singular value. The condition number can be used as an indicator of potential computation difficulties (high condition number reflects ill-condition of matrix).

Generalised InverseGeneralised Inverse

One of the major applications of the SVD is to the calculation of the generalised or pseudo inverse of a matrix, a frequent requirement in many aspects of structural modelling. It is often required to be able to “invert” a rectangular matrix when solving an over determined set of equations and the matrix involved may well be ill conditioned, especially when they are populated with measured data containing noise or other imperfections.

To solve:

[A]N m {x}m 1 = {b}N 1 where N > m

we can write:

{x} m 1 = [A]+ m N {b}N 1

where:

[A]+ = ([A]T [A])-1 [A]T

If [A] is not of full rank, the best way to determine its inverse is via the SVD, as follows:

[A]+ = ([V]T)-1 [ ]+ [U]-1 = [V] [ ]+ [U]T

where [ ]+ is an m N “diagonal” matrix formed by the reciprocals of the non-zero singular values of [A] with the reciprocals of the zero singular values set to zero

00"0".....

.......

00...00

00...00

1

12

11

r

Numerical ExampleNumerical Example

Application: To determine the harmonics force factor {f} applied to a structure where the measured harmonics response are {y} and the relevent FRF matrix is [H].

Given: {y}5 1 = [H] 5 3 {f} 3 1

where:

find {f}

5

5

5

5

5

{y} and

15105

1494

1383

1272

1161

H

Use the classical pseudo inverse calculation:

and recalculation of {y} using this force vector leads to:

31.0

31.0

47.0

}{ f

47.5

31.5

16.5

00.5

84.4

}{y

Which is clearly incorrect (since it differs from the initial set)

Applying the SVD to [H] gives:

151084.200

0465.20

0013.35

][ and

408.0376.0832.0

817.0257.0517.0

408.0809.0202.0

][

....407.0564.0531.0

....329.0251.0487.0

....000.0062.0443.0

....640.0376.0399.0

....562.0689.0355.0

][

V

U

From these results, it is clear that the rank of [H] is 2 ( and not 3). Thus, setting the third singular value to zero and then calculation [H]+, we find:

and when this is used to recompute the response vector {y}, the original values are found exactly

50.0

00.0

50.0

}{ f

Other applications of SVDOther applications of SVD

Smoothing

it is possible to smooth a matrix containing measured (i.e. noise) data by computing its SVD, then after zeroing the negligible singular values, recomputing the matrix

Determinants

The determinant of a matrix can be found using the SVD as an aid to solving for the values of z for which det [A(z)] vanishes. We can write:

det [A(z)] = det [U] det [ ] det [V]T

But, since [U] and [V] are orthogonal, then:

det [U] = det [V] = 1

By analysing the variation with z of the smallest singular value, r, it is possible to identify those value(s) of z that make r a minimum.

This procedure can be used in multi-point excitation applications to determine natural frequencies and hence modal force appropriations.