+ All Categories
Home > Documents > Linear Algebrace.sharif.edu/.../root/Lectures/22_Linear-Transformation.pdf · 2020. 12. 22. ·...

Linear Algebrace.sharif.edu/.../root/Lectures/22_Linear-Transformation.pdf · 2020. 12. 22. ·...

Date post: 05-Feb-2021
Category:
Upload: others
View: 5 times
Download: 0 times
Share this document with a friend
29
Linear Algebra Slide 22: Linear Transformation Fall 2020 Sharif University of Technology Fall 2020 Hamid Reza Rabiee
Transcript
  • Linear Algebra

    Slide 22: Linear Transformation

    Fall 2020

    Sharif University of Technology

    Fall 2020Hamid Reza Rabiee

  • Linear Transformations and their

    Matrix Representation

    Fall 2020Hamid Reza Rabiee

  • Definition of Linear Transformation

    • A Transformation 𝑇 assigns an output 𝑇 𝒗 to each input vector 𝒗 ∈ 𝑽.

    • A Transformation is linear if it satisfies these two

    conditions:

    • 𝑇(𝒗 + 𝒘) = 𝑇(𝒗) + 𝑇(𝒘)• ∀𝑐; 𝑇(𝑐𝒗) = 𝑐 𝑇(𝒗)

    Fall 2020Hamid Reza Rabiee

  • Definition of Linear Transformation (contd.)

    • If 𝑇 is a linear transformation then:

    𝑇 𝑐𝒗 + 𝑑𝒘 = 𝑐𝑇 𝒗 + 𝑑𝑇(𝒘)Proof.

    (By condition 1): 𝑇 𝑐𝒗 + 𝑑𝒘 = 𝑇 𝑐𝒗 + 𝑇(𝑑𝒘)(By condition 2): 𝑇 𝑐𝒗 + 𝑇(𝑑𝒘) = 𝑐𝑇 𝒗 + 𝑑𝑇(𝒘)

    • One can simply see That 𝑇 𝟎 = 𝟎 . If 𝑇 𝟎 ≠ 𝟎 then both conditions would be violated.

    Fall 2020Hamid Reza Rabiee

  • Example

    • Shift operation is not a linear transformation.

    𝑇 𝒗 = 𝒗 + 𝒘𝟎; 𝒘𝟎 ≠ 0

    𝑇 𝒗 + 𝒖 = 𝒗 + 𝒖 +𝒘𝟎 ≠ 𝒗 +𝒘𝟎 + 𝒖 +𝒘𝟎 = T 𝒗 + T(𝒖)

    Fall 2020Hamid Reza Rabiee

  • Example

    Fall 2020Hamid Reza Rabiee

    • Dot product is a linear transformation:

    𝑇 𝒗 = 𝒂. 𝒗;𝑇 𝑐 𝒗 = 𝒂. 𝑐𝒗 = 𝑐 𝒂. 𝒗 = 𝑐𝑇 𝒗

    𝑇 𝒗 + 𝒖 = 𝒂. 𝒗 + 𝒖 = 𝒂. 𝒗 + 𝒂. 𝒖 = 𝑇 𝒗 + 𝑇(𝒖)

  • Example

    Fall 2020Hamid Reza Rabiee

    • Rotation is a linear transformation. Result of addition of two

    vectors, when rotated, is result of addition of their rotations.

    • In this linear transformation the whole plane is turning together.

  • Example (contd.)

    Fall 2020Hamid Reza Rabiee

    • Every point on the input line goes onto the output line.

    • Equally spaced points go to equally spaced points.

    • A triangle in the input goes to a triangle in the output.

  • Linear Transformation and Basis

    Fall 2020Hamid Reza Rabiee

    Let

    • 𝑇 be a linear transformation• 𝒖 = 𝑐1𝒗𝟏 + 𝑐2𝒗𝟐 +⋯+ 𝑐𝑛𝒗𝒏Then by linearity we have:

    𝑇 𝒖 = 𝑐1𝑇 𝒗𝟏 + 𝑐2𝑇 𝒗𝟐 +⋯+ 𝑐𝑛𝑇 𝒗𝒏

    So, if we know 𝑇 𝒗 for all vectors 𝑣1, 𝑣2, ⋯ , 𝑣𝑛 in a basis then we know 𝑇 𝒖 for all vectors 𝒖 in the space.

  • Example

    Fall 2020Hamid Reza Rabiee

    • Consider the space of all 2nd degree polynomials.

    • Let 𝑇 be the derivative transformation. 𝑇 𝑢 =𝑑𝑢

    𝑑𝑥and we want

    to find derivative of 𝑢 = 6 − 4𝑥 + 10 𝑥2.• We intuitively start with the derivatives of 1, 𝑥, 𝑥2 because those

    are the basis vectors of the space. Their derivative are 0,1,2𝑥, respectively.

    𝑑𝑢

    𝑑𝑥= 6 𝑑𝑒𝑟. 𝑜𝑓 1 − 4 𝑑𝑒𝑟. 𝑜𝑓 𝑥 + 10 𝑑𝑒𝑟. 𝑜𝑓 𝑥2

    = −4 + 20 𝑥

  • Example (contd.)

    Fall 2020Hamid Reza Rabiee

    • Nullspace of 𝑇 𝑢 =𝑑𝑢

    𝑑𝑥:

    • We must solve 𝑇 𝑢 = 0 . Derivative of a polynomial is zero when it is a constant function. So multiples of 𝑢 = 1 make the nullspace. Therefore, Nullspace is one-dimensional

    • Column Space of 𝑇 𝑢 =𝑑𝑢

    𝑑𝑥:

    • The input space contains all quadratics 𝑎 + 𝑏𝑥 + 𝑐𝑥2 . So the output space contain all linear polynomials 𝑏 + 2𝑐𝑥 . Note that the Column Space is Two-Dimensional.

    • dim 𝑁𝑢𝑙𝑙𝑠𝑝𝑎𝑐𝑒 + dim 𝐶𝑜𝑙𝑢𝑚𝑛 𝑠𝑝𝑎𝑐𝑒 = 1 + 2 = 3=dim(𝐼𝑛𝑝𝑢𝑡 𝑠𝑝𝑎𝑐𝑒)

    • Note: Transformations have a language of their own. For Transformation 𝑇, nullspace is actually called Kernel of 𝑻 and column space is called Range of 𝑻.

  • Example

    Fall 2020Hamid Reza Rabiee

    We can show the derivative function using Matrix multiplication:

    𝒖 = 𝑎 + 𝑏𝑥 + 𝑐𝑥2 𝐴 𝒖 =0 1 00 0 2

    𝑎𝑏𝑐

    =𝑏2𝑐

    𝒅𝒖

    𝒅𝑥= 𝑏 + 2𝑐 𝑥

  • Important Example

    Fall 2020Hamid Reza Rabiee

    • Suppose A is a matrix. Then 𝑇 𝒗 = 𝐴 𝒗 is a linear transformation. Because 𝐴 𝑐𝒗 = 𝑐𝐴𝒗 and 𝐴 𝒗 + 𝒖 = 𝐴𝒗 + 𝐴𝒖.

    • Let 𝐴 be invertible. multiplication by 𝐴−1 is also a linear transformation.• The inverse transformation of 𝑇 then is shown by 𝑇−1 which brings back

    every vector 𝑇 𝒗 to 𝒗.

    𝑇−1 𝑇 𝒗 = 𝒗 𝐴−1 𝐴 𝒗 = 𝒗

    • Are all linear Transformations from 𝑽 = 𝑹𝒏 to 𝑾 = 𝑹𝒎 produced by matrices?

  • The Matrix of a Linear Transformation

    Fall 2020Hamid Reza Rabiee

    • We want to show linear transformation 𝑇(𝑣) with 𝐴 𝑣, where 𝐴 is a matrix.

    • For ordinary column vectors, the input 𝑣 is in 𝑉 = 𝑅𝑛 and the output 𝑇 (𝑣) is in 𝑊 = 𝑅𝑚. The matrix A for this transformation will be 𝑚 by 𝑛. Our choice of bases in 𝑉 and 𝑊 will decide 𝐴.

  • The Matrix of a Linear Transformation (contd.)

    Fall 2020Hamid Reza Rabiee

    • The standard basis vectors for 𝑅𝑛 and 𝑅𝑚 are the columns of 𝐼. That choice leads to a transformation matrix 𝑇(𝑣) = 𝐴𝑣. But these spaces also have other bases, so the same transformation 𝑇is also represented by other matrices.

    • All vector spaces 𝑉 and 𝑊 have bases. Each choice of those bases leads to a matrix for 𝑇. When the input basis is different from the output basis, the matrix for 𝑇(𝑣) = 𝑣 will not be the identity 𝐼.

  • The Matrix of a Linear Transformation (contd.)

    Fall 2020Hamid Reza Rabiee

    • Suppose we know 𝑇(𝑣) for the input basis vectors 𝑣1 to 𝑣𝑛. Columns 1 to 𝑛 of the matrix will contain those outputs 𝑇(𝑣1) to 𝑇 𝑣𝑛 .

    • Every combination of those 𝑛 vectors can be represented by a vector 𝑐multiplied by the matrix 𝐴.

    • 𝐴𝑐 is the correct combination 𝑐1 𝑇(𝑣1) + · · · + 𝑐𝑛𝑇(𝑣𝑛) = 𝑇(𝑣).

    • Reason: Every 𝑣 is a unique combination 𝑐1𝑣1 + · · · +𝑐𝑛𝑣𝑛 of the basis vectors v𝑗.

    • Since 𝑇 is a linear transformation, 𝑇(𝑣) must be the same combination 𝑐1𝑇(𝑣1) + · · · + 𝑐𝑛𝑇(𝑣𝑛) of the outputs 𝑇(𝑣𝑗) in the columns.

  • Example

    Fall 2020Hamid Reza Rabiee

    • Let the input space 𝑉 = 𝑅2 and the output space 𝑊 = 𝑅2 and 𝑇 𝑣 = 𝑣. The matrix for this transformation is clearly 𝐼 in the standard basis, however, we require another basis:

    Input Basis 𝑣1 𝑣2 =3 63 8

    Output Basis 𝑤1 𝑤2 =3 01 2

    ቊ𝑣1 = 1 𝑤1 + 1𝑤2𝑣2 = 2 𝑤1 + 3𝑤2

    • Input basis is written in terms of output basis, because actually 𝑇 is applied to each input basis vector and the result is written in terms of output vector.

  • Example (contd.)

    Fall 2020Hamid Reza Rabiee

    Input Basis 𝑣1 𝑣2 =3 63 8

    Output Basis 𝑤1 𝑤2 =3 01 2

    ቊ𝑣1 = 1 𝑤1 + 1𝑤2𝑣2 = 2 𝑤1 + 3𝑤2

    • If 𝐵 is the change of basis vector, we have: 𝑊𝐵 = 𝑣 ⇒ 𝐵 = 𝑊−1𝑉

    𝑤1 𝑤2 𝐵 = 𝑣1 𝑣2

    3 01 2

    1 21 3

    =3 63 8

  • “Change of Basis” Matrix

    Fall 2020Hamid Reza Rabiee

    • Change of basis matrix is 𝑩 = 𝑾−𝟏𝑽

    𝒖 = 𝑐𝟏𝒗𝟏 +⋯+ 𝑐𝑛𝒗𝒏𝒖 = 𝑑𝟏𝒘𝟏 +⋯+ 𝑑𝑛𝒘𝒏

    𝑣1 ⋯ 𝑣𝑛

    𝑐1⋮𝑐𝑛

    = 𝑤1 ⋯ 𝑤𝑛

    𝑑1⋮𝑑𝑛

    𝑽𝑐 = 𝑾 𝑑 ⇒ 𝑑 = 𝑾−1𝑽𝑐 so change of basis vector is 𝑩 = 𝑾−𝟏𝑽

  • Constructing Transformation Matrix

    Fall 2020Hamid Reza Rabiee

    • 𝑇 is a linear transformation from the space 𝑽 (n-dimensional) to 𝑾 (m-dimensional). We chose basis 𝑣1, … , 𝑣𝑛 for 𝑽 and 𝑤1, … , 𝑤𝑚 for 𝑾. The matrix A for transformation 𝑇will be 𝑚 × 𝑛. To find the first column of this matrix, we apply 𝑇 to 𝒗𝟏 and write the output 𝑇(𝒗𝟏) in the 𝑾 basis.

    𝑇 𝒗𝟏 is a combination 𝑎11𝑤1 +⋯𝑎𝑚1𝑤𝑚 of the output basis for 𝑾𝑎11, ⋯ 𝑎𝑚1 are the elements of first column of 𝐴.

    • With this method, we can find all the column elements of 𝐴 by applying 𝑇 to 𝑣1, … 𝑣𝑛and putting the coefficients of the answer in terms of 𝑤1, …𝑤𝑛 in the columns of 𝐴.

    Key rule: The 𝑗-th column of 𝐴 is found by applying 𝑇 to the 𝑗th basis vector 𝒗𝑗T(

  • Chained Transformations and Matrix Multiplication

    Fall 2020Hamid Reza Rabiee

    • Consider two Linear Transformations 𝑇, 𝑆 with their matrices 𝐴, 𝐵. Now we compare transformation 𝑇𝑆 with multiplication 𝐴𝐵:

    • When applying the transformation 𝑇 to the output from 𝑆, we get 𝑇𝑆 by this rule:(𝑇𝑆)(𝑢) is defined to be 𝑇(𝑆(𝑢)). The output 𝑆(𝑢) becomes the input to T.

    • When applying the matrix 𝐴 to the output from 𝐵, we multiply 𝐴𝐵 by this rule: (𝐴𝐵)(𝑥) is defined to be 𝐴(𝐵𝑥). The output 𝐵𝑥 becomes the input to 𝐴.

    • Unsurprisingly, matrix multiplication gives the correct matrix 𝐴𝐵 to represent 𝑇𝑆. T(

  • Chained Transformations and Matrix Multiplication (contd.)

    Fall 2020Hamid Reza Rabiee

    • The transformation 𝑆 is from a space 𝑈 to 𝑉. Its matrix 𝐵 uses a basis 𝑢1, … , 𝑢𝑝 for 𝑈 and a basis 𝑣1, … 𝑣𝑛 for 𝑉 . This matrix is 𝑛 × 𝑝 . The

    transformation 𝑇 is from 𝑉 to 𝑊. Its matrix A must use the same basis 𝑣1, … 𝑣𝑛 for 𝑉 and a basis 𝑤1, …𝑤𝑚 for 𝑊. This matrix is 𝑚 × 𝑛. This way, the matrix 𝐴𝐵 correctly represents 𝑇𝑆.

    𝑇𝑆:𝑈 → 𝑉 → 𝑊 𝐴𝐵: 𝑚 × 𝑛 𝑛 × 𝑝 = (𝑚 × 𝑝)

  • Example

    Fall 2020Hamid Reza Rabiee

    • Transformation 𝑇 rotates vectors in 𝑅2 by 𝜃, and transformation 𝑆 rotates vectors in 𝑅2 by 𝜙. So 𝑇𝑆 rotates by 𝜃 + 𝜙 :

    𝐴 =cos 𝜃 − sin 𝜃sin 𝜃 𝑐𝑜𝑠𝜃

    𝐵 =cos 𝜙 −sin𝜙sin𝜙 𝑐𝑜𝑠𝜙

    𝐴𝐵 =cos 𝜃 − sin 𝜃sin 𝜃 𝑐𝑜𝑠𝜃

    cos𝜙 −sin 𝜙sin𝜙 𝑐𝑜𝑠𝜙

    =

    cos 𝜃 cos𝜙 − sin 𝜃 sin𝜙 −(sin 𝜃 cos𝜙 + cos 𝜃 sin 𝜙)sin 𝜃 cos𝜙 + cos 𝜃 sin𝜙 cos 𝜃 cos 𝜙 − sin 𝜃 sin𝜙

    =

    cos 𝜙 + 𝜃 −sin 𝜙 + 𝜃

    sin 𝜙 + 𝜃 cos 𝜙 + 𝜃

    which is rotation by 𝜃 + 𝜙

  • Eigenvector Basis and Singular Value Basis

    Fall 2020Hamid Reza Rabiee

    • We want to find bases that diagonalize the matrix. With the standard basis

    (the columns of 𝐼) our transformation 𝑇 usually produce a non-diagonalmatrix 𝐴, So we must choose different bases. Two great choices areeigenvectors and singular vectors.

    • If 𝑇 transforms 𝑅𝑛 to 𝑅𝑛 standard basis, its matrix 𝐴 is square. But 𝐴 isprobably not diagonal. If there are 𝑛 independent eigenvectors, choosethose as input and output basis. In this basis, the matrix for 𝑇 is thediagonal eigenvalue matrix Λ.

  • Example (projection)

    Fall 2020Hamid Reza Rabiee

    • Transformation 𝑇 projects every 𝑣 = 𝑥, 𝑦 in 𝑅2 onto the line 𝑦 = −𝑥.

    Using the standard basis 𝑣1 = 1,0 ⇒ 𝑇 𝑣1 = (1

    2, −

    1

    2) and 𝑣2 = 0,1

    ⇒ 𝑇 𝑣2 = (−1

    2,1

    2), so 𝐴 =

    1

    2−

    1

    2

    −1

    2

    1

    2

    . The eigenvectors of this matrix

    are 𝑤1 = 1,−1 ,𝑤2 = 1,1 with eigenvalues 𝜆1 = 1 , 𝜆2 = 0. If we choose 𝑤1, 𝑤2 as basis vectors then the matrix for transformation 𝑇 will be:

    𝜆 =1 00 0

  • Singular Value Basis

    • SVD gives us 𝑈−1𝐴𝑉 = Σ . The right singular vectors 𝑣1, 𝑣2, … 𝑣𝑛 will be the input basis and the left singular vectors 𝑢1, … , 𝑢𝑚 will be the output basis. With this method we will have:

    𝐵𝑜𝑢𝑡−1 𝐴𝐵𝑖𝑛 = 𝑈

    −1𝐴 𝑉 = Σwhere 𝐵𝑜𝑢𝑡 , 𝐵𝑖𝑛 are matrices with output and input basis as their columns, respectively.

    Hamid Reza Rabiee Fall 2020

  • Summary

    • A transformation 𝑇 is linear if 𝑇 𝑣 + 𝑤 = 𝑇 𝑣 + 𝑇 𝑤 and 𝑇 𝑐𝑣 = 𝑐𝑇 𝑣

    • 𝑇 𝑐1𝑣1 +⋯+ 𝑐𝑛𝑣𝑛 = 𝑐1𝑇 𝑣1 +⋯+ 𝑐𝑛𝑇 𝑣𝑛• If we know 𝑇 𝑣1 ,⋯𝑇 𝑣𝑛 for a basis 𝑣1, 𝑣2, … , 𝑣𝑛, linearity will

    determine 𝑇 𝑣 for all 𝑣.• Linear transformation 𝑇 on input basis 𝑣1, … , 𝑣𝑛 to output basis

    𝑤1, … , 𝑤𝑚 can be represented by a 𝑚× 𝑛 matrix.• If matrices 𝐴 and 𝐵 represent transformations 𝑇 and 𝑆 and the

    output basis of 𝑆 is the input basis for 𝑇, matrix 𝐴𝐵 represent 𝑇(𝑆 𝑣 ).

    • Best bases that diagonalize transformation matrix are eigenvectors

    and singular value vectors.

    Hamid Reza Rabiee Fall 2020

  • Source/ for more details

    • Introduction to Linear Algebra, Strang, Chapter 8.1 and 8.2

    • MIT Linear Algebra Course, Strang

    • “Matrix Representations of Linear Transformations and

    Changes of Coordinates” handouts, Colorado University.

    Hamid Reza Rabiee Fall 2020

  • Special Thanks to the following TAs for this presentation:

    • Amirmahdi Namjoo

    Hamid Reza Rabiee Fall 2020


Recommended