Applied linear operators and spectral methods/Lecture 3

From testwiki
Revision as of 19:12, 26 August 2020 by imported>ShakespeareFan00
(diff) โ† Older revision | Latest revision (diff) | Newer revision โ†’ (diff)
Jump to navigation Jump to search

Review

In the last lecture we talked about norms in inner product spaces. The induced norm was defined as

๐ฑ=๐ฑ,๐ฑ=๐ฑ2.

We also talked about orthonomal bases and biorthonormal bases. The biorthonormal bases may be thought of as dual bases in the sense that covariant and contravariant vector bases are dual.

The last thing we talked about was the idea of a linear operator. Recall that

๐‘จφj=iAijφiAijφi

where the summation is on the first index.

In this lecture we will learn about adjoint operators, Jacobi tridiagonalization, and a bit about the spectral theory of matrices.

Adjoint operator

Assume that we have a vector space with an orthonormal basis. Then

๐‘จφj,φi=Akjφk,φi=Akjδki=Aij

One specific matrix connected with ๐‘จ is the Hermitian conjugate matrix. This matrix is defined as

Aij*=Aji

The linear operator ๐‘จ* connected with the Hermitian matrix is called the adjoint operator and is defined as

๐‘จ*=๐‘จT

Therefore,

๐‘จ*φj,φi=Aji

and

φi,๐‘จ*φj=Aji=๐‘จφi,φj

More generally, if

๐Ÿ=iαiφiand๐ =jβjφj

then

๐Ÿ,๐‘จ*๐ =i,jαiβjφi,๐‘จ*φj=i,jαiβj๐‘จφi,φj=๐‘จ๐Ÿ,๐ 

Since the above relation does not involve the basis we see that the adjoint operator is also basis independent.

Self-adjoint/Hermitian matrices

If ๐‘จ*=๐‘จ we say that ๐‘จ is self-adjoint, i.e., Aij=Aji in any orthonomal basis, and the matrix Aij is said to be Hermitian.

Anti-Hermitian matrices

A matrix ๐ is anti-Hermitian if

Bij=Bji

There is a close connection between Hermitian and anti-Hermitian matrices. Consider a matrix ๐€=i๐. Then

Aij=iBij=iBji=Aji.

Jacobi Tridiagonalization

Let ๐‘จ be self-adjoint and suppose that we want to solve

(๐‘ฐλ๐‘จ)๐ฒ=๐›

where λ is constant. We expect that

๐ฒ=(๐‘ฐλ๐‘จ)1๐›

If λ๐š is "sufficiently" small, then

๐ฒ=(๐‘ฐ+λ๐‘จ+λ2๐‘จ2+λ3๐‘จ3+)๐›

This suggest that the solution should be in the subspace spanned by ๐›,๐‘จ๐›,๐‘จ2๐›,.

Let us apply the Gram-Schmidt orthogonalization procedure where

๐ฑ1=๐›,๐ฑ2=๐‘จ๐›,๐ฑ3=๐‘จ2๐›,

Then we have

φn=๐‘จn๐›j=1n1๐‘จn๐›,φjφj2φj

This is clearly a linear combination of (๐ฑ1,๐ฑ2,,๐ฑn). Therefore, ๐‘จφn is a linear combination of ๐‘จ(๐ฑ1,๐ฑ2,,๐ฑn)=(๐ฑ2,๐ฑ3,,๐ฑn+1). This is the same as saying that ๐‘จφn is a linear combination of φ1,φ2,,φn+1.

Therefore,

๐‘จφn,φk=0ifk>n+1

Now,

Akn=๐‘จφn,φk

But the self-adjointeness of ๐‘จ implies that

Ank=Akn

So Akn=0 is k>n+1 or n>k+1. This is equivalent to expressing the operator ๐‘จ as a tridiagonal matrix ๐€ which has the form

๐€=[xx00xxx0xxxx0xxx00xx]

In general, the matrix can be represented in block tridiagonal form.

Another consequence of the Gram-Schmidt orthogonalization is that

Lemma:

Every finite dimensional inner-product space has an orthonormal basis.

Proof:

The proof is trivial. Just use Gram-Schmidt on any basis for that space and normalize.

A corollary of this is the following theorem.

Theorem:

Every finite dimensional inner product space is complete.

Recall that a space is complete is the limit of any Cauchy sequence from a subspace of that space must lie within that subspace.

Proof:

Let {๐ฎk} be a Cauchy sequence of elements in the subspace ๐’ฎn with k=1,,. Also let {๐ž1,๐ž2,,๐žn} be an orthonormal basis for the subspace ๐’ฎn.

Then

๐ฎk=j=1nαkj๐žj

where

αki=๐ฎk,๐ži

By the Schwarz inequality

|αkiαpi|=|๐ฎk,๐ži๐ฎp,๐ži|๐ฎk๐ฎp0

Therefore,

αkiαpi0

But the α~s are just numbers. So, for fixed i, {αki} is a Cauchy sequence in โ„ (or โ„‚) and so converges to a number αi as k, i.e.,

limk๐ฎk=i=1nαi๐ži

which is is the subspace ๐’ฎn.

Spectral theory for matrices

Suppose ๐‘จ๐ฑ=๐› is expressed in coordinates relative to some basis φ1,φ2,,φn, i.e.,

๐‘จφj=iAijφi;๐ฑ=ixiφi;๐›=ibiφi

Then

๐‘จ๐ฑ=๐‘จjxjφj=jxj(Aφj)=i,jxjAijφi

So ๐‘จ๐ฑ=๐› implies that

i,jAijxj=bi

Now let us try to see the effect of a change to basis to a new basis φ1',φ2',,φn' with

φi'=j=1nCjiφj

For the new basis to be linearly independent, ๐‚ should be invertible so that

φi=j=1nCmj1φm'

Now,

๐ฑ=jxjφj=i,jxjCij1φi'=ixi'φi'

Hence

xi'=Cij1xj

Similarly,

bi'=Cij1bj

Therefore

๐‘จφ'=j=1nCji๐‘จφj=j,kCjiAkjφk=j,k,mCjiAkjCmk1φm'=mAmi'φm'

So we have

Ami'=j,kCmk1AkjCji

In matrix form,

๐ฑ'=๐‚1๐ฑ;๐›'=๐‚1๐›;๐€'=๐‚1๐€๐‚

where the objects here are not operators or vectors but rather the matrices and vectors representing them. They are therefore basis dependent.

In other words, the matrix equation ๐€๐ฑ=๐›


Similarity transformation

The transformation

๐€'=๐‚1๐€๐‚

is called a similarity transformation. Two matrices are equivalent or similar is there is a similarity transformation between them.

Diagonalizing a matrix

Suppose we want to find a similarity transformation which makes ๐‘จ diagonal, i.e.,

๐€'=[λ1000λ2000λn]=Λ

Then,

๐€๐‚=๐‚๐€'=๐‚Λ

Let us write ๐‚ (which is a n×n matrix) in terms of its columns

๐‚=[๐ฑ1๐ฑ2๐ฑn]

Then,

๐€[๐ฑ1๐ฑ2๐ฑn]=[๐ฑ1๐ฑ2๐ฑn]Λ=[λ1๐ฑ1λ2๐ฑ2λn๐ฑn]

i.e.,

๐€๐ฑi=λi๐ฑi

The pair (λ,๐ฑ) is said to be an eigenvalue pair if ๐€๐ฑ=λ๐ฑ where ๐ฑ is an eigenvector and λ is an eigenvalue.

Since (๐€λ๐ˆ)=๐ŸŽ this means that λ is an eigenvalue if and only if

det(๐€λ๐ˆ)=๐ŸŽ

The quantity on the left hand side is called the characteristic polynomial and has n roots (counting multiplicities).

In โ„‚ there is always one root. For that root ๐€λ๐ˆ is singular, i.e., there always exists at least one eigenvector.

We will delve a bit more into the spectral theory of matrices in the next lecture. Template:Lecture