Applied linear operators and spectral methods/Lecture 2

From testwiki
Jump to navigation Jump to search

Norms in inner product spaces

Inner product spaces have Lp norms which are defined as

๐ฑp=๐ฑ,๐ฑ1/p,p=1,2,

When p=1, we get the L1 norm

๐ฑ1=๐ฑ,๐ฑ

When p=2, we get the L2 norm

๐ฑ2=๐ฑ,๐ฑ

In the limit as p we get the L norm or the sup norm

๐ฑ=max|xk|

The adjacent figure shows a geometric interpretation of the three norms.

File:Normplot.png
Geomtric interpretation of various norms

If a vector space has an inner product then the norm

๐ฑ=๐ฑ,๐ฑ=๐ฑ2

is called the induced norm. Clearly, the induced norm is nonnegative and zero only if ๐ฑ=๐ŸŽ. It is also linear under multiplication by a positive vector. You can think of the induced norm as a measure of length for the vector space.

So useful results that follow from the definition of the norm are discussed below.

Schwarz inequality

In an inner product space

|๐ฑ,๐ฒ|๐ฑ๐ฒ

Proof

This statement is true if ๐ฒ=๐ŸŽ.

If ๐ฒ๐ŸŽ we have

0<๐ฑα๐ฒ2=(๐ฑα๐ฒ),(๐ฑα๐ฒ)=๐ฑ,๐ฑ๐ฑ,α๐ฒα๐ฒ,๐ฑ+|α2|๐ฒ,๐ฒ

Now

๐ฑ,α๐ฒ+α๐ฒ,๐ฑ=α๐ฑ,๐ฒ+α๐ฑ,๐ฒ=2Re(α)๐ฑ,๐ฒ

Therefore,

๐ฑ22Re(α)๐ฑ,๐ฒ+|α2|๐ฒ2>0

Let us choose α such that it minimizes the left hand side above. This value is clearly

α=๐ฑ,๐ฒ๐ฒ2

which gives us

๐ฑ22|๐ฑ,๐ฒ|2๐ฒ2+|๐ฑ,๐ฒ|2๐ฒ2>0

Therefore,

๐ฑ2๐ฒ2|๐ฑ,๐ฒ|2

Triangle inequality

The triangle inequality states that

๐ฑ+๐ฒ๐ฑ+๐ฒ

Proof

๐ฑ+๐ฒ2=๐ฑ2+2Re๐ฑ,๐ฒ+๐ฒ2

From the Schwarz inequality

๐ฑ+๐ฒ2<๐ฑ2+2๐ฑ๐ฒ+๐ฒ2=(๐ฑ+๐ฒ)2

Hence

๐ฑ+๐ฒ๐ฑ+๐ฒ

Angle between two vectors

In โ„2 or โ„3 we have

cosθ=๐ฑ,๐ฒ๐ฑ๐ฒ

So it makes sense to define cosθ in this way for any real vector space.

We then have

๐ฑ+๐ฒ2=๐ฑ2+2๐ฑ๐ฒcosθ+๐ฒ2

Orthogonality

In particular, if cosθ=0 we have an analog of the Pythagoras theorem.

๐ฑ+๐ฒ2=๐ฑ2+๐ฒ2

In that case the vectors are said to be orthogonal.

If ๐ฑ,๐ฒ=0 then the vectors are said to be orthogonal even in a complex vector space.

Orthogonal vectors have a lot of nice properties.

Linear independence of orthogonal vectors

  • A set of nonzero orthogonal vectors is linearly independent.

If the vectors φi are linearly dependent

α1φ1+α2φ2++αnφn=0

and the φi are orthogonal, then taking an inner product with φj gives

αjφj,φj=0αj=0j

since

φi,φj=0ifij.

Therefore the only nontrivial case is that the vectors are linearly independent.

Expressing a vector in terms of an orthogonal basis

If we have a basis {φ1,φ2,,φn} and wish to express a vector ๐Ÿ in terms of it we have

๐Ÿ=j=1nβjφj

The problem is to find the βjs.

If we take the inner product with respect to φi, we get

๐Ÿ,φi=j=1nβjφi,φj

In matrix form,

η=๐‘ฉβ

where Bij=φi,φj and ηi=๐Ÿ,φi.

Generally, getting the βjs involves inverting the n×n matrix ๐‘ฉ, which is an identity matrix ๐‘ฐ๐’, because φi,φj=δij, where δij is the Kronecker delta.

Provided that the φis are orthogonal then we have

βj=๐Ÿ,φjφj2

and the quantity

๐ฉ=๐Ÿ,φjφj2φj

is called the projection of ๐Ÿ onto φj.

Therefore the sum

๐Ÿ=jβjφj

says that ๐Ÿ is just a sum of its projections onto the orthogonal basis.

Projection operation.

Let us check whether ๐ฉ is actually a projection. Let

๐š=๐Ÿ๐ฉ=๐Ÿ๐Ÿ,φφ2φ

Then,

๐š,φ=๐Ÿ,φ๐Ÿ,φφ2φ,φ=0

Therefore ๐š and φ are indeed orthogonal.

Note that we can normalize φi by defining

φ~i=φiφi

Then the basis {φ~1,φ~2,,φ~n} is called an orthonormal basis.

It follows from the equation for βj that

β~j=๐Ÿ,φj~

and

๐Ÿ=j=1nβ~jφ~j

You can think of the vectors φ~i as orthogonal unit vectors in an n-dimensional space.

Biorthogonal basis

However, using an orthogonal basis is not the only way to do things. An alternative that is useful (for instance when using wavelets) is the biorthonormal basis.

The problem in this case is converted into one where, given any basis {φ1,φ2,,φn}, we want to find another set of vectors {ψ1,ψ2,,ψn} such that

φi,ψj=δij

In that case, if

๐Ÿ=j=1nβjφj

it follows that

๐Ÿ,ψk=j=1nβjφj,ψk=βk

So the coefficients βk can easily be recovered. You can see a schematic of the two sets of vectors in the adjacent figure.

File:BiorthonormalBasis.png
Biorthonomal basis

Gram-Schmidt orthogonalization

One technique for getting an orthogonal baisis is to use the process of Gram-Schmidt orthogonalization.

The goal is to produce an orthogonal set of vectors {φ1,φ2,,φn} given a linearly independent set {๐ฑ1,๐ฑ2,,๐ฑn}.

We start of by assuming that φ1=๐ฑ1. Then φ2 is given by subtracting the projection of ๐ฑ2 onto φ1 from ๐ฑ2, i.e.,

φ2=๐ฑ2๐ฑ2,φ1φ12φ1

Thus φ2 is clearly orthogonal to φ1. For φ3 we use

φ3=๐ฑ3๐ฑ3,φ1φ12φ1๐ฑ3,φ2φ22φ2

More generally,

φn=๐ฑnj=1n1๐ฑn,φjφj2φj

If you want an orthonormal set then you can do that by normalizing the orthogonal set of vectors.

We can check that the vectors φj are indeed orthogonal by induction. Assume that all φj,jn1 are orthogonal for some j. Pick k<n. Then

φn,φk=๐ฑn,φkj=1n1๐ฑn,φjφj2φj,φk

Now φj,φk=0 unless j=k. However, at j=k, φn,φk=0 because the two remaining terms cancel out. Hence the vectors are orthogonal.

Note that you have to be careful while numerically computing an orthogonal basis using the Gram-Schmidt technique because the errors add up in the terms under the sum.

Linear operators

The object ๐‘จ is a linear operator from ๐’ฎ onto ๐’ฎ if

๐‘จ๐ฑ๐‘จ(๐ฑ)๐’ฎ๐ฑ๐’ฎ

A linear operator satisfies the properties

  1. ๐‘จ(α๐ฑ)=α๐‘จ(๐ฑ).
  2. ๐‘จ(๐ฑ+๐ฒ)=๐‘จ(๐ฑ)+๐‘จ(๐ฒ).

Note that ๐‘จ is independent of basis. However, the action of ๐‘จ on a basis {φ1,φ2,,φn} determines ๐‘จ completely since

๐‘จ๐Ÿ=๐‘จ(jβjφj)=jβj๐‘จ(φj)

Since ๐‘จφj๐’ฎ we can write

๐‘จφj=iAijφi

where Aij is the n×n matrix representing the operator ๐‘จ in the basis {φ1,φ2,,φn}.

Note the location of the indices here which is not the same as what we get in matrix multiplication. For example, in Re2, we have

๐‘จ๐ž2=[A11A12A21A22][01]=[A12A22]=A12[10]+A22[01]=A12๐ž1+A22๐ž2=Aij๐ži

We will get into more details in the next lecture. Template:Lecture