Applied linear operators and spectral methods/Lecture 1

From testwiki
Revision as of 19:04, 24 February 2025 by 12.156.20.2 (talk) (Corrected grammar mistake, added short proof of uniqueness of vectors as linear combinations of basis elements)
(diff) โ† Older revision | Latest revision (diff) | Newer revision โ†’ (diff)
Jump to navigation Jump to search

Linear operators can be thought of as infinite dimensional matrices. Hence we can use well known results from matrix theory when dealing with linear operators. However, we have to be careful. A finite dimensional matrix has an inverse if none of its eigenvalues are zero. For an infinite dimensional matrix, even though all the eigenvectors may be nonzero, we might have a sequence of eigenvalues that tend to zero. There are several other subtleties that we will discuss in the course of this series of lectures.

Let us start off with the basics, i.e., linear vector spaces.

Linear Vector Spaces (S)

Let ๐’ฎ be a linear vector space.

Addition and scalar multiplication

Let us first define addition and scalar multiplication in this space. The addition operation acts completely in ๐’ฎ while the scalar multiplication operation may involve multiplication either by a real (in โ„) or by a complex number (in โ„‚). These operations must have the following closure properties:

  1. If ๐ฑ,๐ฒ๐’ฎ then ๐ฑ+๐ฒ๐’ฎ.
  2. If αโ„ (or โ„‚) and ๐ฑ๐’ฎ then α๐ฑ๐’ฎ.

And the following laws must hold for addition

  1. ๐ฑ+๐ฒ = ๐ฒ+๐ฑ Commutative law.
  2. ๐ฑ+(๐ฒ+๐ณ) = (๐ฑ+๐ฒ)+๐ณ Associative law.
  3. ๐ŸŽ๐’ฎ such that ๐ŸŽ+๐ฑ=๐ฑ๐ฑ๐’ฎ Additive identity.
  4. ๐ฑ๐’ฎ๐ฑ๐’ฎ such that ๐ฑ+๐ฑ=๐ŸŽ Additive inverse.

For scalar multiplication we have the properties

  1. α(β๐ฑ)=(αβ)๐ฑ.
  2. (α+β)๐ฑ=α๐ฑ+β๐ฑ.
  3. α(๐ฑ+๐ฒ)=α๐ฑ+α๐ฒ.
  4. ๐Ÿ๐ฑ=๐ฑ.
  5. ๐ŸŽ๐ฑ=๐ŸŽ.

Example 1: n tuples

The n tuples (x1,x2,,xn) with

(x1,x2,,xn)+(y1,y2,,yn)=(x1+y1,x2+y2,,xn+yn)α(x1,x2,,xn)=(αx1,αx2,,αxn)

form a linear vector space.

Example 2: Matrices

Another example of a linear vector space is the set of 2×2 matrices with addition as usual and scalar multiplication, or more generally n×m matrices.

α[x11x12x21x22]=[αx11αx12αx21αx22]

Example 3: Polynomials

The space of n-th order polynomials forms a linear vector space.

pn=j=1nαjxj

Example 4: Continuous functions

The space of continuous functions, say in [0,1], also forms a linear vector space with addition and scalar multiplication defined as usual.

Linear Dependence

A set of vectors ๐ฑ1,๐ฑ2,,๐ฑn๐’ฎ are said to be linearly dependent if α1,α2,,αn not all zero such that

α๐ฑ1+α๐ฑ2++α๐ฑn=๐ŸŽ

If such a set of constants α1,α2,,αn does not exist then the vectors are said to be linearly independent.

Example

Consider the matrices

๐‘ด1=[1002],๐‘ด2=[1000],๐‘ด3=[0001]

These are linearly dependent since ๐‘ด1๐‘ด2+2๐‘ด3=๐ŸŽ.

Span

The span of a set of vectors (๐‘ป) is the set of all vectors that are linear combinations of the vectors ๐ฑi. Thus

span(๐‘ป)={๐‘ป1,๐‘ป2,,๐‘ปn}

where

๐‘ปi=α1๐ฑ1+α2๐ฑ2++αn๐ฑn

as α1,α2,,αn vary.

Spanning set

If the span = ๐’ฎ then ๐‘ป is said to be a spanning set.

Basis

If ๐‘ป is a spanning set and its elements are linearly independent then we call it a basis for ๐’ฎ. A vector in ๐’ฎ has a unique representation as a linear combination of the basis elements.

This is because adding two vectors is the same as adding the coefficients of the basis elements, and if any vector had two distinct representations, the basis would no longer be linearly independent, as vโ†’=a1eโ†’1+a2eโ†’2+=b1eโ†’1+b2eโ†’2+implies thatvโ†’vโ†’=(a1b1)eโ†’1+(a2b2)eโ†’2+=0โ†’which is only permissible (by linear independence) if the aibi are all zero.

Dimension

The dimension of a space ๐’ฎ is the number of elements in the basis. This is independent of actual elements that form the basis and is a property of ๐’ฎ.

Example 1: Vectors in R2

Any two non-collinear vectors โ„2 is a basis for โ„2 because any other vector in โ„2 can be expressed as a linear combination of the two vectors.

Example 2: Matrices

A basis for the linear space of 2×2 matrices is

[1000],[1100],[1101],[1311]

Note that there is a lot of nonuniqueness in the choice of bases. One important skill that you should develop is to choose the right basis to solve a particular problem.

Example 3: Polynomials

The set {1,x,x2,,xn} is a basis for polynomials of degree n.

Example 4: The natural basis

A natural basis is the set {๐ž1,๐ž2,,๐žn} where the jth entry of ๐žk is

δjk={1forj=k0forjk

The quantity δjk is also called the Kronecker delta.

Inner Product Spaces

To give more structure to the idea of a vector space we need concepts such as magnitude and angle. The inner product provides that structure.

The inner product generalizes the concept of an angle and is defined as a function

,:๐’ฎ×๐’ฎโ„(orโ„‚for a complex vector space)

with the properties

  1. ๐ฑ,๐ฒ=๐ฒ,๐ฑ overbar indicates complex conjugation.
  2. α๐ฑ,๐ฒ=α๐ฑ,๐ฒ Linear with respect to scalar multiplication.
  3. ๐ฑ+๐ฒ,๐ณ=๐ฑ,๐ณ+๐ฒ,๐ณ Linearity with respect to addition.
  4. ๐ฑ,๐ฑ>๐ŸŽ if ๐ฑ0 and ๐ฑ,๐ฑ=๐ŸŽ if and only if ๐ฑ=๐ŸŽ.

A vector space with an inner product is called an inner product space.

Example 1:

๐ฑ,β๐ฒ=β๐ฒ,๐ฑ=β๐ฒ,๐ฑ=β๐ฑ,๐ฒ

Example 2: Discrete vectors

In โ„n with ๐ฑ={x1,x2,,xn} and ๐ฒ={y1,y2,,yn} the Eulidean norm is given by

๐ฑ,๐ฒ=nxnyn

With ๐ฑ,๐ฒโ„‚n the standard norm is

๐ฑ,๐ฒ=kxkyk

Example 3: Continuous functions

For two complex valued continuous functions f(x) and g(x) in [0,1] we could approximately represent them by their function values at equally spaced points.

Approximate f(x) and g(x) by

F={f(x1),f(x2),,f(xn)}withxk=knG={g(x1),g(x2),,g(xn)}withxk=kn

With that approximation, a natural norm is

F,G=1nk=1nf(xk)g(xk)

Taking the limit as n (show this)

f,g=01f(x)g(x)dx

If we took non-equally spaced yet smoothly distributed points we would get

f,g=01f(x)g(x)w(x)dx

where w(x)>0 is a smooth weighting function (show this).

There are many other inner products possible. For functions that are not only continuous but also differentiable, a useful norm is

f,g=01[f(x)g(x)+f'(x)g'(x)]dx

We will continue further explorations into linear vector spaces in the next lecture. Template:Lecture