Thursday, September 29, 2016

Functional Analysis - Hilbert spaces

Starting from some definitions of the metric and normed spaces (here), we can start by defining the inner product spaces which will lead us to the Hilbert spaces. 

We all know the scalar product from two (or more) vectors x, y ∈ ℝn:
This can be considered as an inner product for a vectorial space. As for the norm in a normed space, the inner product space (also named pre-Hilbert space) is a pair of a set of vectors and the definition of an inner product. The inner product on a space X is a mapping of X ✕ X into a scalar field K. We write the inner product as
 
An inner product must follow a set of axioms:
The third axiom refers to the complex conjugate. If both x and y are real, the inner product is commutative. For the forth axiom, it is equals to zero, only if x is the zero vector.

Remembering the definition of a complete space (from here), we can now define the Hilbert space as a complete inner product space. Hilbert spaces are Banach spaces with a norm that is derived from an inner product, so they have an extra feature in comparison with arbitrary Banach spaces, which makes them still more special.

Let's see now some properties of an inner product:
  • Parallelogram equality: starting from the norm and with the axioms of the inner product, we easily reach to the parallelogram equality: || x + y ||2 + ||x - y||2 = 2(||x||2 + ||y||2)
  • Orthogonality: when two elements x, y are in a space X, we say they are orthogonal if ⟨ x, y⟩ = 0. Also written as x ⊥ y
  • Polarization identity: this helps us to get the inner product from a norm. Re⟨x, y⟩ = 1/4(||x + y||- ||x - y||2) and Im⟨x, y⟩ = 1/4(||x + iy||- ||x - iy||2). Those formulas correspond to the complex inner product, for a real inner product, only the first equation will remain.
  • Schwarz inequality: remembering the triangle inequality, we have with the inner product, the Schwarz inequality: |⟨ x, y⟩| ≤ ||x|| ||y||
The projection, or orthogonal projection, is another concept commonly used in Hilbert spaces, but first we have to explain some other related concepts. 
Let be a space X with an element x and a subset M ⊂ X. The distance from x to the subset M will be δ = inf ||x - y|| where y ∈ M. This sounds quite familiar, as it's the distance in an Euclidean space. But let me introduce more details to the previous definition to be sure of the uniqueness of y: the subspace M must be complete and convex for the solution being unique. This result can be easily demonstrate with the parallelogram equality.
Given the distance from the subset M to the space X, we get the vector z = x - y which is orthogonal to the subset M. Reordering the vectors sum as y = x + z we can see now that the vector y is the orthogonal projection of x on M.

Monday, September 26, 2016

Functional Analysis - Metric and Normed spaces

A metric space is a set for which all the distance between all members of the set are defined. What does that mean? Let's take a set of values, the natural ones , and let's take a distance between elements d(x, y) = |x - y|, well, this is a metric space. Another metric space will be the same distance but another set of elements, or another distance for the same set of elements.
What does that mean at all? Most of the people think in a Euclidean space when taking about distances, but when you're in a non-Euclidean space, the distance between two points has a different calculation. For example, in a sphere, the (minimal) distance between two points is not the difference of its rectangular coordinates, the same occurs in a hyperbolic plane.
The important point in metric spaces is the definition of the metric, which in topology must follow some rules:
  1. d(x, y) ≥ 0: the distances will always be positive.
  2. d(x, y) = 0 ⇔ x = y: if a distance is zero, it means that both elements are equals.
  3. d(x, y) = d(y, x): the distance is the same no matter from which point we start.
  4. d(x, z) ≤ d(x, y) + d(y, z): this is the triangle inequality.
Let's now see the normed spaces. A normed space is applied to vector space instead of a set elements as the metric space. So, the metric spaces, consists in a vectors space plus a definition of a norm. What's a norm? Is the definition of length or size of a vector. The norm also follows a set of rules similar to the metric:
  1. ||x|| ≥ 0: the norm must always be positive.
  2. ||x|| = 0 ⇔ x = 0: if the norm is zero, then the vector is the zero vector.
  3. ||ax|| = |a| ||x||: let a be a scalar, the value of the norm of a vector multiplied by a scalar is equal to the norm of the vector multiplied by the modulo of the same scalar.
  4. ||x + y|| ≤ ||x|| + ||y||: this is the triangle inequality.
A normed space also defines a metric as d(x, y) = ||x - y|| which is called a metric induced by the norm.

Before continuing, let's try to see some examples of both spaces.
  • Real line ℝ: this is a one dimensional set (so, we can't define a normed space here). A common metric will be d(x, y) = |x - y|
  • Euclidean plane ℝ2: considering each element as a vector of two dimensions, we can define a metric as d(x, y) = ((x1-y1)2 + (x2-y2)2)1/2. Here we can also define a norm, as the elements of the set are vectors. The norm in this case will be ||x|| = (x12 + x22)1/2
  • Sequence space l: consider a set of elements where each element is composed by a sequence of bounded values, x = (x1, x2, x3, ...) or x= (xj) where |xj| ≤ cx (cx is a real number). A common metric for a space as describe here is d(x, y) = sup|xj - yj|, where sup means the supremum (least upper bound). And a norm as ||x|| = sup|xj|. This example shows how the concept of metric and norm are surprisingly general.

Let's now see some important concepts which play an important role with metric spaces.

  • Boundary: consider a bounded set of values, the union of all the values which are in the border of the set are considered the boundary. Consider a ball B(x0, r) with center at x0 and radius r, the boundary will be the set of values as d(x, x0) = r.
  • Open set: an open set consists of a bounded set where the boundary does not belong to the set.
  • Closed set: a closed set consists of a bounded set where the boundary also belong to the set.
  • Convergence: given a sequence (finite or infinite), it's said to converge if the values, as we progress into the sequence, tends to approach to a defined value. Given a sequence (xn), it's said that it converges to x if lim d(x, xn) = 0
  • Cauchy Sequence: given a sequence (xn) if there is an ε where d(xn, xm) < ε for n, m > N. What does that mean? For every element from N to infinity of the sequence, the distance between each element is less that a value ε, where ε is more than zero and less than infinity; that's there is no point in the infinity for the sequence from indexes higher than N.
  • Completeness: a metric space is said to be complete, if every Cauchy sequence of the space converges.
Now that we have all those concepts, we can define a Banach space. A Banach space complete normed space. The concept of complete space can also be applied to a normed space, as from a norm we can obtain a metric (the metric induced by norm). The Banach spaces are very used in functional analysis, as all the properties which it involves are of particular interest.