Every finite-dimensional inner product space has an orthonormal basis, which can be obtained from an arbitrary basis using the Gram-Schmidt process. An orthonormal basis for an inner product space V is a basis whose vectors are orthonormal, meaning they are unit vectors that are orthogonal to each other. The presence of an orthonormal basis reduces the study of a finite-dimensional inner product space to the study of Euclidean space under the dot product.
Every finite-dimensional inner product space has an orthonormal basis, which can be obtained from an arbitrary basis using the Gram-Schmidt process. An orthonormal basis for an inner product space V is a basis whose vectors are orthonormal, meaning they are unit vectors that are orthogonal to each other. The presence of an orthonormal basis reduces the study of a finite-dimensional inner product space to the study of Euclidean space under the dot product.
Original Description:
Orthonormal basis is the mathematical relation with linear algebra and vector space
Every finite-dimensional inner product space has an orthonormal basis, which can be obtained from an arbitrary basis using the Gram-Schmidt process. An orthonormal basis for an inner product space V is a basis whose vectors are orthonormal, meaning they are unit vectors that are orthogonal to each other. The presence of an orthonormal basis reduces the study of a finite-dimensional inner product space to the study of Euclidean space under the dot product.
Every finite-dimensional inner product space has an orthonormal basis, which can be obtained from an arbitrary basis using the Gram-Schmidt process. An orthonormal basis for an inner product space V is a basis whose vectors are orthonormal, meaning they are unit vectors that are orthogonal to each other. The presence of an orthonormal basis reduces the study of a finite-dimensional inner product space to the study of Euclidean space under the dot product.
an orthonormal basis for an inner product space V with finite dimension is a basis for whose vectors are orthonormal, that is, they are all unit vectors and orthogonal to each other.[1][2][3] For example, the standard basis for a Euclidean space is an orthonormal basis, where the relevant inner product is the dot product of vectors. The image of the standard basis under a rotation or reflection (or any orthogonal transformation) is also orthonormal, and every orthonormal basis for arises in this fashion.
For a general inner product space an
orthonormal basis can be used to define normalized orthogonal coordinates on Under these coordinates, the inner product becomes a dot product of vectors. Thus the presence of an orthonormal basis reduces the study of a finite-dimensional inner product space to the study of under dot product. Every finite-dimensional inner product space has an orthonormal basis, which may be obtained from an arbitrary basis using the Gram–Schmidt process.
In functional analysis, the concept of an
orthonormal basis can be generalized to arbitrary (infinite-dimensional) inner product spaces.[4] Given a pre-Hilbert space an orthonormal basis for is an orthonormal set of vectors with the property that every vector in can be written as an infinite linear combination of the vectors in the basis. In this case, the orthonormal basis is sometimes called a Hilbert basis for Note that an orthonormal basis in this sense is not generally a Hamel basis, since infinite linear combinations are required.[5] Specifically, the linear span of the basis must be dense in but it may not be the entire space.
If we go on to Hilbert spaces, a non-
orthonormal set of vectors having the same linear span as an orthonormal basis may not be a basis at all. For instance, any square-integrable function on the interval can be expressed (almost everywhere) as an infinite sum of Legendre polynomials (an orthonormal basis), but not necessarily as an infinite sum of the monomials A different generalisation is to pseudo- inner product spaces, finite-dimensional vector spaces equipped with a non- degenerate symmetric bilinear form known as the metric tensor. In such a basis, the metric takes the form with positive ones and negative ones.
Examples
For , the set of vectors
is called the standard basis and forms
an orthonormal basis of with respect to the standard dot product. Note that both the standard basis and standard dot product rely on viewing as the Cartesian product Proof: A straightforward computation shows that the inner products of these vectors equals zero,
and that each of their magnitudes
equals one, This means that is an orthonormal set. All vectors can be expressed as a sum of the basis vectors scaled so spans and hence must be a basis. It may also be shown that the standard basis rotated about an axis through the origin or reflected in a plane through the origin also forms an orthonormal basis of .
For , the standard basis and inner
product are similarly defined. Any other orthonormal basis is related to the standard basis by an orthogonal transformation in the group O(n). For pseudo-Euclidean space , an orthogonal basis with metric instead satisfies if , if , and if . Any two orthonormal bases are related by a pseudo-orthogonal transformation. In the case , these are Lorentz transformations. The set with where denotes the exponential function, forms an orthonormal basis of the space of functions with finite Lebesgue integrals, with respect to the 2-norm. This is fundamental to the study of Fourier series. The set with if and otherwise forms an orthonormal basis of Eigenfunctions of a Sturm–Liouville eigenproblem. The column vectors of an orthogonal matrix form an orthonormal set.
Basic formula
If is an orthogonal basis of then
every element may be written as
When is orthonormal, this simplifies to
and the square of the norm of can be given by
Even if is uncountable, only countably
many terms in this sum will be non-zero, and the expression is therefore well- defined. This sum is also called the Fourier expansion of and the formula is usually known as Parseval's identity.
If is an orthonormal basis of then
is isomorphic to in the following sense: there exists a bijective linear map such that
Incomplete orthogonal sets
Given a Hilbert space and a set of
mutually orthogonal vectors in we can take the smallest closed linear subspace of containing Then will be an orthogonal basis of which may of course be smaller than itself, being an incomplete orthogonal set, or be when it is a complete orthogonal set.