Inner Products
Subtopic: Vector Spaces
An inner product generalizes the dot product to abstract vector spaces, providing notions of length, angle, and orthogonality. Any inner product induces a norm and allows geometric reasoning—even in spaces of functions or matrices. Inner product spaces are the setting for Fourier analysis, quantum mechanics, and least squares.
Introduction
The dot product on
An inner product is that generalization. Once you define an inner product, you automatically get a notion of length (the induced norm) and angle (via the generalized cosine formula).
Definition
An inner product on a vector space
Symmetry:
<u,v>=<v,u> Linearity in the first argument:
<a*u+b*w,v>=a<u,v>+b<w,v> Positive definiteness:
<v,v>≥0 , with equality if and only ifv=0
For complex vector spaces, symmetry becomes conjugate symmetry:
A vector space with an inner product is called an inner product space.
Examples
Standard Inner Product on ℝⁿ
This is just the dot product.
Weighted Inner Product
Inner Product on Functions
For continuous functions on
This inner product underlies Fourier series—sine and cosine functions are orthogonal under this product.
Frobenius Inner Product on Matrices
The sum of element-wise products, treating the matrix as a long vector.
Induced Norm
Every inner product induces a norm:
This generalizes length. For the standard inner product on
Angle and Orthogonality
The angle
Vectors are orthogonal (perpendicular) if
A set of vectors is orthogonal if every pair is orthogonal. It's orthonormal if also each vector has norm
Cauchy-Schwarz Inequality
For any inner product:
Equality holds iff u and v are linearly dependent. This is perhaps the most important inequality in analysis—it guarantees the angle formula makes sense (the ratio is between
Worked Example
In the function space C[0, 1] with inner product ⟨f, g⟩ = ∫₀¹ f(x)g(x) dx, are f(x) = 1 and g(x) = x − 1/2 orthogonal?
In the function space
are
Compute:
Yes,
Orthogonal Projection
The projection of
This is the component of
Applications
In Fourier analysis, functions are decomposed into orthogonal sine/cosine components using the function inner product.
In quantum mechanics, states are vectors in a Hilbert space (complete inner product space). The inner product gives probability amplitudes.
In least squares, the best approximation minimizes the norm of the residual—finding the point in a subspace closest to the target.
In machine learning, kernel methods implicitly work in high-dimensional inner product spaces.
Summary
An inner product generalizes the dot product to abstract spaces, satisfying symmetry, linearity, and positive definiteness. It induces a norm and defines orthogonality. The Cauchy-Schwarz inequality ensures angles are well-defined. Common inner products include the standard dot product, weighted products, function integrals, and matrix trace products. Inner product spaces enable geometric reasoning in settings far beyond