Partial Derivatives
Subtopic: Multivariable
Partial derivatives measure how a function of several variables changes when you vary one variable while holding others fixed. They're the building blocks of multivariable calculus—combining to form gradients, Jacobians, and Hessians. Understanding partial derivatives is essential for optimization, physics, and machine learning.
Introduction
A function like
It's like asking how temperature varies with latitude vs longitude—each direction gives a different rate of change.
Definition
The partial derivative of
Treat y as a constant and differentiate with respect to x using ordinary derivative rules.
Similarly for
Worked Example
Find the partial derivatives of
∂f/∂x (treat y as constant)
The
∂f/∂y (treat x as constant)
Now
Geometric Interpretation
For
Each partial derivative measures steepness along one coordinate axis.
Higher-Order Partials
You can take partial derivatives of partial derivatives:
Mixed partials:
Clairaut's Theorem
If mixed partials are continuous, they're equal:
Order doesn't matter (for "nice" functions).
The Gradient
The gradient collects all partial derivatives into a vector:
It points in the direction of steepest increase; its magnitude is the steepest slope.
Chain Rule for Partials
If
Changes propagate through all paths from
Applications
In physics, partial derivatives describe how physical quantities depend on multiple variables (pressure, temperature, volume).
In economics, marginal analysis uses partials to measure sensitivity to price, quantity, etc.
In machine learning, gradients (vectors of partials) drive optimization of loss functions.
Summary
Partial derivatives measure the rate of change with respect to one variable, holding others constant. Compute by treating other variables as constants.