Linear Transformations
Functions that preserve vector structure and reshape space
What is a Linear Transformation?
In the previous chapter, we saw that matrices are grids of numbers whose columns are vectors. Now we discover their geometric soul: a matrix is not just data—it is a linear transformation, a function that reshapes space itself.
A linear transformation takes vectors as input and produces vectors as output, while preserving the fundamental operations of vector addition and scalar multiplication.
Think of it as a way to transform space itself. When you apply a linear transformation, every point in space moves to a new location, but the transformation follows strict rules that keep the structure of space intact.
For a function to be a linear transformation, it must satisfy two properties:
The first property says that transforming a sum of vectors is the same as summing the transformed vectors. The second says that scaling before or after the transformation gives the same result. Together, these constraints mean that lines stay lines, the origin stays fixed, and parallel lines remain parallel.
Basis Vectors Determine Everything
Here is a powerful insight: to completely describe any linear transformation in 2D, you only need to know where two vectors land—the basis vectors and .
The basis vector points one unit to the right (1, 0), and points one unit up (0, 1). Every other vector can be written as a combination of these two, so once we know how i and j transform, we know how every vector transforms.
Interactive: Drag the basis vectors to transform space
Drag the tips of i (blue) and j (red) to transform space
Any vector transforms to . The transformed basis vectors become the columns of a matrix:
This matrix completely encodes the transformation. The first column tells you where i lands, the second column tells you where j lands. This connection between matrices and transformations is one of the most important ideas in linear algebra.
Rotation
One of the most intuitive transformations is rotation. When we rotate space by an angle theta around the origin, every point moves along an arc of a circle centered at the origin.
The basis vectors rotate just like everything else. After rotating by angle theta, the vector i (which started at (1, 0)) lands at (cos theta, sin theta). Similarly, j (which started at (0, 1)) lands at (-sin theta, cos theta).
Interactive: Rotation transformation
The green dot shows the original point (2, 1), the purple dot shows where it lands
Notice how the grid rotates as a whole without stretching or squishing. Rotation preserves distances and angles—it is a rigid transformation. The determinant of a rotation matrix is always 1, which means it preserves area as well.
Shear
A shear transformation slides space in one direction while keeping lines parallel. Imagine pushing the top of a deck of cards to the side while keeping the bottom fixed—that is a shear.
In a horizontal shear, i stays where it is, but j tilts sideways. The amount of tilt is the shear factor. Vertical shear does the opposite—j stays fixed while i tilts up or down.
Interactive: Shear transformation
Shearing slides one axis relative to the other, like tilting a deck of cards
A horizontal shear by factor k has the matrix:
Shears are interesting because they change angles but preserve area. The unit square becomes a parallelogram with the same area. This is reflected in the determinant, which equals 1 for any shear matrix.
Scaling
Scaling stretches or compresses space along the coordinate axes. Uniform scaling (same factor in both directions) makes everything bigger or smaller. Non-uniform scaling stretches differently in each direction, turning circles into ellipses and squares into rectangles.
Interactive: Scale transformation
Scaling stretches or squishes space along each axis independently
When a scale factor is negative, it flips space across that axis. A scale of (-1, 1) reflects across the y-axis, while (-1, -1) rotates by 180 degrees. The determinant of a scaling matrix is , which tells you how areas change—negative when there is a reflection.
Matrices as Transformations
We have now seen how several common transformations correspond to matrices. This connection goes both ways: every 2x2 matrix describes a linear transformation, and every linear transformation in 2D can be represented by a 2x2 matrix.
To apply a transformation matrix to a vector, we multiply:
This formula says: take x copies of the first column, add y copies of the second column. The result is where the point (x, y) lands after the transformation. The columns of the matrix are the transformed basis vectors, and matrix multiplication just expresses the vector as a combination of those transformed bases.
This perspective—matrices as transformations—is the key to understanding why matrix multiplication works the way it does, why the determinant measures area scaling, and why eigenvectors are so important. In the chapters ahead, we will see how composing transformations corresponds to multiplying matrices, and how the special vectors that do not change direction under a transformation reveal its fundamental structure.
Key Takeaways
- A linear transformation is a function that preserves vector addition and scalar multiplication
- Linear transformations keep the origin fixed, preserve lines, and maintain parallel relationships
- The entire transformation is determined by where the basis vectors i and j land
- A 2x2 matrix encodes a transformation: columns are the transformed basis vectors
- Common transformations include rotation, scaling, shear, and reflection—each with a characteristic matrix form
- Matrix-vector multiplication computes where a point lands after the transformation