Dijkstra's Algorithm
Finding the cheapest path
The Greedy Insight
BFS finds the shortest path when every step costs the same. But what if some steps are harder than others? Imagine crossing a mountain range: you could go straight over the peak, or take a longer route through a valley. The valley path has more steps, but might be far less exhausting.
Dijkstra's algorithm solves this by being greedy in the right way. At each step, it always expands the node with the smallest known distance from the start. Not the most recently discovered node (like DFS), not the first discovered (like BFS), but the closest.
The key data structure is a priority queue ordered by distance. Unlike a regular queue where the first item in is the first out, a priority queue always serves the item with the highest priority—in our case, the lowest distance.
Here is the crucial guarantee: when we pop a node from the priority queue, we have found its shortest path. We will never discover a cheaper way to reach that node later. This is what makes Dijkstra's algorithm correct.
Relaxation
The heart of Dijkstra is a simple question asked for each neighbor: "Is this path cheaper than what I knew before?"
When we visit a node and examine its neighbors, we calculate the cost to reach each neighbor through the current node. If this cost is less than the best known cost to that neighbor, we update our records. This operation is called relaxation—we are relaxing our estimate of the shortest distance.
Think of it like this: you initially assume every destination is infinitely far away. As you explore, you gradually "relax" these pessimistic estimates whenever you find evidence of a shorter path.
Interactive: Relaxation in Action
Click the Relax buttons to update distances. The edge is relaxed only if the new path is shorter. This is the core operation that makes Dijkstra work.
The relaxation step looks like this in pseudocode:
for each neighbor of current:
new_distance = distance[current] + edge_weight
if new_distance < distance[neighbor]:
distance[neighbor] = new_distance
parent[neighbor] = current
add neighbor to priority queue with priority new_distanceNotice that we might add the same node to the priority queue multiple times with different distances. That is fine—when we pop a node that has already been visited, we simply skip it.
Watching Dijkstra Explore
Now let us see the algorithm in action on a weighted grid. Each cell has a traversal cost shown by its shade—lighter cells are cheap to cross, darker cells are expensive.
Interactive: Dijkstra's Algorithm
Watch Dijkstra explore the weighted terrain. Numbers show accumulated distance from start. The priority queue always processes the node with the smallest distance first.
Watch the priority queue: the node at the front always has the smallest distance. As nodes get relaxed, their distances update and they reorder in the queue. The algorithm expands outward, but it favors cheap terrain—it will explore further along easy paths before tackling expensive ones.
Why It Works
The correctness of Dijkstra relies on one critical assumption: all edge weights are non-negative. Let us see why this matters.
When we pop a node from the priority queue, we know its distance is minimal among all nodes still in the queue. Could there be a path to that we have not discovered yet? Such a path would have to go through some other node still in the queue. But 's distance is already greater than or equal to 's distance. And since all edges have non-negative weight, any path through would only add to the cost, never reduce it.
This is the key insight: non-negative weights ensure we never find a cheaper path to an already-extracted node. Once a node is popped, its distance is final.
If edges could be negative, this guarantee breaks. A path through a node with a larger initial distance could end up shorter after traversing a negative edge. For graphs with negative edges, we need more sophisticated algorithms like Bellman-Ford.
Dijkstra vs BFS
How much difference does considering weights make? Let us run both algorithms on the same weighted terrain.
Interactive: Dijkstra vs BFS
BFS
Ignores weights
Dijkstra
Respects weights
BFS finds the path with the fewest edges (steps), but ignores terrain cost. Dijkstra finds the path with minimum total cost, even if it means taking more steps. The shaded region has high traversal cost.
BFS charges straight ahead, finding the path with the fewest steps. Dijkstra takes a detour to avoid the expensive terrain. The BFS path is shorter in distance (number of edges), but Dijkstra's path is shorter in cost (sum of weights).
When all weights are equal to 1, Dijkstra and BFS find the same path—Dijkstra degenerates into BFS. But when weights vary, only Dijkstra guarantees the minimum-cost route.
The Algorithm
Here is Dijkstra's algorithm in pseudocode:
Dijkstra(graph, start, goal):
priority_queue ← empty min-heap
distance ← map with all nodes set to ∞
parent ← empty map
visited ← empty set
distance[start] ← 0
add start to priority_queue with priority 0
while priority_queue is not empty:
current ← extract_min from priority_queue
if current in visited:
continue
add current to visited
if current = goal:
return reconstruct_path(parent, goal)
for each neighbor of current:
if neighbor in visited:
continue
new_dist ← distance[current] + weight(current, neighbor)
if new_dist < distance[neighbor]:
distance[neighbor] ← new_dist
parent[neighbor] ← current
add neighbor to priority_queue with priority new_dist
return "no path found"The priority queue operations are the performance bottleneck. With a binary heap, both insert and extract_min take time. Since we might add each edge to the queue once, the total time complexity is .
With a Fibonacci heap, we can achieve , which is better when the graph is dense. But for most practical purposes, a binary heap works well.
Terrain Exploration in 3D
To build intuition for how Dijkstra navigates weighted terrain, let us visualize it in three dimensions. The height of each cell represents its traversal cost—climbing is expensive.
Interactive: 3D Terrain Pathfinding
Drag to rotate the terrain. Watch Dijkstra find the lowest-cost path across the 3D landscape. Taller cells cost more to traverse—the algorithm avoids climbing when possible.
Notice how the algorithm prefers the valleys. It expands across flat terrain quickly but hesitates to climb hills. The final path often winds around obstacles rather than going over them, trading extra distance for reduced total cost.
Dijkstra always expands the closest unexplored node, not the most recent. This greedy choice—always picking the minimum—is what makes it find optimal paths.
Key Takeaways
- Dijkstra's algorithm finds minimum-cost paths in weighted graphs with non-negative edge weights
- It uses a priority queue ordered by distance from the start node
- The relaxation step updates distances whenever a shorter path is found
- When a node is extracted from the priority queue, its shortest distance is finalized
- Unlike BFS (which minimizes edge count), Dijkstra minimizes total path cost
- Time complexity is with a binary heap
- This algorithm forms the foundation for A*, which we explore in the next chapter