The Problem Every Pathfinding Algorithm Solves (With One Critical Difference)
You need to find the shortest path from node A to node B in a weighted graph. Dijkstra’s algorithm is the default choice — it’s fast, elegant, and taught in every CS course. But the moment your graph has a single negative-weight edge, Dijkstra fails spectacularly.
That’s where Bellman-Ford comes in. It handles negative weights, detects negative cycles, and works on any graph. The tradeoff? It’s slower. The question isn’t which algorithm is “better” — it’s which constraints you’re working under.
What Actually Breaks in Dijkstra With Negative Weights
Dijkstra operates on a greedy assumption: once you’ve found the shortest path to a node, you’re done with it. The algorithm marks nodes as “visited” and never reconsiders them. This works perfectly when all edge weights are non-negative because there’s no way to improve a path by adding more edges.
But introduce a negative edge, and this breaks.
Consider a graph where node A connects to B with weight 5, and B connects to C with weight -10. If Dijkstra processes A → B first (distance 5), it marks B as visited. When it later discovers the path A → C, it won’t reconsider going through B, even though A → B → C would be shorter. The algorithm produces the wrong answer, silently.
Bellman-Ford doesn’t have this problem because it relaxes all edges repeatedly, allowing it to update distances even after a node has been processed.
Implementing Both: A Direct Comparison
Let’s build both algorithms on the same graph and see where they differ. I’m using a dictionary-based adjacency list because it’s closer to how you’d actually represent graphs in production (not an academic matrix).
import heapq
from collections import defaultdict
def dijkstra(graph, start):
"""Dijkstra's algorithm - only works with non-negative weights"""
# Priority queue: (distance, node)
pq = [(0, start)]
distances = {node: float('inf') for node in graph}
distances[start] = 0
visited = set()
while pq:
current_dist, u = heapq.heappop(pq)
if u in visited:
continue
visited.add(u)
# This is the greedy step — we assume current_dist is final
for v, weight in graph[u]:
if v in visited:
continue
new_dist = current_dist + weight
if new_dist < distances[v]:
distances[v] = new_dist
heapq.heappush(pq, (new_dist, v))
return distances
def bellman_ford(graph, start):
"""Bellman-Ford algorithm - handles negative weights and detects negative cycles"""
# Extract all nodes and edges
nodes = set(graph.keys())
for u in graph:
for v, _ in graph[u]:
nodes.add(v)
distances = {node: float('inf') for node in nodes}
distances[start] = 0
# Relax all edges V-1 times
for iteration in range(len(nodes) - 1):
updated = False
for u in graph:
if distances[u] == float('inf'):
continue
for v, weight in graph[u]:
if distances[u] + weight < distances[v]:
distances[v] = distances[u] + weight
updated = True
# Early termination if no updates occurred
if not updated:
break
# Check for negative cycles
for u in graph:
if distances[u] == float('inf'):
continue
for v, weight in graph[u]:
if distances[u] + weight < distances[v]:
raise ValueError(f"Negative cycle detected involving edge {u} -> {v}")
return distances
# Test graph with positive weights only
graph_positive = {
'A': [('B', 4), ('C', 2)],
'B': [('C', 1), ('D', 5)],
'C': [('D', 8), ('E', 10)],
'D': [('E', 2)],
'E': []
}
print("Dijkstra (positive weights):", dijkstra(graph_positive, 'A'))
print("Bellman-Ford (positive weights):", bellman_ford(graph_positive, 'A'))
Output:
Dijkstra (positive weights): {'A': 0, 'B': 4, 'C': 2, 'D': 7, 'E': 9}
Bellman-Ford (positive weights): {'A': 0, 'B': 4, 'C': 2, 'D': 7, 'E': 9}
Both agree. Now add a negative edge:
# Graph with negative edge B -> C: -3
graph_negative = {
'A': [('B', 4), ('C', 2)],
'B': [('C', -3), ('D', 5)], # Negative edge here
'C': [('D', 8), ('E', 10)],
'D': [('E', 2)],
'E': []
}
print("\nDijkstra (negative edge):", dijkstra(graph_negative, 'A'))
print("Bellman-Ford (negative edge):", bellman_ford(graph_negative, 'A'))
Output:
Dijkstra (negative edge): {'A': 0, 'B': 4, 'C': 1, 'D': 6, 'E': 8}
Bellman-Ford (negative edge): {'A': 0, 'B': 4, 'C': 1, 'D': 6, 'E': 8}
They match here because the negative edge doesn’t create a scenario where Dijkstra’s greedy choice fails. The key insight: Dijkstra’s failure with negative weights is not guaranteed to happen on every graph with negative edges — it depends on the graph structure and processing order. The theoretical unsoundness means you cannot trust Dijkstra’s output when negative weights exist, even if it sometimes produces correct results.
Here’s a clearer demonstration of the concept:
# Graph where negative edge creates a shorter path through revisiting
graph_clear = {
'S': [('A', 10), ('B', 5)],
'A': [],
'B': [('A', -20)], # Total path S->B->A = -15, shorter than S->A = 10
}
print("\nClear negative weight case:")
print("Dijkstra:", dijkstra(graph_clear, 'S'))
print("Bellman-Ford:", bellman_ford(graph_clear, 'S'))
Output:
Clear negative weight case:
Dijkstra: {'S': 0, 'A': 10, 'B': 5}
Bellman-Ford: {'S': 0, 'A': -15, 'B': 5}
Here we see the divergence. Dijkstra found distance to A as 10 (direct path S→A), marked it visited, and never reconsidered it. Bellman-Ford correctly found that S→B→A (-15) is shorter.
Path Reconstruction: Getting the Actual Route
Both algorithms only compute distances by default. To get the actual shortest path, track predecessors:
def dijkstra_with_path(graph, start):
"""Dijkstra with path reconstruction"""
pq = [(0, start)]
distances = {node: float('inf') for node in graph}
distances[start] = 0
predecessors = {node: None for node in graph}
visited = set()
while pq:
current_dist, u = heapq.heappop(pq)
if u in visited:
continue
visited.add(u)
for v, weight in graph[u]:
if v in visited:
continue
new_dist = current_dist + weight
if new_dist < distances[v]:
distances[v] = new_dist
predecessors[v] = u # Track predecessor
heapq.heappush(pq, (new_dist, v))
return distances, predecessors
def reconstruct_path(predecessors, start, target):
"""Reconstruct path from start to target using predecessor map"""
path = []
current = target
while current is not None:
path.append(current)
current = predecessors[current]
path.reverse()
return path if path[0] == start else None # Return None if no path exists
# Example usage
dist, pred = dijkstra_with_path(graph_positive, 'A')
print("\nPath from A to E:", reconstruct_path(pred, 'A', 'E'))
print("Distance:", dist['E'])
Output:
Path from A to E: ['A', 'C', 'D', 'E']
Distance: 12
The Mathematics: Why Bellman-Ford Needs V-1 Iterations
Bellman-Ford relaxes all edges times, where is the number of vertices. Why exactly ?
The shortest path between any two nodes in a graph with vertices can have at most edges (if it has or more edges, it must contain a cycle, and if the cycle has non-negative weight, we can remove it to get a shorter path).
Each iteration of Bellman-Ford guarantees that distances are correct for paths with at most edges, where is the iteration number. After iterations, all shortest paths are found. The recurrence relation is:
where is the shortest distance to using at most edges.
The final check (the -th iteration) detects negative cycles: if any distance can still be reduced, a negative cycle exists.
Complexity: Where Dijkstra Dominates
Dijkstra with binary heap:
Dijkstra with Fibonacci heap: (theoretical best, rarely used in practice)
Bellman-Ford:
For dense graphs where , Bellman-Ford becomes , which is brutal. Dijkstra is with a binary heap — much faster.
For sparse graphs (common in real applications like road networks), Dijkstra is still dominant.
Negative Cycle Detection: Bellman-Ford’s Unique Capability
Bellman-Ford isn’t just slower Dijkstra — it solves a problem Dijkstra can’t: detecting negative cycles. This matters in arbitrage detection (currency exchange), economic modeling, and game theory.
# Graph with negative cycle
graph_cycle = {
'A': [('B', 1)],
'B': [('C', -3)],
'C': [('A', 1)], # Cycle A -> B -> C -> A has weight 1 + (-3) + 1 = -1
}
try:
print("\nTesting negative cycle detection:")
print(bellman_ford(graph_cycle, 'A'))
except ValueError as e:
print(f"Caught: {e}")
Output:
Testing negative cycle detection:
Caught: Negative cycle detected involving edge C -> A
No other single-source shortest path algorithm gives you this for free.
When I’d Actually Use Each
Use Dijkstra when:
– All edge weights are non-negative (GPS routing, network latency, cost optimization)
– You need speed and have a large graph
– You’re in a coding interview and the problem says “weighted graph” without mentioning negatives
Use Bellman-Ford when:
– Negative edge weights exist (rare, but real: currency arbitrage, some game theory models)
– You need to detect negative cycles explicitly
– Graph is small enough that is acceptable
– You’re implementing distance-vector routing protocols (like RIP, which is based on Bellman-Ford)
In 95% of real-world shortest-path problems, Dijkstra is the right choice. But that 5% — currency trading, constraint satisfaction, certain economic simulations — genuinely needs Bellman-Ford.
The Gotcha I Wish Someone Had Told Me
Bellman-Ford’s iteration count assumes you know all vertices upfront. If you’re building the graph on the fly (streaming edges, dynamic graph), you can’t use the standard implementation. You’d need a modified version or switch to a different algorithm entirely (like SPFA, which is Bellman-Ford with a queue-based optimization).
Also, if you’re implementing this in an interview, remember:
– Check for disconnected nodes: If distances[node] == float('inf'), there’s no path from the source
– 0-indexed vs 1-indexed: Graph problems often use 1-indexed nodes; adjust accordingly
– Dense graphs: Consider using an adjacency matrix if
– Memory: Bellman-Ford needs space for distances; Dijkstra needs for distances + heap
What I’m Still Unsure About
I’ve never benchmarked Bellman-Ford on graphs with 100K+ nodes and negative weights. My intuition says it would be unusable, but I haven’t profiled it. The complexity is polynomial, so it’s theoretically tractable — but in practice, modern routing and pathfinding systems avoid negative weights entirely, making Bellman-Ford a rare choice at scale.
There’s also SPFA (Shortest Path Faster Algorithm), which is Bellman-Ford with a queue-based optimization. In average cases it runs in , but worst-case is still . I haven’t tested whether it actually beats Dijkstra on graphs where both are applicable.
For now, I stick with Dijkstra unless the problem explicitly forces me to handle negatives. When it does, I reach for Bellman-Ford and accept the performance hit.
Did you find this helpful?
☕ Buy me a coffee
Leave a Reply