# Directed Acyclic Graphs: Depth-First Search

Explore the different techniques used to implement the depth-first search algorithm efficiently.

We'll cover the following

## Shortest paths in acyclic graphs

Shortest paths are also easy to compute in directed acyclic graphs, even when the edges are weighted, and in particular, even when some edges have negative weight. (We don’t have to worry about negative cycles because, by definition, dags don’t have any cycles!) Indeed, this is a completely standard dynamic programming algorithm.

Let $G$ be a directed graph with weighted edges, and let $s$ be the fixed start vertex. For any vertex $v$, let $dist(v)$ denote the length of the shortest path in $G$ from $s$ to $v$. This function satisfies the following simple recurrence:

$dist(v)\begin{cases} & 0 \hspace{4.1cm} if\space v=s \\ & \underset{u\rightarrow v}{min}(dist(u) + w(u\rightarrow v)) \hspace{0.4cm} otherwise \end{cases}$

In fact, this identity holds for all directed graphs, but it is only a recurrence for directed acyclic graphs. If the input graph $G$ contained a cycle, a recursive evaluation of this function would fall into an infinite loop; however, because $G$ is a dag, each recursive call visits an earlier vertex in topological order.

The dependency graph for this recurrence is the reversal of the input graph $G$: subproblem $dist(v)$ depends on $dist(u)$ if and only if $u\rightarrow v$ is an edge in $G$. Thus, we compute the distance of every edge in $O(V + E)$ time by performing a depth-first search in the reversal of $G$ and considering vertices in postorder. Equivalently, we can consider the vertices in the original graph $G$ in topological order.

## Bellman-Ford algorithm with predecessor pointers

The resulting dynamic-programming algorithm is another example of Ford’s generic relaxation algorithm! To make this connection clearer, we can move the initialization $dist(v)$ outside the main loop and add computation of predecessor pointers, as shown below.