# Solution: Network Delay Time

Let's solve the Network Delay Time problem using the Graph Algo pattern.

## Statement

A network of `n`

nodes labeled $1$ to $n$ is provided along with a list of travel times for directed edges represented as $times[i]=(x_iâ€‹, \space y_i, \space t_iâ€‹)$, where $x_i$â€‹ is the source node, $y_i$â€‹ is the target node, and $t_i$â€‹ is the delay time from the source node to the target node.

Considering we have a starting node, `k`

, we have to determine the minimum time required for all the remaining $n - 1$ nodes to receive the signal. Return $-1$ if itâ€™s not possible for all $n - 1$ nodes to receive the signal.

**Constraints:**

- $1 \leq$
`k`

$\leq$`n`

$\leq$ $100$ - $1 \leq$
`times.length`

$\leq$ $6000$ `times[i].length`

$== 3$- $1 \leq x, y \leq$
`n`

- $x$ $!=$ $y$
- $0 \leq t \leq 100$
- Unique pairs of $(x, y)$, which means that there should be no multiple edges

## Solution

So far, you have probably brainstormed some approaches and have an idea of how to solve this problem. Letâ€™s explore some of these approaches and figure out which one to follow based on considerations such as time complexity and any implementation constraints.

### Naive approach

The naive approach is to use a simple brute force method. The algorithm would start by initializing all distances (time to travel from source to target node) to infinity, representing disconnection between the two nodes. Then, each node would use a nested loop to go through all other nodes and update their distances using the given travel times. If there is no connection between a source and a target node, its distance will not be updated. After updating all distances, the algorithm would find the maximum distance among all nodes, representing the minimum time it takes for all nodes to receive the signal. If a node that cannot be reached from node `k`

exists, it means the distance is infinity, and it will return -1.

This approach has a time complexity of $O(n^2)$, where $n$ is the number of nodes in the graph.

### Optimized approach using Dijkstraâ€™s algorithm

Dijkstraâ€™s algorithm is widely used for finding the shortest path between nodes in a graph. This makes it ideal for finding the minimum delay time in a network.

We will use an adjacency dictionary. The source node will be used as key, and the value is a list of tuples that have the destination node and the time for the signal to travel from source to destination node. A `k`

as a tuple. The priority queue ensures that the node with the minimum time is retrieved in each iteration. We will iterate over the priority queue to traverse the nodes in the network. If the node is not visited, the time of the retrieved node is compared to the current delay time and updated accordingly. The neighbors of the retrieved node are found using the adjacency dictionary and are added to the queue with their times updated by adding the delay time from the retrieved node.

Finally, if all the network nodes have been visited, we will return the computed time. Otherwise, $-1$ will be returned.

The slides below illustrate how we would like the algorithm to run:

Level up your interview prep. Join Educative to access 80+ hands-on prep courses.