Jul-02-2020, 04:06 PM
Hello,
I have a graph G with n nodes and m edges, I generated a random float number between 0 and 1 as a delay time for each edge, then after finding the shortest path between 2 nodes, the total delay is obviously the sum.
The question: is that make sense ? or I am wrong and I must search another way to compute the delay time ?
Thank you !
I have a graph G with n nodes and m edges, I generated a random float number between 0 and 1 as a delay time for each edge, then after finding the shortest path between 2 nodes, the total delay is obviously the sum.
The question: is that make sense ? or I am wrong and I must search another way to compute the delay time ?
Thank you !