Python Forum

Full Version: Mesure delay time in network (graph)
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Hello,

I have a graph G with n nodes and m edges, I generated a random float number between 0 and 1 as a delay time for each edge, then after finding the shortest path between 2 nodes, the total delay is obviously the sum.

The question: is that make sense ? or I am wrong and I must search another way to compute the delay time ?

Thank you !