Time Complexity for Computing the Average of a list
If computing the average of 10 numbers takes 0.31 seconds, how long would it take to compute the average of 100 numbers? What would it be for 1000,000 numbers?
For 100 numbers it would take 3.1 seconds
For 1000,000 numbers it would take 31,000 seconds