NOTE: This topic has strong mathematical themes, don't run away just yet, this is a fundamental topic for algorithms and computer science as a whole.


Now, we look at the ways of estimating the running time of a program and how to compare the running times of two programs without ever implementing them. It is vital to analyse the resource use of an algorithm, well before it is implemented and deployed. Space is also important but we focus on time in this course.

How do you estimate the running time of an algorithm?

An algorithm consists of some operations executed a number of times. Hence, an estimate of the running time/time efficiency of an algorithm can be obtained from determining these operations, how long to execute them, and the number of times they are executed. These operations are called basic operations and the number of times is based on the input size of the problem.

What is a basic operation?

Put simply, these are the operations that contribute the most towards the total running time. This is often the arithmetic component of an algorithm, for example, compare, add, multiply, divide or assignment. It is typically the operation most frequently executed, although dependent on the time of each operation. Consider the following...

What is the input size? 

This is more a characteristic of the size of the problem, say we are searching through an array of size n, the input size is n. We want to estimate the running time of an algorithm in terms of the problem, not in absolute terms. When analysing algorithms, we state the number of times that the basic operation is executed in terms of the input size. 

```` // INPUT : a, n // OUTPUT : s = a n 1: set s = 1 2: for i = 1 to n do 3: s = s ∗ a 4: end for 5: return s  ````

What is the basic operation? Multiply What is the input size? n 

Running time is approximately the time to execute a basic operation × basic operations

t(n) ≈ cop × C(n)

* t(n) is the running time. * n is the input size. * cop is the execution time for a basic operation. * C(n) is the number of times the basic operation is executed. You can imagine that the runtime of an algoithm can vary greatly depending on the environment in which it runs. For this reason, we want to find out a number of runtime scenarios... **Worst Case** - Given an input of n items, what is the maximum running time for any possible input? **Best Case** - Given an input of n items, what is the minimum running time for any possible input? **Average Case** - Given an input of n items, what is the average running time across all possible inputs? #Asymptotic Complexity We now have a way to analyse the running time (aka time complexity) of an algorithm, but every algorithms have their own time complexities. How to compare in a meaningful way? Solution, Group them into equivalence classes (for easier comparison and understanding), with respect to the input size.