# How to Analyse Loops for Complexity Analysis of Algorithms

We have discussed Asymptotic Analysis, Worst, Average and Best Cases and Asymptotic Notations in previous posts. In this post, an analysis of iterative programs with simple examples is discussed.

## Constant Time Complexity O(1):

The time complexity of a function (or set of statements) is considered as O(1) if it doesn’t contain a loop, recursion, and call to any other non-constant time function.

i.e. set of non-recursive and non-loop statements

**Example:**

- swap() function has O(1) time complexity.
- A loop or recursion that runs a constant number of times is also considered O(1). For example, the following loop is O(1).

## C

`// Here c is a constant` `for` `(` `int` `i = 1; i <= c; i++) {` ` ` `// some O(1) expressions` `}` |

## Linear Time Complexity O(n):

The Time Complexity of a loop is considered as O(n) if the loop variables are incremented/decremented by a constant amount. For example following functions have O(n) time complexity.

## C

`// Here c is a positive integer constant` `for` `(` `int` `i = 1; i <= n; i += c) {` ` ` `// some O(1) expressions` `}` `for` `(` `int` `i = n; i > 0; i -= c) {` ` ` `// some O(1) expressions` `}` |

## Quadratic Time Complexity O(n^{c}):

The time complexity is defined as an algorithm whose performance is directly proportional to the squared size of the input data, as in nested loops it is equal to the number of times the innermost statement is executed. For example, the following sample loops have O(n^{2}) time complexity

## C

`for` `(` `int` `i = 1; i <= n; i += c) {` ` ` `for` `(` `int` `j = 1; j <= n; j += c) {` ` ` `// some O(1) expressions` ` ` `}` `}` `for` `(` `int` `i = n; i > 0; i -= c) {` ` ` `for` `(` `int` `j = i + 1; j <= n; j += c) {` ` ` `// some O(1) expressions` ` ` `}` |

**Example:** Selection sort and Insertion Sort have O(n^{2}) time complexity.

## Logarithmic Time Complexity O(Log n):

The time Complexity of a loop is considered as O(Logn) if the loop variables are divided/multiplied by a constant amount. And also for recursive calls in the recursive function, the Time Complexity is considered as O(Logn).

## C

`for` `(` `int` `i = 1; i <= n; i *= c) {` ` ` `// some O(1) expressions` `}` `for` `(` `int` `i = n; i > 0; i /= c) {` ` ` `// some O(1) expressions` `}` |

## C

`// Recursive function` `void` `recurse(n)` `{` ` ` `if` `(n == 0)` ` ` `return` `;` ` ` `else` `{` ` ` `// some O(1) expressions` ` ` `}` ` ` `recurse(n - 1);` `}` |

**Example:** Binary Search(refer iterative implementation) has O(Logn) time complexity.

## Logarithmic Time Complexity O(Log Log n):

The Time Complexity of a loop is considered as O(LogLogn) if the loop variables are reduced/increased exponentially by a constant amount.

## C

`// Here c is a constant greater than 1` `for` `(` `int` `i = 2; i <= n; i = ` `pow` `(i, c)) {` ` ` `// some O(1) expressions` `}` `// Here fun is sqrt or cuberoot or any other constant root` `for` `(` `int` `i = n; i > 1; i = fun(i)) {` ` ` `// some O(1) expressions` `}` |

See this for mathematical details.

**How to combine the time complexities of consecutive loops?**

When there are consecutive loops, we calculate time complexity as a sum of the time complexities of individual loops.

## C

`for` `(` `int` `i = 1; i <= m; i += c) {` ` ` `// some O(1) expressions` `}` `for` `(` `int` `i = 1; i <= n; i += c) {` ` ` `// some O(1) expressions` `}` `Time complexity of above code is O(m)` ` ` `+ O(n) which is O(m + n) If m` ` ` `== n,` ` ` `the ` `time` `complexity becomes O(2n) which is O(n).` |

##

**How to calculate time complexity when there are many if, else statements inside loops?**

As discussed here, the worst-case time complexity is the most useful among best, average and worst. Therefore we need to consider the worst case. We evaluate the situation when values in if-else conditions cause a maximum number of statements to be executed.

For example, consider the linear search function where we consider the case when an element is present at the end or not present at all.

When the code is too complex to consider all if-else cases, we can get an upper bound by ignoring if-else and other complex control statements.

**How to calculate the time complexity of recursive functions?**

The time complexity of a recursive function can be written as a mathematical recurrence relation. To calculate time complexity, we must know how to solve recurrences. We will soon be discussing recurrence-solving techniques as a separate post.

### Algorithms Cheat Sheet:

Algorithm | Best Case | Average Case | Worst Case |

Selection Sort | O(n^2) | O(n^2) | O(n^2) |

Bubble Sort | O(n) | O(n^2) | O(n^2) |

Insertion Sort | O(n) | O(n^2) | O(n^2) |

Tree Sort | O(nlogn) | O(nlogn) | O(n^2) |

Radix Sort | O(dn) | O(dn) | O(dn) |

Merge Sort | O(nlogn) | O(nlogn) | O(nlogn) |

Heap Sort | O(nlogn) | O(nlogn) | O(nlogn) |

Quick Sort | O(nlogn) | O(nlogn) | O(n^2) |

Bucket Sort | O(n+k) | O(n+k) | O(n^2) |

Counting Sort | O(n+k) | O(n+k) | O(n+k) |

Quiz on Analysis of Algorithms

For more details, please refer: Design and Analysis of Algorithms.

Please write comments if you find anything incorrect, or you want to share more information about the topic discussed above.