# Analysis of Algorithms | Set 2 (Worst, Average and Best Cases)

• Difficulty Level : Easy
• Last Updated : 22 Aug, 2022

In the previous post, we discussed how Asymptotic analysis overcomes the problems of the naive way of analyzing algorithms. But let’s take an overview of the asymptotic notation and learn about What is Worst, Average, and Best cases of an algorithm:

## Popular Notations in Complexity Analysis of Algorithms

### 1. Big-O Notation

We define an algorithm’s worst-case time complexity by using the Big-O notation, which determines the set of functions grows slower than or at the same rate as the expression. Furthermore, it explains the maximum amount of time an algorithm requires considering all input values.

### 2. Omega Notation

It defines the best case of an algorithm’s time complexity, the Omega notation defines whether the set of functions will grow faster or at the same rate as the expression. Furthermore, it explains the minimum amount of time an algorithm requires considering all input values.

### 3. Theta Notation

It defines the average case of an algorithm’s time complexity, the Theta notation defines when the set of functions lies in both O(expression) and Omega(expression), then Theta notation is used. This is how we define a time complexity average case for an algorithm.

## Measurement of Complexity of an Algorithm

Based on the above three notations of Time Complexity there are three cases to analyze an algorithm:

### 1. Worst Case Analysis (Mostly used)

In the worst-case analysis, we calculate the upper bound on the running time of an algorithm. We must know the case that causes a maximum number of operations to be executed. For Linear Search, the worst case happens when the element to be searched (x) is not present in the array. When x is not present, the search() function compares it with all the elements of arr[] one by one. Therefore, the worst-case time complexity of the linear search would be O(n).

### 2. Best Case Analysis (Very Rarely used)

In the best case analysis, we calculate the lower bound on the running time of an algorithm. We must know the case that causes a minimum number of operations to be executed. In the linear search problem, the best case occurs when x is present at the first location. The number of operations in the best case is constant (not dependent on n). So time complexity in the best case would be Ω(1)

### 3. Average Case Analysis (Rarely used)

In average case analysis, we take all possible inputs and calculate the computing time for all of the inputs. Sum all the calculated values and divide the sum by the total number of inputs. We must know (or predict) the distribution of cases. For the linear search problem, let us assume that all cases are uniformly distributed (including the case of x not being present in the array). So we sum all the cases and divide the sum by (n+1). Following is the value of average-case time complexity.

Average Case Time = \sum_{i=1}^{n}\frac{\theta (i)}{(n+1)} = \frac{\theta (\frac{(n+1)*(n+2)}{2})}{(n+1)} = \theta (n)

## Which Complexity analysis is generally used?

Below is the ranked mention of complexity analysis notation based on popularity:

### 1. Worst Case Analysis:

Most of the time, we do worst-case analyses to analyze algorithms. In the worst analysis, we guarantee an upper bound on the running time of an algorithm which is good information.

### 2. Average Case Analysis

The average case analysis is not easy to do in most practical cases and it is rarely done. In the average case analysis, we must know (or predict) the mathematical distribution of all possible inputs.

### 3. Best Case Analysis

The Best Case analysis is bogus. Guaranteeing a lower bound on an algorithm doesn’t provide any information as in the worst case, an algorithm may take years to run.

### Interesting information about asymptotic notations:

A) For some algorithms, all the cases (worst, best, average) are asymptotically the same. i.e., there are no worst and best cases.

• Example:  Merge Sort does Θ(n log(n)) operations in all cases.

B) Where as most of the other sorting algorithms have worst and best cases.

• Example 1: In the typical implementation of Quick Sort (where pivot is chosen as a corner element), the worst occurs when the input array is already sorted and the best occurs when the pivot elements always divide the array into two halves.
• Example 2: For insertion sort, the worst case occurs when the array is reverse sorted and the best case occurs when the array is sorted in the same order as output.

## C

 // C implementation of the approach#include  // Linearly search x in arr[].// If x is present then return the index,// otherwise return -1int search(int arr[], int n, int x){    int i;    for (i = 0; i < n; i++) {        if (arr[i] == x)            return i;    }    return -1;} /* Driver's code*/int main(){    int arr[] = { 1, 10, 30, 15 };    int x = 30;    int n = sizeof(arr) / sizeof(arr[0]);     // Function call    printf("%d is present at index %d", x,           search(arr, n, x));     getchar();    return 0;}

## C++

 // C++ implementation of the approach#include using namespace std; // Linearly search x in arr[].// If x is present then return the index,// otherwise return -1int search(int arr[], int n, int x){    int i;    for (i = 0; i < n; i++) {        if (arr[i] == x)            return i;    }    return -1;} // Driver's Codeint main(){    int arr[] = { 1, 10, 30, 15 };    int x = 30;    int n = sizeof(arr) / sizeof(arr[0]);     // Function call    cout << x << " is present at index "         << search(arr, n, x);     return 0;}

## Java

 // Java implementation of the approach public class GFG {     // Linearly search x in arr[].  If x is present then    // return the index, otherwise return -1    static int search(int arr[], int n, int x)    {        int i;        for (i = 0; i < n; i++) {            if (arr[i] == x) {                return i;            }        }        return -1;    }     /* Driver's code*/    public static void main(String[] args)    {        int arr[] = { 1, 10, 30, 15 };        int x = 30;        int n = arr.length;         // Function call        System.out.printf("%d is present at index %d", x,                          search(arr, n, x));    }}

## Python3

 # Python 3 implementation of the approach # Linearly search x in arr[]. If x is present# then return the index, otherwise return -1  def search(arr, x):    for index, value in enumerate(arr):        if value == x:            return index    return -1  # Driver's Codeif __name__ == '__main__':    arr = [1, 10, 30, 15]    x = 30     # Function call    print(x, "is present at index",          search(arr, x))

## C#

 // C# implementation of the approachusing System;public class GFG {     // Linearly search x in arr[].  If x is present then    // return the index, otherwise return -1    static int search(int[] arr, int n, int x)    {        int i;        for (i = 0; i < n; i++) {            if (arr[i] == x) {                return i;            }        }        return -1;    }     /* Driver's code*/    public static void Main()    {        int[] arr = { 1, 10, 30, 15 };        int x = 30;        int n = arr.Length;         // Function call        Console.WriteLine(x + " is present at index "                          + search(arr, n, x));    }}

## PHP

 

## Javascript

 // javascript implementation of the approach      // Linearly search x in arr. If x is present then    // return the index, otherwise return -1    function search(arr , n , x) {        var i;        for (i = 0; i < n; i++) {            if (arr[i] == x) {                return i;            }        }        return -1;    }     /* Driver program to test above functions */             var arr = [ 1, 10, 30, 15 ];        var x = 30;        var n = arr.length;        document.write(x+" is present at index "+ search(arr, n, x));

Output

30 is present at index 2

#### Time Complexity Analysis: (In Big-O notation)

• Best Case: O(1), This will take place if the element to be searched is on the first index of the given list. So, the number of comparisons, in this case, is 1.
• Average Case: O(n), This will take place if the element to be searched is on the middle index of the given list.
• Worst Case: O(n), This will take place if:
• The element to be searched is on the last index
• The element to be searched is not present on the list

2. In this example, we will take an array of length (n) and deals with the following cases :

• If (n) is even then our output will be 0
• If (n) is odd then our output will be the sum of the elements of the array.

Below is the implementation of the given problem:

## C

 #Python 3 implementation of the approach def getsum(arr, n):    if n % 2 == 0: # if (n) is even        return 0             Sum = 0    for i in range(n):        Sum += arr[i]    return Sum # if (n) is odd #Driver's Codeif __name__ == '__main__':    arr1 = [1,2,3,4] # Declaring an array of even length    n1 = len(arr1)    arr2 = [1,2,3,4,5] # Declaring an array of odd length    n2 = len(arr2) #Function call    print(getsum(arr1,n1)) # print 0 because (n) is even     print(getsum(arr2,n2)) # print sum of array because (n) is odd #This code is contributed by Syed Maruf Ali

## C++

 // C++ implementation of the approach#include using namespace std; int getSum(int arr[], int n){    if (n % 2 == 0) // (n) is even    {        return 0;    }    int sum = 0;    for (int i = 0; i < n; i++) {        sum += arr[i];    }    return sum; //  (n) is odd} // Driver's Codeint main(){    // Declaring two array one of length odd and other of    // length even;    int arr[4] = { 1, 2, 3, 4 };    int a[5] = { 1, 2, 3, 4, 5 };     // Function call    cout << getSum(arr, 4)         << endl; // print 0 because (n) is even    cout << getSum(a, 5)         << endl; // print sum because (n) is odd}// This code is contributed by Suruchi Kumari

## Java

 // Java implementation of the approach public class GFG {    static int getsum(int arr[], int n)    {        if (n % 2 == 0) // if (n) is even        {            return 0;        }        int sum = 0;        for (int i = 0; i < n; i++) {            sum += arr[i];        }        return sum; // if (n) is odd    }     /* Driver's code*/    public static void main(String[] args)    {        int arr1[]            = { 1, 2, 3,                4 }; // Declaring an array of even length        int n1 = arr1.length;         int arr2[]            = { 1, 2, 3, 4,                5 }; // Declaring an array of odd length        int n2 = arr2.length;         // Function call        System.out.println(getsum(            arr1, n1)); // print 0 because (n) is even        System.out.println(getsum(            arr2,            n2)); // print sum of array because (n) is odd    }} // This code is contributed by Syed Maruf Ali (Sdmrf)

## Python3

 # Python 3 implementation of the approach  def getsum(arr, n):    if n % 2 == 0:  # if (n) is even        return 0     Sum = 0    for i in range(n):        Sum += arr[i]    return Sum  # if (n) is odd  # Driver's Codeif __name__ == '__main__':arr1 = [1, 2, 3, 4]  # Declaring an array of even lengthn1 = len(arr1)arr2 = [1, 2, 3, 4, 5]  # Declaring an array of odd lengthn2 = len(arr2) # Function callprint(getsum(arr1, n1))  # print 0 because (n) is even print(getsum(arr2, n2))  # print sum of array because (n) is odd # This code is contributed by Syed Maruf Ali

Output

0
15`

#### Time Complexity Analysis:

• Best Case: The order of growth will be constant because in the best case we are assuming that (n) is even.
• Average Case: In this case, we will assume that even and odd are equally likely, therefore Order of growth will be linear
• Worst Case: The order of growth will be linear because in this case, we are assuming that (n) is always odd.