๐ Time Complexity (Detailed Discussion)
What is Time Complexity?
Time Complexity measures how much time an algorithm takes as a function of the input size (n).
๐ It tells us how fast an algorithm runs when the input grows.
It does not measure actual time (seconds, milliseconds).
Instead, it measures number of steps/operations an algorithm performs.
๐ Why Time Complexity is Important?
- Helps compare different algorithms
- Predicts performance for large inputs
- Helps choose the most efficient algorithm
- Avoids slow software systems
- Removes dependency on hardware speed
For example:
- Searching in a sorted list using Binary Search is much faster than Linear Search.
- Sorting using Merge Sort is more efficient than Bubble Sort.
๐ Types of Time Complexity
We analyze time complexity using:
1. Big-O Notation (O) โ Worst Case
Upper bound
Example: “At most this much time”
2. Big-ฮฉ (Omega) โ Best Case
Lower bound
Example: “At least this fast”
3. Big-ฮ (Theta) โ Average/Tight Bound
Exact bound
Example: “On average, this time”
In most exams/interviews, Big-O is used.
๐ Common Time Complexities (with examples)
1. Constant Time โ O(1)
Time does not depend on input size.
Examples:
- Accessing array element by index
- Pushing/popping from stack
- Checking if a number is even
2. Logarithmic Time โ O(log n)
Time grows slowly even for large input.
Examples:
- Binary Search
- Height of balanced BST
- Divide and Conquer algorithms
๐ Input size halves at each step.
3. Linear Time โ O(n)
Time grows proportionally to the input size.
Examples:
- Linear Search
- Finding maximum/minimum in an array
- Traversing a list
4. Linearithmic Time โ O(n log n)
Combination of linear and logarithmic.
Examples:
- Merge Sort
- Heap Sort
- Quick Sort (average)
Most efficient sorting algorithms use O(n log n).
5. Quadratic Time โ O(nยฒ)
Time increases rapidly as input grows.
Examples:
- Bubble Sort
- Selection Sort
- Insertion Sort (worst case)
- Nested loops (i inside j)
6. Cubic Time โ O(nยณ)
Triple nested loops.
Example:
- Matrix multiplication (brute force)
7. Exponential Time โ O(2โฟ)
Very slow โ grows extremely fast.
Examples:
- Fibonacci (recursive)
- Subset generation
- Tower of Hanoi
8. Factorial Time โ O(n!)
Used in brute-force permutation problems.
Examples:
- Traveling Salesman Problem (TSP) brute force
- Generating all permutations
๐ Time Complexity Table (most important)
| Complexity | Name | Example |
|---|---|---|
| O(1) | Constant | Access array element |
| O(log n) | Logarithmic | Binary Search |
| O(n) | Linear | Linear Search |
| O(n log n) | Linearithmic | Merge Sort |
| O(nยฒ) | Quadratic | Bubble Sort |
| O(nยณ) | Cubic | Matrix operations |
| O(2โฟ) | Exponential | Fibonacci (recursion) |
| O(n!) | Factorial | TSP permutations |
๐ How to Calculate Time Complexity (Rules)
Rule 1: Drop constants
O(2n) โ O(n)
O(100) โ O(1)
Rule 2: Consider the highest order term
O(nยณ + nยฒ + n + 5) โ O(nยณ)
Rule 3: For loops
- Single loop โ O(n)
- Nested loops โ O(nยฒ)
- Loop that divides n โ O(log n)
Rule 4: Consecutive statements
Take the maximum, not the sum.
O(n) + O(nยฒ) = O(nยฒ)
Rule 5: Recursive algorithms
Use recurrence relations
E.g., Merge Sort: T(n) = 2T(n/2) + n โ O(n log n)
๐ Examples of Calculating Time Complexity
Example 1:
for (i = 0; i < n; i++)
cout << i;
Complexity โ O(n)
Example 2:
for(i=0; i<n; i++)
for(j=0; j<n; j++)
cout<<i+j;
Complexity โ O(nยฒ)
Example 3:
while (n > 1)
n = n / 2;
Complexity โ O(log n)
Example 4:
cout << "Hello";
Complexity โ O(1)
๐ Best, Worst, and Average Case Example
Consider Linear Search:
- Best Case (ฮฉ(1)): item found at the first position
- Worst Case (O(n)): item not found or last position
- Average Case (ฮ(n)): item is somewhere in the middle
๐ Graphical Comparison (Understanding Growth)
Fastest โ Slowest:
O(1) < O(log n) < O(n) < O(n log n) < O(nยฒ) < O(2โฟ) < O(n!)
๐ฏ Summary (Exam-Oriented)
- Time Complexity measures performance of an algorithm.
- Uses Big-O to express worst-case performance.
- Most common complexities: O(1), O(log n), O(n), O(nยฒ), O(n log n).
- Nested loops โ often O(nยฒ).
- Divide & Conquer โ O(n log n).
- Recursion โ use recurrence to find complexity.
- Efficiency becomes crucial as input size grows.
