Skip to content
Home » Different orders of growth

Different orders of growth


📘 Different Orders of Growth of Algorithms

1️⃣ Introduction

The order of growth of an algorithm describes how its time or space complexity increases as the input size (n) increases.

It is expressed using asymptotic notations such as Big-O (O), Omega (Ω), and Theta (Θ), and helps in:

  • Comparing algorithms
  • Predicting performance for large inputs
  • Selecting efficient solutions

2️⃣ Constant Order – O(1)

🔹 Meaning

The execution time is independent of input size.

🔹 Example

Accessing an array element:

x = arr[5]

🔹 Characteristics

  • Fastest possible growth
  • Highly efficient

3️⃣ Logarithmic Order – O(log n)

🔹 Meaning

The input size is reduced by a constant factor in each step.

🔹 Example

  • Binary Search
  • Balanced binary tree operations

🔹 Characteristics

  • Very efficient for large n
  • Common in divide-and-conquer algorithms

4️⃣ Linear Order – O(n)

🔹 Meaning

Time grows proportionally with input size.

🔹 Example

  • Linear Search
  • Traversing an array

🔹 Characteristics

  • Reasonable efficiency
  • Common in basic algorithms

5️⃣ Linearithmic Order – O(n log n)

🔹 Meaning

Combination of linear and logarithmic growth.

🔹 Example

  • Merge Sort
  • Heap Sort
  • Quick Sort (average case)

🔹 Characteristics

  • Best possible complexity for comparison-based sorting
  • Scales well for large data

6️⃣ Quadratic Order – O(n²)

🔹 Meaning

Time grows proportional to the square of input size.

🔹 Example

  • Bubble Sort
  • Selection Sort
  • Nested loops

🔹 Characteristics

  • Inefficient for large n
  • Acceptable for small inputs

7️⃣ Cubic Order – O(n³)

🔹 Meaning

Time grows proportional to the cube of input size.

🔹 Example

  • Matrix multiplication (naïve method)
  • Triple nested loops

🔹 Characteristics

  • Very slow
  • Rarely used for large problems

8️⃣ Exponential Order – O(2ⁿ)

🔹 Meaning

Time doubles with every increase in input size.

🔹 Example

  • Recursive Fibonacci (naïve)
  • Subset generation

🔹 Characteristics

  • Extremely inefficient
  • Only suitable for very small n

9️⃣ Factorial Order – O(n!)

🔹 Meaning

Time grows factorially with input size.

🔹 Example

  • Traveling Salesman Problem (brute force)
  • Permutation generation

🔹 Characteristics

  • Worst growth rate
  • Computationally impractical

🔟 Order of Growth Hierarchy (Very Important)

O(1) < O(log n) < O(n) < O(n log n)
< O(n²) < O(n³) < O(2ⁿ) < O(n!)

✔ Lower order = better performance
✔ Used frequently in exam comparisons


1️⃣1️⃣ Comparison Table

OrderNameExample
O(1)ConstantArray access
O(log n)LogarithmicBinary Search
O(n)LinearLinear Search
O(n log n)LinearithmicMerge Sort
O(n²)QuadraticBubble Sort
O(n³)CubicMatrix Multiplication
O(2ⁿ)ExponentialRecursive Fibonacci
O(n!)FactorialTSP (Brute Force)

1️⃣2️⃣ Importance in Algorithm Design

  • Helps choose efficient algorithms
  • Avoids performance bottlenecks
  • Critical for large-scale applications
  • Foundation for advanced algorithm techniques

🔚 Conclusion

Different orders of growth help in understanding the efficiency and scalability of algorithms.

An algorithm with lower order of growth is always preferred for large input sizes.