Skip to content
Home » Complexity

Complexity

✅ Complexity of Algorithms


✅ 1. Meaning / Definition of Complexity of Algorithm

The complexity of an algorithm means the amount of resources required by an algorithm to run.

These resources are mainly:
Time (how fast algorithm runs)
Space (how much memory algorithm uses)

So, algorithm complexity helps us to measure:

  • Efficiency
  • Performance
  • Speed and memory usage

✅ 2. Types of Algorithm Complexity

Algorithm complexity is mainly of two types:


✅ (A) Time Complexity

Time Complexity is the amount of time taken by an algorithm to execute as the input size increases.

📌 It is calculated by counting the number of operations (steps) performed.

✅ Example:
If input size is n, the time may increase like:

  • n
  • log n

✅ (B) Space Complexity

Space Complexity is the amount of memory required by an algorithm during execution.

It includes:

  • Memory for input data
  • Memory for variables
  • Memory for auxiliary structures (temporary arrays, stack, etc.)

✅ 3. Asymptotic Notations (Used to Represent Complexity)

To analyze complexity, we use Asymptotic Notations:


✅ 1. Big-O Notation (O)

It represents the worst-case time complexity.

✅ Meaning: Maximum time taken by algorithm.

📌 Example: Linear Search worst case
Element found at last position → O(n)


✅ 2. Big-Ω Notation (Omega Ω)

It represents the best-case time complexity.

✅ Meaning: Minimum time taken.

📌 Example: Linear Search best case
Element found at first position → Ω(1)


✅ 3. Big-Θ Notation (Theta Θ)

It represents the average-case time complexity.

✅ Meaning: Expected or normal performance.

📌 Example: Searching in most cases → Θ(n)


✅ 4. Best, Average and Worst Case Complexity

✅ Best Case

Minimum time taken
Example: element found immediately

✅ Average Case

Normal time taken (medium performance)

✅ Worst Case

Maximum time taken
Example: element found at last or not found


✅ 5. Common Time Complexities (Very Important)

ComplexityNameExample
O(1)ConstantAccessing array element a[5]
O(log n)LogarithmicBinary Search
O(n)LinearLinear Search
O(n log n)Linear LogMerge Sort, Quick Sort (avg)
O(n²)QuadraticBubble Sort, Selection Sort
O(n³)CubicMatrix multiplication (basic)
O(2ⁿ)ExponentialTower of Hanoi
O(n!)FactorialTravelling Salesman (brute force)

Fastest to Slowest Order

O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(n³) < O(2ⁿ) < O(n!)

✅ 6. Example to Understand Time Complexity


✅ Example 1: Constant Time O(1)

x = a[5];

✅ Only one step → O(1)


✅ Example 2: Linear Time O(n)

for(i=0; i<n; i++)
  print(i);

✅ Loop runs n times → O(n)


✅ Example 3: Quadratic Time O(n²)

for(i=0; i<n; i++)
  for(j=0; j<n; j++)
    print(i,j);

✅ Outer loop = n times
✅ Inner loop = n times
Total steps = n × n = n² → O(n²)


✅ 7. Space Complexity Example

✅ Example:

An array of size n:

int a[n];

Memory used increases with input size n
✅ Space complexity = O(n)


✅ 8. Why Complexity Analysis is Important?

✅ Complexity analysis helps in:

  • Choosing the best algorithm
  • Improving performance
  • Saving time and memory
  • Handling large input efficiently
  • Designing optimized programs

📌 Example:
For large data, Binary Search O(log n) is much faster than Linear Search O(n).


✅ Conclusion

The complexity of an algorithm tells us how efficient it is in terms of time and space. It is mainly measured using Big-O, Big-Ω, and Big-Θ notations and helps us compare algorithms and select the best one for large input size problems.