Big O Notation
A mathematical notation describing an algorithm's time or space complexity relative to its input size.
Also: Big-O · asymptotic complexity
Definition
Big O notation is a mathematical notation used to describe the performance or complexity of an algorithm in terms of time (execution time) or space (memory usage) relative to the size of the input (n). It describes the worst-case scenario and focuses on the dominant term. Common complexities include O(1) constant, O(log n) logarithmic, O(n) linear, O(n log n) linearithmic, O(n squared) quadratic, and O(2^n) exponential. Understanding Big O helps engineers choose efficient algorithms for large-scale problems.
Example
“Binary search has O(log n) complexity — finding an element in 1 billion sorted items takes only 30 comparisons, while linear search takes up to 1 billion.”
Synonyms
- algorithmic complexity
- time complexity notation
- asymptotic notation
Images
CC-licensed · free to useVideo
Related Terms
- algorithm
- data-structure
- time-complexity
- optimization
