Section: STEM · Computer ScienceDifficulty: Medium

Big O Notation

USUK

A mathematical notation describing the limiting behavior of an algorithm's resource usage.

Also: O notation · asymptotic notation

Definition

Big O notation is a mathematical framework used in computer science to describe the upper bound on an algorithm's time or space complexity as a function of input size, characterizing its worst-case behavior. Common complexities include O(1) constant, O(log n) logarithmic, O(n) linear, O(n log n), and O(n squared) quadratic. It allows comparison of algorithm efficiency independent of hardware or implementation details.

Example

A linear search through an unsorted list of n items has O(n) time complexity because in the worst case every element must be checked, while a binary search on a sorted list takes O(log n), making it drastically faster for large datasets.

Synonyms

  • asymptotic notation
  • complexity notation
  • O notation

Images

CC-licensed · free to use
More on Wikimedia
Loading images…

Video

Dictionary Entry

Back to STEM