BigONotation
Big-O notation, often written as O(...), is a mathematical concept used in computer science to describe the upper bound on the growth rate of an algorithm's running time or space usage as the input size n grows. It characterizes asymptotic behavior, focusing on how the resource consumption scales rather than exact values for finite inputs. A function T(n) is O(g(n)) if there exist constants c > 0 and n0 ≥ 1 such that T(n) ≤ c · g(n) for all n ≥ n0. Here, T(n) might represent time or memory, and g(n) is a simple growth function such as 1, log n, n, n log n, n^2, etc.
Common examples include O(1) for constant time, O(log n) for logarithmic time, O(n) for linear time, O(n
Other related notations provide tighter or complementary information. Theta notation, Θ(g(n)), denotes a tight bound where
Big-O is a tool for comparing algorithms and understanding scalability. It abstracts away constant factors and