The term analysis of algorithms is the calculation of the amount of storage/time are necessary to execute them. The term was original created by Donald Knuth.
The worst-case runtime complexity of an algorithm is the maximum number of steps taken on any instance of size n. the best-case runtime complexity of an algorithm is the minimum number of steps taken on any instance of size n. the average case runtime complexity of an algorithm is the average number of steps taken on any instance of size n.
For example consider sequential search algorithm that is searching an array of n numbers.
Image by Cmglee
The goal of computational complexity is to classify algorithms according to their performances. We will represent the time function T(n) using the "big-O" notation to express an algorithm runtime complexity. For example, the following statement
T(n) = O(n2) says that an algorithm has a quadratic time complexity.
Big O is the theoretical measure of the execution of an algorithm.
An algorithm is said to run in constant time if it requires the same amount of time regardless of the input size.
An algorithm runs in linear time if its time execution is directly proportional to the input size.
An algorithm runs in logarithmic time if its time execution is proportional to the logarithm of the input size.
An algorithm runs in quadratic time if its time execution is proportional to the square of the input size.
Big omega is the notation that is used to define the lower bound.
Space complexity is a measure of the amount of working storage an algorithm needs. That means how much memory, in the worst case, is needed at any point in the algorithm. As with time complexity, we're mostly concerned with how the space needs grow, in big-Oh terms, as the size N of the input problem grows.