In mathematical analysis, asymptotic analysis, also known as asymptotics, is a method of describing limiting behavior.
As an illustration, suppose that we are interested in the properties of a function f(n) as n becomes very large. If f(n) = n^2 + 3n, then as n becomes very large, the term 3n becomes insignificant compared to n^2. The function f(n) is said to be "asymptotically equivalent to n^2, as n → ∞". This is often written symbolically as f(n) ~ n^2, which is read as "f(n) is asymptotic to n^2".
Guidelines for Asymptotic Analysis
1) Loops: The running time of a loop is, at most, the running time of the statements inside the loop (including tests) multiplied by the number of iterations.
//n times execution
for(int i=0;i<=n;i++) {
n = n + 5; // constant time c
}
Total time = c x n = cn = O(n)
2) Nested loops: Analyze from the inside out. Total running time is the product of the sizes of all the loops.
//outer loop executes n times
for(int i=0;i<n;i++) {
//inner loop executes n times
for(int j=0;j<n;j++) {
n = n + 5; // constant time c
}
}
Total time = c x n x n = cn^2 = O(n^2)
3) Logarithmic complexity: An algorithm is O(logn) if it takes a constant time to cut the problem size by a fraction (usually by 1/2). As an example let us consider the following example:
for(int i=0;i<n;) {
i = i*2;
}
On careful observation, the value of i is doubling every time. Initially i = 1, in next step i = 2, and in subsequent steps i =4, 8 and so on.
Now we have
2^k = n
taking log on both sides
log(2^k) = logn
klog2 = logn
k = logn // assuming base is 2
Total time = O(logn)


