Time complexity is an approximation of how low it will take for an algorithm to execute. It is not an actual time(ex in sec/ns) but an approximate time. It is commonly estimated by counting the number of elementary operations performed by the algorithm.

Example: Suppose we have an array with n elements. Then if we traverse the entire array, then we have to visit each element exactly once and since there are n elements, we have to traverse all n elements. Hence here time complexity is **n or O(n)**

**Various Notations to Represent Time Complexity:**

**Big O: **It tells the upper bound on the time taken by an algorithm.

If f(x)<= c*g(n), then f(x) = O(g(x)). Here, c is constant and f(x) and g(x) are some function.

**Big omega:** It tells the lower bound on the time taken by an algorithm.

If c1*g(n) <= f(x), then f(x) = Ω(g(x)). Here, c is constant and f(x) and g(x) are some function.

**Big Theta:** It tells the tight bound on the time taken by an algorithm.

If c1*g(n) <= f(x)<= c2*g(n), then f(x) = Θ(g(x)). Here, c1,c2 are constants and f(x) and g(x) are some function.

for(i=0;i<n;i++){ //do something }

The complexity of the above code is O(n) since the loop is running n times.

for(i=0;i<n;i++) { for(j=0;j<=n;j++) //do something }

The time complexity of the above code is O(n^2) as here are loops. First loop is running n times, and for every iteration of the first loop, the second loop is running n times.