# Explain To Me Like I Am Five: Time Complexity

Hello Everyone!!π

Time Complexity is simply a measure of time it takes for a program to complete its task. Time complexity plays a crucial role in every spot while programming. So, let's try to simplify and learn now.

## CONTENTS

- How to find time complexity?
- Understanding asymptotic analysis
- Analysis of Big-O complexities
- Big-O cheatsheet
- Do we consider time complexity of in-built functions?
- Time complexity of conditional statements

### How to find time complexity?

To check which algorithm is better for our task, one normal way is to run both algorithms on our computer and note which one is taking less time. But, this way of finding time complexity is not effective as results depend on the factors like the performance of the device, inputs given, etc. So, we check the time complexity using **asymptotic analysis**.

In **Asymptotic Analysis**, we evaluate the performance based on the input size we give. In other words, time complexity is estimated by counting the number of elementary steps performed by an algorithm to finish execution.
Let's see an example to understand this in a better way, let us try to find using two different cases.

**1st Case**

```
//psuedocode
int i = 1 to N
N = N + N
print N
```

**2nd Case**

```
//pseudocode
return N * N
```

In the first case, time will depend on N, as N increases time also increases. In the second case, any value of N we take we will get the result in one step(independent of N).

### Understanding asymptotic analysis

Let's consider an example, there is a time complexity function obtained as T(n)=n^2+2n+8. Here for large values of n, (2n+8) becomes insignificant when compared to n^2.

- We also neglect constant terms of higher-order coefficients. If we have 250n^2 and 300n^3, we only consider n^2 and n^3 ignoring constants.

### Analysis of Big-O complexities

**O(1)**

Time complexity is said to be O(1) when it does not contain loop(which varies with input), recursion, or function call.

```
for(int i=0; i < 25; i++){
//statments
}
```

**O(N)**

Time complexity is said to be O(N) when the variables in a loop are incremented or decremented.

```
for(int i= n; i > 0; i--){
//statments
}
```

**O(N^k)**

Time complexity can be said O(N^k)(k can be 1,2,3...) by the number of times the innermost loop of the nested loop is executed.

```
for(int i= 0; i < n; i++){
for(j= 0 ;j < n; j++){
//statments
}
}
```

**O(logN)**

Time complexity is considered as O(logN) if the loop variables are divided or multiplied by a constant.

```
for(int i = 0; i < n; i = i * c){
//statments
}
```

### Big-O cheatsheet

### Do we consider time complexity of in-built functions?

Yes, we do consider time complexity of in-built functions which effect time like `Collections.sort()`

in java uses merge sort and has the complexity of O(NlogN), `sort()`

in python which uses Timsort and has complexity as O(NlogN), etc.

### Time Complexity of conditional statements

```
//pseudocode
input n
if n<7
print "n is less than 7"
else
for int i = 0 to n
print i
```

Here, observe the above code if the input we give is less than 7 then we execute only a print statement so time complexity is O(1). If the input is greater than 7 then there is a for loop which executes for n times. So complexity is n. Therefore best case time complexity is O(1) and worst case is O(N).

That's it. Now you can find time complexity of code by your own.π