## Lecture 17 - 02/27

### Asymptotics 1

- We want to quantify program execution costs
- We express this symbolically based on number of commands executed by the program with respect to N, where N is size of some input to the program
- Runtime functions are in order of smallest to largest: logN, N, NlogN, N2, N3, ..., 2N, 3N, ..., N!
- We determine function size based on rate of growth - look at program performance at scale (for very large N), want programs with lowest possible runtime/ best performance at scale
- Since we are looking at performance at scale, we simplify constants/coefficients/lower-order terms out of runtime functions (so a runtime function of 3N2+4N+7 simplifies to N2)

#### Order of Growth

- O(N) (Big O): Runtime function bounding above (runtime is less than or equal to ...)
- Ω(N) (Big Omega): Function bounding below (runtime is greater than or equal to ...)
- Θ(N) (Big Theta): Function bounding both above and below (tight bound, runtime equals ...)
- For a problem of size N, let the true runtime be R(N)
- Let Θ(N)=f(N)
- Then this means there exist positive constants k1,k2 such that k1⋅f(N)≤R(N)≤k2⋅f(N)

- In this class we look at Big Theta notation in the worst case for program runtime
- Using Big Theta notation tells us the most information about a program's runtime; using this notation gives the strongest statements