## Lecture 17 - 02/27

### Asymptotics 1

• We want to quantify program execution costs
• We express this symbolically based on number of commands executed by the program with respect to N$N$, where N$N$ is size of some input to the program
• Runtime functions are in order of smallest to largest: logN$\log N$, N$N$, NlogN$N\log N$, N2$N^2$, N3$N^3$, ..., 2N$2^N$, 3N$3^N$, ..., N!$N!$
• We determine function size based on rate of growth - look at program performance at scale (for very large N$N$), want programs with lowest possible runtime/ best performance at scale
• Since we are looking at performance at scale, we simplify constants/coefficients/lower-order terms out of runtime functions (so a runtime function of 3N2+4N+7$3N^2 + 4N + 7$ simplifies to N2$N^2$)

#### Order of Growth

• O(N)$O(N)$ (Big O): Runtime function bounding above (runtime is less than or equal to ...)
• Ω(N)$\Omega (N)$ (Big Omega): Function bounding below (runtime is greater than or equal to ...)
• Θ(N)$\Theta (N)$ (Big Theta): Function bounding both above and below (tight bound, runtime equals ...)
• For a problem of size N$N$, let the true runtime be R(N)$R(N)$
• Let Θ(N)=f(N)$\Theta(N)=f(N)$
• Then this means there exist positive constants k1,k2$k_1, k_2$ such that k1f(N)R(N)k2f(N)$k_1\cdot f(N)\leq R(N)\leq k_2\cdot f(N)$
• In this class we look at Big Theta notation in the worst case for program runtime
• Using Big Theta notation tells us the most information about a program's runtime; using this notation gives the strongest statements