# Misc Complexity Tricks

Tags CS161ComplexityExamples

# Some important things

If $f(x)$﻿ is $O(g(x))$﻿ and it is not $\Theta(g(x))$﻿, then it is also the case that $k^{f(x)} = O(k^{g(x)}$﻿.

If $f(x) = \Omega(g(x))$﻿ and it is not $\Theta(g(x))$﻿, then $k^{f(x)} = \Omega(k^{g(x)})$﻿. This is because $f(x)$﻿ grows faster than $g(x)$﻿ (and not just by a constant, which can mess things up).

And more generally, this holds for any monotonically increasing function composed with $f, g$﻿.

However, the difficulty comes when $f(x) = \Theta(g(x))$﻿. When this is the case, you can’t say anything about the relationship between $k^{f(x)}$﻿ and $k^{g(x)}$﻿. It all depends on the constant factor between $f, g$﻿.

In general, when you see if something satisfies a certain complexity, you should look for counterexamples. For $O()$﻿, find cases in which we go above, and for $\Theta$﻿, find cases in which we go below.

# Chaining bounds

Say that you wanted to prove that $f(n) = O(g(n))$﻿. You could do the $f(n) \leq cg(n)$﻿ directly, but sometimes that’s hard. Sometimes, you can prove that $f(n) \leq h(n)$﻿, and then you can prove that $h(n) \leq cg(n)$﻿. Implicitly, this proves that $f(n) \leq cg(n)$﻿.

You can do this same trick with $f(n) = \Omega(g(n))$﻿ but the other way around. You can show that $f(n) \geq h(n)$﻿ and $h(n) \geq cg(n)$﻿.

# The “distillation” idea

When talking about big O, theta, or omega notation, it’s often accurate to just consider the largest term. So $4n^2$﻿ will be equivalent to $n^2$﻿, etc. However, this breaks down for a few key things

# Variable substiution

Sometimes it’s easier to compare when you substitute variables. For example, when proving that $\log^7 n \leq c n^{0.7}$﻿, we log both sides and let $k = \log n$﻿, where it becomes obvious that $7\log k \leq 0.7 k + \log c$﻿

# Ratio test

If you’re not sure which one grows quicker, take the ratio $f / g$﻿ and then take the limit as $n → \infty$﻿. This is known as the ratio test in sequences, and it does work. If the ratio tends to zero, then we say that $f = o(g)$﻿ (yes, it’s lowercase o)

# Floors and ceilings

The best bet is to look at the best case and worst case with floors and ceilings when the floor/ceiling gets used vs when it doesn’t get used and you just pass through whatever is on the inside.

# Exponents

In general, $k^n$﻿ and $q^n$﻿ have different asymptotic behavior. Formally, this is because if you take the log of both sides in your derivation of $O$﻿, you might get

The $c$﻿ is no longer a slope, but rather an intercept. Therefore, if $k \leq q$﻿, then our inequality is satisfied. Otherwise, it is contradicted.

One identity to note:

Remember that

and that

# Logs

We can derive from the ratio test that $\log^b n = o(n^a)$﻿ for any constants $b, a$﻿. This means that the polylogarism grows slower than any polynomial function

## Interesting example with @import url('https://cdnjs.cloudflare.com/ajax/libs/KaTeX/0.16.9/katex.min.css')$n\log n$﻿ vs @import url('https://cdnjs.cloudflare.com/ajax/libs/KaTeX/0.16.9/katex.min.css')$\sum_i \log i$﻿

We can show that these two are actually related by $\Theta$﻿. First, we show that $\sum_i \log i \leq n\log n$﻿. This is trivial, because $\log i \leq \log n$﻿. As such, we conclude that $\sum_i \log i = O(n\log n)$﻿

Next, we unroll the summation and we get that $\sum_i \log i = \log 1 + ... \log(n/2) + ... \log(n)$﻿. Therefore, $\sum_i \log i \geq (n/2)(\log n / 2) = (n / 2)(\log n - \log 2)$﻿. This means that $\sum_i \log_i \geq c n \log n$﻿, which means that $\sum_i \log_i = \Omega (n\log n)$﻿.

Finally, taken together, this means that $\sum_i \log i = \Theta(n \log n)$﻿ . This is a really nice trick!

# Factorial

A weak upper bound is $n! \leq n^n$﻿, but this isn’t fun. We can use Stirling’s approximation, which states that

which means that with large enough $n$﻿, we can disregard the additional $\Theta$﻿ term.

From Sterling’s approximation, we get that $\log(n!) = \Theta(\log(n^n))$﻿, which is very interesting.