I got curious about what actually is a blocker in terms of time complexity, for python. I’ve always known that the smaller the complexity the better, but it’s never quantitative. How bad is n^2? How bad is n^n? How much of an improvement is sqrt(n) or log(n)? What is outrageous to write and is absolutely no-go?
Here are some quantitative answers, with my old MacbookPro, what takes 1 second to run in python:
Code to Execute
def fn(n):
for _ in range(...):
pass
How big n needs to be for fn to take 1 sec:
| Name | fn | n |
| ? | n^n | Between 7 and 8 |
| Factorial | n! | Between 9 and 10 |
| Exponential | 2^n | 21 |
| Cubic | n^3 | 130 |
| Quadratic | n^2 | 1500 |
| Log Linear | n log n | 200,000 |
| Linear | n | 2,500,000 |
| Logarithmic | log n | 10 ^ 1,000,000 |
So yeah, with my poor little laptop, even just having O(n) is not particularly a walk in the park, not to mention anything above that.
It’s very rare to encounter something that’s actually n!, but a total 2 millions of executions, which is equivalent of O(2^n) capping at n=20, O(n^2) capping at n=1500, is a pretty quantitative way to measure how efficient something is.
