Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think you're getting tripped up on what the analysis is. It's a worst-case analysis. That is, at every step of the way, you're figuring out what is the worst possible behavior. Hence, you won't ever fall into the trap you're worried about. When doing worst-case analysis, and we say "this section of code is n^2", and "this section of code is n^3", as n grows, that will always be true.

You're probably thinking of not worst-case analysis (which is what is typically done for Big O), but average-case analysis. That is, you're not asking about what is the worst possible running time, but on average, what will the typical running time be? That requires a more complicated analysis. Specifically, we have to start giving probability distributions to our inputs. In worst-case analysis, reasoning about the inputs is easy: just figure out what inputs will give us the worst performance, no matter how rare those inputs may be. In average-case analysis, we have to start figuring out what the likely inputs will be. That requires more work and more assumptions. Typically, you do average-case analysis with a particular problem domain in mind - otherwise, you can't reason about what kinds of inputs are frequent or rare.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: