Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Worst case complexity matters when the input data can be manipulated by someone malicious, who can then intentionally engineer the degenerate worst case to happen - as we have seen historically in e.g. denial of service attacks exploiting common hash table implementations with bad worst case complexity.


No, you're throwing away a random selection of 50/50. You would have to flood the algorithm with uniques or commons to set the algoritm to a probability of a known state.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: