Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Since it's a cache I don't think there's a big problem in losing a hdd and some cache capacity. I would expect it to be more efficient to use as much storage as possible even if a hdd will fail.


I agree the cache nature makes it easier, but consider that a 3TB drive, at 2GB per 'movie', holds 1500 movies. So when you lose a drive, if the only content you have is on that drive you need to seamlessly switch from sucking it off the drive and sucking it down the back haul network. I can think of a number of ways [1] that might be done with varying degrees of 'hiccup' on the user end, but it is a big chunk of cache to lose at once. Note that 3TB is also 50 minutes of full on 10 GbE pipe bandwidth. But as you say it isn't as if you lose data, since its still there back at the head office, but you lose that copy of it.

[1] Ways that come to mind are; multiple boxes with a diversity of the content amongst them so that another box can 'pick up' the stream, keeping a stream to the home office open but not pulling data until you lose the local source, or even doing a 'buffer' to disk and stream from the buffer then switch disk targets when your disk fails.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: