Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

These systems have such a staggering amount of content and are only profitable at scale. Moderating them in their current form with humans would be expensive and difficult if not impossible.

https://www.statista.com/statistics/259477/hours-of-video-up...

> 500 hours of video are uploaded to YouTube every minute.

This is why Section 230 needs a revision IMO. There is zero incentive for anybody to moderate properly.



It would be ok if YouTube only moderates really popular channels and videos that have the most potential to cause harm. As always, the problem is scale - solutions that work for my 5-man IRC server do not scale to YouTube's size. So, a video with 5 views probably doesn't need manual attention. Even if it is extremely misleading and dangerous. Once you have a few million views of your "just jab high voltage electrodes into some wet wood, it's safe!" video, then it should be up for manual review. You absolutely can have an algorithm decide if the video has any one of several potentially dangerous topics and then flag it for manual review. I'm certain YouTube already does that with a myriad of other topics, they just need to improve their methods a bit.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: