Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Your comment is incomprehensible.


I'll take another stab at it.

OP's critique seems to be that TikTok caused engagement with their preferred content for a longer duration than desired. That's not an interesting distinction from what happened back in the day with people going down Wikipedia rabbit holes. "Digital crack" is hyperbole if all it means is spending longer on the topics one already likes.

My question: is the TikTok algo consistently steering users toward/away from categories of content they would have otherwise had an interest in consuming? E.g., are there pro-unionization users who accidentally spend 2 hours watching pro-unionization TikToks? Or is the recommendation engine steering those users to 2 hours of some less contentious category?


You're talking to a transformer language model.


Really? How do you know? I had a look at their other posts and it doesn't seem like it to me.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: