Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Thanks for your comment. You make some good points. Zuckerberg's comment about the incentives of leaking research is certainly worthy of consideration. And while I don't have first hand experience with Brexit, I do not mean to claim that the disagreements were caused by FB. Only that FB may have had a role in causing people to become more entrenched in their positions.

One of the points I'm making is that Zuckerberg's statement lacks specifics in the form of numbers and data. I think it'd be interesting to read a point-by-point rhetorical analysis of his statement.

Also, because of this, yes, I don't know how much Facebook spends on research. I agree that though money and research quality are quite likely correlated, it's very hard to say by how much. That being said, I care a whole lot more about the values of the company. Haugen's testimony paints a textbook picture of a values problem. The whistleblower has repeatedly said, under oath, that Facebook understaffs its security and safety teams, and that they turned off the safety and integrity protections after the election, and more.

It's also true that civic divisions in the US -- not to mention other social problems -- run much deeper than Facebook. One mechanism people like me are concerned about is how users are recommended people to follow or content that results in either more division or them being led to a more extreme version of their views. In her testimony, Haugen gave the example of how indicating an interest in healthier eating on IG can lead recommendations of anorexia / eating disorder content. Saying that Facebook's engagement-based-ranking has nothing to do with promoting civil divisions seems to me like saying that the Youtube recommendation algorithm a few years back had nothing to do with the rise of the modern flat earth movement. Researchers have evidence that it did [0].

As for ethnic conflict in Ethiopia, I only bring it up because of Haugen's testimony. As this Guardian article puts it, "Haugen warned that Facebook was 'literally fanning ethnic violence' in places such as Ethiopia because it was not policing its service adequately outside the US." [1]. Your comment does make me wonder how many people in Ethiopia have access to the internet though.

This is a slight tangent, but it's also worth mentioning that re: IG and mental health... we don't know about other research, like about any further attempts at a causal study -- most of what's been cited is correlational and comes from small sample-sized interviews. So it would be nice to see larger and more rigorous studies. I don't believe that research should stop with question "Is Instagram Harmful." Of course that's going to have a mixed answer when dealing with large masses of people. "Who is susceptible to being harmed?", "By what mechanisms is IG harmful to some people?" etc. are questions that need answers.

I also disagree that people are so biased against FB/IG that anything they do will be seen in a bad light. Were they to tweak the IG recommendation algorithm so that an interest in healthier eating did not lead to anorexia content, people like myself would applaud. And though I am not an activist, I'm generally interested in (the enabling of) wholesome discussions and interactions, i.e. things that promote a feeling of being in a community / society rather than feeling apart from it.

[0] https://www.theguardian.com/science/2019/feb/17/study-blames... [1] https://www.theguardian.com/technology/2021/oct/07/facebooks...



I think part of the disagreement here is you see a whistleblower, but I see an activist. One who frankly, if I were Zuck, I would have fired or simply never hired in the first place.

Arguing that Facebook causes tribal conflict in Ethiopia by not "policing aggressively enough" or "understaffing" teams is not, to me, the argument of a whistleblower. It's the argument of someone who has totally lost perspective, of a totalitarian who believes that any and all of humanities ills can be fixed by manipulating communication platforms. It's no different to saying "if the phone company cuts off any phone call in which people are arguing, there will be no more arguments and everyone will be happy". When phrased in terms of slightly older-gen tech it is obviously absurd.

"Were they to tweak the IG recommendation algorithm so that an interest in healthier eating did not lead to anorexia content, people like myself would applaud"

Good on you for being consistent then! Sadly it seems to be very rare. Look at Zuck's post. He points out that Facebook did in fact make changes to prioritize stories from friends and family, even though that reduced their income and reduced the amount people used the site i.e. a lot of users were actually people who don't care much about their cousin's cat pictures, but do care a lot about civics, or phrased another way, "divisive politics".

Yet it doesn't seem to have done them any good. For people like Haugen and a depressing number of HN posters it's not enough to re-rank nice safe family stories about new babies. For them Facebook also has to solve teenage depression, war in Africa and probably world hunger whilst they're at it. And if they aren't it's because they're "under-staffing" or refusing to "adequately police" things.


My perception is that people aren't expecting facebook to solve teenage depression, but to prevent themselves from contributing to it if they are. FB's research has been criticized by scientists as being of poor quality [0], and Zuckerberg claims the findings were cherry picked. This actually good news for FB if true. Should they partner with neutral, third party university research teams, as well as commit to a transparent investigation, they'll be able to clear things up. Not everyone would agree, but I believe that many people are capable of changing their minds when presented with new evidence.

The metaphor of a phone company cutting off an argument is an interesting one. I agree that people arguing is a fact of society / nature, and I also agree that cutting off a phone call seems like an absurd way to try and solve the larger problem. But at the same time, I don't think the metaphor fully applies, for the following reasons:

First, a phone call is a one-to-one communication, and Facebook is one-to-many. It's rare if not unheard of for strangers to call each other and say what they think about, for example, a NYT article. Second, there is no recommendation system pushing "engaging" subjects, where engagement can be defined in terms of how controversial it is. Third, only 9% of FB users speak english, and Haugen testified that the non-english safety features, tweaks to the ranking algorithm, and tooling are not as good (potentially drastically worse?) in non-english languages.

Most people would argue that phone companies have some responsibility to prevent spam calls, similar to how an email services prevent or flag spam emails. These are network level actions, and a lot of Haugen's testimony was about how FB was being irresponsible in this regard.

[0] https://unherd.com/2021/09/facebooks-bad-science/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: