It’s hard not to look at this performance and take away from it that most people really don’t care about privacy. As upsetting as the thought may be to some, Mark’s hypothesis that privacy is no longer a social norm we value seems to be proven more and more right. Despite all the scandals and hearings, their DAU are more than the population of the largest countries on Earth. Short of serious legislation, what slows them down? Negative press seems to be nothing but pebbles thrown at steamroller...
Since there's no real life downside to FB using your data to show you more relevant ads, people are making the rational trade-off. Privacy issues with FB are ideology based in nature.
There is research being done on targeted advertisement, and apart from the obvious effect of altering your behaviour and your attention, there has been research shown that it can go as far as changing self-perception.[1]
So to claim that there is no downside to targeted marketing is simply false. We have very little understanding what sort of effects it has on consumers, and as far as we know they're not all good.
There is no reason to believe any deliberation on part of consumers here is rational. The relationship between self harm and social media usage in adolescents are well documented as well.
So it is quite ironic to assert that being concerned about facebook's behaviour is 'ideological', when in fact advocates seem to willfully ignore evidence that suggests that we're playing dice with people's psychology here.
Fair enough, but it's up to each individual to learn about these details and then make a decision if they still want to use Facebook. The media is doing their job reporting on this, and everyone else can learn from this and decide that hey, maybe using Facebook products are not worth the harm. Or maybe they are fine with it and like the relevant ads and continue to use Facebook.
If social media makes you unhappy just stop using it.
>If social media makes you unhappy just stop using it.
This answer is far too simplistic for the reality of the situation.
For example, what of the people who never used it (or did stop using it) and their correlated shadow profiles?[0] What is the supposed answer, then: Don't give your contact information to your friends? Don't use the internet, at all, because of Facebook Pixel?[1]
...but the bulk-share of the problem, for me specifically, is that treasure-troves of information on people are the ripest targets for either exploit or out-right theft. See the OPM hack[2] for a principle example of such theft and then look at the Snowden leaks, where he shows that <insert three-lettered agency here> was in major tech companies' bases, killing their do0dz.[3]
The potential problems (and their requisite solutions) aren't as simple as you're trying to paint them to be.
To summarise this long diatribe: I don't pretend to have an answer, to be sure, but to say that stopping Facebook usage is sufficient is disingenuous to the realities of what the company does; especially, since data about you is still being collected anyway.
If you attribute so little agency to people that becoming a heroin addict becomes someone else's responsibility (barring physical addiction in the womb or being shot up at gunpoint repeatedly) then there's little in this world that we can control anyways.
Your conclusion is entirely consistent with the facts. There is little in this world that we can control. It's really important to stop the bad actors from taking the little away from us.
Curiously, what is that little we can control? One could come up with an argument about how we control exactly zero in life. However even if it's true in a way, it might not be so useful to believe that since having that belief will lead to worse decisions (and a lot of psychology research shows that a belief that you don't have control is highly correlated with depression).
What are you claiming the downside to targeted advertising is? A change in self perception from a single (dubious) study doesn't suggest negative effects.
Why do I say the study was dubious? They showed a small number of undergraduates a luxury watch ad and told them it was being shown to them based on their behavior. Those participants then rated themselves more sophisticated. Of course if someone frames it like that they would say they were sophisticated. What is the negative there?
Most people using FB aren't thinking that though, they're just silently ignoring the ads on the side of the content they actually want.
>Those participants then rated themselves more sophisticated. Of course if someone frames it like that they would say they were sophisticated. What is the negative there?
An altered self image as the result of merely staring at a luxury watch advertisement is more than a little conncering. If psychological self assessment changes even in the context of a small study, what do hours of this stuff per day do to the human brain?
the precautionary principle implies that we don't run an unsupervised experiment on the psyche of two billion people on the planet which primarily serves the purpose of distributing money to facebook.
No obvious downside for now. Discussions about Facebook are usually too shortsighted and look at timescales that are too short.
Large detrimental downstream effects and people's perceptions and trends all happen on timescales on the order of magnitude of a decade. Give it a few years.
This isn't about ideology, it is about pragmatics. The accumulation and analysis of vast bodies of behavioural data by bad actors will inevitably lead them to develop and deploy increasingly indirect and long-term methods of subverting individual agency, to their own benefit.
How about subtle data-directed lobbying to cause subtle data-directed changes to the educational system to create subtle data-directed vulnerabilities to manipulation in adulthood?
There are plenty of examples in the real world, throughout history. It is going to get a whole lot worse, and it has the potential to never get better.
Facebook is creepy. They engage in deep cloaked surveillance, capturing as much as technologically possible of people's behavior in the world and mining it for profit. They don't want the people to know how and to what extent they're mining their personal lives, because they know people would be disgusted by it.
IMO the core reason people keep using this shady service is that they've just never been exposed to the truth.
Its not that they dont care about privacy a number of people do as others have said their bedroom windows likely have curtains. They likely close their bathroom / bedroom doors for privacy.
The issue is they either dont truly understand the technical implications or they have already been so invested into Facebook personally and with no alternative to migrate to in sight (they bought Intagram which could of been one) they have no recourse currently. I hear it all the time here about events being vital or just connecting with family.
Well, you're the one pulling the word out of the hat. "Not truly understanding the technical implications" does not equal dumb.
As an anecdote, my father is a non-dumb person who at some point started using Facebook out of a combination of curiosity and a feeling of missing out. He can use a computer proficiently for the things he needs and is quite adept at learning to use it for new use cases on his own. Yet he is far from a technological expert with as good as overview of the whole picture. Once I explained it to him over a course of several conversations, he was concerned and stopped using it.
I share your fears, but I do have some doubts about their metrics (see the link to a chart in another thread).
But maybe (just dumping some thoughts), after the scandals, DAU and MAU might not be anymore as relevant as in the past for Facebook's data collection if users, who are now (supposed to be) more aware of their data being shared and used actively for not only good stuff, just keep using it for only e.g. trivial posts (e.g. "I bought a bike today" instead of "I support/hate the president blahblah") or maybe make it more challenging for AI/aggregation using "irony" (algorithm would understand it as it's written, but the human interpretation would actually be the opposite), etc... . But maybe I'm just a desperate deluded optimist :)
Alternatively, people do care about privacy, but they care more about the value that FB provides to them. And part of that value can be due to customer lock-in (customers want social networks where the people they interact with are).
Well, though privacy is kind of imprtant to most people, what alternatives do people have (other than stop using all social media?) I don't think the problem of Facebook is unique to them at all - data leak, fake news, whatever - they are instrinsically hard problem to solve, and frankly I don't think there would be lot of companies doing much better.
I don't think that's true. People clearly care about privacy otherwise we wouldn't have curtains and account access controls etc.
They just aren't worried enough about Facebook violating their privacy enough to stop using it. Frankly for most people that is the right decision. Facebook has done much less immoral stuff than the media would have you believe. Even the Cambridge Analytica type stuff (i.e. the obvious potential for abuse of the Friends API) was totally public at the time but nobody cared.
In the beginning, FB was college students only, then high school... Then everyone.
There was a FB group with a million users protesting the opening up of FB to their moms and grandmas.
Some friends and I created an alternative social network at our university to capitalize. In my opinion it looked better and had more features (before they opened up their API). We were college students only, and had SSL (can you imagine back then sites without SSL?!).
We were at the first techcrunch 50. We were located in "the pit".
We didn't gain much coversge we didn't get traction at our university and well, we flopped.
I talked to someone this morning who knew that people were vaguely worried about Facebook but was utterly horrified when I took the time to explain facebook/google and android/ios from a “privacy viewpoints” perspective. This is something the press doesn’t much do. It’s effective. Too bad :(
Who is "we"? You and I? Or "most people", the "average person", some abstract thing that doesn't even exist, determined by numbers? To shorten it extremely: Nobody has the right to piss away the future of humanity. If they do, they simply cease to be relevant to me, by definition. I will gladly fight them, but not ever ask for their permission or advice. Resistance to systematic mass surveillance and what hangs off that is not just a mere "social norm" like whether short or long skirts are acceptable. This is way more complex and important than all of the light-weights that don't consider it combined. Just being alive and wanting to be left alone with whatever apathy one ended up with isn't anything in the intellectual or moral arena.
I don't think the conclusion that "most people really don't care about privacy" is proved by people still using Facebook.
Just because my friend buys a pack of cigarettes a day doesn't mean he doesn't care about cancer.
My view: Loss of privacy is a negative externality of using Facebook. And we know companies can use shiny stuff (compelling-engaging products, PR, advertising, etc.) to overcome these kinds of negative externalities and keep customers coming back.
I do think that your point about serious legislation is correct. Cigarette smoking in countries with serious legislation IS down, the question is if the political will is there and if Facebook's power can or can't outweigh that political will.
Facebooks earnings are because advertisers spend more
Every court case, summons and data breach are instructions manuals and tools for advertisers. These are public and much more useful than an outdated Udemy course selling shovels.
Advertisers will continue spending more after each scandal, and they will be better targeted ads, making Facebook have more utility to users at the same time. Any proposed legislation ought to make advertisers spend more before the gravy train is over.
Facebooks share price increase cant be seen as validation from its profile creators which are the product. These numbers didnt jump they offset and stay the same.
It is good that people don't care about privacy. It means they are no longer afraid of being seen, that they have nothing to hide from the world, and in a way they are moving toward a more liberating existence.
Keeping so many aspects of your life private comes with a price. There is inconvenience, there is overhead, and there is always fear that one day your private assets will be compromised and laid bare.
The less private you are, the more you have to share, and the more you have to share, the richer and more meaningful your experiences will be: with other people, and even with other businesses.
But if you are very private, you have little to share. You miss out on the social conversation of humanity, and businesses will treat you as another generic faceless entity, throwing whatever crap they can at you hoping it sticks. Why would anyone want that?
I would put a slightly different spin on this thought. Most people's problem is that no one is paying attention to them. We struggle to get people to notice us and listen to us. Even if it's only an algorithm, I think most people would think being looked at and considered is a feature not a bug.
> But if you are very private, you have little to share. You miss out on the social conversation of humanity, and businesses will treat you as another generic faceless entity, throwing whatever crap they can at you hoping it sticks.
And mass surveillance and data collection combined with blackbox machine learning techniques are somehow producing a different outcome?
I think it is very misguided to conclude that people don't need privacy from all of this. Most of what people use Facebook and friends for is private, they just haven't been seriously and obviously burned enough from the fact that a large number of corporate entities are siphoning it up.
Maybe they'll never get burned enough. Maybe it all turns out well. But concluding that no people do not need privacy as a basic psychological need is wrong. I know I do and I don't have a bad social life because of it.
Its not that they don't care about privacy, its that they don't care for tracking the pages and showing ads. And if you do want to stop it you go into private mode.