Hacker Newsnew | past | comments | ask | show | jobs | submit | 1101010010's commentslogin

This is only the tip of the iceberg. A majority of the "user content" (product reviews, brand engagement, comments on social media sites, etc) is completely fake, bought for and paid by big money interest aka bots. No different than automated scans of the IPv4 address space: simple, unsophisticated, low-cost and without any repercussion.


I keep getting this eerie feeling when seeing product reviews and the rest of things you list here - I would never in a million year consider writing a paragraph long glowing review of my car jack on amazon, and yet there are millions of such interactions all over the web.


I was startled to recently discover that a local gastroenterologist has numerous excellent reviews on Google Maps by people talking about what a lovely colonoscopy they had, some of whom related the specific medical condition that brought them there. Never in a million years would I consider doing this either. It turns out people are pretty weird.


> I was startled to recently discover that a local gastroenterologist has numerous excellent reviews on Google Maps by people talking about what a lovely colonoscopy they had, some of whom related the specific medical condition that brought them there. Never in a million years would I consider doing this either. It turns out people are pretty weird.

I mean, you kinda are talking about visiting a colonoscopist on hacker news, so there's that.


For all we know, they may well have had an upper endoscopy.


Medicine is run by big business in the USA, I wouldn’t trust those reviews at all.


I've been burned to the tune of thousands of dollars because I took star ratings on Google at face value without looking into the actual reviews to see if they were real or not. I still feel like an idiot because of it.


It's such a double-edged sword anonymity on the internet, but it really is starting to come to be a massive problem where you have no idea whether someone is a person or a bot and now with LLMs, I fear the fake review spam will be next level where it'll be almost impossible to tell real from fake.


Is a paid fraudulent review written by a real person so much better than a LLM-generated one?


Simply put: limited liability, lobbying (political donations) and general ignorance of the subject matter by a majority of the population.


Limited liability only protects the "owners", not the employees, including the board.

So while piercing the LLC veil seems unreasonably hard, putting every individual involved in jail seems not unreasonable.


Yet reality defies your "reasonable" expectations.


Are there even IP cameras manufactured domestically to even choose? I have some inside my home (pointed at exterior doors; not inside bedrooms which seems borderline creepy and/or sadistic), but they're not connected to public networks and thus I don't consider them a risk.


Not at consumer price levels, no.

There are companies that make security hardware here in the US, but the market is mostly very high-end government and military customers. And TBH mostly they are still using Chinese components, just assembling stuff here in the US and marking the price up enough that they can justify calling it 'made in USA' by virtue of the value-add. Sometimes the software is coded or at least audited, though.

You can avoid the worst Chinese-made hardware if you look for "NDAA compliant" rather than US domestic manufacture. NDAA compliance means that a product isn't made by a number of prohibited Chinese suppliers who are known to be very thoroughly compromised (as opposed to the average level of compromise that you should assume most companies in China have... but China is a big place, so that difference isn't nothing).

Axis and Bosch both have NDAA-compliant product lines.


I was giving an analogy. You won't risk installing any camera near to your kids, so is similar is the risk of handing mobile phone with a camera, which is capable of spying them/software that modulate them all the time! Blue Whale was an example which forced people to commit suicide. In TikTok's case the probability of immoral use is the most. As its has a connection with CCP, or is obliged to co-operate if required with CCP. Not juts USA, many countries had its suspecision with this APP. Ignoring all that why would any one wanted to risk there kids with this one specific APP ?


Social media access to user data (whether foreign or domestic) is bad, but it is a distraction from the real evil: behavioral management at scale. When you can target specific demographics and control what they see 8-10 hours of the day, you can change what they think, say, do, and most importantly, how they vote. People are literally being programmed (euphemistically, "conditioned") by the specific triggers and stimuli with almost surgical precision, and completely unbeknownst to them. This is uncomfortable to recognize and discuss, so the conversation is sadly reduced to "China/AI is bad/evil" to further foment hate and division.


Regarding your comment about behavioral management at scale. Carl Sagan once said: "If we've been bamboozled long enough, we tend to reject any evidence of the bamboozle. We're no longer interested in finding out the truth. The bamboozle has captured us. It's simply too painful to acknowledge, even to ourselves, that we've been taken. Once you give a charlatan power over you, you almost never get it back."

It would be interesting to know if this were true in that case.


Or, as a recent politician evocatively put it: "“I could stand in the middle of Fifth Avenue and shoot somebody and wouldn't lose any voters, okay?"

There really is pain associated with acknowledging reality sometimes. I certainly know that from my personal life.

Wisdom comes from willingness to go through pain. Something our general culture ain't great at.


> Something our general culture ain't great at.

What culture is good at it? People in general try to avoid pain. This tendency is not unique to our culture, is it?


The US is uniquely oriented around pain elimination. For example, the opioid epidemic ravaging the US after the overprescription of painkillers is absent in most European and Asian countries, because they were not optimizing patient outcomes for pain elimination.


It may also be that working conditions, poverty, lack of vacation, and differences in the practice of medicine, put Americans in more physical and mental pain than the average European, and so are more likely to turn to painkillers.


Stoicism.


98% of the population gets its reality from authority (the opinions of the hive being #1). Independent thought is unimaginable. Not even on the table. May as well ask a grain of sand lodged in a eastgoing glacier to consider heading south.


This is pretty much Plato’s cave. But yeah, it seems to be absolutely true.


Not sure Plato’s cave applies here.


Upvotes and disagreeing comments.. hard to parse. Anyway, my understanding is that Plato’s cave refers to being bamboozled a core lie about the world, and then when someone tells you how you’re wrong they will reject it because they lack the ability or language to process the alternative view, or it conflicts with their identity that they’d rather stay in the world they know. In Plato’s cave, it’s iirc a former prisoner who tells the other prisoners, ie someone they previously trusted. Yet, they are still upset and violently reject his message.

Is that the same statement as Carl Sagan’s? No. I’m just casually associating. To me they both seem to cover the same aspect of human psychology, our core beliefs that basically cannot be altered later in life.

Moreover, it’s really interesting that people only seem to agree about this observation when it comes to other people–they are stuck in their ways, they are beyond salvation. Very few accept that they themselves suffer from this.


Forgive my dusty memory but I think the relation is that most people stay in the cave, not because they enjoy the experience or they're completely ignorant of the sky, they stay because they are unprepared, incapable, or unwilling to leave.

I doubt most people believe that TikTok is valuable to them but I'm sure many find it irresistible once they are hooked.


My understanding of Plato's cave is that it is about the disparity between the ideas in our head and reality. For example, you know what a triangle is, but there is no such thing as a triangle in the world, only imperfect shapes with three sides that approximate a triangle.

This means that it is not "most people stay in the cave" - it is, "we all stay in the cave because it is impossible to bridge the gap between theoretical construct and lived experience, but we all know about both".


It’s possible to draw multiple meanings from the same allegory


I thought it was a circle rather than a triangle.


Take that, Pythagoras!


You clearly skipped the footnotes.


Your understanding of the meaning of Plato's cave is questionable.


Hence all the addicts that are so quick to its defense.


In other words, cognitive dissonance.

https://en.wikipedia.org/wiki/Cognitive_dissonance


Indeed. Like thinking Tiktok is "such a serious danger" while having apps such as Spotify, Snapchat, and Amazon or Meta < anything > installed on our phones, using Google < anything >, or using devices such as "smart-tvs" and computers with Windows OS. Let's not forget the mild tinge of prejudice/racism that colors the messaging.

Dissonance is just that.


> When you can target specific demographics

> control what they see

> People are literally being programmed [...] by the specific triggers

I hate marketing/PR as much as the next person, but have become part of daily life. How do we get rid of it? Outlaw the practice of trying to influence people by marketing?

Edit: Rereading your comment I realize you're talking about something else, but I guess the same applies nonetheless.


How do we get rid of it?

Can you get rid of it? No. Doing so would result in the loss of many important rights and have unintended consequences. Not even China can get rid of this. BUT that doesn't mean you can't put regulations and limitations on them.

As one example, we may want to make laws that ensure that ads are easily recognizable as ads. I'm referencing Native Advertising. I want to use an example from the NYT[0] that is marked, to give an example of how nefarious this can actually be. The article itself only mentions the show once, in the middle, and mostly discusses women's lives in prison. It would not be surprising to believe that this is not an ad but actually a news story. It is both, but that's why it is nefarious. Is this ad easily recognizable? Even with the notice?

We can talk about dark patterns (native advertising might be one), and prevent many of them. Not allowing for bait and switches. Ensuring that options are easily conveyed. I don't think it matters which side of the political spectrum you're on or many of your philosophical ideals, but tricking people into buying things they don't want or need is not ethical. We live in a specialized world and one person can't be an expert in everything. If the game is supercomputers and teams of psychologists and lawyers against individuals then I think we all know this is an unfair game. We have to talk about how to level this playing field if we want to preserve individual freedoms and safety.

So I know this doesn't really answer your question, and the truth is that I don't have a good answer. I think the topic itself is surprisingly complicated and we need to think carefully about it. The path we're going down clearly isn't acceptable to most people. But overreacting will also be similarly bad. We need to have a tough social conversation and figure out what we want together. We have to learn, a lot, because this is nuanced. We have to be open to being wrong, with a focus on learning and improving rather than asserting our positions (because they are all wrong in some form or another). Which that might be the hardest thing of all, but if we can do this then we can solve a lot more problems. Maybe this is the great filter?

[0] https://www.nytimes.com/paidpost/netflix/women-inmates-separ...


While i think these are ideas worth considering, what I think is the answer is perhaps staring us in the face: put regulatory limits on the amount and kind of data that apps and websites can collect.

The real problem with tiktok is not the CCP; it seems likely in my mind that our own government has equally nefarious techniques at play in other countries, and I think its unfair to single out a single company over this or any other behavior that is otherwise legal.

So cut them off at the knees—make the behavior of tiktok illegal, for them and for any other of the thousands of companies doing basically the same thing. Pointedly i mean the extra-application data collection, cross-checking with third-party data miners (which should be illegal already), and the sorts of things we've just become accustomed to being par for the course.


> put regulatory limits on the amount and kind of data that apps and websites can collect.

Yeah, I would be in full support of this. I think there's a double edged sword that people are playing with and don't see the other edge. Any data that you use to control your population can also be used by an adversary for the same purpose. The same is true about encryption. We have two competing forces in our own government. Blue team and red teams. But we know red team gets a lot more money and is a lot flashier. Focusing all on red team is fun and exciting but makes you a glass cannon.


You would have to make it illegal to show different content to different users. Get rid of "the algorithm" and every website becomes a simple catalog of content.

I also think if you do any moderation of content, you lose your "common carrier" status and become a publisher, responsible for any content you publish.


Thought experiment,

Anyone every consider making a social media site/app like fb, tiktok, insta, twitter, where the user can control the algo, and or have sum input of the algo, in so much that the user can "control" what they see, still have ads [company gets paid] but the user can control those ads to a certain degree...[sort of like brave browser][but for social media]

Just wondering, not saying data collection is good, but perhaps, if it were more transparent and interactive, people would be more accepting to using and capitalizing on their own data. Value for value, the user gets to decide what data to share, and the company gets to push ads based on known algorithm unique to each user's approved data metrics... perhaps this already exists???

Is this a pipedream? Or a yes, yes, "if you build it, they will come" life changing moment? I need to know, it is important I change my outfit if it's the latter, athletic shorts and a tshirt, (in my opinion) don't convene much confidence when shopping around for angel investors... ;)


So distinguish between, you're seeing this content because a company paid us to show it to users like you, and because users like you watch similar things. what if someone pays to have similar users be shown things that give a certain impression? the advertiser didn't create the content or even choose what content, is it an ad?

how would you enforce that? without open sourcing it you'd have no way of knowing why a thing was recommended. giving access only to the government is not possible.


> Can you get rid of it? No

It's possible in some far future. Just rewire brains of people to ignore any kind of biases and susceptibility to manipulation. This new society would be 1000x times better than what we have now.


Not sure you can eliminate it, but you can certainly reduce the impact and scale by fighting anti-competitive behavior . A major reason this sort of mass manipulation is so lucrative and effective is because you only have operate on a couple of platforms to reach a majority of eyeballs.


A good approach is to teach something like this in public schools.

https://www.yourarticlelibrary.com/sociology/propaganda-7-mo...

It seems to have vanished from curricula and I wonder why. It's not as though anyone stands to benefit from an electorate that is less capable of identifying propaganda.


I feel like a large push from the AD council would be appropriate. (Not sure what the exact message would be) It feels like some of the largest industries rely on the masses being easily programmable. TV, Radio, Billboard, basically advertisements. Most are designed to make you give them your money.

It's easy to spot and ignore when you realize it's happening.


> How do we get rid of it?

Put down your phone.


Hold onto the phone and just don't use social media.


You get points for pithy snark, but given that the problem predates smartphones, this isn't likely to solve it.


No, it really is that simple.

Never before has a foreign entity been able to deliver personalized content at the individual level. Not at this scale. The only way a device could get any more embedded would be through rectal insertion.

Not every problem can be solved. Sometimes the best we can do is mitigate.

[Smart]phones are spyware in every pocket...by design.


That solves "your" problem.

It doesn't solve the problems for your democracy.

We want to limit the ability of the wealthy and foreign nations to control the mob.

We want the mob to be well adjusted, well educated, happy, and kind.


The "mob" is uneducated, stupid and prone to manipulation. The "mob" attacked Poland from the east in the XX century.

We have to start treating the mob as individuals. Putting down the phone is tantamount to having a grasp on reality.


The smartphone enables extreme personalization of the bamboozle. Maybe not that simple but would go a long way.


>How do we get rid of it?

Start by not letting it get its hooks in children and young adults.


> How do we get rid of it? Outlaw the practice of trying to influence people by marketing?

Perhaps the first step towards getting rid of it is realizing that it is not "we" who legislate things. That is, not a collective including yourself which has your best interest in mind to a significant extent. "We" have to face the reality in which "they" set up this system - legislatively and commercially.

And I don't mean some evil cabal; you (or me) sometimes participate in the activity of "them". It's just that "we" need to stop identifying "their" actions as what "we" decided to do.


> Outlaw the practice of trying to influence people by marketing?

If and when that is an option, yes definitely. What are we waiting for? And how is this even up for discussion?


We handle it through education. Informing people. Making them aware.


This is not something new. History shows that when the government controls the media, it can control people's opinions and beliefs. The only difference is that now the control over TikTok users' minds is not in the hands of US Government and that's why they are unhappy.


The Century of the Self by Adam Curtis is a great doc on how this took shape in the early 20th century; its primary focus was the double nephew of Sigmund Freud who invented public relations and was a powerful political consultant for many US administrations.

One of his main areas of research was using media as a tool for crowd control... he had a somewhat noble viewpoint about it, believing that large crowds by nature devolved into anarchy: "Intelligent men must realize that propaganda is the modern instrument by which they can fight for productive ends and help to bring order out of chaos".


This guy's a big part of the reason America is so deeply screwed. America's eating habits have been artfully manipulated to the point that dietary common sense in the US is likely among the lowest in the Western world.


> This time it'll be different, we're way better than the last bunch to do it, double-promise!


What do you mean devolved into anarchy?


That the larger the population got, the harder it would be to control, politically or otherwise. That there would be all sorts of competing factions for mindshare and ideology, radicalization, etc, in a fully free society.

He more or less viewed consumerism as a means pacify these tensions, to indirectly exert control... hence the development of closer ties between big business and the government.


Propaganda is such a destructive tool, it's a one way trip. Just being forced to pondering if you have been manipulated in some capacity, for example while working in the offices, is damaging to psyche and undermining someone's sense of the world and even sense of self, but now it's additionally done at literally global scale. We're kicking the ground out from our own feet and adamantly keep searching for a reason everything isn't right.


Is there a political party that the Chinese government considered less hostile than the other?


Correct. The other issue is to not develop pro-CCP rhetoric out of this false dichotomy too. It’s all just varying degrees of social pollution created by some propagandists.


In particular, the lawsuit alleges they are in compliance with a Chinese law that sounds identical to the US CLOUD Act.

Is that terrible? Of course. However, US law is as bad or worse.


how does this differ from newspapers, books and the spoken word?


Newspapers aren't nearly as effective at keeping attention indefinitely/at every opportunity, nor at controlling what you read next.


Fox news seems to have done a pretty good job of this in the US


Presumably most viewers have chosen that echo chamber, so it really isn’t the same to compare it.

Also Fox News has only a few million viewers each night e.g. “Fox News Channel coasted to an easy win in prime time Monday night, delivering an average total audience of 2.351 million viewers”.


Fox News isn’t a newspaper


Newspapers don’t modify what I see in the next 60 seconds based on my response to the previous paragraph.


Reach, cost and effectiveness. Newspapers can't follow you around like social networking sites do. A handful of Twitter/FB accounts can do an amount of damage that newspapers can only dream of.

None of which is to say newspapers are great. After all, Murdoch honed his skills in print media first. Just that they are nowhere near as effective as online


A newspaper owner, William Randolph Hearst, once started a war with Spain.


Yes I know, that’s why I mentioned Murdoch. He honed his villainy in print, before moving to TV and internet.


It doesn't. "Propaganda" is a western word for any argument coming from the eastern enemies of the state, and "Brainwashing" is a western word for being convinced by them.

The way we control behavior is by depriving people of unfiltered information, not showing them cat videos and remembering if they liked them.


The modern sense of the word "propaganda" emerged during the first World War to describe information deliberately disseminated to influence political opinion. I'm pretty sure that's still what people mean when they use the word.

Usage then and now also conveys a sense of purposeful distortion or fabrication.


Not everywhere, which is a good thing.

A year ago I saw (and thoroughly enjoyed) an exhibition of 80's arts posters in Spain. What struck me were the description labels, and especially the terminology used. Where the English part described something as "advertising", the Spanish descriptions unashamedly used the word "propaganda".

Let's not fool ourselves. The mechanisms of advertising have been lifted, adapted and further weaponised from war-time propaganda, or as we'd call them these days, influence operations.


...the Spanish descriptions unashamedly used the word "propaganda"

That's not what you think. At the time it was common to use that word instead of publicidad to mean advertising. A construction typical from Latin, propaganda just meant it's made to be propagated, similar to addenda, Amanda or Miranda.

Now it's limited to politics in Spanish too. People working with ads didn't like the connotations of the term, understandably :)


I was pretty amused when I first visited China and saw that the university communications/marketing department translated its name to "Propaganda Department".


In Brazilian Portuguese advertising is still most commonly called “propaganda”.

But it also has the bad connotation when it’s used in the context of politics.


You have to actively seek it out, or be told about it. Rather than it being presented in your face the moment you wake up out of bed. Referring to folks so addicted it's the first thing they grab, see all the notifications, and are back at it by morning time.


For newspapers, it was relatively easy to see and compare the content of the different papers side by side.

I am not sure how I would jump out of my social media news bubble.


They don't directly tickle the reward center of the brain like social media does, cf. e.g. "Brain anatomy alterations associated with Social Networking Site (SNS) addiction" [0].

This is by design, btw, cf. e.g. "Digital Madness: How Social Media Is Driving Our Mental Health Crisis--and How to Restore Our Sanity " [1]; not itself a primary source but it seems to be well received.

My point being, it's like cigarettes with a message. The message being divisive in all likelihood, in order to override rationality with emotion and increase engagement. [2]

[0] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5362930/

[1] https://www.amazon.com/Digital-Madness-Driving-Crisis-Restor...

[2] How social media shapes polarization: https://drive.google.com/file/d/1cHDXsRpzZ84svIv5L-7tfnOvJ-p...


Just look at how people were programmed into believing that masks were bad. At best they could save lives, at worst they were an inconvenience. But some people acted like you had just punched them in the face if you asked them to wear one

And why? Because certain politicians and media outlets decided to randomly make it a political issue, and then suddenly you have people angrily spouting all kinds of crap rather than be slightly inconvenienced and have to smell their own breath


Also on the front page:

TikTok Feeds Teens a Diet of Darkness

https://news.ycombinator.com/item?id=35932387


I wonder if full homomorphic encryption plus ZKP are the way to go and if time will prove it in this decade.


Furthermore you can target specific persons : heads of states, gov personnel, parliament representatives,... and their families, friends. You can get them "conditioned", you can drive them to commit errors for the purpose of black mailing them, etc...


Brings to mind that old leaked Google internal 'thought experiment' video "The Selfish Ledger": https://www.youtube.com/watch?v=QDVVo14A_fo


I think this is always been the case with media going back as far as language existed. We are more aware of it now.

the big innovation recently has been the ability to manage content of peer to peer communication as well through social media.


Absolutely correct. Along with the usual reduction, there is also the, "Actually, it's too free speech that is bad we need censorship." Meanwhile, no issues with operant conditioning from the Skinner Box Phone.


This is frequently asserted as fact, and I don't disagree that it is possible. But do you have some specific evidence that you can provide?


8-10 hours a day? Basically nobody is using social media this much.


FOX news and CNN don’t change what people think/how they vote?


When you hear something outrageous on FOX or CNN, you yell "bullshit" at the TV. When you read the same thing on Facebook and see 20 of your friends positively interacting with the news story and showing their approval, you remain quiet at best, join the lunacy at worst. What you don't see is the three shadowbanned accounts explaining why it's lunacy.


If the last decade has taught us anything, it is that a lot of people will not "yell bullshit at the TV".


They obviously just change the news network until they get the bullshit they prefer.


Most people will read a room and refrain from sharing politically-unpopular opinions that invite public and private retribution from a deranged mob.

Silence is not implied agreement. What the internet masses say is popular and what people actually vote for are two very different things.


There is a famous psychological experiment done by Solomon Asch (Asch Conformity) that demonstrates this behavior.

https://en.wikipedia.org/wiki/Asch_conformity_experiments


One of many reasons why shadowbanning is bad and outdated practice.


How much traction could a lawsuit against Reddit, Twitter, etc. have against the practice of banning or shadowbanning?

If a user could show good faith participation, could they claim they've been prohibited from exercising freedom of expression in a public forum?

Suppose someone was banned from a subreddit for their particular hobby or, worse, city or region. This might be the single biggest forum for that person to address their neighbors and peers, and banning could prohibit their ability to find work, housing, opportunities, etc.

Moderators often ban users on a whim. Sometimes they ban users for merely commenting on other items or subreddits that they deem "wrong", and this practice is often automated.

If you can't sue Reddit, could you sue the moderators?


I'm interested in participating in a class action suit. They are a public forum.


Are you talking about the First Amendment?


Yes. How close are these platforms to being de facto public squares?

If you're banned from /r/sanfrancisco etc., what do you do? Your voice and ability to participate in the community has been blinded and muffled.

Reddit and Twitter are bigger than Reddit and Twitter. If you're banned, you have less of an ability to participate in modern life. Events, jobs, commentary, and more are gone. There is no alternative, because platforms Hoover up as much as they possibly can.

Ideally these platforms would be protocols, but in the meantime the common carriers that operate them should be held to preserving accessibility.

Moderation isn't easy. It should probably be an order of magnitude more expensive than it already is so that safeguards against "personhood erasure" can be put in place.

You don't want racists, trolls, and bigots spouting hate speech, but you also need to keep the lines open for when these individuals are behaving. Because the pendulum swings and sometimes you find yourself on the other side of the censorship zeitgeist.

Perfectly salient thoughts and people can be memory holed. And that's not just a possibility - it's happening right now.


> If Fox News had a DNA test, it would trace its origins to the Nixon administration. In 1970, political consultant Roger Ailes and other Nixon aides came up with a plan to create a new TV network that would circumvent existing media and provide "pro-administration" coverage to millions. "People are lazy," the aides explained in a memo. "With television you just sit — watch — listen. The thinking is done for you." Nixon embraced the idea, saying he and his supporters needed "our own news" from a network that would lead "a brutal, vicious attack on the opposition."

https://theweek.com/articles/880107/why-fox-news-created


As if that's relevant or excuses this behavior.


But they're domestic companies whose staff members, all the way up to the owners and CEOs, are not under threat of disappearance by the totalitarian regime of a hostile state...


Are you suggesting that FOX and CNN are good things? If it's possible, I'd say we should ban anything working in this way.


Yeah totally possible. We’re deciding this right here on this thread.


They're all doing it. Big tech, social media, left/right.


There is no left in the US


_rolling eyes emoji_

Yes, as has been pointed out countless times, the US left would be considered further right than the Nazi party to all you enlightened Europeans. We know, we get it. Doesn’t change the fact that there is something called left wing politics in the US, and it’s considerably different from right wing politics in the US.


There is nothing called left wing politics in the US. Just people with their heads so far up their own ass they don't realize how far right the US is.


FOX and CNN are at least held to a higher standard. We need to apply those same standards to the internet.

If they tell an outrageous lie on CNN they can be sued.


Whataboutism


It may be new to the extend it's hyper centralized indeed.


Memetic warfare I think they call it.


AKA advertising


Highly recommend reading The Age of Surveillance Capitalism on this topic


This is exactly what Russia did in the 2016 U.S. presidential campaign. The Trump campaign gave voter polling data to Konstatin Kilimnik. Russia then proceeded to exploit Facebook to target swing voters, and Trump won because of 80k voters across Michigan, Pennsylvania and Wisconsin.


Russia's reach on Facebook was minimal and likely did not meaningfully alter the outcome of the 2016 Presidential election.

See https://www.intelligence.senate.gov/publications/report-sele...

See also https://www.nytimes.com/2017/11/01/us/politics/russia-2016-e...


That is one of thousands of reasons, yes.


No they didn't. Source.


been saying this for years...

the response is always: "they didn't spend enough money to sway enough people"

but who did they target? which zipcodes or DMAs? specific UIDs? Trump won be 80k votes in the swing states.


This is literally just the anti-China "brainwashing" propaganda warmed up for a new paranoid generation. I used to collect John Birch and anti-communist publications from the 50s for their histrionic historical value, but now I'm thinking I should start reprinting them and changing the dates. Both Democrats and Republicans would eat it up.

edit: somehow Cambridge Analytica's bullshit marketing material combined with The Manchurian Candidate in the boomer mind.

-----

edit: https://www.smithsonianmag.com/history/true-story-brainwashi...

> “The basic problem that brainwashing is designed to address is the question ‘why would anybody become a Communist?’” says Timothy Melley, professor of English at Miami University and author of The Covert Sphere: Secrecy, Fiction, and the National Security State. “[Brainwashing] is a story that we tell to explain something we can’t otherwise explain.”

> The term had multiple definitions that changed depending on who used it. For Hunter—who turned out to be an agent in the CIA’s propaganda wing—it was a mystical, Oriental practice that couldn’t be understood or anticipated by the West, Melley says. But for scientists who actually studied the American POWs once they returned from Korea, brainwashing was altogether less mysterious than the readily apparent outcome: The men had been tortured.

> [...]

> Meanwhile, the American public was still wrapped up in fantasies of hypnotic brainwashing, in part due to the research of pop psychologists like Joost Meerloo and William Sargant. Unlike Lifton and the other researchers hired by the military, these two men portrayed themselves as public intellectuals and drew parallels between brainwashing and tactics used by both American marketers and Communist propagandists. Meerloo believes that “totalitarian societies like Nazi Germany and the Soviet Union or Communist China were in the past, and continue to be, quite successful in their thought-control programs… [and] the more recently available techniques of influence and thought control are more securely based on scientific fact, more potent and more subtle,” writes psychoanalyst Edgar Schein in a 1959 review of Meerloo’s book, The Rape of the Mind: The Psychology of Thought Control—Menticide and Brainwashing.


If your democracy can be thwarted by free speech from a bad actor, then your democracy is shit.


If you’re an adult watching TikTok 8-10 hours a day that’s on you to fix. When I was at my “peak”, I was watching 2-3 hours a day and that felt like a lot.

If a kid is watching TikTok 8-10 hours a day, that’s on the parents to fix.

This is like the hot dog man meme. “Who’s responsible for this?” Ultimately you are in control of your eyeballs and your time. Stop watching!

Or not. TikTok is awesome. Watch it 8-10 hours a day if you want.


> Ultimately you are in control of your eyeballs and your time. Stop watching!

There are people making hundreds of thousands of dollars whose sole job is to get you locked into a feedback loop in these apps. We are engineering addiction.


The addiction is a side effect of a society built on exploitation. In other words, capitalism is literally the problem.


Does addiction exist in non-capitalist countries? Could you expand because I'm not sure if I understood the point you're making.


Why would TikTok or FaceBook not engineer addiction? It generates more money for them. If they don't engineer addiction, another company will emerge and engineer addiction.

I think this type of problem has to be solved by the government.


My point was, my parent was in a sense blaming the victim for not peeling their eyes away from the screen when literally millions of dollars of thought and effort went into making sure that they don't.

I agree that in the absence of direct negative consequences it seems unlikely we will see a change in the status quo.


No, to be clear I wasn’t “in a sense” blaming the victim. I’m directly blaming anyone who hates their relationship with TikTok and leaves the app on their phone and continues to watch it.


Do you similarly blame people stuck in abusive relationships or those who abuse substances? You haven't addressed the fact that the app on their phone is an engineered pachinko literally designed to glue you to the screen.


No. Just because I blame one set of victims doesn’t mean I blame all sets of victims.

And I don’t need to address that fact because the root of the issue is the person installing the app. It doesn’t matter how well engineered the pachinko machine is if I can long-press it out of my existence.


Should all addictive products be allowed? When does such a product deserve being regulated or outlawed?


You regulate or outlaw products when they negatively impact self or public health or safety.

There's a brief scene on Bojack Horseman involving a commercial for some chicken product. In it, a kid yells at his parents "I don't want to go to school, I want Chicken-4-Dayz!"

Some kids are badly-behaved. They get a lot of validation and reinforcement of their behavior from these platforms, especially since any disciplinary misstep by a parent invites CPS visits. But we should ask why products like "Chicken-4-Dayz" influence children to reject their own actual needs and tone that shit down.

I hear a lot of stories from teachers about teenage boys coming to school exhausted beyond functioning. Child labor abuses from working extra shifts at the factory? No. They're up all night all week playing Call of Duty.

When people opt to consume a product instead of doing the things they need to do to survive past their consumption, it's addiction. All controlled substances have this trait. Given an infinite supply of amphetamines, most people will dehydrate or starve to death.


I'm not sure why you're being downvoted, possibly because the message is uncomfortably paternalistic if taken to its logical conclusion? I don't think that government should serve the role of surrogate parent but at the same time recognize that there exists substances (physical or otherwise) from which some humans have an incredibly difficult time tearing themselves away once they've been exposed (and there likely exist substances which any of us would find hard to deny after exposure).

At the same time, is it the role of the state to prevent people from realizing their own destruction? And further, what is the role of the state in regulating things that were designed outright to be as addictive as possible?


Many people here have lucrative careers that depend on maintaining the status quo, so heretics are unpopular.

> I don't think that government should serve the role of surrogate parent

Neither do I, but at some point it needs to be a backstop against implosion of the country. We're being subject to the Opium Wars playbook (brought to you by TikTok: China's Revenge).

What purpose does government serve if not doing something to mitigate?

It seems most interested in facilitating this behavior for the sake of economic growth, but that engine is destined to seize. Money in the hands of the middle/lower classes is the oil that keeps it running.

> And further, what is the role of the state in regulating things that were designed outright to be as addictive as possible?

So far we've managed it with alcohol, tobacco, slot machines and hard drugs. The role of the state on this matter is pretty well-defined.


Reminds me of the famous saying,

> I’m from the government and I’m here to help.


It's also OK to go after the shady character spending all its effort manipulating and spying. I get lots of things do their best to do this, but this is the CCP in your living room. There should be a line somewhere.


OK but if you don’t want the shady character in your living room, don’t let them in. Don’t complain when somebody else doesn’t close the door you left wide open.


To paraphrase the Buddha, you'll only give up your drug when you find a better drug.

TikTok is a heroin airplane with big brother in the pilot seat.

What we need is a better drug.


> I'm starting to become fatigued

> Causing anxiety

The news is literally designed to do that to you. Stop consuming so much of it for your own mental well-being, and that of everyone else you interact with.


Just buy tech used, save money and reduce dependence on slave labor. Swappa has been good for me.


I hoped someone would mention Swappa, I've managed to get some fantastic deals on year-old devices.


I don't even have a case on my phone right now because I know I can just replace it from Swappa for cheap. Thank goodness for people like OP who keep buying brand new phones and giving up most of their value.


And what I do on my phone is some web browsing, email, chat/messaging, YouTube, navigation etc. which has been a completely great experience for many generations. I don’t have any slowdowns on my iPhone SE 2020 and I only upgraded because the 2016 used SEs had bad long term batteries which would die on me.


Swappa looks interesting. I used to use Glyde some years ago which looks like the same concept. Then one day Glyde was gone. I bought and sold a bunch of tech stuff on there.


Most of the nefarious tracking is taking place via canvas fingerprinting these days, clearing cookies does nothing but make the unaware user feel good they're "doing something".


Care to elaborate?


https://browserleaks.com/canvas

Open it in a regular and "incognito" browser tab. This is a long term identifier of your browser which persists across cookie clearing, IP address changes and other ritualistic totems privacy-seeking individuals still inexplicably cling to.


How do one protect against this?



Simply don't allow Javascript.


Unfortunately, even that might not be enough these days with newer fingerprinting methods.


People who use multiple "angry face" emojis to express themselves are generally moody, petulant and emotionally unfit to work among highly compensated engineers. Instead of whining, adapt and overcome the circumstances you recognize in your professional life.


Manufactured outrage to distract from the far more sinister and subtle behavioral management constantly at work on google users.


Externalizing responsibility is the cornerstone of postindustrial capitalism. To deny that privilege would collapse the entire system.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: