Hacker Newsnew | past | comments | ask | show | jobs | submit | ulchar's commentslogin

Emotions are an axiom to convey feelings, but also our sensitivity to human emotions can be a vector for manipulation.

Especially when you consider the bottom line that this tech will be ultimately be horned into advertising somehow (read: the field dedicated to manipulating you into buying shit).

This whole fucking thing bothers me.


> Emotions are an axiom to convey feelings, but also our sensitivity to human emotions can be a vector for manipulation.

When one gets to be a certain age one begins to become attuned to this tendency of others' emotions to manipulate you, so you take steps to not let that happen. You're not ignoring their emotions, but you can address the underlying issue more effectively if you're not emotionally charged. It's a useful skill that more people would benefit from learning earlier in life. Perhaps AI will accelerate that particular skill development, which would be a net benefit to society.


> When one gets to be a certain age one begins to become attuned to this tendency of others' emotions to manipulate you

This is incredibly optimistic, which I love, but my own experience with my utterly deranged elder family, made insane by TV, contradicts this. Every day they're furious about some new things fox news has decided it's time to be angry about: white people being replaced (thanks for introducing them to that, tucker!), "stolen" elections, Mexicans, Muslims, the gays, teaching kids about slavery, the trans, you name it.

I know nobody else in my life more emotionally manipulated on a day to day basis than them. I imagine I can't be alone in watching this happen to my family.


What if this technology could be applied so you can’t be manipulated? If we are already seeing people use this to simulate and train sales people to deal with tough prospects we can squint our eyes a bit and see this being used to help people identify logical fallacies and con men.


That's just being hopeful/optimistic. There are more incentives to use it for manipulation than to protect from manipulation.

That happens with a lot of tech. Social networks are used to con people more than to educate people about con men.


[flagged]


> not wanting your race to be replaced

Great replacement and white genocide are white nationalist far-right conspiracy theories. If you believe this is happening, you are the intellectual equivalent of a flat-earther. Should we pay attention to flat-earthers? Are their opinions on astronomy, rocketry, climate, and other sciences worth anyone's time? Should we give them a platform?

> In the words of scholar Andrew Fergus Wilson, whereas the islamophobic Great Replacement theory can be distinguished from the parallel antisemitic white genocide conspiracy theory, "they share the same terms of reference and both are ideologically aligned with the so-called '14 words' of David Lane ["We must secure the existence of our people and a future for white children"]." In 2021, the Anti-Defamation League wrote that "since many white supremacists, particularly those in the United States, blame Jews for non-white immigration to the U.S.", the Great Replacement theory has been increasingly associated with antisemitism and conflated with the white genocide conspiracy theory. Scholar Kathleen Belew has argued that the Great Replacement theory "allows an opportunism in selecting enemies", but "also follows the central motivating logic, which is to protect the thing on the inside [i.e. the preservation and birth rate of the white race], regardless of the enemy on the outside."

https://en.wikipedia.org/wiki/Great_Replacement

https://en.wikipedia.org/wiki/White_genocide_conspiracy_theo...

> wanting border laws to be enforced

Border laws are enforced.

> and not wanting your children to be groomed into cutting off their body parts.

This doesn't happen. In fact, the only form of gender-affirming surgery that any doctor will perform on under-18 year olds is male gender affirming surgery on overweight boys to remove their manboobs.

> You are definitely sane and your entire family is definitely insane.

You sound brave, why don't you tell us what your username means :) You're one to stand by your values, after all, aren't you?


Well said, thank you for saving me from having to take the time to say it myself!


[flagged]


Well, when you inquire someone why they don't want to have more children, they can shrug and say "population reduction is good for the climate" as ig serving the greater good, and completely disregard any sense of "patriotic duty" to have more children like some politicians such as Vladimir Putin, would like to instill. They can justify it just as easily as you can be derranged enough to call it a governemnt conspiracy.


[flagged]


You say that but you clearly hate your own race. Why are you contradicting yourself?


Sorry mate I don't engage in weird identity politics like you do. Great Replacement is a conspiracy theory, full stop.

Why did you pick that username?


[flagged]


The question makes no sense. You've just asked me whether I plan to walk off the eastern or western edge of the planet.

Why did you choose that username?


With AI you can do A/B testing (or multi-arm bandits, the technique doesn't matter) to get into someone's mind.

Most manipulators end up getting bored of trying again and again with the same person. That won't happen if you are a dealing with a machine, as it can change names, techniques, contexts, tones, etc. until you give it what its operator wants.

Maybe you're part of the X% who will never give in to a machine. But keep in mind that most people have no critical thinking skills nor mental fortitude.


Problem is, people aren't machines either: someone who's getting bombarded with phishing requests will begin to lose it, and will be more likely to just turn off their Wi-Fi than allow an AI to run a hundred iterations of a many-armed-bandit approach on them.


Probably there will more nuance than that. And doomscrolling is a thing, you know.


I think we often get better at detecting the underlying emotion with which the person is communicating, seeing beyond the one they are trying to communicate in an attempt to manipulate us. For example, they say that $100 is their final price but we can sense in the wavering of their voice that they might feel really worried that they will lose the deal. I don't think this will help us pick up on those cues because there are no underlying real emotions happening, maybe even feeding us many false impressions and making us worse at gauging underlying emotions.


> Especially when you consider the bottom line that this tech will be ultimately be horned into advertising somehow.

Tools and the weaponization of them.

This can be said of pretty much any tech tool that has the ability to touch a good portion of the population, including programming languages themselves, CRISPR?

I agree we have to be careful of the bad, but the downsides in this case are not so dangerous that we should be trying to suppress it because the benefits can be incredible too.


This. It’s mind boggling how many people can only see things through one world view and see nothing but downside.


The concern is that it's being locked up inside of major corporations that aren't the slightest bit trustworthy. To make this safe for the public, people need to be able to run it on their own hardware and make their own versions of it that suit their needs rather than those of a megacorp.


this tech isn't slowing down and our generation maybe hesitate at first but remember this field progressing at astonishing speeds like we are literally 1 generation away


Why can’t it also inspire you? If I can forgo advertising and have ChatGPT tutor my child on geometry and they actually learn it at a fraction of the cost of a human tutor why is that bothersome? Honest question. Why do some many people default to something sinister going on. If this technology shows real efficacy in education at scale take my money.


Because it is obviously going to be used to manipulate people. There is absolutely 0 doubt about that (and if there is I'd love to hear your reasoning). The fact that it will be used to teach geometry is great. But how many good things does a technology need to do before the emotional manipulation becomes worth it?


I don't think OpenAI is doing anything particularly sinister. But whatever OpenAI has today a bad actor will have in October. This horseshit is moving rather fast. Sorry, but in two years going from failing the turing test to being able to have a conversation with an AI agent nearly indistinguishable from a person is going to be destabilizing.

Start telling Grandma never to answer the phone.


AI is going to be fantastic at teaching skills to students that those students may never need, since the AI will be able to do all the work that requires such skills, and do them faster, cheaper and at a higher level of quality.


One may also begin to ask, what's the point of learning geometry? Or anything, anymore?


sounds like a fun project for someone. you could go so far as to direct the user to the proper location on a shelf.


this one's crazy lol.

frankly it sounds kind of fun, but i'm sure it was not in reality.


It was actually pretty entertaining. This was almost 20 years ago.


Did you meet the coworkers who were complaining at the coffee brewer or sth like that, and overheard anything they said about it? If so, must have been an odd feeling, and hard to not smile and say something


Not that I remember. It was a company with about 2,000 employees and I generally stayed on the IT floor.


I know these are the times of big machine learning models, but my favourite kind of AI are the classics like CSAT solvers, minimax with alpha beta pruning, markov chains, and good old search trees. These algorithms have superhuman powers solving so many challenging games and problems just by converting the problem space into a search tree. Love it.


Is Delaware a good place for a startup? Or am I missing the joke.


Many companies incorporate in Delaware for benefits such as loose tax laws.


that's some mind you have

seeing into the future like that


can you tell us a little bit about what's going on here


Sure! It’s a hierarchical numbered outline of the emotion words according to time (past, present, future), then valence (positive, neutral, negative) and finally, locus of control (internal or external).

That’s cool because it lets us understand emotion in terms of logical relationships between points in a geometric space.

When we understand emotion in this way, it becomes clear how emotion isn’t separate from logic, but rather, emotion IS logic. Logic of the memory, experience, and prediction of pleasure, “zen,” and pain!

Consider Star Trek. Commander Data wouldn’t exist without emotion, he couldn’t, it wouldn’t make sense, that would imply he had no world model about time (duh) and good vs bad.

Future Good: Hope Future Bad: Fear

I dare you to find one emotion word that doesn’t fit this framework.

The consequences for AI could be huge. Specifically, we may not be able to avoid developing emotional AI, because emotion emerges directly from the structure of reinforcement learning over time.

We might be forced to deal with “scared AI,” whether or not we want to deal with it or want to create that, because fear is a generic term for the apprehension of future loss. Don’t expect AI to be unfeeling. Evolution may work by repeated accidents, but the cumulative result is NEVER an accident.

Sorry for the formatting, but the content is crucial, and I find it infinitely more clear than “emotion as arousal” (the referent of the “James-Lange Theory” eponym) — that’s overly simplistic.


I agree. But platform dependant throttling should be prevented by good regulation, if that means Youtube's business model is less viable I frankly don't care.

All I know is I'm keeping my eyes open for their next swing at remote attestation. If you follow the money, it's a logical step.


Look up Omar Khadr [1]. If the United States couldn't get an extradition for that kid, could India get one for Nijjar? I doubt it. Not saying as justification for the alleged actions of the Indian government, not in the slightest. I only suggest that Canada wouldn't necessarily co-operate eagerly.

[1] https://en.wikipedia.org/wiki/Omar_Khadr#


Really, Khadr?

The guy was brainwashed as a child by his lunatic father and dragged to Afghanistan to resist American's invasion. There was some kind of firefight, he nearly died, and the Americans decided that since he survived he was guilty of all the crimes.

He was 15. A kid. A child soldier, by international law.

He was held for a decade in Guantanamo Bay, a US military facility, without any real trial.

The US already had him. They abused his rights and ignored international law. Why should Canada give him back to them?


Never said that Khadr should've been extradited, he shouldn't have at all and I support the Canadian government decision. But the claims of the Indian government against Nijjar are also dubious which points to the likelihood that any diplomatic transfer was a non-starter.

Of course, the difference here was that the USA attempted a diplomatic process and did not execute a citizen on Canadian soil.


So India couldn't extradite him. There are many people North Korea would like to punish but can't extradite.

You seem to be implying that if India couldn't extradite him, they should kill him.


He already said that he isn't intending that at all and is just pointing out that an extradition probably wouldn't pan out.


Perhaps the ability to work fewer hours. As the number of 'human' jobs reduce (consider the number of bullshit jobs that exist already). I would gladly accept this trade off. To me, time is more precious than things. Of course, on the presupposition that we can live somewhere between basic needs met and the modern luxury of today's North American. I can have lot's of fun cheaply: going outside, playing cards, playing music. Maybe I am being naive.

For now, I feel we have so much work ahead of us.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: