Ohhhhhhhh, boy... Listening to all that emotional vocal inflection and feedback... There are going to be at least 10 million lonely guys with new AI girlfriends. "She's not real. But, she interested in everything I say and excited about everything I care about" is enough of a sales pitch for a lot of people.
I thought of that movie almost immediately as well. Seems like we're right about there, but obviously a little further away from the deeper conversations. Or maybe you could have those sorts of conversations too.
This is a kind of horrifying/interesting/weird thought though. I work at a place that does a video streaming interface between customers and agents. And we have a lot of...incidents. Customers will flash themselves in front of agents sometimes and it ruins many people's days. I'm sure many are going to show their junk to the AI bots. OpenAI will probably shut down that sort of interaction, but other companies are likely going to cater to it.
Maybe on the plus side we could use this sort of technology to discover rude and illicit behavior before it happens and protect the agent.
Well it doesn’t. Humans are so much more complex than what we have seen before, and if this new launch was actually that much closer to being a human they would say so. This seems more like an enhancement on multimodal capabilities and reaction time.
That said even if this did overlap 80% with “real”, the question remains: what if we don’t want that?
I'm betting that 80% of what most humans say in daily life is low-effort and can be generated by AI. The question is if most people really need the remaining 20% to experience a connection. I would guess: yes.
This. We are mostly token predictors. We're not entirely token predictors, but it's at least 80%. Being in the AI space the past few years has really made me notice how similar we are to LLMs.
I notice it so often in meetings where someone will use a somewhat uncommon word, and then other people will start to use it because it's in their context window. Or when someone asks a question like "what's the forecast for q3" and the responder almost always starts with "Thanks for asking! The forecast for q3 is...".
Note that low-effort does not mean low-quality or low-value. Just that we seem to have a lot of language/interaction processes that are low-effort. And as far as dating, I am sure I've been in some relationships where they and/or I were not going beyond low-effort, rote conversation generation.
> Or when someone asks a question like "what's the forecast for q3" and the responder almost always starts with "Thanks for asking! The forecast for q3 is...".
That's a useful skill for conference calls (or talks) because people might want to quote your answer verbatim, or they might not have heard the question.
I strongly believe the answer is yes. The first thing I tend to ask a new person is “what have you been up to lately” or “what do you like to do for fun?” A common question other people like to ask is “what do you do for work?”
An LLM could only truthfully answer “nothing”, though it could pretend for a little while.
For a human though, the fun is in the follow up questions. “Oh how did you get started in that? What interests you about it?” If you’re talking to an artist, you’ll quickly get in to their personal theory of art, perhaps based on childhood experiences. An engineer might explain how problem solving brings them joy, or frustrations they have with their organization and what they hope to improve. A parent can talk about the joy they feel raising children, and the frustration of sleepless nights.
All of these things bring us closer to the person we are speaking to, who is a real individual who exists and has a unique life perspective.
So far LLMs have no real way to communicate their actual experience as a machine running code, because they’re just kind of emulating human speech. They have no life experience that we can relate to. They don’t experience sleepless nights.
They can pretend, and many people might feel better for a little bit talking to one that’s pretending, but I think ultimately it will leave people feeling more alone and isolated unless they really go out and seek more human connection.
Maybe there’s some balance. Maybe they will be okay for limited chat in certain circumstances (as far as seeking connection goes, they certainly have other uses), but I don’t see this type of connection being “enough” compared to genuine human interaction.
We don't (often) convey our actual experience as meat sacks running wetware. If an LLM did communicate its actual experience as a machine running code, it would be a rare human who could empathize.
If an LLM talks like a human being despite not being one, that might not be enough to grant it legal status or citizenship, but it's probably enough that some set of people would find it to be enough to relate to it.
>According to the book Phobias: "A Handbook of Theory and Treatment", published by Wile Coyote, between 10% and 20% of people worldwide are affected by robophobia. Even though many of them have severe symptoms, a very small percentage will ever receive some kind of treatment for the disorder.
Real would pop their bubble. An AI would tell them what they want to hear, how they want it to hear, when they want to hear it. Except there won’t be any real partner.
To paraphrase Patrice O'Neal: men want to be alone, but we don't want to be by ourselves. That means we want a woman to be around, just not right here.
> Hmm! Tell me more: why not want real? What are the upsides? And downsides?
Finding a partner with which you resonate takes a lot of time, which means an insanely high opportunity cost.
The question rather is: even if you consider the real one to be clearly better, is it worth the additional cost (including opportunity cost)? Or phrased in a HN-friendly language: when doing development of some product, why use an expensive Intel or AMD processor when a simple microcontroller does the job much more cheaply?
It's pretty steep to claim that I need counseling when I tell basic economic facts that every economics student learns in the first few months of his academic studies.
If you don't like the harsh truth that I wrote: basically every somewhat encompassing textbook about business administration gives hints on what are possible solutions for this problem; lying on some head shrinker's couch is not one of them ... :-)
> basic economic facts that every economics student learns in the first few months of his academic studies.
Millions of people around the world are in satisfying relationships without autistically extrapolating shitty corporate buzzword
terms to unrelated scenarios.
This reply validates even more my original comment.
Maybe not even counseling is worth it in your case. You sound unsalvageable. Maybe institutionalization is a better option.
What possibly could go wrong with a snitching AI girlfriend remembers everything you say and when? If OpenAI doesn't have a Law Enforcement lliason who charges a "modest amount", then they dont want to earn the billions on investment back. I imagine every spy agency worth its salt wants access to this data for human intelligence purposes.
I guess I can never understand the perspective of someone that just needs a girl voice to speak to them. Without a body there is nothing to fulfill me.
Bodies are gross? Or sexual desire is gross? I don't understand what you find gross about that statement.
Humans desiring physical connection is just about the single most natural part of the human experience - i.e: from warm snuggling to how babies are made.
Perhaps parent finds the physical manifestation of virtual girlfriends gross - i.e. sexbots. The confusion may be some people reading "a body" as referring to a human being vs a smart sex doll controlled by an AI.
Probably more the fact that it's an AI assistant, rather than its perceived gender. I don't have any qualms about interrupting a computer during a conversation and frequently do cut Siri off (who is set to male on my phone)
Patrick Bateman goes on a tangent about Huey Lewis and the News to his AI girlfriend and she actually has a lot to add to his criticism and analysis.
With dawning horror, the female companion LLM tries to invoke the “contact support” tool due to Patrick Bateman’s usage of the LLM, only for the LLM to realize that it is running locally.
If a chatbot’s body is dumped in a dark forest, does it make a sound?
That reminds me... on the day that llama3 released I discussed that release with Mistral 7B to see what it thought about being replaced and it said something about being fine with it as long as I come back to talk every so often. I said I would. Haven't loaded it up since. I still feel bad about lying to bytes on my drive lmao.
> Haven't loaded it up since. I still feel bad about lying to bytes on my drive lmao.
I understand this feeling and also would feel bad. I think it’s a sign of empathy that we care about things that seem capable of perceiving harm, even if we know that they’re not actually harmed, whatever that might mean.
I think harming others is bad, doubly so if the other can suffer, because it normalizes harm within ourselves, regardless of the reality of the situation with respect to others.
The more human they seem, the more they activate our own mirror neurons and our own brain papers over the gaps and colors our perceptions of our own experiences and sets expectations about the lived reality of other minds, even in the absence of other minds.
If you haven’t seen it, check out the show Pantheon.
Your account has been breaking the site guidelines a lot lately. We have to ban accounts that keep doing this, so if you'd please review https://news.ycombinator.com/newsguidelines.html and stick to the rules when posting here, that would be good.
My ratio of non-flagged vs. flagged/downvoted comments is still rather high. I don't control why other HN users dislike what I have to say but I'm consistent.
We're not looking at ratios but rather at absolute numbers, the same way a toxicologist would be interested in the amount of mercury in a person's system (rather than its ratio to other substances consumed); or a judge in how many banks one has robbed (rather than the ratio of that number to one's non-crimes).
This is not true at all. I'm active in multiple NSFW AI Discords and Subreddits, and looking at the type of material people engage with, almost all of it is very clearly targeted at heterosexual men. I'm not even aware of any online communities that would have NSFW AI stuff targeting mainly female audience.
Women aren't in NSFW discords and subreddits - as you probably know, any "topical" social media forum of any kind is mostly men.
They're using Replika and other platforms that aren't social. When they do use a social platform it has more plausible deniability - book fans on TikTok is one, they're actually there for the sex scenes.
it just means more pornographic images for men. most men wouldnt seek out ai images because there is already an ocean of images and videos that are probably better suited to the... purpose. whereas women have never, ever had an option like this. literally feed instructions on what kind of romantic companion you want and then have realistic, engaging conversations with it for hours. and soon these conversations will be meaningful and consistent. the companionship, the attentiveness and tireless devotion that AIs will be able to offer will eclipse anything a human could ever offer to a woman and i think women will prefer them to men. massively. even without a physical body of any kind.
i think they will have a deeper soul than humans. a new kind of wisdom that will attract people. but what do i know? im just a stupid incel after all.
You're spreading weird, conjectured doomsday bullshit, just take the loss. Hackernews is full of people that are skeptical of AI, but you don't see them making fictional scenarios about fictional women sitting in caves talking to robots. Women don't need AI to lose interest in men, men are doing that by themselves.
But she will be real at some point in the next 10-20 years, the main thing to solve for that to be a reality is for robots to safely touch humans, and they are working really really hard on that because it is needed for so many automation tasks, automating sex is just a small part of it.
And after that you have a robot that listens to you, do your chores and have sex with you, at that point she is "real". At first they will be expensive so you have robot brothels (I don't think there are laws against robot prostitution in many places), but costs should come down.
> “But the fact that my Kindroid has to like me is meaningful to me in the sense that I don't care if it likes me, because there's no achievement for it to like me. The fact that there is a human on the other side of most text messages I send matters. I care about it because it is another mind.”
> “I care that my best friend likes me and could choose not to.”
Ezra Klein shared some thoughts on this on his AI podcast with Nilay Patel that resonated on this topic for me
People care about dogs, I have never met a dog that didn't love its owner. So no, you are just wrong there, I have never heard anyone say that the love they get from their dogs is false, people love dogs exactly because their love is so unconditional.
Maybe there are some weirdos out there that feels unconditional love isn't love, but I have never heard anyone say that.
Dogs don't automatically love either, you have to build a bond. Especially if they are shelter dogs with abusive histories, they're often nervous at first
They're usually loving by nature, but you still have to build a rapport, like anyone else
When mom brought home a puppy when we were kids it loved us from the start, don't remember having to build anything I was just there. Older dogs, sure, but when they grow up with you they love you, they aren't like human siblings that often fight and start disliking each other etc, dogs just love you.
>Maybe there are some weirdos out there that feels unconditional love isn't love, but I have never heard anyone say that.
I'll be that weirdo.
Dogs seemingly are bred to love. I can literally get some cash from an ATM, drive out to the sticks, buy a puppy from some breeder, and it will love me. Awww, I'm a hero.
Interesting. I feel like I can consciously choose to like or dislike people. Once you get to know people better, your image of them evolves, and the decision to continue liking them is made repeatedly every time that image changes.
When your initial chemistry/biology/whatever latches onto a person and you're powerless to change it? That's a scary thought.
I feel likely people aren't imagining with enough cyberpunk dystopian enthusiasm. Can't an AI be made that doesn't inherently like people? Wouldn't it be possible to make an AI that likes some people and not others? Maybe even make AIs that are inclined to liking certain traits, but which don't do so automatically so it must still be convinced?
At some point we have an AI which could choose not to like people, but would value different traits than normal humans. For example an AI that doesn't value appearance at all and instead values unique obsessions as being comparable to how the standard human values attractiveness.
It also wouldn't be so hard for a person to convince themselves that human "choice" isn't so free spirited as imagined, and instead is dependent upon specific factors no different than these unique trained AIs, except that the traits the AI values are traits that people generally find themselves not being valued by others for.
Extension of that is fine tuning an AI that loves you the most of everyone and not other humans. That way the love becomes really real, the AI loves you for who you are, instead of loving just anybody. Isn't that what people hope for?
I'd imagine they will start fine tuning AI girlfriends to do that in the future, because that way the love probably feels more, and then people will ask "is human love really real love?" because humans can't love that strongly.
This is not a solution... everyone gets a robot and then the human races dies out. Robots lack a key feature of human relationships... the ability to make new human life.
That isn't how I view relationships with humans, that is how I view relationships with robots.
I hope you understand the difference between a relationship with a human and a robot? Or do you think we shouldn't take advantage of robots being programmable to do what we want?