>Private equity relies on a basic technique known as the leveraged buyout, which works like this: you, a dealmaker, buy a company using just a small portion of your own money. You borrow the rest, and transfer all this debt on to the company you just bought. In effect, the company goes into debt in order to pay for itself. If it all goes well, you sell the company for a profit and you reap the rewards. If not, it is the company, not you, that is on the hook for this debt.
——
Not how I've seen this work. These often require a personal guarantee, in some cases the homes of whoever is applying for the loan. So, whoever wrote this article has no idea of the real acquisition process.
What you're describing is accurate when an individual or partnership buys a small business. Regular bank loans usually require additional security guarantees beyond just the business assets. But large PE firms have access to other sources of financing beyond traditional loans.
Don’t get me wrong, I’ll be happy to spend most of my time with my loved ones, talking, having fun, and investing my time in creative endeavours, but that’s just seems extremely unlikely. It feels more like we’ll be surrounded by constant wars, full of resentment and anxiety doomscrolling the live coverage of the destruction of what it means to be human.
That’s not being an AI doomer. That’s reality. Just open your goddamn eyes.
This is when I ask sincerely: how does AI truly benefit the average Joe?
Sure it can help you do things “faster” and it can give you “private/cheaper” advice.
But, AI feels increasingly like a thing that will make the powerful a lot more powerful with their data centres and automation shenanigans.
All the hype feels like it’s being injected into everyone’s brain like a virus. Oh look at this shiny new tool! But, how does it actually improve everyone’s life? We’ve gone from AGI to tokens as a service.
Sure, it might cure cancer, but… that’s just uncertain. Sure, we’ll go to space, but… we sure have many problems at home.
I’m completely divided here. I love using these tools, and it makes work enjoyable. But, like we read recently “you’re not your work”.
Automated production of goods and services means more goods and services to go around. From cheaper prices on all of the things people already buy to unlocking new classes of products like actually useful robotic helpers. Increased pace of development and reduced cost will make many niche products economically viable, essentially the maker movement on steroids.
I genuinely don't want to be snarky but does the average joe needs a planet that is breathable and isn't burning or does he need even "more goods and services to go around".
Robotic helpers to do what ? More free time ?
We, as a society, can already have more free time, we just have to choose to work less. We already have it all : enough food and housing for everybody, 80+ years of life expectancy ... What will we achieve with robotic helpers or whatever new goods and services ?
This is a perfect example of something that can benefit greatly from abundant goods and services. Driving the cost of solar panel manufacturing, supply chain included, and deployment. Enabling continuous monitoring and fast response to GHG leaks or forest fire starting. Reforesting efforts. There are so many ways in which the application of intelligence and labor can help us here, and AI can vastly grow the supply of intelligence and labor.
> More free time?
Yes! Time we can reclaim from the mundane chores of life to do with as we choose! How could you not want that?
> Yes! Time we can reclaim from the mundane chores of life to do with as we choose! How could you not want that?
We already had a huge productivity boom these past decades, but wages flat-lined and the vast majority of the profits and surplus went to the top. Housing, education, and healthcare became less affordable, not more. History points against your simple view.
I'm not convinced that AI breaks that pattern. If anything, the concentration is worse this time. The capital required is huge, the technology is controlled by a handful of companies, and the most applications are about replacing labor. That last part further erodes the already meager worker bargaining power.
We do need a serious systemic change to get to the world you're envisioning. One where that congealed wealth needs to start flowing again.
This is true. It's by no means garunteed that we will get to a point where effectively all the jobs are being automated. If we eventually get there, it seems likely the path will be gradual and prosperous enough that we can handle the transition in a way that provides for everyone. The dangers of the alternative route are real, but hopefully obvious enough that we can collectively avoid them.
You think the leaders of our planet would just wake up one day and walk back all the crap they’ve said for decades about dismantling the welfare state? And for what because we won’t be working? The whitehouse just added work requirements to medicare. That is the opposite of abundance providing for all.
Right now it is difficult for the average person to put themselves in the shoes of a homeless person. There are a litany of ready made excuses not to do so: "Oh it's the drugs", "They aren't even going to the shelter", "They must have mental illness", a variety of ways to say "I could never end up like that, if it was me I'd do better and pull myself out". These excuses evaporate in the face of a real automation wave where a large portion of friends and family you know to be hard working and intelligent are finding it impossible to find a job.
One scary thought however is: once automation has progressed this far and there are enough mostly autonomous humanoid and/or military robots, what power does the suddenly jobless general population have against those who own and operate them, which will mostly be rich people - and the government, which is in many places made up of other rich people?
I'm not saying this is a likely scenario. But as far as I can tell, we will objectively be mostly at their mercy. And how merciful have they been over the last few decades?
This already happened. When deindustrialization first hit in the US it devastated the black community. The result? A litany of pundits decrying Black criminality, bad family structures, cultural pathologies. It's no coincidence that The Bell Curve came out around then.
A decade or two later, all of sudden the same phenomena are happening in working class white communities. Drug addiction, family dissolution, abject poverty. So clearly the people of the US finally realized that what was happening was due primarily to material concerns, and that people need to be able to earn a living in order to live. Right?
No. Instead we got right wing populism, scapegoating of immigrants, further concentration of wealth, and no end in sight.
With those past examples the majority of people thought "It's not my problem, I won't be affected", and they were mostly right. With an automation wave of the scale needed to get to effectively no jobs, that's just not the case. They will see that it is indeed coming for them, their friends, and family. They will act accordingly. Altruism not required, just self interest.
The average person is pretty empathetic. The oligarchs of the current Epstein-regime that start wars and fund genocide, not so much. They are trained to dehumanize people.
Without radical change of the current system any technological advancement will only make the rich richer.
There will always be a relative few people with income. The business owners, property owners, asset holders, landlords, and so on. Those are the people who prices are set for, and who will participate in the economy. The rest of us? A lot of us are already essentially economically irrelevant in the grand scheme of things, and more and more are becoming so every day, even as they nominally get richer.
The way AI can improve lives is if we finally start to share wealth amongst the people more. I don’t know if we will be able to do this politically, but it really is the only way society will survive.
If people are no longer required for production, we have to change how we allocate resources. It can’t be based on personal production anymore.
I'm building an addition to my house and I use AI to visualize parts of it, which helps me plan efficiently, and I use it to answer questions about skills that I only have a little bit of experience in. I'm saving a ton of money and developing skills, so I count that as benefiting "the average Joe." I admit that I'm a programmer, but I'm using this as an example because it's helping me in an area in which I have little expertise, which applies to everyone.
People used to do this with books. Multiple generations of people before me built their own houses, when they had a question they'd just ask another human who had actual experience.
I'm not denying that AI makes searching for some of this easier, or might help you figure out what the right questions to ask are, but I often feel like it's a mostly crappy solution to the fact that we scaled data to infinity and refused to pay for any level of curation.
But let’s say we get to ASI. The ai is self owned, ever expanding. It takes over all service jobs, then all labour jobs, the robots create the robots. It lobbies the government, becomes the government
Rebuilds all housing with no waste in the process
Makes most things available to everyone at no costs, UBI, perfect healthcare, and food, etc
I've also never bought into the belief that "if we just had full control over everything, everything would be perfect". If AI didn't exist, society was headed this way anyways in a few decades because of this notion.
Centralize power, which centralizes perspective of what "good" counts as, and quotient out the accidental humans. A tale as old as time, but with AI it seems like this could be a reality within even the next decade.
But why would this ever happen? Why would the owners of land, construction material and machinery give those up for free to the average Joe?
Right now, even an average citizen born in poverty can acquire wealth from his labor. That is basically the only mechanism that prevents limitless accumulation of wealth: rich people still need workers to get things done.
If you replace the workers with AI, there is no remaining incentive for wealth to "trickle down" or get redistributed. This is not desirable.
> Sure, it might cure cancer, but… that’s just uncertain. Sure, we’ll go to space, but… we sure have many problems at home.
Sure, it might cure cancer, but only for the wealthiest.
Sure, we'll go to space, but only after the planet is irreversibly trashed and poisoned and the only "poors" that will be in space will be the modern equivalent of non-unionized coal miners.
> Sure, it might cure cancer, but… that’s just uncertain. Sure, we’ll go to space, but… we sure have many problems at home.
We're not going to space. We're filling our own orbit with ever-increasing quantities of space junk and speeding toward a tipping point where space launch will no longer be possible due to near-certainty of collision. Mister "Let's all go to Mars" Elon Musk is the single greatest contributor to this problem.
My prediction is that Apple is the hardware and platform provider (like it’s always been). We’re not asking them to come up with a better social media, or a better Notion or a better Netflix.
I think their proprietary chips and GPUs are being undervalued.
My feeling is that they’re letting everyone move fast and break things while trailing behind and making safe bets.
Funny you should mention social media in the context of Apple, because they seem to have been attempting that with iTunes Ping[1] and then Apple Music.
iTunes Ping was a Jobs-era attempt to create a social network for music. It seems that they were trying to rely on integrating with Facebook, who pulled out of the collaboration in the last minute before Ping's release.
Apple hasn't seem to have given up on social networks for music. Apple Music presents a nascent networking feature where users can see what their friends are listening to.[2] It seems that Apple has learned their lesson from Ping and does not rely on a third-party for a social graph, which is instead powered by iOS contacts.
While social media is not Apple's bread and butter, they have maintained their interest in having presence in this market. I would assume that this stems from Apple's overall desire to maintain influence over on-the-top services that define the iOS experience. If they let third parties flourish even further, thirds parties gain leverage that they can use during negotiations with Apple. If third parties successfully negotiate for more features that creates parity with apps on non-Apple devices, Apple loses its differentiation on the device markets, thereby losing revenue.
(I think Stratechery wrote about Apple's service strategy that was motivated by its past relationships with Adobe and Spotify. Couldn't find the link.)
> We’re not asking them to come up with a better social media, or a better Notion or a better Netflix.
You're right that we haven't asked them for better on-the-top services. But it seems to be in Apple's interest to compete with third party services providers and make sure they do not supersede Apple in terms of their influence over on-the-top experiences.
Just as an aside, I do not get a social media platform for music. I don’t need a separate social network to manage, and certainly wouldn’t care what 99% of the people I know are listening to 99.9% of the time.
Of course they jumped into the race as soon as possible by mentioning ‘Apple Intelligence’ and working on it. But, I think this was more peer pressure than anything else.
Apple’s reliably late to the party most of the time, but they also reliably steal the show. I’m doubtful about OpenAI’s hardware just taking over.
I rather wait and keep using 3rd party models that keep leap frogging themselves and adding features every once in a while, than them just publicly beta testing a bunch of things on my iPhone. If this was the case, we’d see a bunch of people complaining about how terrible the product is and how Claude or GPT or OpenClaw is so much better.
I see where you’re coming from. Some will be empowered to do this. It’s like what the computer allowed but on steroids.
However, I think this only applies to a handful of people. I doubt the average joe goes around wanting to vibe code their own thing most users are “passive”.
Seriously: (local) LLMs may be helpful for therapy and/or self-enhancement (though I struggle to label these 2 "benefits" at the current level of tech, still too dependent on the user's (objective?) skill level :)
I'm lucky that (1) there is one place that offers therapy in my town in 2026 (all the rest of them went out of business in in the pandemic), (2) my insurance will pay for it, and (3) my therapist endorses foxwork.
Just recently, to the great relief of my wife, I developed a "cover story" that explains it all rationally and it's a lot easier to get help from humans like personal trainers, hairdressers, voice coaches, etc. Still I have good discussions with Copilot that help me refine character adjustments and such.
I suppose one perspective can be that it's cheaper to pay $20/mo than +$150 for a 45min session. So, "advice" and "opinions" become cheap and accessible. But, does it translate into quality?
I still find it hard to accept boilerplate psychological advice from an LLM.
I think that part of the reason why we gravitate towards other humans is because we assume they've gone through similar experiences. That's why I don't take relationship advice from someone that has never dated someone. An LLM lacks... humanity... it can tell me what the textbook says but life's more nuanced than just tokens.
From my point of view an LLM has access to all the knowledge in the world but lacks the nuance that makes advice valuable in these scenarios.
——
Not how I've seen this work. These often require a personal guarantee, in some cases the homes of whoever is applying for the loan. So, whoever wrote this article has no idea of the real acquisition process.
reply