Every generation of builders believed their tools defined their value. Then the tools got easier, faster, automated, and the definition had to change.
But programming didn’t disappear. Writing didn’t disappear. Designing didn’t disappear.
AI flips the equation: when creation becomes cheap, value shifts from how much you can produce to what changes because you showed up. The ability to have a positive impact has actually expanded.
I don’t think this whole thing had anything to do with AI or not. It has to do with ‘teaching kids in a gym’ or ‘sitting in front of a screen in an office’.
“Every generation of builders believed their tools defined their value.”
People anchor identity to the hardest part of their work.
• Assembly → craftsmanship
• Hand coding → engineering skill
• Complex stacks → seniority
• Writing longform → intellectual authority
The difficulty of the tool becomes proof of worth.
“If few people can do this, then my contribution matters.”
So value feels intrinsic to the technical act itself.
⸻
“Then the tools got easier, faster, automated, and the definition had to change.”
Historically, this always happens.
Compilers replaced assembly expertise.
Frameworks replaced boilerplate knowledge.
Cloud replaced infrastructure mastery.
AI replaces a lot of implementation effort.
Each time, people initially interpret it as:
the skill is dying
But what actually dies is the old measurement of importance.
Value moves up a level:
from execution → judgment → direction → taste → responsibility
The activities persist, what changes is why they matter.
You still code, but code is no longer scarce.
You still write, but writing is no longer the bottleneck.
You still design, but layout isn’t the achievement.
The work shifts from producing artifacts to choosing which artifacts deserve to exist.
⸻
“AI flips the equation: when creation becomes cheap, value shifts from how much you can produce to what changes because you showed up.”
This is the core claim.
Old model:
effort → output → value
New model:
judgment → outcome → value
Previously you proved worth by volume, speed, or complexity.
Now production is abundant, so the scarce thing is:
causal impact
Not:
Did you make something?
But:
Did reality change because you were involved?
You’re moving from manufacturing to intervention.
⸻
“The ability to have a positive impact has actually expanded.”
So the post ends optimistic.
AI doesn’t reduce agency, it removes the cost barrier to acting.
Before:
You needed a team, funding, or org permission to affect the world.
Now:
A single person can teach, fix, organize, build tools, or help communities directly.
Meaning shifts from scale (reach) to consequence (effect).
⸻
In one sentence
The post argues that AI doesn’t eliminate human contribution, it removes technical scarcity, forcing value to relocate from producing things to changing outcomes.
There's still huge gaps in our understanding: quantum gravity, dark matter, what happens before planck time, thermodynamics of life and many others.
Part of the problem is that building bigger colliders, telescopes, and gravitational wave detectors requires huge resources and very powerful computers to store and crunch all the data.
We're cutting research instead of funding it right now and sending our brightest researchers to Europe and China...
Gravity is coming back to Silicon Valley: workers are realizing that the Bell Labs image they were sold was mostly innovation theater and hoarding talent for websites overstuffed with ads designed to manipulate users into buying junk
It's good for YC to do this and will benefit every startup in the long run. Google has been one of the sources of the AI boom, and provides liquidity by acquiring startups. But as YC argues they've monopolised distribution channels to the point where you need to go through the Google toll booth every time you want to access the market. This tax on founders to reach their audience makes many types of businesses unsustainable and impossible, especially for products where usage != sharing.
You mean kills potentially successful tech companies of the future by acquiring startups to cement their dominance.
I get why people on this site are in love with the idea of building an unsustainable, money-losing business where the only path to success is being acquired by a tech giant. It's like winning the lottery! But it helps nobody, it hurts your customers/users, and it hurts innovation. It's also stupid, as a successful tech company could potentially grow as big as the giants you're courting (ESPECIALLY now that the FTC has started finally doing its job). Why else do you think they're spending so much money to acquire you? It's easier on the ego to call it an "acquihire", but the truth is that they're just paying a maintenance tax on their monopoly.
Every time people complain about how detrimental big tech is to society, it ultimately comes down to this sad strategy.
Of course it is good for YC to do this. They have significant investments in OpenAI both directly and indirectly through the countless startups they've funded whose core is OpenAI.
And it's ridiculous to act like (a) you are forced to go through Google to access the 'market' and (b) that this is somehow unusual or untoward. They are an advertising company and not the only one.
The decision can be read in the larger political context. There was some controversy a while back on certain directions the Foundation took like the project on sanitation (aka the toilet challenge) and the backing of charter schools. Regardless of one's opinion of those, he is taking a stand and drawing a line in the sand.
Interesting that some people here are advocating for fewer investments and board seats when angel investing. It's rather the opposite: you need more investments and less time spent on companies.
Angle investors are vital to the startup community as they take a lot of risk and are able to fund ideas that sound too crazy to get regular funding. In this case though given a hit rate of let's say 1% for argument's sake, 54 investments in 15 years is too low. You need hundred of investments to make the math work. Also doing it for fun and to learn is laudable but in the end hitting an Uber or an Airbnb is all that matters. It sounds like the job became full time consulting which is great if you're wealthy and want to give back to the community but can become unsustainable otherwise.
OpenAI has a massive brand advantage because the general public equates their products with AI.
Even if they don't 100% figure out agents they are now big enough that they can acquire those that do.
If the future is mostly about the app layer then they'll be very aggressive in consolidating the same way Facebook did with social media, see for example Windsurf.
But programming didn’t disappear. Writing didn’t disappear. Designing didn’t disappear.
AI flips the equation: when creation becomes cheap, value shifts from how much you can produce to what changes because you showed up. The ability to have a positive impact has actually expanded.