No it doesn't.
If the content of a url changes then the only way to have reproducibility is caching.
You tell nix the content hash is some value and it looks up the value in the nix store.
Note, it will match anything with that content hash so it is absolutely possible to tell it the wrong hash.
Producing different outputs isn't dockerfile's fault.
Dockerfile doesn't enforce reproducibility but reproducibility can be achieved with it.
Nix isn't some magical thing that makes things reproducible either.
nix is simply pinning build inputs and relying on caches.
nixpkgs is entirely git based so you end up pinning the entire package tree.
I don't agree that the code is cheap.
It doesn't require a pipeline of people to be trained and that is huge, but it's not cheap.
Tokens are expensive.
We don't know what the actual cost is yet.
We have startups, who aren't turning a profit, buying up all the capacity of the supply chain.
There are so many impacts here that we don't have the data on.
You completely ignored the post you're replying to.
To recap, the author disagrees that writing code is cheap, because we've collectively invested trillions of dollars and redirected entire supply chains into automating code generation. The externalities will be paid for generations to come by all of humanity; it's just not reflected in your Claude subscription.
GP is not totally ignoring the post he replied to: we have models that are basically 6-months behind closed SOTA models and that we can run in the cloud and we fully know how much these costs to run.
The cat is out of the bag: compute shall keep getting cheaper as it's always been since 60 years or something.
It's always been maintenance that's been the killer and GP is totally right about that.
And if we look at a company like Cloudflare who basically didn't have any serious outage for five years then had five serious outages in six months since they drank the AI kool-aid, we kinda have a first data point on how amazing AI is from a maintenance point of view.
We all know we're generating more lines of underperforming, insecure, probably buggy, code than ever before.
Maintaining it is becoming more costly. The increasing burden of review on FOSS maintainers is one example. AWS going down because an agent decided to re-write a piece of critical infrastructure is another. We are rapidly creating new kinds of liability.
unlikely, FOSS is mostly driven by zero-cost maintenance but AI tools needs money to burn. So only few FOSS project will receive sponsored tools and some definitely reject to use by ideological reasons (for example it could be considered as poison pill from copyright perspective).
We kind of do? Local models (thought no state of the art) set a floor on this.
Even if prices are subsidized now (they are) that doesn't mean they will be more expensive later. e.g. if there's some bubble deflation then hardware, electricity, and talent could all get cheaper.
Or, and bear with me hear, there is a problem even if you aren't experiencing it.
I've been using spotlight since it was introduced for... everything.
In Tahoe it has been absolutely terrible. Unusable.
Always indexing.
Never showing me applications which is the main thing I use it for (yes, it is configured to show applications!).
They broke something.
Not seeing complaints doesn't mean they don't exist.
Not to mention ui latency that is common in electron apps that is just a low-level constant annoyance.
reply