> This comes up now as “is vibecoding sane if LLMs are nondeterministic?” Again: do you want the CS answer, or the engineering answer?
Determinism would help you. With a bit of engineering, you could make LLMs deterministic: basically, fix the random seed for the PRNG and make sure none of the other sources of entropy mentioned earlier in the article contribute.
But that barely impact any of the issues people bring up with LLMs.
If you've got a fixed GPU that doesn't degrade at all during the process, I think? If you switch GPUs (even another one of the same model) or run it long enough the feed-forward of rounding will produce different results, right?
The rounding itself is, but the operations leading to what gets rounded are not associative and the scheduling of the warps/wavefronts isn't guaranteed.
And determinism isn’t particularly helpful with compilers. We expect adherence to some sort of spec. A compiler that emits radically different code depending on how much whitespace you put between tokens could still be completely deterministic, but it’s not the kind of tool we want to be using.
Determinism is a red herring. What matters is how rigorous the relationship is between the input and the output. Compilers can be used in automated pipelines because that relationship is rigorous.
The problem you pointed out is real, but determinism in compilers is still useful!
Suppose you had one of those widely unstable compilers: concretely if you change formatting slightly, you get a totally different binary. It still does the same thing as per the language spec, but it goes about it in a completely different way.
This weak determinism is still useful, because you can still get reproducible builds. Eg volunteers can still audit eg debian binary packages by just re-running the compiler with the exact same input to check that the output matches. So they can verify that no supply chain attack has fiddled with the binaries: at least the binaries belong to the sources the are claimed to.
The argument is that determinism in compilers isn't particularly important for building software because we did without it for a long time.
Your argument would be... that building software isn't particularly important for building software...?
The actual argument you'd be making would be something like, building software isn't particularly important for survival. Which is pretty obviously true, for the reason you state.
And the reason that relationship can be regiorous is because compilers by definition translate one formal language to another. You can’t have a compiler that translates English to machine code in a rigorous, repeatable manner because English is ambiguous.
If you're submitting a CVE for a primitive that seems likely to be useful for further exploitation, mark it as such. That's not the case for ReDOS or the vast majority of DoS, it's already largely the case that you'd mark something as "privesc" or "rce" if you believe it provides that capability without necessarily having a full, reliable exploit.
Maybe. But at least everyone being on the same (new) version makes things simpler, compared to everyone being on different random versions, of what ever used to be current when they were written.
> The kicker is that updating the dependencies probably just introduces new CVEs to be discovered later down the line because most software does not backport fixes.
I don't understand how the second part of that sentence is connected to the first.
I could have written it more clearly. If you’re forced to upgrade dependencies to the latest version to get a patch, the upgrade likely contains new unrelated code that adds more CVEs. When fixes are backported you can get the patch knowing you aren’t introducing any new CVEs.
Dvorak works really well for me. (Though you might want to pick Colemak or Neo2 these days.) I use Dvorak on both my Kinesis Advantage and on 'normal' keyboards like on a laptop.
My personal experience after switching to Colemak is mostly neutral. Speed is about the same after some training, around 70 WPM. Comfort, maybe improved a bit, but no life changing.
Some people claim that they went from 60 WPM on Qwerty to over 100 WPM on some other newly designed layout, but my experience is clear: if you do it for the speed you will be disappointed.
I'd guess the speed improvement in those cases likely came from learning a better technique, like touch typing and using more of your fingers. Afaik a lot, if not most, of the fastest typists are still on qwerty.
its surprisingly easy to get away with murder (literally and figuratively) without piercing the corporate veil if you understand the rules of the game. Running decisions through a good law firm also “helps” a lot.
A bit over five years ago, someone struck and killed my friend in a crosswalk. He was a fellow PhD student. It was on a road with a 30mph limit but where people regularly speed to 50+mph.
He was an international student from Vietnam. His family woke up one day, got a phone call, and learned he was killed. I guess there was nobody to press charges.
She never faced any accountability for the 'accident'. She gets to live her life, and she now runs a puppetry education for children. Her name even seems to have been scrubbed from most of the articles about her killing my friend.
So, I think about this regularly.
I was a cyclist at the time so I was aware of how common this injustice was, but that was the first time it hit so close to home. I moved into a large city and every cyclist I've met here (every!) has been hit by a car, and the car driver effectively got only a slap on the wrist. It's just so common.
> Her name even seems to have been scrubbed from most of the articles about her killing my friend.
I'm somewhat surprised there were even articles? Are road fatalities uncommon enough in the US that everyone gets written up? Or was this a special enough one?
Not sure if this is true for every university, but when someone in the community dies, especially a student, there's usually at least an article about it.
Determinism would help you. With a bit of engineering, you could make LLMs deterministic: basically, fix the random seed for the PRNG and make sure none of the other sources of entropy mentioned earlier in the article contribute.
But that barely impact any of the issues people bring up with LLMs.
reply