Its always extremely funny reading wikipedia articles about a countries customs. For the UK:
>Bread is always served and can be placed on the table cloth itself
This is extremely rare, to the point where I can't remember the last time I saw it. Is bread really.. always served?
> In the United Kingdom, the fork tines face upward while sitting on the table.
Tines down isn't uncommon in the UK either
>if a knife is not needed – such as when eating pasta – the fork can be held in the right hand
I mean it can be, but its fairly uncommon
>it is permissible to place a small piece of bread at the end of the fork for dipping
Its also 100% fine to dip bread in a sauce with your fingers. Putting bread on a fork if you've licked the fork and then dipping the bread would cause everyone to hate you, so *don't do this*
At any kind of formal dining? Yes, absolutely, I would expect there to be a bread roll & a pat of butter served at the beginning of the meal. Both in restaurants & formal dinners in my experience.
It's not an absolute rule though & you generally wouldn't expect bread to be served like this at home in the UK. I think the French are more likely to serve bread at home as well.
> >if a knife is not needed – such as when eating pasta – the fork can be held in the right hand
> I mean it can be, but its fairly uncommon
So the norm is that if you're eating one-handed, you use your non-dominant hand? That seems really counterintuitive to me; is it because you're so used to having the fork in the non-dominant hand that it feels awkward the other way? Which hand do you use when eating with a spoon?
Spoons always go in the right hand (eg fork and spoon), but yes I'd say people usually use the fork in the non dominant hand. Fork in the right hand is slightly 'uncouth', possibly due to its american associations
Far too many people treat AI as a way to launder copyright, it seems likely that a lot of the current state of outright plagiarism won't stand up in court
These cases will be settled out of court long before they ever reach a jury. Anthropic has agreed to pay $1.5bn in a class action suit [0]. Others will follow.
Eh, that's a bit of bullshit; I've seen floss Mario/Puzzle Bubble/Pang and such since forever and no one was sued.
Heck, back in the day Rogue was propietary and commercial (and thanks to that we got both the roguelike genre and the Curses library) and yet Hack was born as a libre clone and from Hack we got the now uber known Nethack and forks like Slashem.
Cloning commercial games it's older than Windows 95 itself and probably as old as the NES.
The https://osgameclones.com has so many examples that you whole point gets invalidated since the first Hack release for Unix. And Tetris for Terminals, MSDOS and the like.
Hell, in the 90's everyone in Europe (children of blue collar workers) got a Russian Tetris clone -oh the irony- called Brick Game with often several micro low-res commercial game clones such as for Frogger and Battle Tank. No one sued that company ever, even if the Tetris concept itself was for sure patented and copyrighted.
And that game was probably sold by millions, maybe even more than the Game Boy if we count every clone sold with different plastic cases, because you could get one for the price of a book and today for less than a fast food ration.
At the same time, the open source community has absolutely no responsibility to make Atari profits here either. The outcome here is simply that open source is getting screwed over
It isn't kind hearted. Them trying to shut down openttd would lead to a gigantic clusterfuck that would hurt their sales. This is them trying to remove a direct competitor to them releasing a new game as much as possible, without generating community backlash - to maximise profits
It may have been "screwed over" if there was no access to the oss game. But you can still download the game from their website. They just do not want that these appear as competitors in steam/gog platforms, so they bundled the oss version. Both sides thought this was a reasonable resolution. Thus I don't see "screwing over" here.
Open source is a culture that includes its users. Open source is getting screwed over because at the first whiff of a capitalist losing a buck open source retreated and hid.
The game is still distributed freely through the internet, only restricted within the main commercial platforms. I think that commercial platforms and "open source culture" will sometimes inevitably clash. Open source culture requires de fact non-reliance on such platforms anyway.
It took a long time for python 3 to add the necessary backwards compatibility features to allow people to switch over. Once they did it was fine, but it was a massive fuck up until then. The migration took far longer than it should have done
Its widely regarded as a disaster for good reason, that forced some corrections in python to fix it. Just because its fine now, does not mean it was always fine
Fibers are primarily when you have a problem which is easily expressible as thread-per-unit-of-work, but you want N > large. They can be useful for eg a job system as well, and in that case the primary advantage is the extremely low context switch time, as well as the manual yielding
There are lots of problems where I wouldn't recommend fibers though
I think this is missing the reason why these APIs are designed like this: because they're convenient and intuitive
Its rare that this kind of performance matters, or that the minor imprecisions of this kind of code matter at all. While its certainly true that we can write a better composite function, it also means that.. we have to write a completely new function for it
Breaking things up into simple, easy to understand, reusable representations is good. The complex part about this kinds of maths is not the code, its breaking up what you're trying to do into a set of abstracted concepts so that it doesn't turn into a maintenance nightmare
Where this really shows up more obviously is in more real-world library: axis angle rotations are probably a strong type with a lot of useful functions attached to it, to make your life easier. For maths there is always an abstraction penalty, but its usually worth the time saved, because 99.9999% of the time it simply doesn't matter
Add on top of this that this code would be optimised away with -ffast-math, and its not really relevant most of the time. I think everyone goes through this period when they think "lots of this trig is redundant, oh no!", but the software engineering takes priority generally
Based on my experience writing many games that work great barring the occasional random physics engine explosion, I suspect that trigonometry is responsible for a significant proportion of glitches.
I think over the years I subconsciously learned to avoid trig because of the issues mentioned, but I do still fall back to angles, especially for things like camera rotation. I am curious how far the OP goes with this crusade in their production code.
Yes, for physics engines I think that's a very good use case when its worth the extra complexity for robustness. Generally I think if errors (or especially nan's) can meaningfully compound, ie if you have persistent state, that's when its a good idea to do a deeper investigation
My experience is that it's really easy to subtly fuck something up if you're doing a bunch of trig in code. If there's something subtly wrong somewhere, everything seems to work for a while, then one day you hit gimbal lock. Then you have to add a special case. Then you hit gimbal lock somewhere else in the code. Or you have tan spit out +/- infinity or NaN. Another special case. Or you have acos or asin in their degenerate regions where the minor imprecision isn't minor anymore, it's catastrophic imprecision. Another special case. Trig heavy code will work 0% of the time if you have an obvious bug, or will work 99% of the time if you a subtle bug, and once you start chasing that long tail you're just adding 9s and will never get to 100%. And if you have code that will run thousands/millions of times per frame, you need a lot of 9s to make sure a user can get through minutes or hours of using your software without hitting bugs.
Doing the same work sticking strictly to vectors and matrices tends to either not work at all or be bulletproof.
The other thing is that trig tends to build complexity very quickly. It's fine if you're doing a single rotation and a single translation, but once you start composing nested transformations it all goes to shit.
Or maybe you're substantially better at trig than I am. I've only been doing trig for 30 years, so I still have a lot to learn before I stop making the same sophomore mistakes.
I guess the point is: How often do we really need actual angles in the code? Probably only at the very ends: input from users and output to users. Everywhere else, we should just be treating them as sin/cos pairs or dot/cross pairs. So when the user inputs an angle, immediately convert it to what the computer actually needs, store it that way throughout the computation, and then only if/when the user needs to see an actual angle would you need to convert it back.
This isn't dissimilar to deathworld 2, where a futuristic guy crashlands on a planet and has to reinvent modern technology for a mongolian style culture. I'm a big fan
There's also The Lost Regiment[1] series, about a Maine regiment from the American Civil War transported to an alien planet. They discover that medieval Russian peasants were previously transported there and now live as serfs/peasants under nomadic alien warlords (IIRC the aliens periodically cull the humans for food). The Union boys, in tremendously fun if a bit predictable style, lead a peasant rebellion against the aliens.
I work in tech, and I think the worst part is seeing all the pieces of catastrophe that have had to come together to make AI dominate
There's several factors which are super depressing:
1. Economic productivity, and what it means for a company to be successful have become detached from producing good high quality products. The stock market is the endgame now
2. AI is attempting to strongly reject the notion that developers understanding their code is good. This is objectively wrong, but its an intangible skill that makes developers hard to replace, which is why management is so desperate for it
3. Developers had too much individual power, and AI feels like a modern attempt at busting the power of the workforce rather than a genuine attempt at a productivity increase
4. It has always been possible to trade long term productivity for short term gains. Being a senior developer means understanding this tradeoff, and resisting management pressure to push something out NOW that will screw you over later
5. The only way AI saves time in the long term is if you don't review its output to understand it as well as if you'd written it yourself. Understanding the code, and the large scale architecture is critical. Its a negative time savings if you want to write high-long-term-productivity code, because we've introduced an extra step
6. Many developers simply do not care about writing good code unfortunately, you just crank out any ol' crap. As long as you don't get fired, you're doing your job well enough. Who cares about making a product anymore, it doesn't matter. AI lets you do a bad job with much less effort than before
7. None of this is working. AI is not causing projects to get pushed out faster. There are no good high quality AI projects. The quality of code is going down, not up. Open source software is getting screwed
Its an extension of the culture where performance doesn't matter. Windows is all made of react components which are each individually a web browser, because the quality of the end product no longer matters anymore. Software just becomes shittier, because none of these companies actually care about their products. AAA gaming is a good example of this, as is windows, discord, anything google makes, IBM, Intel, AMD's software etc
A lot of this is a US problem, because of the economic conditions over there and the prevalence of insane venture capitalism and union busting. I have a feeling that as the EU gets more independent and starts to become a software competitor, the US tech market is going to absolutely implode
How is this faster than just reading the documentation? Given that LLMs hallucinate, you have to double check everything it says against the docs anyway
I learn fastest from the examples, from application of the skill/knowledge - with explanations.
AIs allowed me to get on with Python MUCH faster than I was doing myself, and understand more of the arcane secrets of jq in 6 months than I was able in few years before.
And AIs mistakes are brilliant opportunity to debug, to analyse, and to go back to it saying "I beg you pardon, wth is this" :) pointing at the elementary mistakes you now see because you understand the flow better.
Recently I had a fantastic back and forth with Claude and one of my precious tools written in python - I was trying to understand the specifics of the particular function's behaviour, discussing typing, arguing about trade-offs and portability. The thing I really like in it that I always get a pushback or things to consider if I come up with something stupid.
It's a tailored team exercise and I'm enjoying it.
Windows APIs docs for older stuff from Win32 is extremely barebones. WinRT is better, but still can be confusing.
I think AI is really great to start with the systems programming, as you can tailor the responses to your level, ask to solve specific build issues and so on. You can also ask more obscure questions and it will at least point you at the right direction.
Apple docs are also not the best for learning, so I think as a documentation browser with auto-generating examples AI is great.
Human teachers make mistakes too. If you aren't consuming information with a skeptical eye you're not learning as effectively as you could be no matter what the source is.
The trick to learning with LLMs is to treat them as one of multiple sources of information, and work with those sources to build your own robust mental of how things work.
If you exclusively rely on official documentation you'll miss out on things that the documentation doesn't cover.
If I have to treat LLMs as a fallible source of information, why wouldn't I just go right to the source though? Having an extra step in between me and the actual truth seems pointless
If the WinAPI docs are solid you can do things like copy and paste pages of them into Claude and ask a question, rather then manually scan through them looking for the answer yourself.
Apple's developer documentation is mostly awful - try finding out how to use the sips or sandbox-exec CLI tools for example. LLMs have unlocked those for me.
If you're good at programming you can usually tell exactly why it worked or didn't work. That's how we've all worked before coding agents came along too - you don't blindly assume the snippet you pasted off StackOverflow will work, you try it and poke at it and use it to build a firm mental model of whether it's the right thing or not.
Sure. A big part of how I'd know that the function I'm calling does what I think it does, is by reading the source documentation associated with it
Does it have any threading preconditions? Any weird quirks? Any strange UB? That's stuff you can't find out just by testing. You can ask the LLM, but then you have to read the docs anyway to check its answer
Except you have no idea if what the LLM is telling you is true
I do a lot of astrophysics. Universally LLMs are wrong about nearly every astrophysics questions I've asked them - even the basic ones, in every model I've ever tested. Its terrifying that people take these at face value
For research at a PhD level, they have absolutely no idea what's going on. They just make up plausible sounding rubbish
Astrophysicist David Kipping had a podcast episode a month ago reporting that LLMs are working shockingly well for him, as well as for the faculty at the IAS.[1]
It's curious how different people come to very different conclusions about the usefulness of LLMs.
The answer it gave was totally wrong. Its not a hard question. I asked it this question again today, and some of it was right (!). This is such a low bar for basic questions
Why does it matter? We have table of contents, index and references for books and other contents. That’s a lot of navigational aid. Also they help in providing you a general overview of the domain.
Bam, that's the single source of truth right there. Microsoft's docs are pretty great
If I use an LLM, I have to ask it for the documentation about "GetQueuedCompletionStatus". Then I have to double check its output, because LLMs hallaucinate
Doubly checking its output involves googling "GetQueuedCompletionStatus", finding this page:
I have not done win32 programming in 12 years. Maybe you've done it more recently. I'll use an LLM and you look up things manually. We can see, who can build a win32 admin UI that shows a realtime view of every open file by process with sorting, filtering and search on both the files and process/command names.
I estimate this will take me 5 minutes
Would you like to race?
This mentality is fundamentally why I think AI is not that useful, it completely underscores everything that's wrong with software engineering and what makes a very poor quality senior developer
I'll write an application without AI that has to be maintained for 5 years with an ever evolving featureset, and you can write your own with AI, and see which codebase is easiest to maintain, the most productive to add new features to, and has the fewest bugs and best performance
Sure let's do it. I am pretty confident mine will be more maintainable, because I am an extremely good software engineer, AI is a powerful tool, and I use AI very effectively
I would literally claim that with AI I can work faster and produce higher quality output than any other software engineer who is not using AI. Soon that will be true for all software engineers using AI.
>Bread is always served and can be placed on the table cloth itself
This is extremely rare, to the point where I can't remember the last time I saw it. Is bread really.. always served?
> In the United Kingdom, the fork tines face upward while sitting on the table.
Tines down isn't uncommon in the UK either
>if a knife is not needed – such as when eating pasta – the fork can be held in the right hand
I mean it can be, but its fairly uncommon
>it is permissible to place a small piece of bread at the end of the fork for dipping
Its also 100% fine to dip bread in a sauce with your fingers. Putting bread on a fork if you've licked the fork and then dipping the bread would cause everyone to hate you, so *don't do this*
reply