I agree with your comment. While reading the article, I had sympathy for the author, but also unintendedly pictured them as a mix of all of the "wizard" seniors I have worked with over the years. These are the type of people who when pair programming, constantly point out what they perceive as problems with your development setup, IDE, keyboard-macro skills, lack of tiling layout, etc etc. Not to mention what they will suggest on your actual PRs.
At the end of the day, I like the mental model of programming, and I am somewhat uninterested in shaving every millimeter of friction off of every surface I touch on my computer. Does that make me a worse programmer? Maybe? I still delivered plenty of high quality code.
> constantly point out what they perceive as problems with...
Yeah, screw those people. I count myself as lucky that I've only worked with 1 person who was seriously CRITICAL of the way other's worked... beyond just code quality. However, I always enjoyed a good discussion about the various differences in how people worked, as long as they could accept there's no "right" way. That's what the article brought up for me, and I wonder how much that happens these days.
One of my fondest memories was sitting around with a few other devs after work, and one had started learning Go pretty soon after its public release... and he would show us some new, cool thing he was playing around with. Of course those kind of organic things stopped with remote work, and I wonder how much THAT has played into the loss of identity?
> At the end of the day, I like the mental model of programming, and I am somewhat uninterested in shaving every millimeter of friction off of every surface I touch on my computer. Does that make me a worse programmer? Maybe? I still delivered plenty of high quality code.
In every pursuit there are secretly 3 different communities. Those who love doing the thing, those who love tinkering with the gear, those who love talking about it.
HackerNews and the internet in general are dominated by people who like to nerd out about the gear of programming (IDEs, text editors, languages, agent setups, …) and the people who like to talk about programming. The people doing most of the work are off on the side too busy doing the thing to be noticed.
I came from embedded, where I wasn't able to use agents very effectively for anything other than quick round trip iterative stuff. They were still really useful, but I definitely could never envision just letting an agent run unattended.
But I recently switched domains into vaguely "fullstack web" using very popular frameworks. If I spend a good portion of my day going back and forth with an agent, working on a detailed implementation plan that spawns multiple agents, there is seemingly no limit* to the scope of the work they are able to accurately produce. This is because I'm reading through the whole plan and checking for silly gotchyas and larger implementation mistakes before I let them run. It's also great because I can see how the work can be parallelized at certain parts, but blocked at others, and see how much work can be parallelized at once.
Once I'm ready, I can usually let it start with not even the latest models, because the actual implementation is so straightforwardly prompted that it gets it close to perfectly right. I usually sit next to it and validate it while it's working, but I could easily imagine someone letting it run overnight to wake up to a fresh PR in the morning.
Don't get me wrong, it's still more work that just "vibing" the whole thing, but it's _so_ much more efficient than actually implementing it, especially when it's a lot of repetitive patterns and boilerplate.
* I think the limit is how much I can actually keep in my brain and spec out in a well thought out manner that doesn't let any corner cases through, which is still a limit, but not necessarily one coming from the agents. Once I have one document implemented, I can move on to the next with my own fresh mental context which makes it a lot easier to work.
The amount of boilerplate people talk about seems like the fault of these big modern frameworks honestly. A good system design shouldn't HAVE so much boilerplate. Think people would be better off simplifying and eliminating it deterministically before reaching for the LLM slot machine.
I'm not so sure I agree. To me it's somewhat magical that I can write even this amount of code and have this stuff just magically work on pretty much every platform via docker, the web platform, etc. Maybe this again is me having started with embedded, but I am blown away at the ratio of actual code to portability we currently have.
Dear author,
I suggest trying out a job in a niche part of the field like firmware/embedded. Bonus if it's a company with a bunch of legacy devices to maintain. AI just hasn't quite grokked it there yet and thinking still reigns supreme :)
I disagree. I think the attention economy is a new, parallel universe of fame. Yes, old Hollywood and new social media interact and affect each other, but they are still fairly independent. Most Hollywood actors don't have a very large social media presence -- it is probably mostly managed by their PR team. And few social media influencers make it in Hollywood as actors or actresses.
this became extremely apparent for me watching Adam Curtis's "Russia 1985-1999: TraumaZone" series. The series documents what it was like to live in the USSR during the fall of communism and (cheekily added) democracy. It was released in Oct 2022, meaning it was written and edited just before the AI curve really hit hard.
But so much of the takeaway is that it's "impossible" for top-down government to actually process all of what was happening within the system they created, and to respond appropriately and timely-- thus creating problems like food shortages, corrupt industries, etc etc. So many of the problems were traced to the monolith information processing buildings owned by the state.
But honestly.. with modern LLMs all the way up the chain? I could envision a system like this working much more smoothly (while still being incredibly invasive and eroding most people's fundamental rights). And without massive food and labour shortages, where would the energy for change come from?
A planned economy is certainly a lot more viable now than it was in 1950, let alone 1920. The Soviet Union was in many ways just a century too early.
But a major failing of the Soviet economic system was that there simply wasn't good data to make decisions, because at every layer people had the means and incentive to make their data look better than it really was. If you just add AI and modern technology to the system they had it still wouldn't work because wrong data leads us to the wrong conclusions. The real game changer would be industrial IoT, comprehensive tracking with QR codes, etc. And even then you'd have to do a lot of work to make sure factories don't mislabel their goods
That is, assuming leadership wants good data, as opposed to data that makes them look good, or validates their world model. Certainly in recent history, agencies tasked with providing accurate data are routinely told not to (e.g., the BLS commissioner firing, or the Iraq WMD reports).
> A planned economy is certainly a lot more viable now than it was in 1950, let alone 1920. The Soviet Union was in many ways just a century too early.
If the economy were otherwise stagnant, maybe. But top-down planning just cannot take into account all the multitudes of inputs to plan anywhere near the scale that communist countries did. Bureaucrats are never going to be incentivized anywhere near the level that private decision making can be. Businesses (within a legal/regulatory framework) can "just do" things if they make economic sense via a relatively simple price signal. A top-down planner can never fully take that into account, and governments should only intervene in specific national interest situations (eg in a shortage environment legally mandating an important precursor medicine ingredient to medical companies instead of other uses).
The Soviet Union decided that defence was priority number one and shoved an enormous amount of national resources into it. In the west, the US government encouraged development that also spilled over into the civilian sector and vice-versa.
> But a major failing of the Soviet economic system was that there simply wasn't good data to make decisions, because at every layer people had the means and incentive to make their data look better than it really was.
It wasn't just data that was the problem, but also quality control, having to plan far, far ahead due to bureaucracy in the supply chain, not being able get spare parts because wear and tear wasn't properly planned, etc. There's an old saying even in private business that if you create and measure people on a metric they'll game or over concentrate on said metric. The USSR often pumped out large numbers of various widgets, but quality would often be poor (the stories of submarine and nuclear power plant manufacturers having to repeatedly deal and replace bad inputs was a massive source of waste).
What you're describing is called The Fourth Industrial Revolution in Klaus Schwab's book.
Factory machines transmitting their current rate of production all the way up to International Govt. which, being all knowing, can help you regulate your production based on current and forecasted worldwide consumption.
And your machines being flexible enough to reconfigure to produce something else.
Stores doing the same on their sales and Central Bank Digital Currency tying it all together.
If I'm playing a quick pattern like this and holding down some bass note, depending on where the pattern starts, the middle two notes will become "synchronized" and play/get recorded at the same time. In my example, the top 4 notes work fine, but shifting down by one note causes the bug. I also switched between holding the bass not and not for demonstration. I assure you my fingers aren't doing anything different, I messed around with this for a while.
I'm not criticizing, sorry; just trying to understand.
I find video more compelling, generally. Obviously video has more ways to communicate - graphically, empirically, etc. It's not that reading works more effectively, but far more efficiently.
It has a lot more map data accessible and you can even overlay National Park Service maps, land ownership, accurate cell service grids, mountain biking trails, weather conditions and things like that.
Disclaimer: Just because you see a route on a map, digital or paper, does not mean it is passable today. Or it may be passable but at an extremely arduous pace.
Let me clarify. I meant the following. Assume ghb is found and evidence of sex. The woman claims she didn't take it and didn't want to have sex. Wouldn't this be enough for a conviction?
if the jury believed the woman's claims, yes, it's enough for conviction. conviction rates are high not because it's easy to prove guilt, but because district attorneys don't bring case that are likely to be lost. the scenario you describe might not be considered strong enough to win, and resources are limited, so this hypothetical case might not get a hearing.
i'm not talking about a plea bargain, i'm talking about declining to prosecute, not filing charges after an arrest, or asking the court to dismiss the charges
i am sure that every victim allegation does not lead to a prosecution in Germany
..until about 2020, with COVID/russia-ukraine/Oct 7/trumps re-election following and burying it below tons of other news cycles.
reply