Does it have to be? The etymology of the word „abstraction“ is „to draw away“. I think it‘s relevant to consider just how far away you want to go.
If I‘m purely focused on the general outcome as written in a requirement or specification document, I‘d consider everything below that as „abstracted away“.
For example, this weekend I built my own MCP server for some services I‘m hosting on my personal server (*arr, Jellyfin, …) to be integrated with claude.ai. I‘ve written down all the things I want it to do, the environment it has to work in and let Claude go.
Not once have I looked at the code. And quite frankly, I don‘t care. As long as it fulfills my general requirements, it can write Python one time and TypeScript the other time should I choose to regenerate from that document. It might behave slightly differently but that is ok to a degree.
From my perspective, that is an abstraction. Deterministic? No, but it also doesn‘t have to be.
It is quite frankly ridiculous that you need to be in the "in-group" to get things like this resolved and it is not the first time this has been reported, be it Google or Meta or any other big tech corpo.
These players MUST be regulated or treated like utilities; hoping the EU will ratchet up the pressure even more.
It used to be that you had to have a strong understanding of the underlying machine in order to create software that actually worked.
Things like cycle times of instructions, pipeline behavior, registers and so on. You had to, because compilers weren‘t good enough. Then they caught up.
You used to manage every byte of memory, utilized every piece of underlying machinery like the different chips, DMA transfers and so on, because that‘s what you had to do. Now it‘s all abstracted away.
These fundamentals are still there, but 99,9% of developers neither care nor bother with them. They don’t have to, unless they are writing a compiler or kernel, or just because it‘s fun.
I think what you‘re describing is also going to go away in the future. Still there, but most developers are going to move up one level of abstraction.
Having worked in a very large company for the past two decades now, one of the best career advices I ever got is about how you measure if you are a „good employee“.
It is very simple: you are a good employee if your boss(es) think you are.
That’s it. Nothing else matters in terms of career advancement or retainment.
No, you misunderstood. It is not about their output, it almost never is.
Most of the times, the business decision has already been made long before McK is hired. It’s all about legitimizing that decision and making it happen.
You can also wield them as a weapon against internal competitors or opponents. Look up how they were used to kill off Cariad for example.
Why is it bizarre? It is inevitable. After all, AI has not ruined creative professions, it merely disrupted and transformed them. And yes, I fully understand my whole comment here being snarky, but please bear with me.
> Actually all progress will definitely will have a huge impact on a lot of lives—otherwise it is not progress. By definition it will impact many, by displacing those who were doing it the old way by doing it better and faster. The trouble is when people hold back progress just to prevent the impact. No one should be disagreeing that the impact shouldn't be prevented, but it should not be at the cost of progress.
Now it's the software engineers turn to not hold back progress.
> [...] At the same time, a part of me feels art has no place being motivated by money anyway. Perhaps this change will restore the balance. Artists will need to get real jobs again like the rest of us and fund their art as a side project.
Replace "Artists" with "Coders" and imagine a plumber writing that comment.
> [...] Artists will still exist, but most likely as hybrid 3d-modellers, AI modelers (Not full programmers, but able to fine-tune models with online guides and setups, can read basic python), and storytellers (like manga artists). It'll be a higher-pay, higher-prestige, higher-skill-requirement job than before. And all those artists who devoted their lives to draw better, find this to be an incredibly brutal adjustment.
Again, replace "Artists" with coders and fill in the replacement.
So, please get in line and adapt. And stop clinging to your "great intellectually challenging job" because you are holding back progress. It can't be that challenging if it can be handled by a machine anyway.
The premise of those comments, just like the premise in this thread, is ridiculous and fantastical.
The only way generative AI has changed the creative arts is that it's made it easier to produce low quality slop.
I would not call that a true transformation. I'd call that saving costs at the expense of quality.
The same is true of software. The difference is, unlike art, quality in software has very clear safety and security implications.
This gen AI hype is just the crypto hype all over again but with a sci-fi twist in the narrative. It's a worse form of work just like crypto was a worse form of money.
I do not disagree, in fact I'm feeling more and more Butlerian with every passing day. However, it is undeniable that a transformation is taking place -- just not necessarily to the better.
Gen AI is the opposite of crypto. The use is immediate, obvious and needs no explanation or philosophizing.
You are basically showing your hand that you have zero intellectual curiosity or you are delusional in your own ability if you have never learned anything from gen AI.
I play with generative AI quite often. Mostly for shits and giggles. It's fun to try to make it hallucinate in the dumbest way possible. Or to make up context.
E.g. try to make any image generating model take an existing photo of a humanoid and change it so the character does a backflip.
It's also interesting to generate images in a long loop, because it usually reveals interesting patterns in the training data.
Outside these distractions I've never had generative AI be useful. And I'm currently working in AI research.
Is it though? I agree the technology evolving is inevitable, but, the race/rush to throw as much money at scaling and marketing as possible before these things are profitable and before society is ready is not inevitable at all. It feels extremely forced. And the way it's being shoved into every product to juice usage numbers seems to agree with me that it's all premature and rushed and most people don't really want it. The bubble is essentially from investing way more money in datacenters and GPU's than they can even possibly pay for or build, and there's no evidence there's even a market for using that capacity!
It's funny you bring up artists, because I used to work in game development and I've worked with a lot of artists, and they almost universally HATE this stuff. They're not like "oh thank you Mr. Altman", they're more like "if we catch you using AI we'll shun you." And it's not just producers, a lot of gamers are calling out games that are made using AI, so the customers are mad too.
You keep talking about "progress", but "progress" towards what exactly? So far these things aren't making anything new or advancing civilization, they're remixing stuff we already did well before, but sloppily. I'm not saying they don't have a place -- they definitely do, they can be useful. My argument is against the bizarre hype machine and what sometimes seems like sock puppets on social media. If the marketting was just "hey, we have this neat AI, come use it" I think there'd be a lot less backlash then people saying "Get in line and adapt"
> And stop clinging to your "great intellectually challenging job" because you are holding back progress.
Man, I really wish I had the power you think I have. Also, I use these tools daily, I'm deeply familiar with them, I'm not holding back anyone's progress, not even my own. That doesn't mean I think they're beyond criticism or that the companies behind them are acting responsibly, or that every product is great. I plan to be part of the future, but I'm not just going to pretend like I think every part of it is brilliant.
> It can't be that challenging if it can be handled by a machine anyway.
This will be really funny when it comes for your job.
I believe you misunderstood the point of my comment, or rather I didn't make it clear enough. The quotes I quickly picked out feel like they represent a majority opinion on HN, namely that this is progress and disruption. I don't share that opinion.
My own gut feeling is that this sentiment comes out of a position of superiority and a definite lack of empathy. It is software engineers building the technology that is leading to job loss, sloppification of everything as well as second order effects like storage and RAM prices soaring because of the hype.
As such, I find it ironic to complain about being replaced. After all, your profession is the one responsible for all of this, so now please take a look in the mirror and take responsibility for the actions of the industry you choose to work in.
Personally, I think the current trajectory of AI is an overall net negative to society. I sincerely hope it all comes crashing down in another AI winter, but we'll see.
I feel different: the last line is very important in this context, since it communicates the underlying thoughts and values of the poster.
Asking for "amazing" open source projects in this case is not asking out of genuine curiosity or want for debate, it is a rhetorical question asked out of frustration at the general trajectory of AI and who profits off of it -- namely the boot-wearers.
If I‘m purely focused on the general outcome as written in a requirement or specification document, I‘d consider everything below that as „abstracted away“.
For example, this weekend I built my own MCP server for some services I‘m hosting on my personal server (*arr, Jellyfin, …) to be integrated with claude.ai. I‘ve written down all the things I want it to do, the environment it has to work in and let Claude go.
Not once have I looked at the code. And quite frankly, I don‘t care. As long as it fulfills my general requirements, it can write Python one time and TypeScript the other time should I choose to regenerate from that document. It might behave slightly differently but that is ok to a degree.
From my perspective, that is an abstraction. Deterministic? No, but it also doesn‘t have to be.
reply