Hacker Newsnew | past | comments | ask | show | jobs | submit | NamlchakKhandro's commentslogin

why in gods name would you ever pay for discord.

> The argument is not

Then what is it.

be blunt and obvious in your reply or go home.


> Then what is it.

It's that the erosion and atrophying of the fundamental skill that made you (or, in this case, the GP) valuable is a matter of concern, because you (or GP, as the case may be) are willingly embracing the fact that you will be no more valuable than the average office office worker, and so can expect that compensation will drop to match.

As an example, moving to Python from C was was moving to a higher level of abstraction, but it still didn't jettison the need for actually knowing how to program!

Moving to LLMs from Python does jettison any need to know what an object is, what "parse, don't validate" actually means, etc.

If the problem you are solving with the LLM doesn't need that knowledge, then that job doesn't need all those valuable programming skills anyway, and thus you are no more valuable than the average clerk toiling away in the middle of some organisation.

> be blunt and obvious in your reply or go home.

Very classy.


What has made me valuable for 30 years is an ability to go from business goal -> to working implementation. They can pay someone a lot less than me (or any American - I’m in no way bragging about comp) to code.

Companies don’t pay my employer the bill rate they charge for me based on how well I code. While I’ve been expected to produce production level code as part of my job across 5 companies in the past decade not a single one asked me to write a line of code as part of the interview. They were much more concerned about ability to get things done.

Ironically, even the job at BigTech that landed in my lap was all behavioral (AWS ProServe). I damn sure didn’t get that job because of my whopping two years of AWS experience at the time. Most of my answers for “tell me about a time when…” were leading non AWS projects.

I’m not bragging - I’m old. My competitive advantage should be more than just my coding ability.


scratchpost

oh man, you actually jokingly made a good one:D

then use podman instead.

So you're walking into this hoping that it's an actual AI and not just an LLM?

interesting.

how much planning do you put into your project without AI anyway?

Pretty much all the teams I've been involved in:

- never did any analysis planning, and just yolo it along the way in their PR - every PR is an island, with tunnel vision - fast forward 2 years. and we have to throw it out and start again.

So why are you thinking you're going to get anything different with LLMs?

And plan mode isn't just a single conversation that you then flip to do mode...

you're supposed to create detailed plans and research that you then use to make the LLM refer back to and align with.

This was the point of the Ralph Loop


what are you comparing this to btw?

My outdated/incorrect mental model of how well databases work.

sounds like the kinds of hyperbole someone whose just been forced to set a linter for the first time

> Debugging is hell

Most people won't care because the extent of their debugging skills is console.log, echo, print. repeat 5000 times.


printf() debugging is still considered a best practice in the eyes of many. I still remember being really surprised when I heard my famous (Turing award-winning) CS professor tell the class this for the first time.

https://tedspence.com/the-art-of-printf-debugging-7d5274d6af...


The thing about printf debugging is that it works universally. All languages, all platforms, all stacks. Even down to the lowest levels of most software, there will always be some sort of log available.

While some tools/frameworks might have more robust debugging tools, if you have a dynamic role within an organization, you may not find it worth the effort to set them up if your target platform is constantly changing.

One real world example of this from my own work in PHP - there is a tool/lang-extension called XDebug that is great and provides step through debugging, but historically, it has been a pain to configure. It's gotten better, but when I can just as easily add a few `dump()` statements that expose the same data, it's overkill. Very rarely do I need to actually pause the running request to debug something and 99% of the time, I just want to know the state of that object within an specific `if()` block and a debug log gets me that same information.


You missed out the important bit - think about the problem and data flow first.

After that it doesn’t matter much which tool you use to verify assumptions.


> Most people won't care because the extent of their debugging skills is console.log, echo, print. repeat 5000 times.

I don't agree. The first thing any developer does when starting out a project is setting up their development environment, which includes being able to debug locally. Stdout is the absolute last option on the table, used when all else fails.


I don't agree! It's easiest to printf() things since you don't have to have tooling to debug every language you want to work with!

while you should know this for anything you're proficient in, I usually reach for printf since it's usually quicker than messing with a debugger :)


> I don't agree! It's easiest to printf() things since you don't have to have tooling to debug every language you want to work with!

Nothing is easier than setting a breakpoint in a line of code and running the app.

When a breakpoint is hit, you also have access to local variables and their state. Some debuggers even allow us to change variable values at runtime.

Using printf is effective, but far from being the best way to achieve anything.


You guys are probably gonna laugh me out of the room, but I use a combination of both printing and debugging tools when identifying issues.


No one pays for Claude Code, they pay a subscription to access the Claude Models.

lmao who even unironcally uses claude code when other harnesses exist that eclipse them ?


> distribution and manufacturing costs are zero.

???? citation needed


Once the software is written, it is ~free to copy. People pay a lot of money to avoid the software getting copied... and it gets copied anyways.


The thing is that one software is easy to crack and copy, so it is cheap...but a lot of softwares to work together is not, so that's why we have SaaS and PaaS today.


To clarify, I used "manufacturing" there, as opposed to R&D, to differentiate from physical stuff. Software costs lots to develop, but nothing to make a second copy of.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: