Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Linus Torvalds likes the fact Kernel debugging is not user friendly so that it does not detract from good programming practice, making sure you really understand your code and what it should do before having to resort to a debugger.

https://lkml.org/lkml/2000/9/6/65



That note always struck me as pride in self flagellation. Having been around many talented kernel developers, most are perfectly happy with a debugger or any other tool they can use to debug something.


> pride in

Not really: the linked text shows it is a costly strategy to avoid later «self flagellation»: it is meant to avoid the practice of mindless coding - of overly fast feature insertion, blowing amounts of low quality code to maintain. Torvalds: «I'd rather not work with people who aren't careful».

Also: «I happen to believe that not having a kernel debugger forces people to think about their problem on a different level than with a debugger ... that mindset where you know how it behaves, and then you fix it from there. [...] It's that you have to look at the level _above_ sources. At the meaning of things. Without a debugger, you basically have to go the next step: understand what the program does. Not just that particular line».


Implicit in that argument is that these kernel developers are writing low quality code / coding mindlessly. That’s a bold statement to make when you don’t know who I’m talking about.

At no point has anyone suggested using the debugger in lieu of understanding what the program does and that’s a pure strawman. Understanding is step 0. Once you’ve done that piece, a debugger levels up your game. Same with other tools like profilers, printf debugging, EBPF debugging, etc. Hell, even simple things like executing the code in the first place / running tests / gathering metrics is a debugging tool. It’s rare that someone stares at a difficult bug until they see the matrix. It usually involves iteration cycles of some kind to test out different hypothesis / collect data and a debugger is one tool to speed up those iteration cycles.

There was a story on HN a while back about how one of the inventors of Ethernet had to hook up oscilloscopes to figure out what the electrical signal was doing to figure out why CSMA wasn’t working properly. An oscilloscope is an analog debugger and you can’t tell me that the inventor of Ethernet didn’t understand Ethernet or that the oscilloscope hampered that in any way, especially since the bug turned out to be very nuanced with a chip generating a spike switching modes.


As the cheerful used to say: "«You» who"?

That is not implicit: it seems explicit and literal. Apparently, Torvalds must have had different experiences - and evidently within a strongly different framework: it looks like the project manager in a drastic effort to propagate the will about what is part of the project and what is not.


“Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.”

— Brian Kernighan


That doesn't really refute the point that Carmack made. Namely that a lot of the supposed inherent difficulty in the endeavour of debugging is actually incidental from a general cultural attachment to crap old tooling.


And I believe the point of Kernighan, as well as Torvalds, is that the difficulty of debugging serves as a bulwark against over-complicated code.


So that's why ed(1) is the standard text editor. Anything more usable is a dangerous extravagance! :)


Sounds like Brian has never used a good debugger


UNIX has never been the place of cool graphical debugging anyway.


Yes, he also worked his entire life on his own codebase. Reality for most programmers is not like that.


Nokia's Symbian OS was notoriously hard to debug, too. Half-baked emulators and the real device debugging was even worse.

I wonder to what degree this contributed to the eventual downfall of the platform. The learning curve was so steep and the IDE and tools placed so many mines under your feet so relentlessly that it must have driven some talent away.

(An example: you closed the emulator and it left over some running background processes that you had to fish out and kill in the Windows process manager. If you forgot to do that, you could restart the emulator, but breakpoints would stop working. But if you mistakenly killed another important background process, the restarted emulator would freeze and you had to restart the entire Windows.)

On the contrary, I just love working with JetBrains IDEs. The debugging functionality is so great that I feel no one is trying (deliberately or out of neglect) to waste my time. Given that I am 15 years older than I was in the Nokia days, time has become a more valuable resource for me.


That was not the fault of the OS though, but rather embedded dev in general. Techview (the emulator) allowed you to test your code to a point, but because x86 CPUs were more forgiving than ARM ones you still had to test your code on ARM reference boards (H2 and H4). There were only a handful of Lauterbachs floating around since they were so expensive. Only one team got to see the actual phones being developed, and that part of the building was off-limits to everyone else.

We went through several IDEs, Visual Studio, CodeWarrior and finally Carbide.

I’m not saying the OS itself wasn’t complicated though.


Don't both perspectives make sense?

If you are working on a relatively small or modular codebase for a long period of time Torvalds perspective makes sense, since the codebase will also benefit from that kind of introspection (ok Linux is not exactly small but excluding drivers is on the order of ~100k AFAIK, which is small relative to the time it's been active); id games are on the order of millions of lines of code (according to Carmack) and while the different engine generations are related they still draw a line under each one after a certain point and move on so they are also very time limited.

I enjoy playing computer in my head in fine detail on small code and find a lot of joy in small code. But I can also respect that on a large and unwieldy enough code base where it's necessary, it's going to make more sense to lean on a debugger and other tools more often.


There’s an important reason to keep up these computer-in-your-head skills: you don’t have an interactive debugger in production, or on customer devices. If you can’t reproduce the problem, then all you have is code and (hopefully!) enough logs to figure out what’s going on and make a fix.


It's all well and fine if all that you see is what will run. These days so many frameworks and layers are below and on top of your code that it's hard to know how your (relatively) simple code will behave at runtime. We've become on some level config and package install warriors.


that works for him, and he is intimately familiar with his software, also he's a programming savant. I find as a consultant I often get dropped into poorly documented code to fix it and a good shell environment and IDE/debugger are invaluable especially for spaghetti python code with references that are nonobvious until you can hit a breakpoint and decipher where the hell an object is coming from.


Yes I also thought about that. Two programming gurus on opposite sides of a spectrum. I find it refreshing.


Anyone who says a tool is better when it’s shittier is just insane or has a huge ego to protect




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: