Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Such a joy to listen to Carmack talk about just about anything.

Not sure if it's an age thing but there is a particular feeling common to contemporaries of that era, growing up in our formative years as computers grew from their humble beginnings to the incredible machines we have now. At that time, each new improvement brought astounding differences in capability. In the space of 10 years we went from 8k to 64k of memory, 4 colors to 256 colours to 16k colors to 16M colours. Resolution from 320x200 to 640x480 to 1024x768, which was close to photo-realistic (for the time, an amazing capability). I remember how revolutionary it was when I first got a computer that could render 80 columns of characters - rather than 40. Storage went from tapes to floppy drives storing a hundred kb of data to MB to hard drives storing GB of data. Every time one of these changes happened, a vista of opportunity opened up to explore and innovate and figure out what you could do with the new capabilities. Writing code in those times was all about deeply understanding the internal capabilities of the machine you had and how to get the most out of them.

Something about that process forged a deep appreciation for the fundamentals of hardware and software that I don't know how anybody could get these days.



Era aside, Carmack is a special guy, he can talk articulately and gently for hours. He did a talk few years ago (quakecon ? .. I forgot), basically a 3h long monologue about thinking while coding, there was only a single 5min pause to sip some water, incredible.


His expertise over a very wide array of objects is also extremely impressive.

I know spectroscopy very well, and yet hearing him talk about light-matter interaction was still very educative for me at the time and I was impressed by the depth and sheer knowledge he had on the matter, learned many things I didn't even know existed from him.

Same applies to many other topics related to physics, engineering and mechanics, not just computer science and programming.

Whoever says that you need a degree to get depth in several topics has never knew people like Carmack who simply study and try to understand.


I can speak just like that. Problem is I'm an uninformed idiot. But people who don't know that seem to think I'm pretty smart.

*this post does not contain an implication about Carmack. He is actually real smart.


Any uninterrupted and coherent hours long monologue is impressive to me.


You should tell that to my annoyed relative. They may disagree on the coherence front.


He was a hero of mine when I was in engineering school but i only see pictures of him mostly at the time and thought of him (probably due to his specs) as an archetypical geek. However, reading about him more and listening to his talks over time has really made me appreciated how well rounded, low drama, and high impact he is.


I'm a 40 years old guy who taught himself and got his first gig last year and he's also my hero. I'll sound "kids nowadays" but it's surprising how many young developers never even heard of Carmack.


I read Abrash's Black book when I was in my late teens and early 20s. It offered some insights into how Carmack operated. That's what made me a fan but I thought of him as a "rockstar" rather than a professional. That's the important bit that changed over several years.


I remember seeing that video, and others like it at Oculus. The variety and depth of his knowledge is truly astounding, as is his ability to talk about it without rambling. Every time he switched topics, you knew there was so much more he could have said.

When I was a technical evangelist, I regularly gave hour long technical presentations to relatively large ballrooms and it's ridiculously hard to maintain focus. The fact that Carmack gives a detailed talk rather than a presentation or speech is already a distinction, but mostly off the cuff, for hours, non-stop, to a packed conference hall? Impressive doesn't describe it.


Does anyone recall which talk specifically this is? I'd love to give it a watch.


One generation removed from that, folks got their start programming on the web. Some similarly developed great understanding about the internals of web browsers and how to get the most out of them. I don't think that's less beautiful :)

Each generation will forge deep appreciation for stuff higher on the stack. And there will always be folks working at the embedded boundary, forging deep appreciation for the folks who blazed the trail you described before them.

Also, to be fair, it's easier to learn stuff today than it was 20 years ago. You don't need to invest as much time and the resources are a lot better (yay for YouTube!)


Current stacks are much bigger and much more opaque than the 90’s hardware was. On top of that computers are fast enough that the need to go lower lever is significantly lessened. Because of this, it is likely a much smaller proportion of current programmers will get to learn about low level stuff, despite the much greater availability of learning material.

Cutting corners is not the only reason why software slows down as hardware speeds up. Many people just have no idea how fast their computer is supposed to run, and their "fast enough" is often much slower than it could be. And in some cases that’s the difference between "fast enough to make a smooth animation", and "slow enough that I need a spinning wheel".


On the flip side, if only had the teenage me had a computer at home my last year in school. Or more than a single book on Turbo Pascal (with OO support in v5.5!) I was reading again and again. Forget about the Internet, had the college-age me had access to just Wikipedia :(

It's _really_ painful to remember how scarce information was in the early 90s and how prohibitively expensive the hardware. I guess it was a better filter when only the people who really loved computers were getting into the industry though.


> It's _really_ painful to remember how scarce information was in the early 90s

I think that scarcity of information has helped me quite a lot. I often see others giving up on problems where they lack sufficient information, but I have an ingrained ability to sniff out the information - whether it's reverse engineering, gleaning subtle hints from what is available in the documentation I do have or just "thinking like the computer" to intuit things, I can very often make headway where others have failed.


1024×768×16m was my default settings for a long time despite higher resolutions were available - I disliked the need to play with font sizes to have a comfortable layout. With Windows 7 and its awesome scaling feature I finally jumped to 1920*1080. I moved to 4k HDR less than a year ago and had some wow-moments in the process.


There was such a magic to ega games, then vga, that I think was lost a bit in the transition to 3d engines. I count myself lucky to have experienced the surprise he mentions about doom and a monster jumping out at you, the land unfolding worlds of sierra games, and quirkiness of Lucas arts stories. It's a good reminder that graphics capabilities can change, but our human software can be entertained by all the old tricks.


> At that time, each new improvement brought astounding differences in capability.

Yes! There's a post we wrote on precisely this phenomenon with John Carmack as the main character that you might find interesting.[1]

TLDR: There was a time when people were willing to spend (literally) tens of thousands of dollars on ultra-premium hardware in order to compute on hardware a few years ahead of the market.

For example, John Carmack spent $23K for a NeXT workstation on which he made Wolfeinstein 3D, Doom I & II, and Quake. He also spent ~$10K on the world's first 1080p monitor, which he used to program the Quake trilogy.

[1] https://simulavr.com/blog/paying-for-productivity/


>in 1995, Carmack spent $9,995 on one the first 1080p monitors… It provided a 28" screen with 1920x1080 @ 85Hz… Essentially, Carmack was able to trade $9,995 of 1995 money for a monitor that put him decades into the future.

“Decades” is a bit hyperbolic. I got a 22” that would do 1920x1440@80 for under $1000 in 1999. I’m also skeptical of that non-4:3 resolution quoted.


I've seen pictures of his desk of what in the '90s we would have called "goofy looking CRT", I'm pretty sure its native aspect ratio was not 4:3.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: