While I agree with the thesis, I also believe it is possible to over emphasize it. A couple of examples from someone who is a product of that generation:
Given my age at the time, programming was accessible but modifying hardware was hard. Part of the reason is that few sensible adults would hand a soldering iron to a six year old. Part of the reason is that software is easier to learn than electronics. Part of the reason is the hardware was the product back then, something you did stuff with rather than something you did stuff to (contrast the 1977 Apple ][ to the 1982 Commodore 64, and you'll get an idea of how quickly the mindset changed). It wasn't until my early 20's until I learned that people were not only modifying the hardware, but making their own peripheral cards. Someone ten years older than me likely had a different perspective simply because electronics was more relevant ... since software hadn't started eating the world.
Even those people were likely in the same boat as I if they took a moment to think about the generation prior to them. I remember reading something by one of the grandfathers of computing. They blew my mind when they started talking about hard drives, not in terms of the logical layout of the disk nor in terms of the electronics but in terms of the physics. There was a time when people had to think about computers in those terms because they were in the process of developing those lowest layers of abstraction.
I guess what I'm saying is that each generation will have a different perspective on what computers are and how computers work. They will also have different perspectives on what "low level" means. While it is sad to see a lot of the old knowledge and old ways fade into obscurity, we shouldn't pity the younger generation for not having what we had. First of all, the old knowledge hasn't really disappeared for those who choose to pursue it. Second, they are going to be building a new layer on the technological stack anyhow. What they need to understand is the layer directly below them, not what's twenty layers down.
Given my age at the time, programming was accessible but modifying hardware was hard. Part of the reason is that few sensible adults would hand a soldering iron to a six year old. Part of the reason is that software is easier to learn than electronics. Part of the reason is the hardware was the product back then, something you did stuff with rather than something you did stuff to (contrast the 1977 Apple ][ to the 1982 Commodore 64, and you'll get an idea of how quickly the mindset changed). It wasn't until my early 20's until I learned that people were not only modifying the hardware, but making their own peripheral cards. Someone ten years older than me likely had a different perspective simply because electronics was more relevant ... since software hadn't started eating the world.
Even those people were likely in the same boat as I if they took a moment to think about the generation prior to them. I remember reading something by one of the grandfathers of computing. They blew my mind when they started talking about hard drives, not in terms of the logical layout of the disk nor in terms of the electronics but in terms of the physics. There was a time when people had to think about computers in those terms because they were in the process of developing those lowest layers of abstraction.
I guess what I'm saying is that each generation will have a different perspective on what computers are and how computers work. They will also have different perspectives on what "low level" means. While it is sad to see a lot of the old knowledge and old ways fade into obscurity, we shouldn't pity the younger generation for not having what we had. First of all, the old knowledge hasn't really disappeared for those who choose to pursue it. Second, they are going to be building a new layer on the technological stack anyhow. What they need to understand is the layer directly below them, not what's twenty layers down.