Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Am I the only one who thinks we don't need Moore's Law anymore? The game has moved from computing to communication.


You need to think bigger and deeper than "using computers for Facebook".

1 billion times faster computers could help us manipulate weather, terraform planets, solve death, have a much better understanding of physics at a much more fundamental level, create fusion or matter-anti-matter engines, etc.


Right. But can't most anything that is computable be done with existing technology operating in parallel? (I'm not sure about solving death, but in general high performance computing)

I'm interested in what's inherent about the doubling per integrated circuit that makes it the limit, rather than just using a lot of circuits. Is it power related? Does it have to do with the Von Neumann bottleneck?

For better or worse, I think 3-D videoconferencing is the short term use that will drive both bandwidth and computing needs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: