Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Nvidia released a toolkit to assist chip design a few months ago:

https://techmonitor.ai/technology/ai-and-automation/nvidia-a...

Google has been working on AI's to optimize code:

https://blog.research.google/2022/07/mlgo-machine-learning-f...

> But it still can't really design things.

What do you really mean by "really" in that sentence?

In any case, the claim you responeded to was not that the chips would be designed from the ground up by AI, only that AI will enable us to run code on top of chips that are even more complex than current chips.



The Nvidia toolkit you linked is for place & route, which is a much easier task than designing the actual architecture and microarchitecture. It's similar to laying out a PCB.

The Google research you linked is using AI to make better decisions about when to apply optimisations. For example when do you inline a function? Which register do you spill? Again this is a very simple (conceptually) low level task. Nothing like writing a compiler for example.

> What do you really mean by "really" in that sentence?

I mean successfully create complex new designs from scratch.


I really wouldn't say it's an easier task at all. It's so intense it's essentially impossible for a human to do from scratch. But it is "just" an optimization problem which is different from designing an architecture.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: