Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is absolutely the first thing I looked for too. They just barely mentioned thermal management at all. Maybe they know something I don't, but I know from past posts here that many people share this concern. Very strange that they didn't go there, or maybe they didn't go there because they have no solution and this is just greenwashing for the costs of AI.


No, they just literally assumed their design fits withing the operational envelope of a conventional satellite - the paper (which no one read, apparently) literally says their system design "assumes a relatively conventional, discrete compute payload, satellite bus, thermal radiator, and solar panel designs".

This is not the 1960s. Today, if you have an idea for doing something in space, you can start by scoping out the details of your mission plan and payload requirements, and then see if you can solve it with parts off a catalogue.

(Of course there's million issues that will crop up when actually designing and building the spacecraft, but that's too low level for this kind of paper, which just notes that (the authors believe) the platform requirements fall close enough to existing systems to not be worth belaboring.)


Since this isn't the 1960s, and it's Google with their resources, maybe they'd go for some superconducting logic based on Josephson Junctions, like RSFQ? In parts, at least?

So they wouldn't have the burden of cooling it down first, like on earth? Instead being able to rely on the cold out there, as long as it stays in the shadow, or is otherwise isolated from sources of heat? Again, with less mess to deal with, like on earth? Since it's fucking cold up there already? And depending on the ratio of superconducting logic vs. conventional CMOS or whatever, less need to cool that, because superconducting stuff emits less heat, and the remaining 'smartphony' stuff is easy to deal with?

If I had those resorces at hand, I'd try.


> Instead being able to rely on the cold out there, as long as it stays in the shadow, or is otherwise isolated from sources of heat?

All the sources of power to run anything are also sources of heat. Doesn't matter if they're the sun or RTGs, they're unavoidably all sources of heat.

> Since it's fucking cold up there already?

Better to describe it as an insulator, rather than hot or cold.

> If I had those resorces at hand, I'd try.

FWIW, my "if I had the resources" thing would be making a global power grid. Divert 5% of current Chinese aluminium production for the next 20 years. 1 Ω the long way around when finished, and then nobody would need to care about the duty cycle of PV.

China might even do it, there was some news a while back about a trans-Pacific connection, but I didn't think to check if it was some random Chinese company doing a fund-raising round with big plans, or something more serious.


The basic principle still is that you need to shed as heat whatever energy you absorbe from the Sun. Electronics don't create extra heat, they convert electricity into it. So, unless I'm missing something, I'd expect any benefit of superconducting to manifest as less power required per unit of compute, or more compute per fixed energy budget. Power requirements can't go to zero for fundamental reasons (that make parts of CS into branches of physics).

> If I had those resorces at hand, I'd try.

I would too, and maybe they will, eventually. This paper is merely exploring whether there's a point in doing it in the first place.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: