Hacker Newsnew | past | comments | ask | show | jobs | submit | typ's commentslogin

American business leaders have (had?) an obsession with gross margin and tech "advancedness." They thought they would be the winner as long as they occupied the high-tech sectors in the supply chain. So they discarded the high-volume, low-margin, low-growth, low-tech businesses like assembly lines and outsourced them. But the reality is that the proximity of the assembly lines creates a cost advantage that attracts more upstream suppliers to surround it. Even Intel was seeking to build more fabs in China before being stopped by the US government.


If that were the true secret sauce of the economic success in China, why had it not taken off before the 2000s? Like, they have been that "aligned" and "want the same thing" and "run by engineers" since the 50s, no?


It kind of did. GDP per capita grew at around 6% per year from 1952-1980. It was starting from such a low base that it was still pretty low in 1980, but it was much improved. And Mao was not an engineer.


6% compared to the post-2000s is mediocre, especially given the low baseline. Not remarkably better than other high-income democratic countries like Japan and West Germany. Even the US can have ~4% growth at the time.


> why had it not taken off before the 2000s?

This topic has been discussed on Chinese forums and social media for like 1 million times. The short answer is it did. To give you a prefect example - the J-10 fighter jet was first tested in 1998, it shot down multiple best EU made fighter jets last year.


They did. Developmental state for huge country = phases measured in generations. 1.4B can't get away with building a few industries like other tigers, JP/SKR/TW/SG who can capture a few highend and do fine per capita.

TLDR timeline

50s-70s was soviet engineers / knowledge transfer from post war wreckage. Built basic industry, 80s-10s was relentlessly building out every industrial chain for every sector except leading edge because lack talent. Talent pipeline was 90s-00s building out academic system, 2010s-20s was brrrting tertiary talent. Couldn't brrrt tertiary talent without teaching peasants literacy in 60s, and then having literate parents in 80s family planning (i.e. one child policy) which filtered generations of 1-2 kid households where surplus went towards education/tertiary. All the recent highend progress recently was result from that, step by step building on generational phase/timescale. PRC only passed US in total STEM a few years ago, now they on trend to talent inflection point 2x-3x STEM vs US in next 20 years. People mock one child policy, but it was exactly choreographed for this outcome, one of few cases of generational peasant to phd planning, though 50 year foresight to build up greatest high skill demographic dividend in human history, not 100 year foresight because cost is shit TFR in the next 50 years.


The silicon shield became a slogan that has only been popularized in recent years. The potential crisis of war has been there for more than half a century (even before semiconductors became a thing). The real value proposition of the status quo is the freedom of navigation between the northeastern Asian countries and the SEA (the Strait of Malacca, aka the lifeline of energy imports), and the consequential domino effect of the entire western Pacific.

Also, not sure why everyone forgets about it. People should have learned from the experience of the pandemic that the cutting-edge foundry nodes are not really the crucial ones, as being the bottleneck of industrial infrastructure. A delay of the next-gen iPhone or RTX gaming card isn't that catastrophic. But a shortage of embedded MCUs, which are actually fabricated by mature nodes, could stall the entire industrial base of a country.


I'd bet, on average, the quality of proprietary code is worse than open-source code. There have been decades of accumulated slop generated by human agents with wildly varied skill levels, all vibe-coded by ruthless, incompetent corporate bosses.


There's only very niche fields where closed-source code quality is often better than open-source code.

Exploits and HFT are the two examples I can think of. Both are usually closed source because of the financial incentives.


Here we can start debating what means better code.

I haven’t seen HFT code but I have seen examples of exploit codes and most of it is amateur hour when it comes to building big size systems.

They are of course efficient in getting to the goal. But exploits are one off code that is not there to be maintained.


It doesn’t matter what the average is though. If 1% of software is open source, there is significantly more closed source software out there and given normal skills distributions, that means there is at least as much high quality closed source software out there, if not significantly more. The trick is skipping the 95% of crap.


In my time, I have potentially written code that some legal jurisdictions might classify as a "crime against humanity" due to the quality.


Not to mention, a team member is (surprise!) fired or let go, and no knowledge transfer exists. Womp, womp. Codebase just gets worse as the organization or team flails.

Seen this way too often.


Developers are often treated as cogs. Anyone should be able to step in a pick things up instantly. It’s just typing, right? /s


> reduce energy requirement by 10-50 times

This is only relevant to the compute productivity (how much useful work it can produce), but it's irrelevant to the heat dissipation problem. The energy income is fundamentally limited by the solar facing area (x 1361 W/m^2). So the energy output cannot exceed it, regardless useful signals or just waste heat. Even if we just put a stone there, the equilibrium temperature wouldn't be any better or worse.


Assuming that we place an iron ball (ideal sphere-shaped and thermal conductivity) on the SSO (solar synchronous orbit), how hot can the object be?

Given the solar constant 1361 W/m^2, you can calculate the temperature range based on the emissivity and absorptivity. With the right shape and “color”, the equilibrium temperature can be cooler than most people thought.

I suppose that a space data center powered 100% by solar is no different than this iron ball in principle.


The ideal shape would be a shaded, flat panel perpendicular to the sun right?


That should be better than a sphere. Though I imagine there could be some fancier 3D geometry designs.

Even for a simple sphere, if we give it different surface roughnesses on the sun-facing side and the "night" side, it can have dramatically different emissivity.


About 120 degC.


It makes sense to target a higher operating temperature, like 375K. At some point, the energy budget would reach an equilibrium. The Earth constantly absorbs solar energy and also dissipates the heat only by radiative cooling. But the equilibrium temperature of the Earth is still kind of cool.

I guess the trick lies in the operating temperature and the geometry of the satellites.


Asking for a friend (who sucks at thermodynamics:) could you use a heat pump to cool down the cold end more and heat up the hot end much higher? Heat radiation works better the higher the temperature?


Not sure about the effectiveness of a heat pump in this use case.

>Heat radiation works better the higher the temperature?

The power output is proportional to T^4 according to the Stefan-Boltzmann law.


https://www.mdpi.com/1996-1073/16/10/4010

40% isn't much in the grand scheme of things, but maybe they can reach higher reduction with more research/materials. Mass and power are pretty cheap for spaceX, so shipping more solar panels and a heap pump might not be a deal breaker.

Would e.g. a reduction of 90% in radiator area change the overall picture on the overall feasibility? I think not, it would still be ludicrous, but I'd be happy to be proven wrong.


The radiator area is probably not what they need to worry about that much as we thought. When the energy input comes from solar 100%, they just need to optimize the ratio of the sectional area facing the sun over the total surface area of the satellite. If the ratio is low enough, like a fin or cone shaped object, it will be harder to be hot.


Are we not talking about radiating away the energy used to run the DC ? I assume the solar panels should face the sun, radiate leftover energy away, hopefully as little as possible as solar cells get more efficient.

The radiators are for dissipating the waste heat coming from the data center equipment. I'd agree they better not be pointed at the sun ;) In fact, the DC should probably hide behind the solar array to not pick up extra heat from the sun.

The radiators, also behind the solar array, will also be hot because of the DC waste heat being conducted there ;)

I probably completely misunderstood your point.


It's a minor point but the Earth doesn't radiate all of that heat to equilibrium, that's why we have climate change.


The label 'open source' has become a reputation reaping and marketing vehicle rather than an informative term since the Hugging Face benchmark race started. With the weights only, we cannot actually audit that if a model is a) contaminated by benchmarks, b) built with deliberate biases, or c) trained on copyrighted/privacy data, let alone allowing other vendors to replicate the results. Anyways, people still love free stuff.


Just accept that IP laws don't matter and the old "free software" paradigm is dead. Aaron Swartz died so that GenAI may live. RMS and his model of "copyleft" are so Web 1.0 (not even 2.0). No one in GenAI cares AT ALL about the true definition of open source. Good.


Good?


Rather than API/ABI stability, I think the problem is the lack of coherence and too many fragile dependencies. Like, why should a component as essential as Systemd have to depend on a non-essential service called d-bus? Which in turn depends on an XML parser lib named libexpat. Just d-bus and libexpat combined takes a few megabytes. Last time I checked, the entire NT kernel, and probably the Linux kernel image as well, has no more than single-digit MBs in size. And by the way, Systemd itself doesn’t use XML for configurations. It has an INI style configuration format.


that's why they are doing varlink now


The difference is that you can customize/debug it or not. You might say that a .EXE can be modified too. But I don't think that's the conventional definition of open source.

I understand that these days, businesses and hobbyists just want to use free LLMs without paying subscriptions for economic motives, that is, either saving money or making money. They don't really care whether the source is truly available or not. They are just end users of a product, not open-source developers by any means.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: