Hacker Newsnew | past | comments | ask | show | jobs | submit | shaism's commentslogin

Ilya mentioned in the video that 2012 and 2020 was the “Age of Research”, followed by the “Age of Scaling” from 2020 to 2025. Now, we are about to reenter the “Age of Research”.


How do I get stuff from my “onesmartphone” to the “onecomputer”?

Or shall I also put the “onesmartphone” in the cupboard?


The phone here basically does IMAP (which is sync I suppose) and gets plugged into the computer and stuff copied around as required manually, which turns out to be rarely as it's not the primary device!


Do you mind elaborating?

I agree that weather data should be public but I don’t see why we should restrict innovation in the private market if there is demand for it.

Also more generally, I see no issue in the government outsourcing work to a competitive private market wherever possible.


I have no issue with it if it is part of a legislative/regulatory framework. This is not inside of any framework. There has been no conversation about privatization of NOAA or any of its functions. These things need to be explicit as part of a democracy.

The current regime has upended that process and has created a situation where the government has no choice but to outsource data gathering to third parties. This is corruption and not in the spirit or the letter of the law.

This startup is attempting to take advantage of an illegal situation which is just ridiculous.

I'm happy if they want to sell fancy weather balloons to anyone that they want, even the government, but selling data back to the government that should be already collecting the data in the first place BY LAW is just corrupt.


Very cool. I implemented something similar for personal use before.

At that time, LLMs weren't as proficient in coding as they are today. Nowadays, the decorator approach might even go further and not just wrap LLM calls but also write Python code based on the description in the Docstring.

This would incentivize writing unambiguous DocStrings, and guarantee (if the LLMs don't hallucinate) consistency between code and documentation.

It would bring us closer to the world that Jensen Huang described, i.e., natural language becoming a programming language.


People have been talking about natural language becoming a programming language for way longer than even Jensen Huang has been talking about it. Once upon a time, they tried to adapt natural language into a programming language, and they came up with this thing called COBOL. Same idea: "then the managers can code, and we won't need to hire so many expensive devs!"

And now the COBOL devs are retiring after a whole career . . .


But isn't it actually more like, COBOL lets you talk in COBOL-ese (which is kinda stilted), whereas LLMs let you talk in LLM-ese (which gets a lot closer to actual language)? And then since the skill cap on language is basically infinite, that this becomes a question of how good you are at saying what you want - to the extent it intersects with what the LLM can do.


COBOL was the best attempt that they could get to in the 1960s. It's the entire reason COBOL has things like paragraphs, things end with periods, etc. They wanted as much of an "English-like syntax" as possible.

The reason it looks so odd today is that so much of modern software is instead the intellectual heir of C.

And yeah, the "skill cap" of describing things is theoretically infinite. My point was this has been tried before and we don't yet know how the actual limitations of an LLM come close to that ideal. People have been trying for decades to describe things in English that still ultimately need to be described in code for them to work; that's why the software industry exists in the first place.


The answer almost certainly is no. While lithography is one of the largest single contributor to manufacturing costs, the contribution to overall cost is still far below 10%.

And one cannot simply substitute an optical lithography with a nano imprint machine without redesigning some part of the process (etch, metrology etc.).

Investing R&D resources for a (best case) 10% reduction in costs while still having a decent probability of failure in a big but declining node is not worth it.


Maybe the market hasn’t recognized the value yet.

Hence, buy $GOOG.


Given your requirements, I would rank the frameworks like this:

* SEO -> 1. Astro, 2. Remix, 3. NextJS

* Performant Pages (for SEO) -> 1. Astro, 2. Remix, 3. NextJS

* Easy to Maintain (for a small team) -> 1. Astro, 2. Remix, 3. NextJS

* Self Hostable -> 1. Astro, 2. Remix, 3. NextJS

* Some interactivity and smooth transitions -> 1. Remix and NextJS, 3. Astro

From your description, it seems you are most familiar with NextJS. My question: How about the other maintainers? If everyone is familiar with NextJS, that is a big bonus point.

In conclusion, it comes down to how high you prioritize "Some interactivity and smooth transitions" and "Familiarity with the chosen framework." Astro, being a multi-page framework first and foremost, will not deliver the smooth transitions that the other two will. However, it shines on pretty much all your other requirements.

If you can live with Astro being a Multi-Page Framework that, by default, does a full-page reload for every route, and your team is comfortable using Astro, I would go with Astro. Otherwise, pick NextJS or Remix, whichever your team is more familiar with.


thank you, really appreciate it that helps me a lot! Currently I would be the only one starting this project and maintaining it. Everyone else is working on php and not transitioning to the forntend. I would be the one in charge to get new frontend devs on board so yeah. I worked with Nextjs a lot and also use Astro a lot for smaller projects


Actually that is how it works in many companies in Germany. It has its benefits and drawbacks, like every system ;)


Because for most users the costs of getting used to a new OS outweigh the benefit of switching.


No, it’s because people approximate the costs badly. Pretty much everyone I know is biased towards inaction even if the action provides immediate as well as long term benefit. Especially when it comes to technology. Even the technology oriented people.


I am not sure how you can be so certain that it would be an immediate benefit to those users… how are you measuring the cost of having to learn something new and the benefit of knowing how everything works?


> No, it’s because people approximate the costs badly.

"Why doesn't everyone switch to Linux? They must be stupid" is just as short-sighted now as it was 30 years ago.

Linux is great, but there are and were plenty of good reasons why it's not optimal for the average user. https://news.ycombinator.com/item?id=41283651


If we're talking purely about the average user it's perfect for them. They use a browser primarily, which work great.

The trouble is the slightly above average user who is not technical. They think they know computers because they can click around in some random Windows GUI and get something to work. But they don't actually know much in a general "how stuff works" way.

And then there's the highly technical like SWE who thrive under Linux.

So really it's just the weird middle ground that struggles. You know, your MBAs who can use Excel but get scared at plaintext files.


I didn’t say it’s stupid. Everyone’s assessment is inaccurate every once in a while. Or pretty much always, when it comes to certain biases.

You say “average user” and then point to a comment where someone’s disappointed that they can’t get their professional software to work. The average user these days needs a web browser, and doesn’t care which one.


> I didn’t say it’s stupid.

You're claiming that the main reason most people don't switch to Linux is some sort of deficiency of judgment rather than anything practical, and that's not true and it's never been true.

You dismissed the costs of getting used to a new OS right off the bat, and that's a real thing. In my link that you dismissed, the first sentence ("while Windows is very hostile ... to its users, it's rarely broken or buggy") is relevant to most users. Professionals may have trouble getting their apps to work, gaming has gotten better but still has the same problem. If user literally only cares about running a web browser, then yeah, Linux would be fine; but Windows is already fine, so why bother switching?

I like Linux. I've used it at home and professionally. I'm a huge fan of FOSS principles, and I think it would be better for everyone if more people used open systems. But Windows works well enough to satisfy most people, and they're not going to change purely for ideological reasons. Maybe it'd be a better world if they did! But it's not reasonable to expect them to, or to cast that as a failure of judgment.


What was that about respecting users?


Devils Advocate: If there is no disruption in Taiwan, and TSMC continues to execute as it has, and Intel continues to execute as is has (not), then Intel might go to $0 or survive only because of government subsidies and military contracts.

NVIDIA, AMD, Apple and other companies seeking cutting edge performance will choose the most advanced node to stay competitive. Being the second best foundry has historically not been good business.

This is an extreme scenario, of course.


Does TSMC have sufficient capacity in the works to satisfy all the demand for sota/near sota fabrication?

People sometimes wonder why anyone is buying Boeing planes—it’s because it’s that or nothing. Airbus has no spare capacity.


TSMC today has more than 60% of the foundry market share, and an estimated 80%+ market share for the leading edge.

If you exclude Intel themselves (although they also use TSMC now) and Samsung, TSMC is pretty much the sole supplier to the leading edge.

So, yes, TSMC has the capacity.

Your Boeing / Airbus analogy hinges on multiple factors. First, Airbus and Boeing have comparable capacity. Second, they have comparable products. Both are not true when you compare Intel and TSMC.

NVIDIA is heavily supply constrained. Why haven’t they sourced Intel or Samsung as second source?

Historically, there are also many other issues with Intel operating as foundry. Do you think NVIDIA and AMD will be happy to send their CPU and GPU designs to Intel for manufacturing? Independence was one of the main drivers why TSMC was founded. To have an independent supplier who does not compete with its customers.


It was a genuine question, I don’t follow the sector closely.

That said to poke at your answer a bit:

If nvidia is heavily supply constrained, wildly profitable, and strictly limited to TSMC doesn’t that dynamic lead to other customers being pushed aside? If so, doesn’t that open an opportunity for Intel if they can execute?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: