Hacker Newsnew | past | comments | ask | show | jobs | submit | easythrees's commentslogin

This may not be the “right” place to ask this, but I am a software engineer in the entertainment industry and it’s pretty bad right now. What is a pivot I can make? My skills are mostly C++ and Python.


It might be better in its own "Ask HN" post.

If you want to get into AI topics: start learning about LLMs, embeddings, vector stores, datasets, fine-tuning, evals, and how to use CLI dev tools like Claude Code.

Andre Ng helped build a great free resource with courses: https://www.deeplearning.ai/ (some are sponsored but still free)


I think a quicker pivot (job wise) would likely be Python web development. They already know Python. Then layer on LLMs when doing that job (with more "runway")


Yes and no. Some new things are quite fun like expeditions or derelict freighters, but there’s no “story” per se beyond the base one.


Wait a minute, it’s possible to relicense something from GPL to MIT?


Yes if you are the only developper and never received nor accepted external contributions or if you managed to get permission from every single person who contributed or replaced their code with your own.


> or if you managed to get permission from every single person who contributed

This makes it sound more difficult than it actually is (logistically); it's not uncommon for major projects to require contributors to sign a CLA before accepting PRs.


That depends on how old and big is the project. For example Linux is "stuck" on GPL2 and even if they wanted to move to something else it wouldn't be feasible to get permission from all the people involved. Some contributors passed away making it even more difficult.


Not exactly “stuck” since they very explicitly do not want to move to GPL 3.


Even if they wanted to move to another license (which they don't), they wouldn't be able to do. So sounds exactly like they're "stuck", regardless of what they want.


How is the problem of "you signed a CLA without authorization by your employer to do so" solved? I'm mostly asking because I saw the following:

"I will not expose people taping out Hazard3 to the possibility of your employer chasing you for your contribution by harassing them legally. A contribution agreement does not solve this, because you may sign it without the legal capability to do so (...)"

https://github.com/Wren6991/Hazard3/blob/stable/Contributing... (this is I believe the repo with design for riscv cores running on RPi Pico 2)


These are the ones I refuse to contribute to.


Yes. Generally you need permissions from contributors (either asking them directly or requiring a contribution agreement that assigns copyright for contributions to either the author or the org hosting the project), but you can relicense from any license to any other license.

That doesn't extinguish the prior versions under the prior license, but it does allow a project to change its license.


There’s this remarkably stupid stereotype that muscles equates to stupidity. If you look at some of the champions, they all have degrees in fields like Accounting (Ronnie Coleman, Phil Heath) or Chemistry (Frank Zane). Schwarzenegger famously did business school at night while a bodybuilder.


This is one game I think would be amazing in VR.


You're right, I think they would be fun.

Part of the appeal was definitely the retro look, but this is a very-specific retro look that the author worked hard to put into 3D space.

If people tried to make a "retro" to VR games, you'd mostly just get lots of blocky, cheap-looking graphics. But this could really be weird, in a good way.


The modding community did look at it but the first problem was that it's a Unity game built for D3D9 game, which doesn't allow the VR plugin to be dropped in. They did request the developers create a D3D10/11 build to allow that but I'm not sure it ever happened.

Even if it did, VR relies on getting consistent images for each eye for the stereo view, which might be difficult with this style of rendering. It'd be hard to know if it works without trying it!


I don't know, I start feeling off when just watching a video of the game (a little queasy, a bit of a headache). I think that playing it in VR would be a horrible experience.


Dumb question here, but can X-Ray lasers be used to make CPUs?


That's not a dumb question at all! They're next in line, but there's a long road ahead!

Modern EUV lithography uses 13.7nm light, barely shy of the 10nm cutoff for X-Rays (and that's debatable). Many of the problems we'll need to solve are already in-play with EUV lithography, but with X-Rays they will be turned up to 11. Directing the light is a huge one, most materials are transparent to X-Rays so lenses aren't going to work, and mirrors are difficult. Building an EUV or X-Ray mirror requires coating stacks tens to hundreds of nano-meter thick layers thick but still can't manage very high reflectivity. Also, at these energies, the light easily ionizes substrate atoms knocking electrons out which travel around and affect nearby atoms, causing weird non-local stochastic effects.

We've barely started EUV production, there's plenty of room for optimization, so I'd bet we're decades away from using X-Rays commercially, but you better believe we're trying!

https://www.asml.com/en/products/euv-lithography-systems

https://www.sciencedirect.com/topics/engineering/x-ray-litho...


100s of nanometers is not what I would call thick...


It's even worse than that! There's hundreds of layers, but the individual layers are only a few nanometers each!


AMD has HIP.


Yes, but I don't think its debatable to say that the entire ecosystem is firmly behind NVIDIA's. Usually it comes as a surprise when something does support their framework, whether directly ROCm or even HIP which should be easier....

I shouldn't be surprised that AMD's ecosystem is lagging behind, since their GPU division spent a good decade suffering to be even relevant. Not to mention that NVIDIA has spent a lot of effort on their HPC tools.

I don't want this to be too negative towards AMD, they have been steadily growing in this space. Some things do work well, e.g. stable diffusion is totally fine on AMD GPUs. So they seem to be catching up. I just feel a little impatient, especially since their cards are more than powerful enough to be useful. I suppose my point is that the gap in HPC software between NVIDIA and AMD is much larger than the actual capability gap in their hardware, and that's a shame.


Apple was a few months from bankruptcy during most of the 90s competing with IBM and Microsoft, then turned around to become the most profitable company on the planet. It takes a leader and a plan and a lot of talent and the exact right conditions, but industry behemoths get pulled down from the top spot all the time.


Apple's success is mostly UX and marketing with a walled garden for an application tax. AMD has to actually achieve on the hardware side, not just marketing. Beyond this, AMD has demonstrated that they are, indeed working on closing and pulling ahead. AMD is well ahead of Intel on the server CPU front, they're neck and neck on desktop, with spans ahead in the past few years. And on the GPU side, they've closed a lot of gaps.

While I am a little bit of a fan of AMD, there's still work to do. I think AMD really needs to take advantage of their production margins to gain more market share. They also need to get something a bit closer to the 4090 on a performance gpu + entry workstation api/gpgpu workload card. The 7900 XTX is really close, but if they had something with say 32-48gb vram in the sub-2000 space it would really get a lot of the hobbiest and soho types to consider them.


Yeah, sure, changing their platform 3 times in the space of some twenty years is just marketing and UX from Apple. They are just a bunch of MBAs. Sometimes I feel like I am reading slashdot.


The platform changes had very little to do with their success. They switched from PowerPC to Intel because PowerPC was uncompetitive, but that doesn't explain why they did any better than Dell or anyone else using the exact same chips. Then they developed their own chips because Intel was stagnant, but they barely came out before AMD had something competitive and it's not obvious they'd have been in a meaningfully different position had they just used that.

Their hardware is good but if all they were selling was Macbooks and iPhones with Windows and Android on them, they wouldn't have anything near their current margins.


You’re not really making sense.

If they hadn’t made platform changes they would have never been able to turn into what they are today. I hardly thing that is ‘little to do’.

They would likely barely exist. They have ‘achieved product market fit’ as the saying goes. Which requires more than just a sharp UI, as their history shows


I think the point here is that their first platform shift onto x86/x86-64 was driven by how far Power had fallen behind. Even their fans were having difficulty justifying the slowness of their comparatively expensive computers.

It was more forced upon them than anything else.

The move to M1 was an actual innovation that came after their success.

The real story of the company though is the iPhone, which is absolutely their own technical innovation.


I would say the turning point was either the iPod or the rainbow Macs.


I didn't say they didn't have technical prowess... I said that wasn't the key to their overall success. iPod/Phone/Pad came out ahead of competition with better UX than what, generally, came before. They marketed consistently and did it well. They built up buzz. They paid for placement throughout TV and Movies.

They are a brand first, and a technical company second. That doesn't mean they aren't doing cool technical things. But a lot of companies do cool technical things and still fail.


NV has a huge advantage over AMD: they only do one thing. And that has helped them to relentlessly focus on optimizing that one thing. AMD is fighting on three different fronts at once.


Yeah, but try running ML projects on your AMD card and you'll quickly see that they're an afterthought nearly everywhere, even projects that use PyTorch (which has backend support for AMD). If consumers can't use it, they're going to learn nvidia and experience has shown that people opt for enterprise tech that they're familiar with, and most people get familiar hacking on it locally


How many are dead or permanently disabled because of Covid, I wonder.


COVID has killed hundreds of thousands, but overwhelmingly older or weaker folks, the kinds who are retired, semi-retired, or on disability. This would have had an impact, but not 10 million


Is Flutter open source?


Yeah it is under a BSD license:

https://github.com/flutter/flutter


It's open source the way Chrome is open source: the source is there, but the development is ~100% by Google


Google Chrome is not open source, Chromium is.

[Reference](https://en.wikipedia.org/wiki/Chromium_(web_browser)#:~:text....)


People keep pointing this out as if this made a grain of difference.


Not 100%, at least 30% of PRs are done by outside Google contributors.


How relevant are those PRs to the actual development of the product?

That is, fixing minor bugs or translation strings vs adding new web APIs etc.?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: