Hacker Newsnew | past | comments | ask | show | jobs | submit | lII1lIlI11ll's commentslogin

> … isn’t this… what you should be doing already?

I still "own" (i.e. I'm the sole user with a root access and can install OS of my choosing) an old machine from the days before everything moved to a cloud and guess no one from IT has got to decommission it yet. I'm have no idea where it is located (besides knowing which office it is assigned to), never saw it, no way in hell am going to attach any tags and waste my time to install enterprise spyware on it or manually encrypt it's data. Do engineers do that for development servers on your job? If yes, name and shame!


> Though in the original title AV1 could be anything if you don't know it's a codec.

I'm not trying to be elitist, but this is "Hacker News", not CNN or BBC. It should be safe to assume some level of computer literacy.


Knowledge of all available codecs is certainly not the same tier as basic computer literacy. I agree it doesn't need to be dumbed down to the general user, but we also shouldn't assume everyone here know every technical abbreviation.

> I suspected that main topic of discussion here would be "is it worth learning CS at all in 2026?"

Considering the current state of the job market I don't think it is good idea to go into CS in 2026 expecting a lucrative career. People who just love to program will find a job eventually, of course.

> Does anyone suspect that some HN posts have a lot of astroturfing from AI-adjacent organisations?

Why does it have to be AI? I don't work for OpenAI/Anthropic/etc. and am an "AI-skeptic" overall. I don't believe that the current job market conditions are caused by AI. I think the issue is that the field has become saturated with all your regular "fullstack web ninjas" while higher education institutions are still pumping hordes of CS(-adjacent) grads. Things will get worse (people that went into CS before the downturn are still graduating) before they get better (smaller number of people who are truly interested are choosing the field these days which will result less people of higher average quality graduating in a few years).


Yea agree, the job market isn't difficult due to AI. Its difficult because big tech overhired massively for years, making it seem like there was demand for this many developers. Turns out if a field is somewhat saturated and the big tech corps lay off a lot of people the market gets difficult. Ill enjoy my current employment for as long as possible and hopefully the market will be better once I am looking for a job again.

I use my time off to learn a ton about Erlang/Elixir in the hopes of maybe entering the BEAM domain some day. Way less competition compared to Javascript Python devs.


I second this. Mathacademy is great and there is no way OP would be able to just jump into university-level math courser without re-learning prerequisites, considering they said that they forgot most of the school-level math.

I wouldn't bet on LLMs steamrolling jobs of a nurse or military personnel any time soon.

There are so many more reasons to hate AI than just “it is taking my job”. But even if we’re just sticking to that, some people don’t like that it will replace their co workers, neighbors or family members job.

> Also, the big part of that issue is people are incentivized to make their GitHub profile look good to have a higher chance of getting hired.

Do people really get hired for bunch of PRs to random repos on GH or just think they will? My impression has always been that GH profile is completely ignored by both recruiters and interviewers.


And this kind of behaviour is a red flag for people who actually go digging through the GitHub profile. Like techical people in the last stages of a hiring process.

Is this aspirational or anecdotal, or is this what technical people in FANNG/tech actually do? I hope it's true but it strikes me as the kind of thing that most technical people involved in the interview process would be too tired/overworked to do.

I agree. As a technical person who has been involved in hiring, I never looked at github. My evaluation of a candidate was based on how he/she answered questions in the interview, and my general sense of "could I work with this person every day." I had no spare time to go beyond that.

> "could I work with this person every day."

Communication skills (or lack thereof) on PRs or issues they opened is something I try to look for if they provide a Github profile. Signs of a big ego that will likely get in the way of day-to-day work is the main thing I look out for and it's sadly not that uncommon.


I've worked at a couple of companies with pay scales on part with FAANG, as well as a startup that was extremely selective in hiring. We rarely looked at GitHub, and never used it as a in a situation where someone got hired. I could see a situation where someone had good open source contributions it might help them get noticed by a recruiter, but that's so incredibly rare and hard to discover that it's kindof the last place people look. Having a good GitHub profile can't hurt, but LinkedIn is still king here

Nobody ever has brought up my GitHub though I did use my private GitLab projects to land my first dev job.

GitHub (or any repo host) is/was pretty much the only thing I care about when looking at people's applications, because it's the only thing that has anything to do with incontrovertible reality. I certainly don't give a shit about whether or where you went to university and it's unlikely that I value your tenure at some company highly unless I know the people in that company.

I've seen quite a few HR hiring processes where a mediocre HR person (knows to look for GH profile + activity on that, but not how to evaluate them) is paired with engineers with too little input power. In those processes, people that game their GH profiles tend to benefit.

I am from a "poor country in Eastern Europe". I'm not sure how you think homeless is dealt with here, but it is nothing that left-wing US liberal would find palatable I assure you.


I'm from Poland. Homelessness is not solved maybe, but it's nowhere near to the level of US.

The solution seems to be public healthcare, education, transport, safety net and cheap housing.

Addiction and mental illness are excuses. Eastern Europe has more mental ilness (generational trauma from WW2 is still alive) and alcoholism than US and yet it has less homeless people.

In early 90s my parents were earning 20 USD per month each. It was about average. There were still almost no homeless people.

It's a solved problem.


It is not "solved", it is marginalized because homeless people are much less tolerated compared to, say, California. In very simple terms a homeless person getting caught shitting on someones porch in Eastern Europe gets punched in the face and kicked out. And no way homeless would be allowed to just squat some park or square with tent encampment in a major city here.

Overall, I don't see neither US nor Poland as a big outliers by looking at the stats[0], it just seems like some specific places (SF) made homeless population a highly-visible nuisance by feel-good unrealistic policies that can't possible work.

[0]: https://en.wikipedia.org/wiki/List_of_sovereign_states_by_ho...


According to these stats Poland has 41% as much homelessness as US.


Celluloid and IINA are "traditional" GUIs for mpv. They work fine.


Apple and Qualcomm _have_ to use Arm ISA because they don't have x86 license. Apple would have likely stayed on x86 if they could use it in their in-house designs. Intel wouldn't issue x86 license to Qualcomm or Apple, of course.


Very unlikely. Moving to Arm allowed Apple to have a single architecture across all their hardware and leverage iPhone designs as the starting point for Mac SoCs.


If Intel would have licensed x86 for use in Apple’s own finished computers, Intel would be in a way better position. Foolish not just to lose that customer but also to legitimize ARM as a desktop and high-end option.

I think Apple would have switched anyway though. They designed Apple Silicon for their mobile devices first (iPhone, iPad) which I doubt they would have made x86. The laptops and desktops are the same ISA as the iPhone (strategically).


There was a rumour on here that aarch64 was actually designed by Apple and given to ARM to standardize


It’s a false rumour. Arm has decades of ISA design experience and their chief architect has talked about designing Aarch64.

Sure they Apple and Arm worked together but it wasn’t developed by Apple and given to Arm.


It was more compelling when M1 came out, but these days, even accepting every possible x86-related patent as valid, everything pre-AVX should expire by 2026. So no license needed.


> If they don't add big features every year, the tech press crucifies them as "just putting out another version of the same thing".

And then what? Mac users would buy some janky Acer with Windows 11 and bunch of preinstalled malware instead?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: