Hacker Newsnew | past | comments | ask | show | jobs | submit | codexb's commentslogin

There will always be value in doing work that other people don't want to do themselves or that requires expertise and skill that isn't conveyed all that well through books or pictures. The economy used to be full of stable masters for horses and carriages, and manual typists, and street lamp lighters, and television repairmen, and any number of jobs that don't exist anymore.

I'm pretty sure we'll survive.


I think it's a stretch to call having to make a living in a career other than your preferred job "suffering". Even before AI, there were surely millions of people who grew up wanting to be an artist, or an astronaut, or an architect, or any number of things that they never had the skills or the work ethic or the resources to achieve. I'm sure before cars there were people who loved maintaining stables of horses and carriages, and lamented the decline of the stable master profession. It's no different now.

At one point, there was a case for preventing scammy and fraudulent apps. For a long time, the ios App store had a much higher quality than android.

But now? There are tons of scammy and fraudulent apps on the app store. If you try to search for any popular app, you'll be presented with a dozen apps that look similar with similar names and logos.


Apple's "manual review" process stopped meaning anything to me when they verified a trojan horse version of LastPass: https://blog.lastpass.com/posts/warning-fraudulent-app-imper...

I don't even know how this is possible. FOSS repos have more security than that...


Yep. And this has been the case for over a decade.

They might do some sampling, but they're definitely not checking everything.

The first app I published in 2012 had a backend, but the Apple team never logged in with the provided credentials, or even tried anything.


Like when you search for anything "AI" and get bombarded with a wall of minimalist goatse

Also: gambling apps. Legal, sure, but also incredibly scammy.

And there are literally app farms pushing hundreds consealed illegal gambling / casino / betting apps to app store daily. Apple approves every single one.

They are then getting removed in days / weeks, but it just proves their review process is a joke.


License plate holders that obscure the license plate on private property.


Look at any hobby and there are lots of beginners and casuals and far fewer people who are very skilled at it. The Maker hobby is no different. It's certainly not a problem of the microcontrollers available. Arduino is the simplest, but there are plenty of others.

The "blinky LED" roadblock is really just a result of the fact that more complex "maker" projects require some amount of electrical or engineering or fabrication knowledge and skill, which takes some trial and error and practice -- the same thing that limits progress in lots of other hobbies.

The real "Maker" movement is the demand that drives so many consumer level fabrication tools and components that were only available as expensive industrial and commercial orders in the past -- 3d printers, laser cutters, microcontrollers, IC sensors, brushless motors -- there are so many options now that just weren't available at all 20 years ago.


I agree with the outcome of increased fabrication tools availability.

Yet, when the intent is that the population is to be empowered democratically to wield these tools, there needs to be a better pedagogical culture in the communities.

I cannot believe the amount of people replying who seem to think that having a path to improvement is gatekeeping. How are people supposed to actually use these tools to make greater than novelty-level changes in their lives and communities?

The price of Arduino has not only been going up and up, but there have been IP disputes over the years. At the same time, you can get chips for pennies on the dollar. People in this thread are lamenting the possible demise of Arduino, when like Cloudflare, like Github, and like so many other things, they should have never been so invested into a single player.

The result of Arduino going away should be "Ah, it is a sad day that one of our many choices of accessible boards is going away. let's make sure the other ones are robust against that same fate and keep creating with our remaining tools."

Instead, the conversation is "How dare that big corp change the terms and conditions on our only hobby option!"

I certainly see a structural and cultural problem here.


Wow, I can't believe this is still around! I'm glad to see artifacts from the past like this are still out there on the internet.

Makes me miss Google CodeJam though.


Sounds more like people need to de-UK. It's going to be a problem with any company or technology.


It's more likely to be a problem with Apple (and Google) because they have put themselves in a position where they are a gateway to everybody. There are multitudes of online storage providers outside of the UK's reach and jurisdiction but 0% of iPhone users back up to them because of technical limitations that inhibit iCloud competitors or any compatible storage solution.


> 0% of iPhone users back up to them because of technical limitations that inhibit iCloud competitors or any compatible storage solution.

To clarify, by "technical limitations" here you don't mean "it's not possible with our current technology", you mean "Apple purposely blocks this".


Allegedly it's deliberate, according to a pair of legal actions they face in the UK (hearing in 9 days) and US (hearing in August 2026).

> 13.1 a set of technical restrictions and practices that prevent users of iOS from storing certain key file types (known as “Restricted Files”) on any cloud storage service other than its own iCloud and thus ensuring that users have no choice but to use iCloud (a complete monopolist in respect of these Restricted Files) if they wish to meet all their cloud storage and/or back up needs, in particular in order to conduct a complete back-up of the device (“the Restricted File Conduct”); and/or

> 13.2 an unfair choice architecture, which individually and cumulatively steer iOS Users towards using and purchasing iCloud rather than other cloud storage services, and/or limit their effective choice, and/or exclude or disadvantage rivals or would- be rivals ( “the Choice Architecture Conduct ”). See further paragraphs 6 to 9 and 97 to 132 of the CPCF.

https://www.catribunal.org.uk/cases/16897724-consumers-assoc... (via summary of ruling of the chair)

> 30. By sequestering Restricted Files, and denying all other cloud providers access to them, Apple prevents rival cloud platforms from offering a full-service cloud solution that can compete effectively against iCloud. The cloud products that rivals can offer are, by virtue of Apple’s restraints, fundamentally diminished because they can only host Accessible Files. Users who want to back up all of their files—including the basic Restricted Files needed to restore their device at replacement—have but one option in the marketplace: iCloud.

> 31. There is no technological or security justification for Apple mandating the use of iCloud for Restricted Files. Apple draws this distinction only to curtail competition and advantage its iCloud product over rival cloud platforms.

https://www.courtlistener.com/docket/68303306/felix-gamboa-v... (via document 1 the complaint)


> There are multitudes of online storage providers outside of the UK's reach and jurisdiction

Not according to the UK, lately. The problem is still domestic. UK wants to exert this control over any service a UK citizens happens to use, whether they have a UK presence or not. Same with the ID/Age verification stuff.

Moving away from Apple and Google probably is something they should do, but it's not going to be a solution to the problem of the UK government's overreach.

UK citizens need to turn their attention inward against their government.


Readers may be interested to know what my MP had to say when I got in touch about this:

Thank you for your email.

The UK has a strong tradition of safeguarding privacy while ensuring that appropriate action can be taken against criminals, such as child sexual abusers and terrorists. I firmly believe that privacy and security are not mutually exclusive—we can and must have both.

The Investigatory Powers Act governs how and when data can be requested by law enforcement and other relevant agencies. It includes robust safeguards and independent oversight to protect privacy, ensuring that data is accessed only in exceptional cases and only when necessary and proportionate.

The suggestion that cybersecurity and access to data by law enforcement are at odds is false. It is possible for online platforms to have strong cybersecurity measures whilst also ensuring that criminal activities can be detected.

It should be noted that the Home Office cannot comment on operational security matters, including confirming or denying the existence of any notices it has issued. This has been the longstanding position of successive UK Governments for reasons of national security.

I support the responsible use of data and technology to drive economic growth, create new jobs, and empower individuals. It is essential that data is used safely and wisely, and that individuals remain in control of how their data is used.

Additionally, I welcome the Government’s transparency regarding how data is used, including on the algorithms that process this information. Several algorithms have already been published for public scrutiny, with more to follow—as resources allow—across multiple departments.

Thank you once again for contacting me about this important issue.


To be clear, Apple and Google both have huge UK presence. I don't know the extent of Google, but Apple has offices with thousands of people working in them. Compliance with what the UK wants in this regard is not optional.

What the original poster does is completely misplace blame under the guise of "clever" writing - blame should be assigned squarely on the idiotic policies of the UK government.


Google has been building a huge new office in London for a bit now, with the apparent intent to move most of their EU presence there.


I'd say more likely, their UK presence. There's an increasing gap between UK and the rest of Europe, wider than other non-EU members such as Switzerland.

I see Switzerland as a country that wants complete independence, but sees value in cooperating with other countries, and does so. UK seems like on the path to becoming an authoritarian hellscape and won't allow any other country to stop its degradation.


I did mean EU presence. From what I’ve heard, the plan of the construction project seems to have been to relocate people from Munich, Warsaw, etc. to London. Not only is there isn’t much of an equivalence post Brexit, as you note, but it’s also not less expensive in any obvious way. So yes, it’s weird.


> they are a gateway to everybody

They are, and most time this allows them to abuse you. But what do you think happens once you that gateway is blown open, isn't your front door next?

> There are multitudes of online storage providers outside of the UK's reach and jurisdiction

What I said above means that once you normalize the situation that providers have to open the gate to your yard whenever the state comes knocking, the state will just come knocking directly at your door. In other words I'm not sure the state will stop in its pursuit of access to your data when it can just incriminate trying to evade the law by storing it out of reach.


> But what do you think happens once you that gateway is blown open, isn't your front door next?

Yes this is the way policing should work, if they think you have done something they knock on your door rather than go to Apple and Google and compromise the entire population all at once through the convenience of their monopolies. Bonus points if a judge needs to grant them the privilege of knocking on your door too.


> Yes this is the way policing should work, if they think you have done something they knock on your door [...] Bonus points if a judge needs to grant them the privilege

How exactly would they come after you if your data is "outside of the UK's reach and jurisdiction"? They went after the gatekeepers because they wanted a one stop shop for accessing people's data. They will look to take the same easy road in the future and there's nothing easier then framing any attempts to keep data out of UK's reach as a crime. They get your data or get you for not providing the data.

The law will be "stupid", tech savvy people will find ways around it. But it's enough to throw a or a noose around as many people as possible and tighten as time goes by. Authoritarianism 101.


> How exactly would they come after you if your data is "outside of the UK's reach and jurisdiction"?

By suspecting you of a crime first, then they can establish access to your device through legal due process and access the data on your device or imprison you for not facilitating it. Same thing they do with computer passwords and whatnot.


> By suspecting you of a crime first

My friend, suspecting you of a crime is the easiest thing to do. Just putting your data outside of UK jurisdiction makes you suspicious. Ever tried going into the US and refusing to unlock your phone if asked at the border because "you have rights"?

> through legal due process

"Legal due process" is literally just what the law says. In this case a backdoor is the legal due process. The UK government took aim at Apple and Google because they wanted a one stop shop for their data access needs, and didn't want to bother going after you "the criminal" individually. If Apple and Google didn't exist and everyone starting backing up their data in some far away, untouchable jurisdiction (should you trust one) you think the UK government wouldn't tighten the noose around individuals the same way? Most governments are going in this direction anyway.

The government showed its intentions with this move: have easy access to your data. They'll keep pursuing that goal no matter what, gatekeepers or not. They define the due process. In this particular case the problem isn't that Apple is a gatekeeper but that the government wants things they shouldn't (by my definition) have.


> because of technical limitations that inhibit iCloud competitors or any compatible storage solution.

ah thats not quite true is it now?


It's an Apple problem, because with libre tools you can run your own software to circumvent this law.


You can run your own software, but if asked by UK authorities to provide the keys/password and you don't comply you face prison time.


You can safely leave it to FOSS to implement ways around that.


How does FOSS prevent someone from pulling your teeth out with a wrench to get your passphrase?


Rust solves this.


Although effective, this particular technique does not scale very well. Even if the UK had 100,000 kidnapping wrench torturers, it would take ~2 years for them to get through to pulling everyone in the UK’s teeth.


What if they tortured with wrenches in both hands? That would double torturing capacity.


Fortunately for those who like laws having an effect, that's not factoring in the deterrence aspect of the first (few) toothpull event(s)

It's the law that's the issue. Avoiding enforcement only works until people actually care to start enforcing. There's also enough examples in history of people taking matters into their own hands if they disagree with something, doubly so if there's a law against it or something else makes them feel righteous. If you do bad in the eyes of the public (or its prosecutor), good luck swimming against the tide


You give them access to a fake account.


no, no, you can't that's just a factually vacuous and obviously incorrect statement...


The majority of apple and android users can't run their own libre software, until libre software is as easy and seamless to use as the comparables.


My (grand)parents like their FOSS launcher, gallery, and chat client just fine. I've had zero questions about how Signal works, but a bunch about how to deal with the OS' pre-installed garbage spawning notifications about this or that update. They can't tell the difference between an advertisement pushed by some commercial app they want and a smartwatch firmware update notification

From my POV, it's the commercial software that has fundamental usability issues due to misaligned incentives (not completely different either, but not as aligned as FOSS). They just have a better lobby and marketing budget. Chrome didn't become this ubiquitous on mobile by having to be downloaded from f-droid, but by making a deal that device manufacturers cannot refuse


tell me, did they install and set this up themselves or you?


The person above said "run", not "install". I don't build my own car either. But in this case it's more of a marketing/discoverability problem than a knowledge gap on setting it up because it's nearly as simple as using them

Idk if it's just me but I do hate these suggestive questions. If you've got something to state, spit it out


Did you have to state that as a question? (I obviously install even FB and Watsapp in there)

It's unthinkable to setup a phone with whatsapp and fb here. When meta had a BGP problem, they (the older ones) asked me why there was "no internet"


The fact you're answering rather than ignoring shows the problem I'm very sorry to say


There's more good reasons to de-Apple besides just residing in UK.


Thankfully my predecessors de-UK’ed for me!


Legacy features!


People need to hold the UK government responsible for its crimes against humanity. Until the AUMF and similiar policies across the Wetsern hemisphere which resulted in the utterly reprehensible "War on Terror", are rescinded and the crimes committed under their enactment fully prosecuted, the authoritarianism will continue.

Remember, people, these are WAR CRIMINALS driving these policies forward. To expect this class of individuals to adhere to democratic, western values, is naive in the extreme.

The same people who have no problem with genociding a million people in the middle east enemy-state-de-jour are not going to give one fig of care to the local human rights violations that they are also getting away with.

The West has a war criminal problem. Until we solve that we cannot do a damn thing about our human rights problem.


[flagged]


The actual straightforward fix isn't available to us - namely, we aren't due a general election until 2029 and right now the "good guys" are in power, so it's not at all clear that anyone would even offer to reverse this TCN if they were elected instead, in 4 years time.


At least the US hasn't postponed the general elections to keep the unpopular party in power.

https://www.local.gov.uk/our-support/devolution-and-lgr-hub/...


Neither has the UK government.

* It wasn't the general election.

* They offered local councils the chance to request it if they were going through a reorganisation or devolution process.

* 18 councils requested and 9 were accepted as justified.

* And even those are only delayed until May next year (one year after the rest of the UK).

So to be clear the UK government not only didn't postpone the general elections but half the councils who requested the local elections were postponed were denied, with the other half having reasons and still doing it a year later anyway.

And all that is actually covered in the page you link to.


Fact check - the UK hasn't postponed the general election.

Your link points to _some_ local council elections (the people responsible for bin collections, parks and care homes) and the extension has been requested by the local councils themselves.


I wish they would help get as many reform councils as possible. Given how incompetent they have been in the ones they did get elected, I think it would put a damper on the enthusiasm of their supporters.


yes, and they're totally not fighting on multiple fronts against central govt, those in the areas they're trying to run deciding that the civil service can now be a political arm of the left or from having to boot the actual racists in sheep's clothing from the party. I'm not defending any party here, I'm even highlighting some of reforms worse issues. But you have to admit it's not apples and oranges when comparing their performance so far with a cushy little on-the-Thames millionaires cul-de-sac.


> racists in sheep's clothing from the party.

Could you name a few? I thought you had to be racist to join...


If there was a general election today, who exactly could we vote for?

Trade union organising with the threat of nationwide industrial action would work regardless of which flavour of Tory is in power, though.


> and right now the "good guys" are in power, so

So close and yet so far.


Granted it would be more impactful that to stop using Google and Apple services.


They need to stay and fix their country, and the means by which they do that is simple: start prosecuting our own war criminals.


I agree with you, but that only works if people value it and are willing to pay for it.

Look at email. It’s technically open, but in reality there are a few large players who control the majority of it.

The only way open source phone software succeeds is if there is real money behind it and there is an attractiveness to it that makes people pay for it.


They were, but the actual cuts needed (to entitlements) are politically impossible to make.


Which is why they went the non-democratic/illegal route by avoiding congress.


So they did a thing they knew wouldn’t work? AKA not serious about solving the problem. The OBBBA budget bill did make some cuts though, anyway.


That's basically compilers these days. It used to be that you could try and optimize your code, inline things here and there, but these days, you're not going to beat the compiler optimization.



I guess I should mention that https://blog.regehr.org/archives/1515 doesn't dispute that people can pretty much always beat the shit out of optimizing compilers; Regehr explicitly says, "of course there’s plenty of hot code that wants to be optimized by hand." Rather, where he disagrees is whether it's worthwhile to use optimizing compilers for other code.

Daniel Berlin's https://news.ycombinator.com/item?id=9397169 does kind of disagree, saying, "If GCC didn't beat an expert at optimizing interpreter loops, it was because they didn't file a bug and give us code to optimize," but his actual example is the CPython interpreter loop, which is light-years from the kind of hand-optimized assembly interpreter Mike Pall's post is talking about, and moreover it wasn't feeding an interpreter loop to GCC but rather replacing interpretation with run-time compilation. Mostly what he disagrees about is the same thing Regehr disagrees about: whether there's enough code in the category of "not worth hand-optimizing but still runs often enough to matter", not whether you can beat a compiler by hand-optimizing your code. On the contrary, he brings up whole categories of code where compilers can't hope to compete with hand-optimization, such as numerical algorithms where optimization requires sacrificing numerical stability. mpweiher's comment in response discusses other scenarios where compilers can't hope to compete, like systems-level optimization.

It's worth reading the comments by haberman and Mike Pall in the HN thread there where they correct Berlin about LuaJIT, and kjksf also points out a number of widely-used libraries that got 2–4× speedups over optimized C by hand-optimizing the assembly: libjpeg-turbo, Skia, and ffmpeg. It'd be interesting to see if the intervening 10 years have changed the situation, because GCC and LLVM have improved in that time, but I doubt they've improved by even 50%, much less 300%.


Meanwhile, GCC will happily implement bsearch() without cmov instructions and the result will be slower than a custom implementation on which it emits cmov instructions. I do not believe anyone has filed a bug report specifically about the inefficient bsearch(), but the bug report I filed a few years ago on inefficient code generation for binary search functions is still open, so I see no point in bothering:

https://gcc.gnu.org/bugzilla/show_bug.cgi?id=110001

Binary searches on OpenZFS B-Tree nodes are faster in part because we did not wait for the compiler:

https://github.com/openzfs/zfs/commit/677c6f8457943fe5b56d7a...

Eliminating comparator function overhead via inlining is also a part of the improvement, which we would not have had because the OpenZFS code is not built with LTO, so even if the compiler fixes that bug, the patch will still have been useful.


Definitely not true.

You aren't going to beat the compiler if you have to consider a wide range of inputs and outputs but that isn't a typical setting and you can actually beat them. Even in general settings this can be true because it's still a really hard problem for the compiler to infer things you might know. That's why C++ has all those compiler hints and why people optimize with gcc flags other than -O.

It's often easy to beat Blas in matrix multiplication if you know some conditions on that matrix. Because Blas will check to find the best algo first but might not know (you could call directly of course and there you likely won't win, but you're competing against a person not the compiler).

Never over estimate the compiler. The work the PL people do is unquestionably useful but they'll also be the first to say you can beat it.

You should always do what Knuth suggested (the often misunderstood "premature optimization" quote) and get the profile.


I was disappointed with how bad avr-gcc was at optimizing code. Here's one of several examples I found. https://nerdralph.blogspot.com/2015/03/fastest-avr-software-...


These days optimizing compilers are your number one enemy.

They'll "optimize" your code by deleting it. They'll "prove" your null/overflow checks are useless and just delete them. Then they'll "prove" your entire function is useless or undefined and just "optimize" it to a no-op or something. Make enough things undefined and maybe they'll turn the main function into a no-op.

In languages like C, people are well advised to disable some problematic optimizations and explicitly force the compiler to assume some implementation details to make things sane.


If they prove a NULL check is always false, it means you have dead code.

For example:

  if (p == NULL) return;

  if (p == NULL) doSomething();
It is safe to delete the second one. Even if it is not deleted, it will never be executed.

What is problematic is when they remove something like memset() right before a free operation, when the memset() is needed to sanitize sensitive data like encryption keys. There are ways of forcing compilers to retain the memset(), such as using functions designed not to be optimized out, such as explicit_bzero(). You can see how we took care of this problem in OpenZFS here:

https://github.com/openzfs/zfs/pull/14544


They just think you have lots of dead code because of silly undefined behavior nonsense.

  char *allocate_a_string_please(int n)
  {
      if (n + 1 < n)
          return 0; // overflow

      return malloc(n + 1); // space for the NUL
  }
This code seems okay at first glance, it's a simple integer overflow check that makes sense to anyone who reads it. The addition will overflow when n equals INT_MAX, it's going to wrap around and the function will return NULL. Reasonable.

Unfortunately, we cannot have nice things because of optimizing compilers and the holy C standard.

The compiler "knows" that signed integer overflow is undefined. In practice, it just assumes that integer overflow cannot ever happen and uses this "fact" to "optimize" this program. Since signed integers "cannot" overflow, it "proves" that the condition always evaluates to false. This leads it to conclude that both the condition and the consequent are dead code.

Then it just deletes the safety check and introduces potential security vulnerabilities into the software.

They had to add literal compiler builtins to let people detect overflow conditions and make the compiler actually generate the code they want it to generate.

Fighting the compiler's assumptions and axioms gets annoying at some point and people eventually discover the mercy of compiler flags such as -fwrapv and -fno-strict-aliasing. Anyone doing systems programming with strict aliasing enabled is probably doing it wrong. Can't even cast pointers without the compiler screwing things up.


I would consider the use of signed integers for sizes to be wrong, but if you insist on using them in this example, just test for (n == INT_MAX). malloc itself uses size_t, which is unsigned.

I have been known to write patches converting signed integers to unsigned integers in places where signed arithmetic makes no sense.


The real problem is the fact compilers constantly screw up perfectly good code by optimizing it based on unreasonable "letter of the law" assumptions.

The issue of signed versus unsigned is tangential at best. There are good arguments in favor of the use of signed and unsigned integers. Neither type should cause compilers to screw code up beyond recognition.

The fact is signed integer overflow makes sense to programmers and works just fine on the machines people care to write code for. Nobody really cares that the C standard says signed integer overflow is undefined. That's just an excuse. If it's undefined, then simply define it.

Of course you can test for INT_MAX. The problem is you have to somehow know that you must do it that way instead of trying to observe the actual overflow. People tend to learn that sort of knowledge by being burned by optimizing compilers. I'd very much rather not have adversarial compilers instead.


I have been known to fix bugs involving signed integer overflow in C code others wrote.

In any case, send your feedback to the C standard committee. They can change the standard. It would not be a bad change to make.


>if (n + 1 < n)

No one does this


Oh people absolutely do this.

Here's a 2018 example.

https://github.com/mruby/mruby/commit/180f39bf4c5246ff77ef71...

https://github.com/mruby/mruby/issues/4062

  while (l >= bsiz - blen) {
      bsiz *= 2;

      if (bsiz < 0)
          mrb_raise(mrb, E_ARGUMENT_ERROR, "too big specifier");
  }
> bsiz*=2 can become negative.

> However with -O2 the mrb_raise is never triggered, since bsiz is a signed integer.

> Signed integer overflows are undefined behaviour and thus gcc removes the check.

People have even categorized this as a compiler vulnerability.

https://www.kb.cert.org/vuls/id/162289

> C compilers may silently discard some wraparound checks

And they aren't wrong.

The programmer wrote reasonable code that makes sense and perfectly aligns with their mental model of the machine.

The compiler took this code and screwed it up because it violates compiler assumptions about some abstract C machine nobody really cares about.


I propose we rewrite everything in my yet-unnamed new low-level language:

    loop
        while l >= bsize - blen;
        bsiz, ovf := bsiz * 2;
        if ovf <> 0 then
            mrb_raise(mrb, E_ARGUMENT_ERROR, ("too big specifier\x00"));
        end
    end


Just stop using signed integers to hold sizes. malloc itself takes size_t, which is unsigned.


If they prove a check is always false, it means you have dead code or you made a mistake.

It is very very hard to write C without mistakes.

When not-actually-dead code gets removed, the consequences of many mistakes get orders of magnitudes worse.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: