Hacker Newsnew | past | comments | ask | show | jobs | submit | catamorphismic's commentslogin

Indeed, I've been successfully running my email server for the past five years and mail has no problem reaching gmail users.

I think people have just become defeatist.


Very good point. I'd like to add that this is not just kind of ridiculous, it is just plain ridiculous. Also, we have the right to publish any benchmark numbers, not just basic numbers. This isn't a time for weak wording.


Wouldn't the better solution be having compiler support for rejecting compilation of constructs that have UB?


  int add(int a, int b)
  {
    return a + b;
  }
Do you want the compiler to reject this because (assuming it has external linkage) there's no guarantee that there is no call with arguments such that the addition would overflow?


This discussion came up recently, but my stance still is that if you know for sure at compile time that some code always results in UB or is just non-conforming, you should not fuck around and just remove a big chunk of code because the standard allows you so, but either simply refuse compilation (my preference) or at least still just emit code representing what was written (effectively the implementation defined behavior route).


Sure, and compilers do now sometimes emit warnings like "shift amount is always out of range".

Overall though, this isn't how compilers work: they aren't "fucking around" with code they know for sure is broken and will be executed. They are just applying a large series of optimization passes which includes passes like removing unreachable code, and that interacts with an analysis of paths that cannot be take for various reasons (including that end in UB) to remove "big chunks of code".

The same passes that screw you over in any of these famous examples are the ones that help code generation in the 99% of remaining cases including many "obvious" optimizations that you'd want the compiler to perform.

I know the situation with undefined behavior is distressing and the examples make it look like the compiler writers are out to get you, but that's not really the case.


Well just let them continue to do so, then?


So either open source developers have to work for and enrich corporations or work on the side for free?

Why should software development be any different than any other profession where people get paid for their work?


> Rust had potential for proof work, but went off in a different direction.

Which direction is that?


Why is it poorly named? I've always liked it.


So, I also really like it from a "cool statistics reference" point of view. However, I don't think it is informative, unique or google-able, and discoverability is pretty important imho.


> You seem to have a poor understanding of both entropy and markets.

And you're a bit assuming and rude. Your argument also isn't as bulletproof you want to make it sound. What is the argument here, anyway? There's no need to improve technique for an average programmer because an outlier system (Facebook) is written in a language commonly associated with poor programming practices, with some handwaving about markets and entropy sprinkled on top?


Sorry if I came off that way. I was in a rush on the way to an event and I thought I was just being honest about the weakness in his argument.

What's the argument here? That stakeholders have requirements that don't have to do with robustness like budget and deadlines and that your software has a shelf life and sometimes it's ok if it eventually breaks, just like cars and even the laptop I'm typing this on will. Is that an unreasonable perspective?

And Facebook is an outlier? Really? Even when we add Wordpress, Wikpedia, Flickr, MailChimp and a long list of the most successful websites in the world to that list?


> And Facebook is an outlier? Really? Even when we add Wordpress, Wikpedia, Flickr, MailChimp and a long list of the most successful websites in the world to that list?

Yes, FB is an outlier -- one out of million companies. Only 5-10 companies out of those millions made this current model work. So their existence and "success" proves absolutely nothing.

You have a strange understanding of the word "successful".

Facebook is certainly not "successful" because it neglects good tech. If anything, they rewrote PHP itself so as not to have to rewrite their customer-facing software. How is that for your "tech excellence is not important" argument? They rewrote the damned runtime and even added a compiler.

So please define what "successful" means to you. "A lot of people using FB" is a temporary metric, even if it lasts for decades. It's not sustainable per se. It relies on hype and network effect. These fade away.

@jacquesm's points are better argued than yours. Throwing words like "free market" and "entropy" does not immediately prove a point.

I will give you the historical fact that there are many throwaway projects but he's also right that the fallout from the tech debt they incurred is almost never faced by the original author. Throw in the mix the fact that many businessmen are oblivious on what do the techies do in their work hours exactly and one can be easily misled that technology perfection is not important. Seems that you did.

Final point: I am not arguing for 100% technical excellence. That would be foolish. We would still be refining HTTP and the WWW in general even today and internet at large would not exist. But the bean counters have been allowed to negotiate down tech efforts to the bare minimum for far too long, and it shows everywhere you look.

(My local favourite restaurant waiters' smartphone-like devices for accepting and writing orders are faulty to this day because some idiot bought a cheap consumer-grade router AND made the software non-fault-tolerant, being an everyday example.)


> Only 5-10 companies out of those millions made this current model work

Stats? Evidence? I mean hundreds of of thousands of companies use PHP and other forms of less than perfect tech.

Websites all over the world seems to get the job done even when JavaScript with all its warts is used. I like JS for the record, but it does have warts.

> even if it lasts for decades.

You're saying the same thing I said. That stuff breaks. That companies come and go in and out of fashion. I also think it's interesting that you're calling FB an example of tech excellence but saying it's going to fade away. Choose one?

> How is that for your "tech excellence is not important" argument?

I never made any such argument. Not even close. I only said quality is not the only requirement and might sometimes not be a requirement at all.

Most of the code I write is high quality. I put a lot of effort into code reviews too. I mentor more junior devs around quality. My original post is actually much more nuanced than you are claiming.

> Final point: I am not arguing for 100% technical excellence. That would be foolish. We would still be refining HTTP and the WWW in general even today and internet at large would not exist.

Exactly. That's in the spirit of my original post. Maybe re-read it to see that we mostly agree instead of making my position into something it really isn't?


> Stats? Evidence? I mean hundreds of of thousands of companies use PHP and other forms of less than perfect tech.

Oh, I meant companies at the scale of Facebook. There aren't too many of them, would you not agree?

> I also think it's interesting that you're calling FB an example of tech excellence but saying it's going to fade away. Choose one?

FB does a lot of open-source projects. Their devs are excellent. That doesn't mean that their main value proposition is not comprised of code of the kind you speak about. No need to choose one, both can coexist in such a huge company like FB.

> I never made any such argument. Not even close. I only said quality is not the only requirement and might sometimes not be a requirement at all.

Well alright then. I am not here to pick a fight, you should be aware that you came off a bit more extremist to me and a bunch of others than you claim. But these things happen, I can't claim your intent because of a few comments, that's true.

Me and several others' point is that quality plays a part bigger than what you seem to claim. I also knew many devs that decided they won't ask for permission to take the [slightly / much] longer road and this decision paid off many times over in the following months and years.

Sometimes businessmen simply must not be listened to. I can ship it next week alright. But I can skip a few vital details, namely that I did not take into consideration some stupid micromanagement attempts to teach me how to do my job ("nobody cares about this arcane thing you call 'error logger' or 'fault tolerance', just get on with it already!"). Such toxic work places should be left to rot, that is a separate problem however.


Can you give a practical example of such nonterminating subexpressions?


The Dexcom G5 website says you should recalibrate with a meter every 12 hours to ensure accurate readings. Do you really have to recalibrate that often?


Not really, but you should really know how the calibrations work and understand when they might be off. It takes time to learn and understanding the basic math behind doesn't hurt.

Always calibrate when you are not sure.

G6 promises a no-calibration mode, but has a hard stop for sensors after 10 days. With G5 and calibrations you can double or triple the sensor lifetime (with xDrip).


Behaviorism isn't simply about holding psychology to a higher scientific standard and it is misguided to imply it is the only psychological paradigm to have done so. As an easy counterexample, take a look at cognitive psychology.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: