Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"If addition doesn't work, I have no way of reasoning about anything in a program."

Hardware addition is filled with edge cases that cause it to not work as expected; I don't see it as that much different from the memory safety edge cases in most programming models. So by corollary is there no way to reason about any program that uses hardware addition?



>Hardware addition is filled with edge cases that cause it to not work as expected

I am not aware of a single case where integer addition does not work as expected at the hardware level. Of course if you have a different expectation than what the hardware does it could be called "unexpected", but I would classify this more as "ignorance". I think we can reword it as "addition must be predictable and consistent".

This is not the case in standard C, because addition can produce a weird value that the compiler assumes obeys a set of axioms but in fact doesn't, due to hardware semantics. Most C compilers allow you to opt into hardware semantics for integer arithmetic, at which point you can safely use the result of addition of any two integer values. That is really the crux of the matter here - if I write "a = b + c", can I safely examine the value "a"? In C, you cannot. You must fist make sure that b and c fulfil the criteria for safe use with the "+" operator. Or, as is more usual, close your eyes and hope they do. Or just turn on hardware semantics and only turn them off for specific very hot regions of the code where you can actually prove that the input values obey the required criteria.


That's why I implement my own addition using XOR. Gotta build from strong foundations. It ain't fast, but it's honest math.


The difference is that hardware will treat addition in a predictable and consistent manner, but C and C++ will not. In C and C++, if you overflow two signed integers, the behavior of your entire program is undefined, not simply the result of the addition.

That's the difference between something being safe and local but perhaps it's unexpected because you lack the knowledge about how it works, and something being unsafe because there is absolutely no way to know what will happen and the consequences can be global.


> The difference is that hardware will treat addition in a predictable and consistent manner, but C and C++ will not. In C and C++, if you overflow two signed integers, the behavior of your entire program is undefined, not simply the result of the addition.

On the same hardware, yes, but the same C or C++ program may behave differently on different hardware specifically because the C abstract machine doesn't define what's supposed to happen. This leaves it up to (to your point) the compiler or the hardware what happens in, e.g., an overflow condition.

If you're planning on running the program on more than one CPU revision then I'd argue it introduces a similar level of risk, although one that's probably less frequently realised.


Leaving the behavior up to the compiler (or hardware) is not undefined behavior, that is unspecified behavior or implementation defined behavior.

Undefined behavior really does mean that the program does not have valid semantics. There is no proper interpretation of how the program is to behave if signed overflow happens. It's not simply that the interpretation of the program is beyond the language and left to the hardware or the operating system, it's that the program's behavior is undefined.


Gotcha, thanks for the clarification.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: