Yeah, I bike regularly on and off (season/mood/goals dependent, honestly), and knowing what I should expect on my commute to work /and/ back is important... and not something I can predict without looking at the weather in the morning.
I guess I'd say -- I think you're right that you shouldn't (ideally) be able to trigger true deadlocks/livelocks with just serializable transactions + an OLTP DBMS.
That doesn't mean it won't happen, of course. The people who write databases are just programmers, too. And you can certainly imagine a situation where you get two (or more) "ad-hoc" transactions that can't necessarily progress when serializable but can with read committed (ad-hoc in the sense of the paper here: https://cacm.acm.org/research-highlights/technical-perspecti...).
I’m not sure they were _introduced_ by switching to serialised, but it means some processes started taking long enough that the existing possibilities for deadlocks became frequent instead of extremely rare.
I mean, you say that, but systems like Spanner exist & I think the fact that it's used for Gmail, Docs, etc. has demonstrated that for a large range of OLTP workloads, serializable everywhere and also performant /is/ possible to architect for -- with the right combination of both database + application design.
That isn't to say it's correct everywhere, but I'd maybe be a little more suspicious of "We want OLTP but we can't use serializable" in a vacuum.
Of course--there are obvious cases like "We can't use serializable because of how our database implements it" or even "because we can't be arsed to figure out how to structure all our data accesses to avoid conflicts and aren't paid enough to solve that problem", but I feel like those are a bit more honest than "Because of performance reasons, in all cases". :)
You can, with some programming languages, require a proof of this (see: Rocq, formerly 'coq').
I think a more interesting case might be showing functional equivalence on some subset of all inputs (because tbh, showing functional equivalence on all inputs often requires "doing certain things the slow way").
An even more interesting case might be "inputs of up to a particular complexity in execution" (which is... very hard to calculate, but likely would mean combining ~code coverage & ~path coverage).
Of course, doing all of that w/o creating security issues (esp. with native code) is an even further out pipe dream.
I'd settle for something much simpler, like "we can automatically vectorize certain loop patterns for particular hardware if we know the hardware we're targeting" from a compiler. That's already hard enough to be basically a pipe dream.
I mean, in C++ (17? 20? Whenever constexpr was introduced) it's totally possible to create a library that allows you to build a SQL query via the language's string concatenation libraries/etc., but only allows you to do it with static strings unless you use ~shenanigans. (C++ unfortunately always allows ~shenanigans...)
I guess you do wind up needing to potentially re-implement some basic things (or I guess more complex, if you want format string support too). But for basic string concatenation & interpolation, it's reasonable.
That's a pretty useful way to get basic string concatenation while also preventing it from creating opportunities for SQL injection.
For example, you have a class that requires a constexpr input & can be appended to/concatenated/etc.:
SqlStringPart(constexpr ...)
operator+(SqlStringPart ...)
(so on)
And you have a Query API that only takes SQL string expressions that are built out of compile time constants + parameters:
SqlQuery(SqlStringPart ..., Parameters ...);
This doesn't solve the problem mentioned in the article around pagination & memory usage, but at least it avoids letting someone run arbitrary SQL on your database.
Proofs are actually incredibly hard to review. Problems will be found in what had been accepted to be a "good" proof years to decades later. There's a whole movement to move proofs over to something/anything more verifiable (e.g. representing all proofs in Coq--but even then you're relying on the Coq proof assistant to have zero bugs).
Furthermore, the standard for mathematical proof has also changed over time, most significantly in the early 20th century. This led to a number of existing results needing to be re-proved (or thrown out! Some were incorrect!).
Exactly what qualifies as a proof is a FASCINATING debate. Mathematics is created by consensus, just like all other knowledge.
Proving a math problem is mentally challenging, but there are other interesting definitions of hard. In medicine, for instance. None of it requires particularly fancy logic. But can you cite any instances in math here you had to invest, say, $10-15M and collect data from every available human patient at multiple hospitals over 2-3 years in order to replicate?
I disagree. Phone encryption should ideally be open source-able and it's security should rely as entirely on a device specific key as possible.
I think this makes more sense for a secret project (e.x. the next iPhone), but honestly as a security person it seems overkill for anything outside national security responsible code, like state sponsored malware.
I also find it strange that the code is apparently somehow accessible outside that building (see the fired comment). If this was anything beyond security theatre, it'd be on an airgapped network and that wouldn't even be a concern (as the employee wouldn't be able to access the code from their laptop). Seems excessive for very little gain.
I wouldn't take SiVal's comment as ground truth. I think it conflates rules for general employees with rules for his friend, and mixes it with a dash of unfounded hyperbole (criminal charges?).
The code isn't available outside the building unless someone takes it outside, which they make clear is not only a fireable offense but might qualify as criminal. They made it quite clear: If you're in crunch mode, don't be tempted to just take a bit of work with you to get a bit more done on the long shuttle ride.
Fair enough. I obviously don't know your friend or his project, so I can't with certainty say anything about his situation. I viewed your post through a critical lens because the details given didn't match my experience or the experience of any of my old colleagues, and you are a second-hand witness.
Yeah, which is hosted on Azure, a data center that Microsoft owns and employs guards for, and secured behind our standard corporate authentication. :) (Source: I work at Microsoft, near the VSTS team.)
> Yeah, which is hosted on Azure, a data center that Microsoft owns and employs guards for, and secured behind our standard corporate authentication. :)
Way back when, Microsoft used to host a bunch of auth servers for banks. A friend of mine mentioned an armed guard in front of the data center for that particular service.
I've worked on teams at MS where there was a (non-armed) guard checking everyone who got off the elevator, but before I joined MS I was once left alone in a room full of computers open to the Windows source tree, wearing my "do not leave guest unattended" badge.
It's definitely true that Meltdown is a more immediate problem--but Spectre is basically the problem that will last. We can move kernel memory into another process space, take the perf hit, and correct most of the meltdown problems.
Spectre style issues had JS pulling browser process memory using timing--the patches being "put every page in its own process" (Chrome) and "don't let people get accurate timings" (Firefox). They are way worse in the grand scheme of things, because even if they aren't as easy to exploit, they will continue to show up, probably for the foreseeable future (next 5-10 years), long after Meltdown is patched and old news.
I'd disagree as far as your position on the word "gender"--I think actually "race" is a much better example of something that has had many meanings over the course of its lifetime in the language, and isn't necessarily well defined even now.
Gender as a term that relates to people, and not words (as in the sense of the gender of a noun in Latin-family languages), has been pretty well defined since it was introduced in the 1950s--specifically as a way to differentiate between biological sex (male, female, intersex, etc.) and the social roles that people play that may or may not actually be related to biological sex (men/women/boy/girl/etc.). It's always been quite well defined, although some people feel that it either doesn't make sense to differentiate between biological sex and gender (e.x. because, perhaps, they believe that social gender roles are innate conditions of biological sex).
Just a point, The word "gender" was brought into the English language somewhere in the 14th century, so we are looking at 600 - 700 years of usage and has it roots in propagation and breeding.
Your disagreement over the meaning of the word "gender" just highlights the accuracy of my "position".