At the last company I worked at I only used C# + WPF (it was horrible). A couple jobs ago I only used R. There are companies with entire divisions that never have to touch javascript, and I'm certain there are companies that never use it. There's a very large world of programmers working for insurance/Healthcare/embedded/military/government that is nothing like modern web dev.
I would argue there is money to be made out of people with zero money. Student loans are an example. Instead of zero money, they now have negative money. This is terrible ofc.
I'm really looking forward to when SIMD support is released in later versions of Java. If anyone here follows handmade hero (by Casey), he explained SIMD in some of his videos when implementing the software renderer. The speedup from regular code to SIMD is very impressive. You essentially get speedup of 2x-3x or even more for free.
The downside is it makes the code much harder to read, so I think it should be applied only in very small sections of the code.
What does Java have to do with Indian university? Java/JVM is widely popular in the enterprise world for backend services, and a ton of open source software, e.g. Cassandra, Hadoop. You need to learn more stuff for sure.
I’m not familiar with Durable Objects. When D1 does replication to read replicas, if it’s not doing synchronous replication, then it’s not strongly consistent, is that correct?
Nice project! I have been studying system design recently and read a lot about Dynamo, Cassandra, Kafka, etc. and I realized that they have much more in common than I thought. I'm mainly learning for system design interview, but I plan to do some implementation myself after getting a new job. So I might use this repo as a starting point! Thanks for the effort OP
I'd like to add my anecdote. I have been driving my Tesla for about 2 months now, and it definitely feels safer than the ICE cars I drove before. This is mainly due to more information given by the Tesla such as the distance between my car and other objects, and it also shows the lanes around the car so it's easier for me to keep a safe distance from other cars nearby.
I also tried the 'FSD' on freeways where it can navigate on autopilot. It's really feels amazing and takes a lot of burden of driving at high speed off of me.
However, recently I was going onto the acceleration ramp, and turned on navigate on autopilot, Tesla told me to confirm a lane change to the right, I did and it somehow didn't see a car to the right of me trying to change lane to its left. That would be to same spot I would have gone to. So I had to manually intervene and take control of the wheel myself.
Manual override isn't difficult at all though, and there are multiple ways to do it. It can be done through turning the wheel manually, toggle the right stick upwards, or hit the brake. So I think if you still pay attention (maybe not as much attention as driving a non-Tesla car) when driving a Tesla, it's going to be okay.
Paying attention while a computer does a task for you automatically and correctly 99.9% of the time is basically impossible though. That point has been made many times before.
I wonder if this will paradoxically lead to Tesla's becoming less safe overall as the software improves, because errors will start to become uncommon enough that people really stop paying attention.
From the sound of it it is currently unreliable enough that most people know that they really do have to pay attention.
The bottom line is that FSD as sold is nowhere near ready for prime time, still. As in, you always have to supervise it.
Who's wants to be their cars supervisor?
For me, it's either fully drive and enjoy your driving ... or full FSD and relax ... if it's somewhere in between, per Tesla, it's pointless (and don't get me wrong, lane assist and cruise control are handy, but they've been around for eons).
No, not really. I mean, I'll use lane-assist and cruise control for very simple situations on a motorway. That's it. I know what the car is going to do, but I'm still driving and that's fine.
It would drive me nuts to have to supervise an FSD car through a city or any situation beyond the above simple motorway situation.
You'll never convince me otherwise. I don't want to be my cars manager.
> It has been assert many times without much actual evidence.
Probably because it is so obviously true surely? Do you really doubt it?
We've already seen Teslas crash due to autopilot failures in ways that definitely wouldn't have happened if people were actually paying attention (e.g. driving into the side of lorries) so there is at least some evidence anyway.
> Probably because it is so obviously true surely? Do you really doubt it?
Yeah. I do doubt it. The question is about if the fall off of attention is larger then fall of of amount incidents. And then how many of those incidents are of a critically where you can not overcome the not paying attention part.
Its not at all clear to me if its a net negative in security, in fact I would suspect the opposite.
> We've already seen Teslas crash due to autopilot failures in ways that definitely wouldn't have happened if people were actually paying attention
Not really. We have seen Tesla crash mostly in situation where they crash all the time already. And that analysis is prove of existence, not prove of significance and net negative security. You need to actually show that this happens MORE often if Autopilot is engaged.
I have seen 100s of videos were Autopilot avoids a vehicle cutting Tesla off for example. In many of those cases even a human driver would not have react so quickly. Most of those cases would not result in death but some could. You need to actually take all of that into account.
I always wonder how people come up with the designs for various ciphers. When I took crypto class in uni, it all seems like different math/bit/word operations get randomly mashed up and repeated for several rounds. Is there a good resource for learning how to design ciphers?
Im not an expert, but i think it comes down to knowing how to break ciphers, and not doing any of those things.
Most symmetric block ciphers involve something along the lines of mixing in the (sub) key, some linear permutations, some non-linear permutations, then repeat all 3 in many "rounds"
I've often thought that alien civilizations would also probably invent asymmetric cryptography pretty much the same as us: Lattices, Diffie-Hellman, RSA, etc. Those are all based on very pure mathematical problems. Whereas symmetric cryptography, while it all does need some common themes like diffusion, there's no way an alien civilization would design something that looks close to AES or SHA2. They'd have symmetric ciphers and hashes, but they'd look quite different I think. I'd love to be convinced otherwise on this.
The set of operations you have access to as a designer affects immensely what your cipher will look like. If your alien CPUs had a very different instruction set than ours, their ciphers would look very different.
Back in the old days memory lookups were as fast as computation, and S-box based designs were very popular. That is no longer the case, to an extent, both for security (cache-timing side-channels) and performance reasons. S-box designs are still common, but the S-boxes are usually <= 4-bit wide, mostly there to facilitate analysis (counting active S-boxes), and usually implemented as boolean logic instead.
Without S-boxes, the other main approach is to mix operations from incompatible algebraic domains. Like add and xor. When composed many times together, hopefully this results in a very complicated nonlinear expression of very high degree on any of these domains. One of the first popular ciphers to do this was IDEA, which mixed addition, xor, and modular multiplication to pretty good effect. The challenge then is to figure out a set of these operations that is both efficient at eliminating input-output structure (linear, differential, etc characteristics) and efficient at being computed in the widest possible range of machines. This restricts your options to a common set of operations, like add, xor, shift, and so on. Multiplications can be useful, but they don't do very well at the low end, and tend to complicate analysis.
This is only at the very lowest level of the design phase, where you're picking your mixing/diffusion building blocks. You still have to decide on a higher-level structure such as the various Feistel variants, Lai-Massey, SPN, etc, which comes with its own set of tradeoffs.
Yeah, I could believe aliens would have ciphers that mix add, modulo, xor, etc. in some way, but the choices we use in designs do seem to be somewhat random among a large set of possible decisions (though I could be wrong, and it would be really cool to know if there was a "natural" cipher that ought to be universally "discovered", based on how physics, mathematics and computation generally works in this universe). Or at least a class of ciphers where some decisions like constants can be arbitrary without affecting security or efficiency.
Take the polynomial in GCM mode. I'm no expert on this, but I believe it's chosen to be somewhat "awkward", but I'm not sure if it was really the only choice out there. SHA1 I think, and other constructions, sometimes have "nothing up my sleeve" numbers in places like the digits the square roots of small integers and whatnot, but they're just arbitrarily chosen so as not to be suspicious to others. But those are just constants. What about designs? E.g. AES rounds: if you still did the operations or rounds but in a slightly different order, or added/removed some other operation somewhere, I imagine there are many combinations where this will be just fine. I really have no idea if it's chosen completely optimally. It certainly seems different to RSA and Diffie-Hellman which, aside from the padding perhaps, I think are naturally destined to be discovered due to their close relation to group theory and primes.
GCM is a different type of construction, but the polynomial used there is explainable---it's the irreducible polynomial of the type x^128 + f(x) for which f(x) has the lowest degree (i.e., it's the lexicographically smallest irreducible GF(2) polynomial of degree 128). Any irreducible polynomial picked here would work as well, from a security point of view.
AES is a good case study, because you _cannot_ remove any operation without ending up with a broken cipher:
- If you remove AddRoundKey, there's no key getting mixed and you don't really have a block cipher in the first place;
- If you remove SubBytes, the entire thing becomes linear and elementary to break;
- If you remove ShiftRows, each column of the 4x4 state is independent of each other, and you could trivially distinguish the cipher from random by observing that only 32 bits change for any single byte change;
- If you remove MixColumns, you're removing most of the diffusion of the cipher, and once again it becomes much easier to break because there's no longer any mixing across bytes.
The order itself doesn't matter all that much. You can reorder AddRoundKey, ShiftRows, and Mixcolumns any way you like and you end up with the exact same cipher, since those are all linear operations. Ordering them differently with respect to SubBytes would have no meaningful security impact, but it serves no purpose to start with anything other than adding the key and applying SubBytes, since all the other linear stuff can be trivially canceled out by the attacker until the first nonlinear operation.
The general principle in AES-like ciphers is to iterate rounds of the shape `block = nonlinear_operation(block, key[i])` followed by `block = linear_operation(block)`. You will find many (many!) designs like this if you go looking, trying various choices of nonlinear and linear operations that guarantee various useful properties. In the case of AES, SubBytes guarantees that the differential probability going through any byte is upper bounded by 1/64; MixColumns guarantees that for any two distinct 4-byte inputs, between the input and output there are at the very least 5 changed bytes. When you mix these two together, with the help of ShiftRows you can show that useful differential trails are pretty much gone after 4 rounds.
Constructions like SHA-1 (or more precisely, the block cipher underlying SHA-1) do not generally have such clean rationales. You can view it as a Feistel-like cipher with a linear key schedule, so it's hardly a random mashing of operations, but how the particular sequence of operations of the round function is chosen is a bit of a trial and error process. The same applies to Salsa20, Chacha20, and most other ARX constructions.
Arguably the entire reason that Rijndael won the AES competition is because it isn't "randomly mashed up". Each of the individual steps in AES is (relatively) simple and has an explainable design motivation.
Unfortunately I don't have any good advice on where to read up on designing ciphers, as I learned this in a uni course on symmetric cryptography.
It is also very educative to read the AES selection methodology for the best symmetric encryption algorithm. Multiple criteria were used to find a good compromise between security margin and implementation convenience. What is also a curious thing is that Rijndael was not the algorithm that provided the largest security margin, Serpent was yet it didn't get selected.
I think one conclusion the cryptographic community took from previous issues is that the best way to get a solid cipher is a group process.
Even highly skilled cryptographers can fail. Ron Rivest designed RC4, and I don't think anyone would claim that Ron Rivest is not a good cryptographer (he's the R of RSA). But RC4 was not good.
What has been done a number of times now and is currently happening with pqcrypto: You ask everyone in the crypto community to come up with proposals. Then you let them try to find weaknesses in each other's proposal. Then you gradually remove the ones that are considered problematic for any reason.
While you can argue whether this process is perfect (I think some people would argue either serpent or twofish should've won the AES competition), it has not produced any major failures.
This is sort of more interesting because modern ciphers are so much better than older ones. Between Caesar, Enigma, and AES, the growth in robustness seems exponential.
> it all seems like different math/bit/word operations get randomly mashed up and repeated for several rounds
Honestly think you have it down. /s
Need to add some S-Boxes, P-Boxes and SP-Boxes though. The post-quantum stuff people are proposing now gets a bit hairy mathematically but lattices and their hardness properties are fairly easy to understand.
Gimli is a cool place to start for anyone wanting to dip their toes in the water.
It's a permutation in 30 lines or so. Building a cryptographic hash out of it is another 30 sloc.
I second this. I can't use over the ear ones because I'm wearing glasses. Then I got Bose qc20 and it's been working great for me so far. I can use it for a couple of hours without any major issue.
The only minor issues is that moist starts to accumulate in the ear after a while, which I guess isn't a big deal.