Hacker Newsnew | past | comments | ask | show | jobs | submit | normaljoe's commentslogin

I wouldn't call Apple's decision a mistake, they knew exactly what they were doing and their long term plan required it. The relative insignificant size of user base that still needs 32 bit support is dwarfed by all devices that will never need that. Apple has always been quick to drop backward compatibility to support innovation both at hardware and software. They dropped floppy support and CD/DVD support eons before the rest of the desktop market. Since they own the complete stack at this point and with everything SoC, at some point they will start saving die space not wasted on 32 bit support. To get there however requires they start pushing the software first.


I recently took the list of the commercial software that supported MacOS that I’ve paid for over the years and checked current platform compatibility.

Twice as many packages run under Linux than under MacOS, specifically because of the lack of 32 bit support.


While that may be tragic, it still is the intentional effect and hardly a mistake. Apple focusing on its $250B iPhone market over its < $50B Mac market very much makes sense. When they removed 32 bit support we might have guessed at Apple Silicon on the desktop, and lo and behold that did come to pass. The Intel to ARM transition was much smoother than the PowerPC to Intel move in part because of 2 vs 4 versions in the Universal Binaries. I am going to go out on a limb here and say that Apple was also aware of ARMs forthcoming complete removal of AArch32.


Are you talking about games? Or old versions of apps?

I know of very few actively-maintained 32-bit Mac applications that didn’t make it to 64-bit: MathType and AccountEdge are the ones I remember right now.


> Maybe the software integration helps avoid this?

With auto pilot on the car is watching you. If you take eyes off the road it issues more pay attention nags. Failure to comply removes FSD Beta. So you have a feedback loop where paying attention becomes more important then your phone.


Chrome used WebKit until the Blink fork in 2013. That should make the numbers work. :)


As the other posts point out it's not just a matter of material, but a trade off. As you increase the ductile strength you end up with something that is more brittle and the same is true in reverse.

Brittle materials are more temperature sensitive and can rapidly fail. Brittle fracture is no joke look for WWII Liberty ships split in two as an example. We didn't really understand why at that point in time.

The answer though is actually pretty simple and why you can't get a perfect material. The lattice structure of the molecules connected to each other can be soft (plastic) or hard (steel).

In the plastic example just take a milk jug and poor boiling water into it. Don't actually do this without protective gear (oven mit should be fine). The molecules holding everything together get excited from the energy (heat) move around and since they are not closely bonded end up in a different configuration. Your milk jug at this point does not look the same and is deformed.

In the hard example just boil the water you are using for the plastic test. The pot you used does not change because the molecules are tightly bonded.

It turns out the the tighter the molecules are bonded there is still a point where that bond will break. Hence the brittle fracture, which is every molecule mic dropping and going home at the same time.

While I could go down another rabbit hole to the atomic, subatomic, level of how those lattice structures work. I won't as I think this example gives you what you need.

Finally in this case we have a very old, by modern space technology, structure that is made of material that was most likely never expected to last this long. If you want a space ship to last you just have to make sure the engineers understand what "last" means. :)


And this still happens in Texas where it is suppose to be illegal as well. I called provider multiple times and explained the rate from my insurer. They of course sent to collections, but they know they can't enforce it in court, but that doesn't matter. Most people are just going to pay.

What really got to me was it was ER visit and the Doc(extortionist) spent maybe 5 minutes with me before having the nurse fix me up. After insurance $250 to hospital, and $2000 bill for the Doc.


WiFi Teletype that works. I so want that.

Where can I even find a Teletype that might actually be close to working and if do find it can I get a little more notes?


There are a bunch listed on ebay. "Working" may be difficult to verify.


RC vs GC really depends on workload/memory and the volatility of the allocate to deallocate timeframe.

I think the better argument here is that iOS and MacOS use RC in the underlying objc libs. Having a CPU that works better around that makes sense to increase performance for those particular OSes.


Oh I didn’t mean to distract from the improvements to refcounting - it sounds very much like they’ve significantly improved the perf of uncontended atomic increment vs Intel which is obviously a win on iOS and OS X as objc and Swift both inline refcounting - I think on windows/COM it is through a virtual call? In which case improving the increment itself seems like it would not be a huge win.

As far as perf the general argument is that dropping the need for refcounting saves time, and that removal also helps caching due to reduced per-object size.

That said I’m not sure if those comparisons are comparing to generational or moving collectors (which are the low latency collectors) because those start needing write barriers.


Clock is a up volt and down volt. There is a period of time it takes to reach either up or down.

If I have to get the clock to reach 2 cores/units at the same time that is easier then say 4 cores/units. As you increase the clock speed the time to reach all the cores decreases. So if you are wider you would need to reach more cores/units in the same period of time, hence "harder".

EDIT (To make it a little more clear) To make it more clear the voltage change is typically represented in books as a vertical line, but that is not the case it's diagonal and fuzzy. By fuzzy I mean not a straight line but will have some tiny mini downs on the way up.

Different parts of the circuits are going to respond to the up or the down. They are also can vary based on the exact voltage. For example if I have a 5V up one circuit might consider 4.8V to be up and another could be 4.9 or 4.7.

Silicon has improved, but there still is limits of scale based on size, volts and timing.


Except it did by creating a larger market. Some of that was flashy at the start like the gold watches but ultimately created a new market. Personally I quit wearing a watch sometime around 2010 when I found I was checking my phone and didn't need a watch. Now I am checking my watch for time and weather and not pulling my phone out.


They did this with FedEx planes out of China years back, forget which iPhone version, but they basically did the same thing they booked everything. All FedEx planes only had iPhones and everybody complained. I don't think they have tried that stunt again.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: