> I fear the current state of our industry eliminated the possibility for not-great, not-skilled juniors to embark in these journeys
I think both sentiments are a product of their times.
Was porting an OS to a new architecture an extremely skilled thing? 100% then and 1000% today. With each new stage of advancement and increase in the layer of abstraction away from the core metal, newer developers no longer need to know how to program at the lowest level like targeting a processor architecture directly.
Software development from the 1950s till the rise of Windows as the standard was targeted not towards systems like we do today but towards processors and architectures. Processors at that time were simpler to write for. You could get the datasheet for whatever was the latest processor from a magazine, understand it inside and out and start writing software for it. Today I do not think there are more than a few dozen people who understand the x64 line of Intel processors at the same level. So times have changed. We write for operating systems now and not processors anymore.
I think that this is neither good nor bad. It just is simply how it is. I'm sure that people who worked on computers in the 1950s at the assembly level would have been complaining in the 1970s about people writing programs in C/Pascal. And so the cycle continues.
In fact, I think that the current state of generative models that output code is the perfect scenario to separate the wheat from the chaff. Their power function nature gives a clear divide between people who worked in software for the paycheck and those who love technology for it's own sake.
I think what ultimately led to Sun's downfall is a combination of what ESR [1] and joelonsoftware [2] have previously covered.
1. Sun didn't become the defacto desktop platform because they lost out to WinNT. So they lost out on the consumer market.
2. Custom server hardware and software makers like Sun and Silicon Graphics were the fashion till Google and later on Facebook came around and built their own data centers with consumer hardware and specialized software to overcome the inherent unreliability of that hardware. And anyway ever since web-based software became a thing your device is practically a console a la Chromebooks. So they lost the server market.
The only option left was to serve the high end HPC market like labs or even banks but that didn't make business sense since that's increasingly niche because those customers would eventually also want the effects of commoditization.
The real losses were against Windows 2000 (specifically Active Directory) and to Linux.
The loss to Linux was greatly accelerated by Sun's failure to make a deal with Google for Google to use Solaris on their servers. The story I heard was that Scott wanted a server count for the license while Google believed server count was a top secret datum.
If Sun had made a deal with Google in 2002 and worked on OpenSolaris starting in 2001, then Linux might not have been quite the success it became.
It wasn’t Google’s investment that made Linux a viable OS for enterprise applications. Google using Solaris would have made little difference.
Active Directory was a huge win for Microsoft. We’ll see them milk that product for generations. Sun could have captured a part of that, but it’d need to compete against Microsoft when 99.9% of the clients using AD were Microsoft. I doubt they would succeed.
Another fun alt-history branch is the one Sun manages to sell thousands of Amigas as low-end Unix workstations, moving Unix down into the personal computer space, and saving Commodore.
Sadly, none of that happened and we live in the crappiest timeline.
>Another fun alt-history branch is the one Sun manages to sell thousands of Amigas as low-end Unix workstations, moving Unix down into the personal computer space, and saving Commodore.
This never would have happened with the 3000UX, and various websites are guilty of passing on nonsense (like Sun actually having designed the darn thing). Amiga by this time had already fallen behind Apple's 68K offerings. There is no time in history when the 3000UX was competitive with Sun's own products. By this time Sun had three separate offerings (SunOS on SPARC, SunOS on 80386, and PC/IX on 80386) and would not have added another which, again, was technologically behind and incompatible with Sun's own products.
Maybe. Sun could have acquired Commodore in 1984 or 1985 and Dave Miner and the blitter/copper, and gone a bit more the SGI route.
Also, Commodore did the first SVR4 port outside the Labs, and Sun ended up doing the first commercially successful port of SVR4 (Solaris). So it's not that crazy.
(I think the SVR4 porting was probably a mistake. At Sun we had a pejorative for a lot of the garbage in SVR4: "it came from New Jersey".)
> Maybe. Sun could have acquired Commodore in 1984 or 1985 and Dave Miner and the blitter/copper, and gone a bit more the SGI route.
You mean acquire Amiga. Commodore in 1984 was far larger than the brand new Sun. But yes, that is a very intriguing path not taken.
>(I think the SVR4 porting was probably a mistake. At Sun we had a pejorative for a lot of the garbage in SVR4: "it came from New Jersey".)
You obviously are on the West Coast side of the Berkeley/Bell Labs divide. Was there a lot of internal discussion/dissension before/during the SunOS/Solaris transition?
> You obviously are on the West Coast side of the Berkeley/Bell Labs divide.
No, I joined Sun long after the SunOS 4 -> Solaris 2 transition. The "it came from New Jersey" thing was just a pejorative phrase we used for ugly code with ugly code smells that came from SVR4. It was certainly not my coinage, but rather something Sun's greybeards would say. I had occasion to say it myself.
Basically STREAMS and XTI were disasters that took two decades to eradicate. But there was plenty of stuff in userland that wasn't great either. I recall a bug in eqn once that elicited that comment from someone.
> Was there a lot of internal discussion/dissension before/during the SunOS/Solaris transition?
There was plenty of evidence of internal dissent still a decade after the transition. SVR4 just wasn't all that great. And really, Solaris did not resemble SVR4 that much anymore 20 years after the transition. However, Sun was able to make Solaris quite good in spite of SVR4.
Ultimately I think the transition was good for Sun though. More than anything the user-land of SVR4 was fundamentally different from that of BSD primarily because of ELF, and I think ELF was a fantastic improvement over static linking (at the time, and even now because the linkers haven't adopted any of ELF's semantics wins for static linking, though they could).
> Maybe. Sun could have acquired Commodore in 1984 or 1985 and Dave Miner and the blitter/copper, and gone a bit more the SGI route.
Another path not taken is the Commodore 900 <https://en.wikipedia.org/wiki/Commodore_900> running Coherent. If Sun buys Amiga, perhaps Commodore goes ahead with it and eventually dominates the world via Unix(like)!
Sun selling Amgias would have been quite interesting.
As for AD, Sun had an opportunity to buy u/lukeh's XAD, which was compatible, and it could have done the whole embrace-and-extend thing to MSFT. Instead Sun passed on the deal, Novell bought it instead, and then MSFT acquired Novell. At the time the Sun DS folks were not particularly interested in taking on AD -- they had a cash cow and they were milking it, so no need for innovation.
As for Google using Linux or Solaris, it certainly would have been a PR boost for Sun, and one way or another would have improved Sun's position while denying Linux important resources (contributions from googlers).
NT ate the technical workstation space from below. Once NT was good enough on commodity hardware, they were toast.
Unless they went the Apple route and made “luxury workstations” average people would buy. Hindsight is always 20-20, so we now see all the things they could have done then to prevent now from happening.
I would say it's more like the internal Xerox machines that were built in the mid-1970s at PARC. Most people have no idea how to work with those machines. By the time the 128K Mac came out, people understood what a GUI interface was and knew how to use them - it's just that the Mac in particular was limited by it's hardware.
imo, Apple are actually ahead when it comes to the hardware side of the whole thing. Their vertical integration gives them an edge not many can match, when it comes to running ML models. It's a no-brainer for Apple to reduce the barriers for devs to build really cool native Mac/i{Pad}OS applications and incentivize them to leverage the built-in AI/ML abilities to a greater extent. The iPhone in part took off _because_ of the whole app ecosystem that got built around it. Sure they might take a loss in their services revenue in the short term but they get to be _the_ AI platform for at least the next decade and half - both on-device and server side with their new Apple silicon servers.
It's just that most Apple software seems to suck in some fundamental way right now. I don't know if it's a technical issue (SwiftUI being meh when compared to UIKit for example) or a culture issue or the money coming in insulating management from accountability. Software execution has been lagging behind the excellent hardware execution for almost all of the Tim Cook era. They desperately need someone like Scott Forstall to come in, kick butts and get stuff going again.
They ideally have a couple of years while waiting for Moore's Law to catch up to turn around their software side. Otherwise, it's a real shame that all that great hardware is just being used to run Electron B2B SaaS apps.
They also had (but squandered) the potential to be ahead on the software side. macOS is the only platform I'm aware of that has every app wired for scripting (AppleScript/Apple Events). And not only that, they already solved the issue of adoption since almost every well-behaved (read: non-electron) application has decent support for AppleScript.
It would take very little effort to put an LLM frontend on top of this, and yet they've not only abandoned applescript (or the underlying apple events) at a time when they could expose it to the masses, but have gone in the exact opposite direction with "Shortcuts".
Oh and the icing on the cake is that apple events can be sent over the network as well, and this infrastructure has existed since the early days of OSX.
I agree. The AppleScript/Apple Event Manager thing is an example of treating macOS like a second class citizen in the ongoing iOS-fication of the Apple Ecosystem. The point of macOS for me is that it's simple to use for most beginners but allows more advanced users to add complexity through tools like Apple Script and Automator and the underlying Unix base.
Like you say, an MCP server integrated with AppleScript/Apple Event Manager would instantly hook up any LLM with virtually all Mac apps. More Mac devs will then be incentivized to support these features. For people who find AppleScript un-intuitive, JavaScript is also supported. And in my view, this is a revolutionary way to use my computer - a very Star Trek way of using the computer.
> Their vertical integration gives them an edge not many can match, when it comes to running ML models.
They have an advantage when it comes to running them locally, but it feels like they're trying to push it onto consumer hardware before consumer hardware is at the point of actually being able to run useful LLMs.
You're right. The hardware right now can't run useful models.
But that's why I think they have a couple of years to sort out their software issues. When useful models can be run on their devices they have to be ready. The hardware advantage can only be an advantage when they have the software to run useful applications. Hopefully they don't get stuck in the typical big company bureaucracy and ego matches and instead can make a change for the better.
They made it less useful because they were greedy. Before M1 we used to have laptops that had 16GB RAM as a base. With M1 the made base back to 8GB. In PC world before M1 and even before apple started soldering everything you could have easily laptop extended up to 64GB RAM for a cheap price. Those ram sticks are not expensive in retail price and should be even less expensive if wholesale and not full sticks as bill of material but just memory modules.
If last year they would make each macbook air as a standard have 32GB RAM and iPhone 16GB RAM + 250GB SSD as a base they would have the most capable hardware with big user base.
Sure they loose some money of people having upgrade models but they would sell much more Macs. As a reference they sell each year only ~20M Macs comparing to 60M ipads and 240M iphones. Macs are having only like what ~10% market share worldwide? They could easily double it but they protect their profit margin like virginity.
> Before M1 we used to have laptops that had 16GB RAM as a base. With M1 the made base back to 8GB.
You're getting product segments mixed up. From what I can tell, the 13" MacBook Pro in the Intel era never started at 16GB; the last model still started at 8GB. That's what was replaced by the M1. The 15/16" Intel-based MacBook Pro models didn't get a proper replacement until the M1 Pro and M1 Max, which started at 16GB and 32GB respectively. The only regression I can find there is that the last 13" Intel MacBook Pro could be configured with up to 32GB, which wasn't available from the base M series chips until the M4 last year.
A lot of people forget that it was only recently that the Photos app on your iPhone could run OCR text search on pictures in your phone. Google had that feature on their phones many years before Apple.
> Recent advances in text-to-speech (TTS) synthesis, such as Tacotron and WaveRNN, have made it possible to construct a fully neural network based TTS system, by coupling the two components together. [...] However, the high computational cost of the system and issues with robustness have limited their usage in real-world speech synthesis applications and products. In this paper, we present key modeling improvements and optimization strategies that enable deploying these models, not only on GPU servers, but also on mobile devices
Having worked on Apple's TTS for more than a decade, I can state with confidence that this is utter bullshit and you don't have the slightest idea what you are talking about. Both in terms of quality, and of the underlying technology used, Apple's current TTS is in no way comparable to what existed 10 years ago (at Apple, or anywhere else in the industry).
I challenge you to find a 2014 recording that is on par with a contemporary Siri voice.
I have been playing recently with those enhanced TTS model and they are of similar quality like piper TTS models to me - not that good. StyleTTS 2 like kokoro sounds so much better for me and also run realtime on their devices. And when you compare their online models to not even what OpenAI have but some small recent startups like Sesame or open source models like Orpheus, Apple TTS sounds (pun intended) really behind.
I don't dispute your claim, just that I still find Alex voice to be the best, and it's been the same since over 10 years ago. The other voices have issues, they don't sound too good at 1.5x.
Alex was developed when VoiceOver (the screen reader) was the primary use case for text to speech. Consequently, it was optimized for low latency and robustness under rate changes.
The Siri voices sound much more natural at 1x and have a higher signal quality, but rate changes were a lower priority for this use case.
Fun fact: when we worked on Alex, many VoiceOver users stubbornly hung on to Fred (which is mostly using late 1970s technology). Screen reader users are not fond of switching voices; it appears their hearing locks in to a particular voice, so switching is costly.
>imo, Apple are actually ahead when it comes to the hardware side of the whole thing.
It surely is just your opinion. Nvidia is king and Apple has found a way to market an integrated GPU and CPU RAM as something magnificent, rather than something that has existed since the dawn of computing.
There is a reason Nvidia is king. There is a reason corporations buy Nvidia and not Apple for their LLM uses.
You seem to think that the AI market consists entirely of the segment that NVIDIA dominates (datacenter training and inference) and that the segment where NVIDIA is absent (inference on battery-powered devices) doesn't exist.
It's really wild how Mister Rogers Neighborhood had this slow, thoughtful cadence with long pauses, quiet moments, no flashing graphics or over-the-top antics. It engaged the audience (mostly kids I guess) with something calm and sincere. You don’t see that much anymore.
I sometimes wonder if a show with that same deliberate pacing and emotional intelligence could even succeed today or if it would get buried under all the noise.
True. I was born in the 80s so had a few tastes of the programs of those early days when people still appreciated a smiling face, calm words and long explanations without flashy special effects. They were quite good actually.
> In fact the whole British imperial project was largely glossed over. But lots of coverge of the Romans, Vikings, Normans, the black death and the two World Wars.
I feel that this is a major source of why Britain (and Europe to a larger extent) is unable to come to terms with reality on a majority of issues today - immigration, foreign policy, economic policy etc. They simply have not come to terms with the loss of their empires and the wealth they brought. So they choose to not teach it. This leads British institutions today to have a serious colonial hangover whether they know it or not. The operating paradigm is still an outdated one in many cases.
They teach students what they think made Britain great -- the Romans, the Norman invasion, the World Wars, Churchill etc. -- while actually glossing over what made them great: Empire. It really brings to mind a line from the Thor: Ragnarok movie - "Proud to have it; ashamed of how they got it". The British people today might not have an idea of their Empire but the effects still linger on in their former colonies.
> I feel that this is a major source of why Britain (and Europe to a larger extent) is unable to come to terms with reality on a majority of issues today - immigration, foreign policy, economic policy etc. They simply have not come to terms with the loss of their empires and the wealth they brought.
I would be so bold at to assert that no millennial really taught that a) Britannia had an empire and b) Britain ruled the waves.
Two world wars, and slavery is pretty much all we were taught, unless you specialised.
"modern" immigration was/is much more driven by our former membership of the EU than empire.
Empire is why our friends had Caribbean grandparents. WWII for polish grandparents, and Idi Amin why they also might have had indian parents born in Uganda.
But they were all pretty British to us. They sounded like us, dressed the same.
"modern" immigration when I was growing up was mostly Portuguese and Polish, later more baltics when that opened up to schengen.
But those later countries were also a product of another empire: USSR.
> I would be so bold at to assert that no millennial really taught that a) Britannia had an empire and b) Britain ruled the waves.
Millenial here from the US. I was taught about the British Empire, extensively, in both high school and college. My high school teacher played "Rule, Britannia" (lyrics include "rule the waves") for us to hammer home the point.
Not a universal experience though: also a US Millennial, in middle school/highschool our history classes were pretty much only US history, and only touched world history as GP described. We did have a world history elective in highschool though, but it was an advanced placement class and not everyone could take it. No history classes during college.
Additionally my history classes all ended around the 60s-70s - roughly when the teachers were kids. Seemingly from their perspective "history" didn't include anything they experienced.
Neither. While colonialism didn't _create_ generational poverty, the systemic genocides of the British were new. Colonial policy of prioritizing exports directly led to the deaths of millions. That's a fact.
A similar comparison would be between Roman slavery and the chattel slavery of the Americas. They are both abhorrent practices (just like the genocides caused by Indian rulers in the pre-British period), but it pales in comparison to the scale and horror of antebellum slavery.
Well, if this is not mentioned at all during history classes, at least it prevents them from being taught that "British brought prosperity and development to all of its colonies, making the world better for everyone" and "they should be thankful that we went there and did all those things, how nice of us, and how rude of them not to thank us again and again!".
I think it comes from the general belief that poverty is bad and a simplistic view of cause and effect. "Before the Empire they were really poor, with high mortality, after the empire they were much wealthier with lower mortality".
It's just going to be the default view if one does not have further information.
Go is at a weird place where the standard library is big that third party libraries aren't really required but not that big that it can do everything. So like OP says there is a huge thing with Go projects (and devs) about reinventing the wheel for each project. While sometimes this is fine (and even welcome because it offers you flexibility and customization), it is mostly really slowing down someone who wants to write a full-featured application with auth/ORM/web security etc.
You either have to write your own or try to use something that is actually really behind from what you're used to. And coming from a Python, all batteries-included ecosystem it's actually nuts. Our small team would have to dedicate a huge chunk of dev cycles to write our own plumbing. Dev cycles that we could have used writing business logic that actually made us money instead of writing yet another auth.
But having said that, I don't think Go is the language to write these kinds of apps in. So if you're trying to use Go to do that, you're obviously going to run into roadblocks. It's like trying to cut through metal with a pair of gardening shears. Can it be done? Yes. Can I choose the size and design of the shears? Yes. Can I modify or create my own shears? Yes. But is it the most efficient or appropriate tool for the job? Probably not.
IMO Go is ideal for server/CLI applications or even for web apps that interact with other services completely behind your network. But for most other things you're better using something else. Or you use it if your customer really wants to use Go and wants to pay you.
Go is like a tool used to build the plumbing and internal systems of your house, but not the house itself. Using Go is similar to casting and forging your own water pipes. Some people enjoy getting into the details of crafting those pipes, but most of us just want to install them and focus on the rest of the construction.
> Go is like a tool used to build the plumbing and internal systems of your house, but not the house itself. Using Go is similar to casting and forging your own water pipes. Some people enjoy getting into the details of crafting those pipes, but most of us just want to install them and focus on the rest of the construction.
It actually gives me the house, it just doesn't do decoration. I hate doing decoration anyway. Other languages + their frameworks usually force me to care about that.
This might be a contrarian view to the rest of this thread but I think this is a decisive move by Microsoft. AI-enabled OSes are a part of the future (for mainstream consumers at the very least). By sticking with Recall despite the initial backlash, I think Microsoft is showing they're a serious player.
MSFT would not risk their enterprise and government business with features like Recall if they weren't sure that it had a need and requirement at some level. Fundamentally, MSFT isn't a company that preempts the needs of their users and haven't been for the past 25 years. They've lagged behind mobile and then cloud because none of their main customers thought they were important. They face they're going on the front foot is indicative that they have a larger strategic play going on here.
Coming to the product itself there appear to be sufficient controls in place on Recall. It's opt-in even if it cannot be uninstalled. It's all on-device and allocates specific space on the PC. I can specify if I don't want it to take snapshots of certain apps and it doesn't take snapshots of private browsing by default. IT teams can manage Recall through policies AND users can have further fine grained control over their settings beyond that. It's great that MSFT have included these right from the start because if we're frank, not all other tech companies would have thought it through.
Personally, I wouldn't use Recall but I can see the appeal and usefulness -- for both consumers and IT teams. You ask your computer what you did on so-and-so date and it'll tell you? That's great and what computers should do -- take cognitive load off of our minds. Plus it's a great audit trail in the office.
My only gripe with it -- as with anything MSFT really -- is security. I'm not entirely sure MSFT would be able to stop people writing malware that explicitly steal Recall data. I hope they have safeguards but being closed source that's the best we can expect unfortunately.
I know HN users are more likely to be anti-MSFT and more tech savvy than the average consumer -- it's a bit like the tech enthusiasts buying smart products and the senior engineer living alone in a forest off-grid. But what we have to remember that we're the exception than the rule. Most people are tech-illiterate and have no inclination towards learning more or towards spending more time with their computers. Products like this are for them.
> It should be forbidden to ship PCs with Windows preinstalled, unless the user made it a conscious purchase decision (and offered to choose between several options).
I don't know where you're buying PCs but where I'm from there is always an option to buy the PC with FreeDOS at a discount of about ~$20-30 compared to the Windows version. Lately, I also see an increase in Ubuntu computers.
> But then you can bet Microsoft lobbying won't let that happen.
I feel "MSFT lobbying" isn't charitable at all to what MSFT and their devs have achieved. You have to give credit where it's due. MSFT have spent a lot of time, effort and dev years ensuring that their customers can run their software without breakage and downtime. This is a non-trivial aspect that most people who don't use Windows often dismiss. MSFT have made themselves the standard platform because of their broad support. This is no mean feat. Canonical has tried for almost 20 years at this point and have barely made a dent with Ubuntu.
> I don't know where you're buying PCs but where I'm from there is always an option to buy the PC with FreeDOS at a discount of about ~$20-30 compared to the Windows version. Lately, I also see an increase in Ubuntu computers.
I'm not from there, it seems. The best you can do here is build your own PC, or go with a distributor (typically Dell/Lenovo) whose configuration allows opting-out of buying an OS. Needless to say that it's not a mainstream purchasing behaviour.
> MSFT have spent a lot of time, effort and dev years ensuring that their customers can run their software without breakage and downtime.
That wasn't my point at all. It was to stress how the ludicrous track-record of Microsoft anticompetitive practices, establishing and sustaining a decades-long monopoly, barred non-expert and non-enthusiasts from experiencing (possibly favourable) alternatives.
> It was to stress how the ludicrous track-record of Microsoft anticompetitive practices, establishing and sustaining a decades-long monopoly, barred non-expert and non-enthusiasts from experiencing (possibly favourable) alternatives.
My point served to counter this very statement.
There are alternatives (Linux, macOS, FreeBSD etc. etc.) but none are favorable like you say. A big part of why this is the excellent job that MSFT did as a technical force looking to consolidate Windows as the OS standard all those years ago. The efforts taken by them to ensure broad based application support and customer research and support on Windows has contributed to the continued perpetuation of their monopoly. Were they ever the most technically advanced option? No. Is any of their software products absolutely perfect and without deficiencies? Also no. And yet they are possible the leading software company in the world. This is NOT solely due to their anticompetitive practices. Saying so is a form of denial about the true state of things.
I gave Canonical and Ubuntu as an example of someone else who has tried to step in the breach and failed to force out MSFT as an alternative for non-experts and non-enthusiasts. Ubuntu and the FOSS community are many things but friendly to beginners and non-technical people is not one of them. There have been tremendous advances in the past decade but we're nowhere close to this being the Year of the Linux Desktop. The bottom line is that mainstream (i.e. non-technical and non-enthusiast) consumers will choose to put their money where they get the best value and that remains MSFT and Windows.
> MSFT have spent a lot of time, effort and dev years ensuring that their customers can run their software without breakage and downtime
I don’t know about that, but they have spent a lot of hours making sure you can run .exe’s from 30 years back, which is wildly valuable to slow-moving corporations.
> I don’t know about that, but they have spent a lot of hours making sure you can run .exe’s from 30 years back, which is wildly valuable to slow-moving corporations.
Which ones? There are tools like dosbox to get old DOS programs running again.
> I fear the current state of our industry eliminated the possibility for not-great, not-skilled juniors to embark in these journeys
I think both sentiments are a product of their times.
Was porting an OS to a new architecture an extremely skilled thing? 100% then and 1000% today. With each new stage of advancement and increase in the layer of abstraction away from the core metal, newer developers no longer need to know how to program at the lowest level like targeting a processor architecture directly.
Software development from the 1950s till the rise of Windows as the standard was targeted not towards systems like we do today but towards processors and architectures. Processors at that time were simpler to write for. You could get the datasheet for whatever was the latest processor from a magazine, understand it inside and out and start writing software for it. Today I do not think there are more than a few dozen people who understand the x64 line of Intel processors at the same level. So times have changed. We write for operating systems now and not processors anymore.
I think that this is neither good nor bad. It just is simply how it is. I'm sure that people who worked on computers in the 1950s at the assembly level would have been complaining in the 1970s about people writing programs in C/Pascal. And so the cycle continues.
In fact, I think that the current state of generative models that output code is the perfect scenario to separate the wheat from the chaff. Their power function nature gives a clear divide between people who worked in software for the paycheck and those who love technology for it's own sake.