Yeah my mom would slowly forget how to get to my house, it was sad she would always try but just not quite get it... She eventual died of Alzheimer's a some years later...
I was going from SFP+ to SFP28. Not sure about SFP to SFP28. But with that EEPROM change you could also try going from SFP to SFP+, just use different bandwidth values.
You could also try resoldering new/thicker Twinax wires to old SFP connectors and update the EEPROM. My soldering skills apparently aren't that great. If yours are or you have plenty of old connectors you should give that a try.
I have always envisioned a ai server being part of a family's major purchases e.g. when they buy a house, appliance, etc. they also buy a 'ai system'.
Machine hardware evolution is slowing down, pretty soon you can buy one big ass server that will last potentially decades as it would be purpose built for ai.
Things like 'context based home security' yeah thats just, automatic, free, part of the ai system.
Everyone will talk to the ai through their phones and it'll be connected to the house, it'll have lineage info of the family may be passed down through generations etc, and it'll all be 100% owned, offline, for the family; a forever assistant just there.
- 6× faster CPU/GPU performance
- 6× faster AI performance
- 7.7× faster AI video processing
- 6.8× faster 3D rendering
- 2.6× faster gaming performance
- 2.1× faster code compiling
Over the span of 5 years.
Plus, realistically what makes an "ai" server different from a computer? This "lineage info of the family may be passed down through generations" sounds nice but do you know anyone passing down a Commodore 64 or Apple II that remains in daily use? I fail to see how "ai" would protect something from obsolescence.
I have a good analogy. 10 years ago, I was convinced that a 24-inch 1080p monitor at arm's length was perfection. There could never be any reason to improve over it. I could do everything I ever wanted to, to a standard I would never need to improve upon.
Yet here we are. The simplest and most obvious improvement is a 24" 4k monitor at 200% scaling. Basically, better in every way.
There's a discussion to be had about whether you need the better setup, which I think is your point, but there's no denying you'd want it (all other variables the same).
At some point specs don’t matter. I don’t wonder about the processor in my thermostat either. I don’t know how many horsepower my XC90 has. I don’t know the rated power of my chainsaw.
All I care about is: do they work, are they ‘safe’, are they comfortable, etc.
Today, not much differentiates them. But as time passes our only option will be to further specialize the hardware to get realistic gains; at some point perhaps a 'purpose built analog' computer kinda thing will get to the point where it is so useful, that it would be like the 'Standard Template Constructs' concept in Warhammer 30k. So what you can make a faster ai but, the current one can 'teach everyone, basically anything'.
It depends on what/how you're comparing. Core to core, according to CPU benchmark, the M1 is 5800 vs the M5 at 3600, so we're still not quite to 2x.
Overall system performance is better at about 2x improvement thanks to extra cores/other improvements/changes. I could see other more specialized benchmarks improving more thanks to different improvements/core/power/size improvements in other components (GPU/NPU/etc...).
If you bought a big ass server for your home 10 years ago it probably wouldn't have even have had a GPU/AI accelerator at all. If it did, it would have been something with wimpy compute and VRAM because you needed the video encoder/decoder for security cameras or the like.
I'm not sure that really gives confidence hardware has really slowed down enough to invest in it for decades. Single core CPU performance has but that's not really what new things are using.
It really just depends on if the hardware is "good enough" for whatever its purpose is. If the hardware today can locally run whatever models for your security cameras, it's likely they will still be "good enough" in 10 years.
Of course, similar to a 10 year old car or appliance, you will be missing any new features or bells and whistles that have become available in the meantime.
I agree; it's important to recognize that there are lots of use cases where computers have long since reached "good enough" and aren't really going obsolete anymore for those use cases.
My NAS is about 13 years old, the network switches it connects through are even older, and while 2.5GbE now exists I have no need throw out my "good enough" equipment to replace with something marginally faster or more power efficient. I don't even really need to expand the storage of that NAS anytime soon, because my music collection could never come close to filling it, my movie/TV collection isn't growing much anymore due to the shift to streaming, and the volume of other stuff that I need to back up from my other computers just isn't growing much over the years.
Decades is a long time for hardware, but "years" seems reasonable soon. The commercial models are "good enough" for a lot of things now, so if that performance makes its way into the on-device space for "home applicance"-level cost (<$5k at the start, basically), I'd expect a lot of stuff to start popping up there. In offices too.
Like the PC in the 80s starting to eat up "get a mainframe" or "rent time on a mainframe" uses.
You’re kindof undermining your own point. Ten years later the only thing you’d need to upgrade for your home server might be the GPU - because a new use-case emerged. Okay? Spend $500-$1000 on an eGPU. Problem solved. Will that eGPU setup last another ten years? If all it’s doing is processing security video and routing claw-like tasks, then yes.
Not sure I follow why - that the server from 10 years ago would be completely unfit for purpose now should not imply the one you buy today would therefore be the right hardware 10 years from now. Unless you can somehow guarantee we've reached the final set of new requirements we will ever have just these last few years the GPUs you buy today will probably be just as irrelevant to the new requirements a decade from now.
Of course one can always upgrade components piecewise as requirements change, but I don't see why you need to invest in a big ass server to do that. It'd be cheaper to go that route everyone has for decades at this point - upgrade with normal sized stuff as needed and not try to make it an up front multi-decade home investment out of it.
On the flip-side, if you intentionally plan to lock in the capabilities to the kinds of things one can run today and know you'll never therefore need to upgrade it then you can get whatever sized system makes sense for today's needs. You just need to be really sure you'll not be interested in "the next big thing" when it comes too.
Yeah but, how long do mainframes last? Think of the COBOL systems used in government. No reason to update them, they worked forever; their job is discrete and they performed it well enough where intense updating wasn't a requirement.
You also need to ask: How much do mainframes cost? They were engineered for backwards compatibility and reliability, with built in redundancy you don't find in consumer hardware.
AI models are changing every other day. I have to rebuild llama.cpp from source regularly. We are no where close to a personal "AI mainframe."
I don’t think there’s anything different between what you’re suggesting and a homelab. Most people do not have a homelab and are happy to offload services like photo storage or security to remote providers.
I think that attitude is (very) slowly changing though and might not be the default forever.
My elderly parents have asked me about "local backups" of their cloud stuff, their Facebook history etc..
If they're thinking about the risks/tradeoffs of being in the cloud..
I think people use the cloud because there's no better/easier option today.
But at some point there might be. A home appliance (which may be similar to a homelab under the hood but the user experience is where things change) that provides a bunch of automation and home services could be quite attractive if it got to a point of being very turnkey for the average family.
There’s no better option today because it’s impossible to make it a better experience. That machine at home will need upgrades, it could fail, it costs thousands, it sucks lots of power. There is no mass market appeal.
Maybe today.. But my TV has been sitting on my stand for years, and it doesn't need upgrades.
My Raspberry Pi pi-hole is a Pi 2b that has been running for over 5 years and it's totally fine. It has automatic security upgrades turned on but nothing else, and it doesn't need any time or attention. It just does its job.
I have a Homelab that's a mini-PC that's quiet and does not suck lots of power and is tucked away neatly in a closet.
I think it would be completely possible to provide an appliance-like machine that would not have the problems you're outlining.
Home labs feel wholly different and requires custom setup and maintenance.
A home appliance like a toaster would be in the case of an AI server are ready to go appliance that’s preloaded and confined and connect to everything in your home and help you manage it likely by just voice chat or some amount of interface.
What you’re describing is more likely to manifest as a proprietary product from someone like Samsung or Ring (likely both!) than an open standard AI server that integrates with everything in your home automatically. This is exactly like what we have today with security systems and smart appliances. You have managed services and you have Home Assistant in your homelab.
Strongly agree. Plus, for all but very specific usecases, most people will spend less money by paying for cloud services, with "most" here referring to the general population.
I think something like this actually makes the 'big server' idea feasible - if these are basically plug and play modules for whatever home AI services you use. If the chassis is just handling power distribution and networking for modules, then it really might last a decade or more, and individual modules might also last similarly long times depending on the use case, while having the flexibility to move to something newer and more performant as it comes out.
Even if that happened tomorrow, I suspect we'd have _at least_ a decade of people tweaking/optimizing designs on the same node to squeeze meaningful performance upgrades out. Eg, coming up with hardware support for new int/float formats that make more sense for the models of 2029, running matrix operators on ram chips directly, etc.
I remember back in the early 2000s when people thought we were running out of steam on the advancements front. This was roughly around the time when CPU clocks stopped getting faster. Pentium hit 3GHz in 2005, Intel Core Ultra 5 performance cores are generally around this exact speed 20 years later.
Since at least the 640kb quip, betting against progress or the appetite for progress has been a losing bet.
Honestly post 2005 things did slow down dramatically for typical single core workloads.
In the late 90s and early 2000s the mantra was "why waste time optimizing your software? By the time you're done the next gen of CPUs will have made up the difference."
Now the increase is more about moving to GPUs and power efficiency etc. We still have increases, but the rate of speedup has slowed down a lot.
Based on our current trajectory, it seems more likely everyone will upload everything to the cloud and pay perpetual royalties to access their own data.
I really think this is a temporary scenario, there will be advancements in ai's building the next generation of ais, where the scale of the model continually shrinks and maybe there will be some break through that allows us to double the use of existing hardware/memory etc.
10 years ago I couldn't do alexa at my house, now I'm pretty close with a Qwen3:8b / Ollamma LLM (I mean I never really wanted alexa to do anything other then play music, automate stuff, etc. zero interest in it teaching me how to code).
I'm even thinking at some point we'll consider ai to be a fundamental human right to have access too as otherwise you are inherently in a disadvantaged position in terms of wealth prospects to those who do have access.
There's potential case for a subscription model to keep security updated for the connection to the users' phones as well as on going support for less tech savvy users (e.g. "I told my assistant to turn on my smart dishwasher and it turned on the my smart washing machine instead"). I'd imagine the HN crowd would lean toward a open source version though.
If dishwashers were invented today they would be rented out to homes and businesses with DRM to lock you into buying approved detergent and tableware. Times change, and more exploitative arrangements are normalized. This ratchet is primed to go in one direction, and only moves the other way in fits and starts borne of great effort.
Well, custom/bespoke training for your families particular needs perhaps, performed once every 5 years.
I mean I envision analog/custom/bespoke ai hardware that is fundamentally 'good enough'. I mean as the market increases its need for these systems and as time progresses at some point it'll like warhammer 30k where these 'standard template constructs' are smart enough to basically teach you anything.
This is your reminder we're in a bubble inside of a bubble...
Most people don't even think about running network cables or mesh wifi when building a house, no one will buy a server to run ai in their physical home
> These are results for what is bev market auto industry
> Search instead for what is bev market auto industru
> AI Overview
> The Battery Electric Vehicle (BEV) market involves vehicles powered exclusively by electricity via onboard battery packs, without any internal combustion engine. It is a rapidly growing, high-investment sector within the automotive industry aimed at zero-emission transportation. Key aspects include accelerating market share, intense competition, and improvements in charging infrastructure.
reply