Hacker Newsnew | past | comments | ask | show | jobs | submit | smashface's commentslogin

If AI continues at this trajectory, sure, likely to the picks and shovels.

If AI has a bubble burst, you could see a lot of used hardware flood the market and then companies like WD could have a hard time selling against their previous inventory.


The problem is more likely that companies like WD doesn't know if this will be a bubble or not. Currently they can milk the market by raising their prices and just rely on their current production facilities, maybe expand a little. If there's going to be crash, then it's better to have raised the price, even if just temporarily, rather than being left standing with excessive production capacity.

If it's long term, it would be better to be the front runner on additional capacity, but that's assuming continuous growth. If it all comes down, or even just plateaus, it's better to simply raise prices.


Given how hard AI is on I/O, while restarting if hardware might go second hand. I dont see hard drives go second hand. Most hardware that we get might be used beyond redeeming even at free price.


Actually used hard disks have gotten pretty popular in the past few years, with sellers like ServerPartDeals building up a reputation by selling drives that are properly tested and recertified. As long as you have redundancy and backups by all accounts they hold up pretty well.


SPD is out of stock as well for all their high capacity, manufacturer recertified drives.

Got the 20TB Seagate recert in May 2025 for $239. Now it's $500 and out of stock [1].

Got another 28TB Seagate recert in Sep 2025 for $398. Now it's $482 and out of stock [2]. Same drive, but seller refurbished is also OOS, but at $700.

I sure hope everything in my homelab NAS outlives the AI bubble. Otherwise, I'll have to shut it down, as getting a replacement for ANY part inside it is going to be stupid expensive. I'm at least thankful I went for DDR4 motherboard instead of DDR5.

[1] https://serverpartdeals.com/products/seagate-exos-x20-st2000...

[2] https://serverpartdeals.com/products/seagate-exos-st28000nm0...


Yeah AI killed that market just like everything else. I'm sure the supply of used drives has also dried up as the equation of "cost per TB of power and rack space" vs "cost per TB of replacing with newer, denser drives" has gone haywire.


I don't think the HDDs are being used for any intensive loads. They have too much latency for most of that. It's probably just archival storage for their scraped content and generated slop.


For "cold" archival storage you would want to use tape, which is far cheaper per TB at scale.


I don't mean that type of archive, but rather "just in case" data like "last month's scrape of this website" after we scraped it 5 more times this month or higher resolution versions of book scans. You might want to still be able to dump it out quickly if you need it. Money is no object for these companies and the cost of HDDs is more than low enough for the flexibility they provide.


If demand for hard drives is this high then it sounds like there wouldn't be near enough tape around either.


This is why I am buying a couple of LTO 6 tapes. Thus far I've been able to buy 4 for approx 120 EUR, 2,5 TB each. They have been around 30 EUR each the past years, and still are approx such price (leaning towards 35 EUR though). I bought a second hand drive for about 500 EUR, and a HBA for it.


Tapes are great for true cold storage (will easily last many decades!) but they will wear out significantly with more intense use: you only get a couple hundred passes total over their full data capacity, either read or write. In practice, you still need plenty of big hard disks to act as nearline storage for practical use, and the tape only rarely does storage and retrieval in bulk. This is also why you see mechanical tape libraries with tens or hundreds of tapes for a single read/write unit: you don't really need more than that.


Yes, I will use them as cold storage, nothing else. Right now, I have the following scheme:

1x a live server (Proxmox, NAS, firewall, and various other capabilities). 2x RAID1 enterprise NVMe, with NAS storage on RAID1 HDDs. 12th gen Intel, so relatively power friendly apart from the enterprise HDDs. 10 gbit local, remote 1 gbit fiber.

1x a live backup server in same city. Syncs every night. I should disable it otherwise, but I don't as of now, since it also gets a livestream of my doorbell camera (I don't use cloud for it). Has 1 gbit fiber, and RAID-1. Runs OS off RAID-1 (cannot add NVMe, older Synology, well maybe USB3 would work, but I'd rather not).

1x a backup server in same location as my home server which auto starts and syncs every week. This is my main old server, a Xeon so not very power friendly. Also RAID-1 NVMe. 10 gbit local.

1x a remote cloud (Norway), to have another copy of the most important data. Doesn't contain everything. Costs me only 50 EUR/year.

So that is a lot of copies of the same data, and quite frankly I not need this many. For the HDDs I want to get rid of RAID-1 and use either RAID-0 or JBOD, doubling the available data (I'm at the max as it is) while still having great data redundancy. And I will want to store my tapes off-site, although it wouldn't have the latest and greatest backups. I still have to look up how to do FDE with LTFS though, but I'll figure it out.

It also seems a good moment to sell some old hardware, given the current prices, but I am not sure if I will. Just something to ponder on later. You'd think I'd like to sell off the Xeon with its 30W CPU, but I quite like the machine (HP Microserver 10 Gen10 Plus). I'd rather sell the Synology which is still a decent machine, but I use voodoo to run recent software on it, and ZFS (with Homebrew / Nixpkgs). Tho neither is useful for ML.


Headline reads like a hot take. Actual recommendation is rather useful. Click-bait used for good.


Basically this.

If companies want people back in the office when they could WFH instead, they're going to have to compensate them, time and costs, for the commute.


when unemployment goes to 7% they won't be compensating for a damn thing and WFH will be a thing of the past.


Skimmed article mostly.

Could things be better? Sure. But saying we're "wasting" all this time seems a little disingenuous. How much time do people spend maintaining tractor equipment to grow crops? Or really any tools of any trade?

The equipment and tools we use let us be more productive overall with certain tradeoffs.


While that's an interesting question, legacy students likely also have access to better resources outside of the classroom and office hours.


Really? Pretty much every school I've ever seen has a special office of tutors that takes care of the various affirmative action admits. And also athletes. Definitely athletes. I don't think I've ever seen any such office for legacy students. Ever. Or even heard about some covert version.

Can you explain a scenario where this might be the case? The only one I can imagine is assuming that legacy students are rich and so they can just toss around more money on private tutors. Perhaps. But then the right statement is to say that rich students have better access.


That's the point... They're accepted because their family resources make them more likely to be successful which is what Harvard is trying to select for.


She is literally a walking biohazard. She got her due process, lost, and then ignored the court order. From the article it sounds like she can still refuse treatment but she can't keep being a risk to the public.

Edit: And to your point about "no crime," reckless endangerment is a crime. Defying court orders is a crime.


Where do you get that 50% number? Do you mean 50% of all new code in the industry? That seems beyond extremely unlikely.


The number is 40%, and it's 40% of code written by Copilot users. It's also just for Python:

> In files where it’s enabled, nearly 40% of code is being written by GitHub Copilot in popular coding languages, like Python—and we expect that to increase.

https://github.blog/2022-06-21-github-copilot-is-generally-a...


I wonder if this properly counts cases where copilot writes a bunch of code and then I delete it all and rewrite it manually.


From what I remember they check in at a few intervals after the suggestion is made and use string matching to check how much of the Copilot-written code remains.


It's all about the denominator!


There was some discussion by the copilot team that x% of new code in enabled IDEs was generated by copilot.

It varies, but here's one post with x=46 from last month. So, very close to half.

https://github.blog/2023-02-14-github-copilot-for-business-i...


Measuring output by LOC is not a very useful metric. The sort of code that’s most suited to ai is closer to data than code.


(I read it as 50% of their code)


My first two internships were at an insurance company. Everyone was busy. Not saying your job wasn't what it was but I can't say it's representative of everything outside Silicon Valley.


We have a project that is basically an internal, stripped down version of CodePen and others like it. All code samples are stored in git.

The main challenge for something like our app is search. Git is good at creating files and managing versions/branches but not good at search files or their content. I'm not a git expert to fully backup that claim but that's been our experience. You can layer on your own search capabilities if you need it but then you might want to start asking if a full DB is better.


Perhaps not depressing but certainly unhelpful. If I'm spend time thinking about other more interesting stuff, I tend to get sucked into those things. Which just means I've got zero focus on stuff I need to get done.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: