Hacker Newsnew | past | comments | ask | show | jobs | submit | duped's commentslogin

What do you mean, cold smoked fish and pickled cabbage is great. And you don't have to worry about heart disease when consumption will get ya long before the sodium does.

Silk screen printing is probably the easiest way to introduce the concepts to kids. There are a lot of maker spaces/artist collectives and classes that have the basic tools and resources to do it.

You could also try to replicate something like the Monster 6502: https://monster6502.com/

It's not lithography, but you can build a working processor out of small surface mount chips, and you can solder these chips with lead-free solder. That seems very achievable for a motivated engineer, and probably involves much less toxic chemicals?


Or even gel plate printing, where you get to build multiple layers, one of them being a laser printed photo that is used as a resist.

Consumers don't care so much about consolidation as they care about not getting ripped off. When Netflix and Hulu were the only streaming platforms you paid a pretty low price to get virtually everything you wanted. Now you pay more for a worse experience.

Netflix at least has technical chops. Other studios (looking at you, Paramount-) put out barely functional apps because they know consumers ultimately will pay for their content.


Netflix may have the technical ability, but they don't deliver. Their UI just gets worse and worse in terms of usability and they keep cutting features on top of steadfastly refusing to provide features people have been asking for since they started steaming movies.

Basically every streaming app is minimally functional and obnoxious in their own ways. netflix isn't the worst of them, but it's no exception and getting worse all the time.


>you paid a pretty low price to get virtually everything you wanted

Depends what you wanted.

Both a deep back catalog of TV and film more generally were always pretty lacking on all-you-could-eat streaming services. Frankly, my biggest complaint with Netflix is that they basically drove local video rental out of business and then shut their own rental down.


This. I loved the DVD service and I don't think I was alone. Younger folks didn't perhaps use it as much as some, but for those who don't have the best internet speed or service, they were great.

Even when I had good service/speeds the DVD service was amazing because it had way more options than streaming does even now, including some pretty hard to find DVDs, and you got the extra features! It was also nice to regularly get something in my mailbox besides spam...

It's an issue for me, a Mac owner. All the games I want to play have buggy graphics on Mac. I have a PC just for playing with my friends.

Apples decisions are often wrong when it comes to third party software.


Apple's view on your situation is probably that you still bought / keep the Mac (not intended flippantly).

Shoutout to PACE who banned scripting in the JUCE 8 license terms so if you wanted to make this using the leading framework, you can't.

Do you have a source for this? I don't see any indication from a quick Google other than this thread as the second result.

The license at: https://github.com/juce-framework/JUCE/blob/master/LICENSE.m...

indicates you can just license any module under agpl and avoid the JUCE 8 license (which to be fair, I'm not bothering to read)


https://forum.juce.com/t/archived-juce-8-eula/60947/149

And sure you can license under APGL. It should be obvious that's undesirable.


Define scripting.

I'm not going to test it, but couldn't you just load a json file with all params.

Various instructions, etc.

I can't believe it's not code!


The definition is up to them. They don't want to play around with loopholes since the whole point of the license change was to force more people to buy license seats.

How different is this from rental car companies changing over their fleets? I don't know, this is a genuine question. The cars cost 3-4x as much and last about 2x as far as I know, and the secondary market is still alive.

> How different is this from rental car companies changing over their fleets?

New generations of GPUs leapfrog in efficiency (performance per watt) and vehicles don't? Cars don't get exponentially better every 2–3 years, meaning the second-hand market is alive and well. Some of us are quite happy driving older cars (two parked outside our home right now, both well over 100,000km driven).

If you have a datacentre with older hardware, and your competitor has the latest hardware, you face the same physical space constraints, same cooling and power bills as they do? Except they are "doing more" than you are...

Would we could call it "revenue per watt"?


The traditional framing would be cost per flop. At some point your total costs per flop over the next 5 years will be lower if you throw out the old hardware and replace it with newer more efficient models. With traditional servers that's typically after 3-5 years, with GPUs 2-3 years sounds about right

The major reason companies keep their old GPUs around much longer with now are the supply constraints


The used market is going to be absolutely flooded with millions of old cards. I imagine shipping being the most expensive cost for them. The supply side will be insane.

Think 100 cards but only 1 buyer as a ratio. Profit for ebay sellers will be on "handling", or inflated shipping costs.

eg shipping and handling.


I assume NVIDIA and co. already protects themselves in some way, either by the fact of these cards not being very useful after resale, or requiring them to go to the grinder after they expire.

In the late '90s, when CPUs were seeing the advances of GPUs are now seeing, there wasn't much of a market for two/three-year old CPUs. (According to a graph I had Gemini create, the Pentium had 100 MFLOPS and the Pentium 4 had 3000 MFLOPS.) I bought motherboards that supported upgrading, but never bothered, because what's the point of going from 400 MHz to 450 MHz, when the new ones are 600 or 800 MHz?

I don't think nVidia will have any problem there. If anything, hobbyists being able to use 2025 cards would increase their market by discovering new uses.


Cards don't "expire". There are alternate strategies to selling cards, but if they don't sell the cards, then there is no transfer of ownership, and therefore NVIDIA is entering some form of leasing model.

If NVIDIA is leasing, then you can't get use those cards as collateral. You can't also write off depreciation. Part of what we're discussing is that terms of credit are being extended too generously, with depreciation in the mix.

The could require some form of contractual arrangement, perhaps volume discounts for cards, if they agree to destroy them at a fixed time. That's very weird though, and I've never heard of such a thing for datacenter gear.

They may protect themselves on the driver side, but someone could still write OSS.


Dont they own socket for enterprise cards? I can't see consumers buying these card unless they are PCIE at the very least.

Rental car companies aren’t offering rentals at deep discount to try to kickstart a market.

It would be much less of a deal if these companies were profitable and could cover the costs of renewing hardware, like car rental companies can.


I think it's a bit different because a rental car generates direct revenue that covers its cost. These GPU data centers are being used to train models (which themselves quickly become obsolete) and provide inference at a loss. Nothing in the current chain is profitable except selling the GPUs.

> and provide inference at a loss

You say this like it's some sort of established fact. My understanding is the exact opposite and that inference is plenty profitable - the reason the companies are perpetually in the red is that they're always heavily investing in the next, larger generation.

I'm not Anthropic's CFO so i can't really prove who's right one way or the other, but I will note that your version relies on everyone involved being really, really stupid.


The current generation of today was the next generation of yesterday. So, unless the services sold on inference can cover the cost of inference + training AND gain money, they are still operating at loss.

“like it's some sort of established fact” -> “My understanding”?! a.k.a pure speculation. Some of you AI fans really need to read your posts out loud before posting them.

You misread the literal first snippet you quoted. There's no contradiction in what you replied to.


Or just "everyone" being greedy

> the secondary market is still alive.

this is the crux. Will these data center cards, if a newer model came out with better efficiency, have a secondary market to sell to?

It could be that second hand ai hardware going into consumers' hands is how they offload it without huge losses.


The GPUs going into data centers aren't the kind that can just be reused by putting them into a consumer PC and playing some video games, most don't even have video output ports and put out FPS similar to cheap integrated GPUs.

And the big ones don't even have typical PCIe sockets, they are useless outside of behemoth rackmount servers requiring massive power and cooling capacity that even well-equipped homelabs would have trouble providing!

Don’t underestimate a homelaber’s intention to cosplay as a sysadmin or ability to set their house on fire ;)

I wonder if people will come up with ways to repurpose those data center cards.


I would presume that some tier shaped market will arise where the new cards are used for the most expensive compute tasks like training new models, the slightly used for inference, older cards for inference of older models, or applied to other markets that have less compute demand (or spend less $ per flop, like someone else mentioned).

It would be surprising to me that all this capital investment just evaporates when a new data center gets built or refitted with new servers. The old gear works, so sell it and price it accordingly.


Data centre cards a don’t have fans and don’t have video out these days.

i dont mean consumer market for video cards - i mean a consumer buying ai chips to run themselves so they can have it locally.

If i can buy a $10k ai card for less than $5000 dollars, i probably would, if i can use it to run an open model myself.


At that point it isn't a $10k card anymore, it's a $5k card. And possibly not a $5k card for very long in the scenario that the market has been flooded with them.

Ah well yes to a degree that’s possible but at least at the moment you’d still be better off buying a $5k Mac Studio if it’s just inference you’re doing

How many "yous" are there in the world? Probably a number that can buy what's inside one Azure DC?

Why would you do that when you can pay someone else to run the model for you on newer more efficient and more profitable hardware? What makes it profitable for you and not for them?

Control and privacy?

You need the hardware to wrap that in, and the power draw is going to be... significant.

Print debugging is fine and all but I find that it pays massive dividends to learn how to use a debugger and actually inspect the values in scope rather than guessing which are worth printing. It also is useless when you need to debug a currently running system and can't change the code.

And since you need to translate it anyway, there's not much benefit in my mind to using something like msgpack which is more compact and self describing, you just need a decoder to convert to json when you display it.


> rather than guessing

I'm not guessing. I'm using my knowledge of the program and the error together to decide what to print. I never find the process laborious and I almost always get the right set of variables in the first debug run.

The only time I use a debugger is when working on someone else's code.


That's just an educated guess. You can also do it with a debugger.

The debugger is fine, but it's not the key to unlock some secret skill level that you make it out to be. https://lemire.me/blog/2016/06/21/i-do-not-use-a-debugger/

I didn't say it's some arcane skill, just that it's a useful one. I would also agree that _reading the code_ to find a bug is the most useful debugging tool. Debuggers are second. Print debugging third.

And that lines up with some of the appeals to authority there that are good, and that are bad (edited to be less toxic)


Even though I'm using the second person, I actually don't care at all to convince you particularly. You sound pretty set in your ways and that's perfectly fine. But there are other readers on HN who are already pretty efficient at log debugging or are developing the required analytical skills and I wanted to debunk the unsubstantiated and possibly misleading claims in your comments of some superiority in using a debugger for those people.

The logger vs debugger debate is decades old, with no argument suggesting that the latter is a clear winner, on the contrary. An earlier comment explained the log debugging process: carefully thinking about the code and well chosen spots to log the data structure under analysis. The link I posted was to confirms it as a valid methodology. Overall code analysis is the general debugging skill you want to sharpen. If you have it and decide to work with a debugger, it will look like log debugging, which is why many skilled programmers may choose to revert to just logging after a while. Usage of a debugger then tends to be focused on situations when the code itself is escaping you (e.g. bad code, intricate code, foreign code, etc).

If you're working on your own software and feel that you often need a debugger, maybe your analytical skills are atrophying and you should work on thinking more carefully about the code.


Debuggers are great when you can use them. Where I work (financial/insurance) we are not allowed to debug on production servers. I would guess that's true in a lot of high security environments.

So the skill of knowing how to "println" debug is still very useful.


I'm seeing homes in my neighborhood sit on the market for 3-4 months before dropping prices and finally selling, about 20-25% off the original listing.

Houses here are sitting on and off for 6 months at a time without closing. The market has all of the energy of a banana slug.

They actually started in LISP and rewrote it in Python (and also apparently, did not pick any of the "mature" web frameworks).

http://www.aaronsw.com/weblog/rewritingreddit


Maybe awkwardly worded but that's implied by the phrasing "since"

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: