Hacker Newsnew | past | comments | ask | show | jobs | submit | some-guy's commentslogin

I am actually surprised at how well our second TV (Samsung) still looks [0] after 17 years. We inherited from my sister who bought it for some ridiculous amount of cash for the time. It’s heavy and runs hot, but doesn’t look any worse than cheap TVs of the same size today.

[0] https://www.cnet.com/tech/tech-industry/samsung-touches-lcds...


It very likely depends on what you are playing on it and what size it is.

Even most cheap TVs now a days are 4k even if the panel is low end.

There is a large difference between 1080p and 4k which is usually quite noticeable if the TV is large but if it is a smaller size I can see how it would be less obvious.


> There is a large difference between 1080p and 4k

I cannot tell the difference at normal viewing distances. Up close, sure.

This is how they get you to buy the 4K version, in the store you are standing 2 feet from the screen and you can see the pixels at 1080. Sitting at a normal viewing distance and 1080 looks great.


Actually large LCDs (>65") were pretty uncommon in 2010 but if you ever watched a 1080p DLP television I would be surprised if you didn't notice when looking at them side by side.

There is also of course the issue where people have bad internet (so netflix or whatever destroys the bit-rate, or they have the cheaper 1080p only plan... is that still a thing?) or old cable boxes plugged into 4k televisions.

There is a lot that people can do to inadvertently destroy their image quality without knowing which is not great.


Problem is I am not looking at TVs side by side. I'm watching a movie or a show, and the enjoyment I get at the correct viewing distance is unlikely to be significantly more on a 50" or smaller TV.

I bought the top of the line TV from Samsung in 2011. The 'smart' functionality services went offline after a year or two, which means all 'smart' functions no longer work and I am now happily using it as a dumb TV.

Eventually every smart TV becomes dumb when they inevitably shut down the backend services.


> Eventually every smart TV becomes dumb when they inevitably shut down the backend services.

Except that on newer tvs all the nagging will still be there, all the ads will be "frozen" in time (mine has ads for stuff from 2023, the last time I connected it for some firmware update that _GASPS_ actually fixed some things) and some features may depend on internet connectivity. The manufacturer may care to release a final update and solve these issues, but you know they are much more likely to fraudulently just disable features that worked offline as a last middle finger.

Repeat with me, SaaS is fraud. Proprietary digital platforms are fraud.


In 2011 smart tvs / phones were not quite the data harvesting devices they have become.

The things stopped working because they were for you the consumer.

The modern smart tv will keep working as long is its piping data back to the data retailers, they have a vested interest in keeping it going.


One thing I don't hear people talking about very is about how AI is going to make money in any other way other than cutting employment.

With the internet, and especially with the internet being accessible by anyone anywhere in the world in the late 2000s and early 2010s globally, that growth was more obvious to me. I don't see where this occurs with AI. I don't see room for "growth", I see room for cutting. We were already connected before, globalization seems to have peaked in that sense.


That's a pretty significant way to make money though.

I do think at this stage the best analogy is the offshore call centres. Yes, the excess in the market is likely because of misunderstanding about what LLMs can actually do and how close AGI is, the short term attraction is the labour cost savings. People may not think wages are high enough etc but the total cost for one hire to companies, particularly outside the US, is nothing to sniff at. And at current pricing of ai services, the maths would make complete sense for the bottom line.

I don't like it, because I ultimately err on the side of even limited but significant changes to people's livelihood will make the world a more hostile place (particularly in the current climate), but that's the society we live in


Why not? AI assisted shopping for example will boost growth. Productivity also boosts growth.

How does AI assisted shopping create more economic activity? Even assuming you can do it, and people do find it helpful, it likely just shifts who people buy from, not how much?

How did Amazon.com create more economic activity? Same way. It just has to make it more efficient.

Amazon.com is a business. AI is a technology in of itself it isn't a business.

Thus far it appears applications of AI that provide 'benefit' do so by removing or reducing the need for human operators. Examples include: fewer software engineers, fewer call centres, removing potentially whole areas of work such as paralegals and in general automating away many white collar jobs.

By far the largest use case of AI is this.


Internet is also not a business but it removed a lot of people but increased productivity and prosperity.

Well Amazon makes it take way less time to buy something (almost anything). I dont have to waste time and money going to the hardware store, it only takes 90 seconds of my time and the price including shipping is less than what the hardware store wants to charge, with its rent and utilities and so on to cover. Amazon cut out a middleman between the warehouse and the household. It made things more efficient, and I can use the time and money savings to spend more money or work harder at my job. (I for one do spend more hours working if I have more time in my day. Not everyone is like this.)

AI could do the same thing. It could cut 90 seconds down to... 10 seconds? It doesn't seem like the same impact as Amazon, where an hour's investment became 90 seconds. And I can't see how AI shopping is going to save me money here. There's no middleman to cut out, except maybe some web site storefront?? There's also a huge downside: with Amazon, I suddenly had access to 100 different pairs of scissors to choose from, instead of the 2 or 3 I could find in Staples. That was a plus. With AI shopping, suddenly I'm down to one choice: whatever Chat chooses for me. If I want to have a say in which pair of scissors I buy, I'm back to shopping for myself.


You've hit the nail on the head.

AI use cases do not appear to be of the type that unlock NEW capabilities.

The main use cases in AI are not about wealth creation but about saving existing wealth (largely through increased automation of human operators).


Yeah, can someone explain how you get rich by making the previously-employed workforce unable to afford your products?

"There is no good outcome for the 99% because of AI" should be talked about more, but the media is owned by the 1%.

Either the bubble bursts and everyone's retirement funds take a hit, 2008 style,

Or a decent chunk of the workforce becomes unemployed and unemployable.


There are a few areas where I have found LLMs to be useful (anything related to writing code, as a search engine) and then just downright evil and upsetting in every other instance of using it, especially as a replacement for human creativity and personal expression.

Not only am I not shocked, I am in no way offended as to who is doing the "stealing"

Are they rejecting the driver because of it being open source? There are specific modules I use in my AMD card that require closed proprietary driver add-ons for example such as AMF.

Not defending the HDMI forum here, but perhaps Valve / AMD have a way of including a proprietary blob in SteamOS (I don't think most gamers would care)


>Valve strictly adheres to open-source drivers, but the HDMI Forum is unwilling to disclose the 2.1 specification.

I work in UI in enterprise, where slight color shade differences between releases can cause uproar. I cannot imagine the thought process behind liquid glass in any sense.

OSX's Aqua was also an insanely bold UI with a lot of gimmicks, but was still usable for the most part. I'm so very curious about the internal discussions around this.


It was a different time, too. I remember being starstruck after seeing the UI. Windows looked overwhelmingly grey in comparison.

I do think Siri is particularly behind, but they were behind long before the AI craze. I also understand you cannot simply make Siri “be smart” with an LLM without all kinds of consequences and edge cases to deal with.

It’s not the same, but PMs and VPs at my company think we can vibe code our way out of migrating a 1.6 million line codebase to a newer language / technology. Or that our problems can be solved by acquiring an AI startup, whose front end looks exactly the same as every other AI startup’s front page, and slapping a new CSS file that looks like that startup on top of our existing SPA because their product doesn’t actually do anything. It’s an absurd world out there.


I was going to say this. It's not a well-balanced map gameplay-wise, but the grand scale of it back in the day was very awe-inspiring.


I think most UT99 maps were balanced for chaos. Different types of chaos depending on the mode and the map. But they were genuinely all great. I don't remember hating any map in particular.


That’s kind of what I miss about classic shooters. No rankings, no matchmaking — just random jank in service of pure pubbie fun. Don’t like the map? No worries, we’ll cycle to a new one in a few minutes. (Or hop on over to a different server.) Versus CS just being endless rounds of perfectly balanced de_dust2 these days. “Competitive,” sure, but freakin’ boring!


It really was all geared towards having fun. And the fact that all the weapons were available on the maps rather than being pre selected made the game so easy to play.

There was no wasting time figuring out what style of play you wanted to go for. You just picked a game mode and went for it.

And as you said, maps were changing very quickly depending on how the game mode was set up.

Plus, so many exploding bodies


This is the one that I still fire up with my friends once in awhile. The only problem is it sort of lacks the charm and nostalgia, and it's a bit rough around the edges in parts (the gun selection is fairly abysmal and don't feel right IMO).


My NAS:

1) A refurbished Dell Wyse 5070 (8GB of RAM) with a cheap 64GB SSD from 2013 2) An 8-bay USB-C hard drive enclosure 3) 4 used 12TB hard drives from eBay, 4 3TB drives from 2010 that still somehow haven’t died. 4) A headless Debian with various Linux ISO trackers / downloaders in Docker / Plex (the CPU, though slow, has decent hardware encoding) 5) No RAID, but an rsync script for important Linux ISOs and important data that runs weekly across different drives. I also have cold storage backup by purchasing “Lot of X number” 500GB hard drives on eBay from time to time which store things like photos, music, etc over two drives each.

The whole setup didn’t cost me much, and is more than enough for what I need it to do.


No Raid? Have you considered using SnapRAID (+ mergerfs if you fancy)? Seems like an ideal use case for the solution.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: