Is there any kind of “hook up” on wholesale large dumb displays?
I know I’m preaching to the choir, but I just want a giant dumb display from my Apple TV. I vaguely remember someone posting a link to tvs restaurants use but I don’t remember exactly what or if it was what I’m looking for.
I wonder about this every time I see a smart TV-related thread on HN. I recently purchased an LG OLED (C5 48") because my old TV died so I'll finally comment. As others have said, just don't connect it to the internet. But you knew this already, so I'll provide my anecdote on the experience of this since I wondered the same thing for years before getting this TV.
When the TV is never connected to internet, and you use a single HDMI source like me, the TV acts completely like a dumb TV. It gets turned on via my AppleTV remote and displays the picture 1-2 seconds later. No LG logo (I disabled this), and no smart interface shown whatsoever.
If you want to change settings, you can display the settings interface via LG remote control and it generally acts like a dumb TV (not blocking the entire screen, so you can adjust picture quality and see the result as expected).
I've had the TV for about two months and never been asked to update it or shown any ad. The only time I've ever seen the smart fullscreen interface is when you unplug a live HDMI source and the TV detects that nothing is there. (If you turn the source off, it tells the TV to turn itself off as well.)
Hope this helps since it's a lot easier to buy a nice smart TV and do it this way than find a truly dumb commercial panel.
They could also make agreements with ISP's where their TV's can be whitelisted for access to a public or potentially unlisted WiFi, enabling them to connect that way, without the vast majority of customers ever being aware.
Similarly, these TV's could connect to any open wifi hotspot it can find and phone home/download updates that way. Cox for example proudly boasts how more than 4M of it's residential customers modem+router+ap's can be used for "WiFi Hotspots" by anyone - not just the customer/resident - if they have a cox account. I don't see why Samsung or any other manufacturer may approach said ISP's to use this network to update devices under some guise of "convenience" or "seamless updates" ostensibly for their less tech savvy users.
I don't know if these business deals exists, but "smart devices" will often try to phone home/update anyway they can, even if you don't manually configure it on a private network.
Mine is a Vizio from Target that's never been online. I've gotten close to cutting its wifi antenna circuit to prevent this but I think I got it before they started programming anything like this in and I should be safe if it stays offline.
But then I still think about cutting it in case I ever have anyone over that would be stupid enough to sign in to the wifi on it. Better for it to have never happened.
Totally correct and a good call out. I did check this as best as I could for this particular model of TV. But I'd have to do the same in a few years if it was ever to be replaced. I suspect I'll have to desolder the cellular module of my next TV circa 2036...
Most are also larger, heavier, with higher power consumption, and sometimes uncomfortably high minimum brightness. They rarely use the same panels as retail models because they have to support different operating conditions like extreme temperatures and 24/7 operation.
> the cost was double because the target market is "ad agencies" or whatever.
A TV capable of operating in those conditions has to be more expensive or else it'll need replacing twice as often and cost even more long term. Remember when Tesla used bog standard laptop screens in their dash because they were cheaper than automotive grade, leading to high failure rate?
This makes me wonder if my local McDonalds, which has three big screens mounted vertically in the drive-thru, ended up with not the commercial grade ones. They’re cooking in the sun in a hot climate all day, so they fail and turn into flickery messes, and it seems like they’re on a cycle of roughly 3 months newly-replaced & working, 1 year flickering.
Yeah, if you want a TV that looks terrible. They usually have terrible response times and focus on nits at all costs. Try watching anything HDR on a display panel.
Many (most?) "smart" TVs will work fine if their network connection is never set up. Many of those can be set to wake on an HDMI signal from, for example, an outboard streamer box. That means you can take advantage of the subsidy paid by the bloatware that comes inside your TV, for a price that I bet is coincidentally close to the same as the price difference of an unsubsidized dumb TV.
I wondered this as well when I was shopping for a new TV a few years ago. Unfortunately, not only is it really difficult to find a "dumb" modern TV, but the best display panels will always be on smart TVs because that's what sells.
That said, I ended up getting a Sony A95K 55" TV, and it's been great. It has Google TV built-in, but I immediately connected our Apple TV to it, and it's never seen the Internet since. No nags, either. Sony also made it really easy to disable motion interpolation, a feature I really dislike.
Dell has the P5525QC, a 4K 55 inch screen. Here in Denmark they sell it for 8846 DKK (~$1300 USD). I use a predecessor with my Apple TV and it works great.
Just use any TV but don't log into your WiFi or connect an Ethernet cable. It sounds like that won't work with these Vizio TVs, but they're likely junk anyway. This is what I do with my Sony and LG TVs in my house and they work fine as dumb displays attached to my AppleTV box.
I've said it before on HN, but I just want a somewhat trustworthy group to develop "DUMB" certification. I think enough people would pay extra for a certified DUMB TV for it to be worthwile. "Don't Upload My Bits"
I wonder if a "woot" style service could work. If 10K like-minded consumers made a group-buy every 2-3 years, a high-end panel vendor might be willing to provision a new SKU with a few firmware tweaks.
For a while, Costco had a reputation as the place where you could buy a TV and be confident that it was usable as a "dumb" TV. The rumor (unconfirmed as far as I know) was that, among the customizations that manufacturers would make for retailer-specific models, the Costco ones included firmware tweaks to pull back on requirements for things like mandatory connectivity, account creation and the like.
I'm not sure how true any of that is, but in any case Costco still has a reputation as a place where it's easy to return a TV, and they pay attention to the stated reason for return.
Somehow these dumb displays always seem to be cheaper than the smart ones. For some mysterious reason all the chips and stuff to needed run an OS have a negative cost.
We’re in a personal software era. Or disposable software era however you want to look at it. I think most people are building for themselves and no longer needing to lean on community to get a lot of things done now.
I think this is right, I can get cause to build me something for my own use that I’d have given up at before, getting to the point of being useable still doesn’t make it shareable.
It is annoying. But I bet they’re effectively being DDoS’d every day by AI agents now. I think the past year of that growth has destroyed any of their spare resources. It’s not really a scale problem I’d like to work on, tbh. Seems very hard.
Wait - are you missing all the context on this? Anthropic pushed back against this hard, there was a whole back and forth. I'm on mobile and can't look it up for you atm but if you google about this scenario, Anthropic definitely come out of this looking a lot better than OpenAI and xAI
The highest in in the industry for API pricing right now is GPT-5.4-Pro, OpenRouter adding that as an option in their Auto Router was when I had to go customise the routing settings because it was not even close to providing $30/m input tokens and $180/m output tokens of value (for context Opus 4.6 is $5/m input and $25/m output)
(Ok, technically o1-pro is even more expensive, but I'm assuming that's a "please move on" pricing)
I’ve been tempted to buy one and do “real dev work” on it just to show people it’s not this handicapped little machine.
I built multiple iOS apps and went through two start up acquisitions with my M1 MBA as my primary computer, as a developer. And the neo is better than the M1 MBA. I edited my 30-45 min long 4k race videos in FCP on that air just fine.
> I built multiple iOS apps and went through two start up acquisitions with my M1 MBA as my primary computer, as a developer. And the neo is better than the M1 MBA. I edited my 30-45 min long 4k race videos in FCP on that air just fine.
Before I was a professional software developer, I used a scrawny second-hand laptop with a Norwegian keyboard (I'm not Norwegian) because that was what I could afford: https://i.imgur.com/1NRIZrg.jpeg
This was the computer I was developing PHP backends on + jQuery frontends, and where I published a bunch of projects that eventually led to me getting my first software development job, in a startup, and discovering HN pretty much my first day on the job :)
The actual hardware you use seems to me like it matters the least, when it comes to actually being able to do things.
I still manage and develop my php/jquery saas product on a 2011 27" iMac running Linux Mint, with an SSD being the only upgrade. Runs better than most new windows machines. No complaints.
I switch between Thinkpad T420s and PineBook Pro for all the hobby work.
T420s has loose USB ports and the power socket is almost falling off, so I plan to replace it by a 5 years old T14 G2 in the coming months.
I can afford the latest MacBook, but I'd rather not generate more e-waste that there is, and more importantly I feel closer to my users, and my code is efficient and straight to the point.
My non-hobby laptop is an old cheap Dell from 5-6 years ago.
The best laptop I ever had was a maxed-out Thinkpad P7x, and it came with the most meaningless job ever.
I can only compare that job to the one at a unicorn that gave me the latest and greatest MacBook. Not only the job was meaningless, the whole industry made no sense to me.
I started my business back in 2006 with an ancient 306 laptop - it was practically free, it ran VIM just fine, and that was all I needed it to do to crank out PHP until the cows came home.
My substantially more privileged, but somewhat equivalent experience, was doing mobile app development, Docker, linux VMs, UI design, and finding out about hacker news on an 11 inch MacBook Air with 4gb of RAM.
i have a computer that benchmarks literally 10x faster and with 32x the amount of RAM, but i miss that little thing that helped me build my career from nothing
My dad spun up my Pentium Deschutes (400MHz!) machine the other day. Same hard drive from when I was 10 years old. “clouds.psd” was on the desktop.
I still remember retiring that computer. The first thing I did when I got my Pentium IV chip a year later was download Macromedia Dreamweaver. Did me well.
I wrote 99% of a large PHP app on an six year old laptop with a single 17" LCD. Meanwhile, at my desk, I had a Dell workstation with 3 monitors at the time, but it was easier to squirrel away in a corner somewhere, undisturbed.
After all, the actual server ran the code, I just needed text editors, terminal windows, and web browsers.
Your hardware matters quite a bit if you're doing lower level things and the architecture is not the same as you're developing for. But apparently HN is all web devs
Unless you want to insmod things in your main kernel like a cowboy, I don't see why you'd need architectures to match. Cross compilation is the proper way (for some architectures it would be quite hard to find a machine capable of compiling the kernel before the heat death of the universe...)
I just spent vacation deciding not to bring a laptop, but to use my android phone (a galaxy s22) with a hdmi adapter and Bluetooth travel keyboard. Plugged it in to the TV in our accomodation and had a lot of fun.
Running neovim on termux was fine. Developing elixir was no problem, the test suite took 5s on my phone, and takes 1s on my laptop. Rust and cargo compiling was slow enough that I didn't really enjoy it though.
Meant that I could just pack up instantly and have an agent do review workflows while I was out and about as well in my pocket, and didn't really notice a big battery hit.
I brought my 13 inch macbook pro to japan for three weeks last month for photo editing. i was able to pack up immediately by slipping it into my backpack laptop pocket.
Not sure the difference other than weight, but I wasn't carrying it day to day when i could leave it in my hotel room.
Exactly that, I have too many ideas for side-projects and never enough time for them.
The main activity was still the traveling, hiking and enjoying some calm time. But instead of spending the usual downtime reading or something else, I had a blast coding and experimenting.
According to Google AI > A vacation (American English) or holiday (British English) is a designated period of time for rest, recreation, or travel, often taken away from home. It involves a break from work or school routines, usually lasting several days or weeks. Vacations are crucial for mental health, reducing stress, and fostering better relationships.
So maybe different meaning for everyone. For me it’s getting away from technology and into nature.
> a break from work... are crucial for mental health
When I'm hacking on my Linux desktop automation scripts on my free time, I can assure you that my good mood is positively contributing to my mental health.
It's starting to show its age, but I've been using a 2019 MacBook Pro with the Intel chip and 16GB of memory. Still handles multiple terminal sessions with Claude Code and Codex simultaneously, building in Xcode, running Docker in the background, etc.
(Maybe the fans sometimes sound like they're a jet engine taking off…)
Finally just put an order in for a new 16" MBP M5 Max with 48GB memory only because it looks like they're going to stop supporting the Intel stuff this year and no more software updates. It'll probably be obsolete in six months with the rate things are going, but I've been averaging seven years between upgrades so it should be good!
Oh my. All I have to say is cherish the first week of your M* experience. :D When I got rid of my intel MBP (it was an i7) for my MBA it was astonishing how fast and smooth it was.
I agree. It was utterly ridiculous how noticeable the improvement was. I was doing z3 solving for ICFP contest the first couple weeks after getting the m1 air. And it was consistently smoking my teammates maxed out i7 MBP
Sort of, they have no "hands", LLMs can only respond that they want to execute a tool/command. So they do that a lot to: read files, search for things, compile projects, run tests, run other arbitrary commands, fetch stuff from the internet etc.
Obviously the LLM inference is super heavy, but the actual work / task at hand is being executed on the device.
I use a 2015 MacBook Pro all the time--like right now. It does have 16GB of memory. It's what sits on my dining room table where I do most of my writing/browsing and which I take for travel. I do have an Apple Silicon MacBook Pro in my office but my downstairs "office" is a lot lighter and airier.
I use a 2015 MacBook Pro all the time--like right now.
I have a 2010 MacBook Air that I still use when traveling.
The battery is completely shot, but it works fine when plugged in. And if I'm on the road, I don't use my computer until I get to the hotel anyway. And even then, it's just fine for e-mail, browsing, and even Photoshop.
I think this one had a battery replacement because it was bulging. But it's definitely in the class of devices that, if it gets swiped or lost, is basically in the <ehh> category as opposed to my newer one.
Am probably giving newish iPad and magnetic keyboard a spin on my next trip mostly to see how it goes.
I have one of these (it's my only Mac), but it only has 2GB of RAM, so it's kinda rough. I tried Mint on it, but IIRC it might not have the GPU drivers? I just bought it a new SSD which helped a bit.
Former employee of mine had the 2019 MBP as well. After a few years he had the same problem with the fans -- if you haven't already, pop it open and clean the fans and vents. You'll probably need a little brush along with compressed air. Lots of stuff comes up on Google. Great machine btw. Good luck!
I was using a M1 Mac Mini and only 8GB of RAM on it to build iOS apps for maybe a year. It's absolutely doable, though it very noticeably gets a little less snappy when building projects. When building in Xcode and then switching to Firefox to browse for instance, I could tell it took slightly longer to switch tabs and YouTube playback would occasionally stutter if too much was happening.
I also was using an Intel MacBook Pro with 16GB at the time. Doing the same thing there was much smoother and snappier. On the whole, it actually made me want to just the laptop instead since it "felt" nicer. (This isn't measuring build times or anything like that, just snappiness of the OS.)
I'm glad enough people got M1 MacBook Airs now that the broader sentiment within the commentariat is changing and people are pushing back on the dismissals.
8gb has ALWAYS been fine in Apple Silicon Mac OS. RAM usage on a fresh boot is a meaningless statistic (unused RAM is wasted RAM). And they're just plain capable!
People usually forget 8GB isn't 8GB. Memory compression means you can store ~2x (lz4) to 3x (zstd) as much data in memory as ordinarily. And in the worst case, reading swap from disk (writes don't matter as they can be predicted) is so much faster with NVMe SSDs.
The worst corner they cut is no keyboard backlighting. That saves them what, $1 BoM per MacBook Neo? Especially because now they have to put up an entire new keyboard production line instead of just piggybacking off of the Air keyboard production line.
Thank you so much for this! At least now I can see the worst offenders. How did you even find this tool? The internet has almost no records of it. Amazing.
Still does not explain why this balloons over time. Aka if I restart my Mac right now and reopen the same exactly apps with the same exactly windows the WindowServer will take 80% less memory.
I just retired my m1 air to being a server this month. They’re very capable laptops. If the neo is even comparable in spec it’s excellent for the price
My m1 air with 1TB ssd and 16GB of ram is a little champion, I use it during travel to play indie games like Hades II or Slay the Spire, and it works really well, better than my Steam Deck which broke. The only issue it really has is when I try to plug it into my docking station it struggles mightily with 2 2K screens and a 4K screen, so I just use my desktop in that case.
I am jealous of my wife’s 13” M5 iPad Pro though, that oled screen is gorgeous, a wonder of modern engineering.
I setup a self hosted runner and then use that in my CI workflows. Then I disabled it from sleeping so it can clamshell forever and now it sits here in my living room silently workin' https://imgur.com/a/EaBICdo
And, presumably for a combination of the Mac build (and hardware) being of niche interest and sitting outside the standard Linux workflows so it's annoying to administer. And serving a money-making audience (iOS app devs) who have a revenue stream and see the extra CI cost as worth it.
I have an older 8GB MacBook Air. This is false. I routinely have Slack, Chrome, iTerm, Visual Studio Code, and more open on it. It’s fine.
Those apps don’t need every single byte of memory you see in Activity Monitor to be active in RAM all of the time. The OS swaps out unused parts to the very fast SSD. If you push it so far that active pages are constantly being swapped out as apps compete then you start to notice, but the threshold for that is a lot higher than HN comments seem to think.
It really isn't. It is a capable machine but modern software has made it a lemon. And that is the only reason apple sells it. So that whoever buys it needs to buy another one prematurely, generating another sale.
Everything from apple to modern software is rotten to its core.
…in reply to someone who just said their experience is fine, and included details. If you just want to rant about Apple, have at it, but you’re going to have to do better than “nuh, uh” if you want to be convincing.
Well I could say that it isn't enough for vscode alone. And I'd be right. It all depends on how and what you use vscode for.
8GB really shouldn't be an option in 2026, it is just shortsighted and an insanely uneven build.
I could rant about Dell too. Or most other manufacturers (surprise, greed isn't apple exclusive). But Apple at least tries to keep the appearance of a higher profile.
Well I could say that it isn't enough for vscode alone. And I'd be right. It all depends on how and what you use vscode for.
Fair enough; though experience says 8Gb will run VScode, it would very much depend on the use case, I agree. OTOH, I would argue that anyone working VScode that hard probably isn’t buying 8Gb machines, but OP did say they’re running it so it’s up for discussion.
I’m sick to death of this. It’s so devoid from reality in 2026 that I see it as a lowest common denominator populist political catchphrase more than any legitimate contributor to any conversation. My min spec MacBook Pro from 6 years ago doesn’t flinch at this, and it barely flinches at a whole lot more.
Can we please just move on? Maybe get your hardware checked if you’re legitimately still having these issues.
I've been finding it hard to wean myself off the standalone app but another major reason to do so is opening threads in separate tabs. I find as soon as I'm involved in two or more conversations on there it's super easy to start losing track of things.
I am talking from experience with an M1 & 8GB RAM. I had to restart either the browser or the YouTube browser processes at least once every couple days to stop the whole system from lagging.
I could have two browser windows open in the late 1990s. I have about a thousand times as much RAM now. So even with 10x more bloat in the pages, I should be able to open 200 tabs just fine.
The argument is misrepresented - I think it's about frustration and convenience, not achievability.
I developed some work that keeps tens of thousands of people alive every day on a $100 Acer netbook almost 15 years ago. The tools are always there, I don't think anyone thinks the work is actually impossible to do on a limited machine.
I wrote a fix for node that got upstreamed a few years ago on a Lenovo Thinkpad 3 Chromebook. I'm actually commenting from it now. It's not a workhorse by any means, but for $99, it's not bad. A 1.1GHz Celeron processor with 4GB of memory is able to compile projects like node, python, Erlang, etc. without much hassle. It just takes a lunch break :)
Any modern Mac is more than capable. I had the baseline M1 Macbook Air that I did work on as well, just to see how that fared. Much better than this machine - 10x the price, but more than 10x the performance. This one is great as a "I don't mind if I break it or lose it" device.
I was doing Android development and Verilog synthesis on a mobile Nehalem i5 in 2020. That machine is still totally adequate for anything a "normal person" does with their computer, provided they have good tab hygeine. The reality is that (unless you play video games and/or you want local LLM inference) the demands people place on their computers haven't changed significantly in at least 10 years.
Oh that made it seem like I was the driving factor. Maybe for the first one (Percy.io) I can claim a large part of that success (owning the SDKs and support end to end).
The other I just owned the front end infra and was on the growth team. The rest of the folks were the stars on that one.
Edit: I guess I brought that up because I guess I don't know any more "real work" that that, ha. What is 'real work'?
> I’ve been tempted to buy one and do “real dev work” on it just to show people it’s not this handicapped little machine.
But... you can do the same exercise with a $350 windows thing. Everyone knows you can do "real dev work" on it, because "real dev work" isn't a performance case anymore, hasn't been for like a decade now, and anyone who says otherwise is just a snob wanting an excuse to expense a $4k designer fashion accessory.
IMHO the important questions to answer are business side: will this displace sales of $350 windows machines or not, and (critically) will it displace sales of $1.3k Airs?
HN always wants to talk about the technical stuff, but the technical stuff here isn't really interesting. The MacBook Neo is indeed the best laptop you can get for $6-700.
But that's a weird price point in the market right now, as it underperforms the $1k "business laptops" (to avoid cannibalizing Air sales) and sits well above the "value laptop" price range.
No, you can't do real work on a $350 windows machine. No way such a setup is suitable for anything beyond browsing a tab or two and connecting to servers using SSH.
And, the whole shittiness of the experience will even distract you attempting real work: the horrible touchpad, the bad screen, the forced windows updates when you trying to start the machine to do something urgent, ads in Windows, the lack of proper programmability of Windows (unless you use WSL).... Add the fact that the toy is likely to break in a year or two. These issue exist on far more expensive Windows machines, how much more a $350 machine.
Leaving Windows machines and OS behind for more than a decade has been a continuing breath of fresh air. I have several issues with the Apple devices and macOS (as I have with Linux too), but on the whole they are far better than Windows. The only good thing about Windows that I miss on Macs is the file explorer and window management, not sure why Apple stubbornly refuses to copy those.
A lot of $350-ish Windows machines also don’t have SSDs but instead eMMC storage, which is dog slow and will make modern SSD-mandatory Windows feel even more awful to use.
If Windows/Linux/x86 is non-negotiable and that’s your budget, I would never in a million years recommend anything brand new. This is when you go pick up a $350 used midrange ThinkPad on eBay. It won’t outperform a Neo in terms of CPU and battery life but I guarantee it’ll be a better experience than the garbage routinely sold at this price point.
Of course you can. You can do real work on an $80 Amazon Fire. Yes, some things will be potentially impossible or frustrating but that's also true of the MacBook Neo, just a bit higher of a bar. A lot of this also depends on your definition of "real work".
$350 USD can get you a decent laptop with a SSD, 16GB RAM and something like an Intel N100 or N95. And they pretty comparable to a decent Intel Skylake CPU which are still pretty usable.
Yes, the Neo has a faster CPU but it also has less RAM and less storage and costs more and has less ports. Besides ray traced games what can the Neo do that the others can't? They'll take longer but they'll get there.
And if you're willing to go used? That $350 goes a lot further.
> Yes, the Neo has a faster CPU but it also has less RAM and less storage and costs more and has less ports.
8GB on Apple Silicon is far better than 16 GB on Wintel, and I don't event trust the quality of 16GB of RAM on a bottom of the barrel Windows machine.
Would you prefer a machine that is still good 7 years from now with less ports, or one with more ports that you have to replace in 2 years? Yes it is more expensive now, but over 7 years it is an absolute bargain.
16 GB physical RAM is just better. Apple isn't magic. Gimme a break. Both devices have SSDs for fast swapping and have RAM compression. You can't spin up a VM that has 8GB RAM on the Neo, you can't load a large spreadsheet or do a decently sized digital painting. I could maybe buy a claim that 8GB is better on Mac than 8GB on Windows.
Why would you have to replace it in 2 years? How do we know Apple will even be offering updates to Neo in 7 years? Will 8GB still be usable in 7 years really? 8GB is barely on the fence already.
I wouldn't be surprised if Apple drops the Neo from software support in less than 7 years.
The ThinkBook 14 Gen 6 at Costco for $380 has a single thread passmark score of 2800. The laptop I use to develop most of my SaaS products, with IDEs and claude open etc, has a score of 2000. I run Linux, but win10 iot runs fine on it too.
> No, you can't do real work on a $350 windows machine.
Sigh. I mean, even absent the obvious answers[1], that's just wrong anyway. You're being a snob. Want to run WSL? Run WSL. Want to run vscode natively? Ditto. Put it on a cheap TV and run your graphical layout and 3D modelling work. I mean, obviously it does all that stuff. OBVIOUSLY, because that stuff is all cheap and easy.
All the complaining you're doing is about preference, not capability. You're being a snob. Which is hardly weird, we're all snobs about something.
But snobs aren't going to buy the Neo either. Again, the business question here is whether the $350 junk users can be convinced to be snobs for $600.
[1] "Put Linux on it", "All of your stuff is in the cloud anyway", "It's still a thousand times faster than the machine on which I did my best work", etc...
You mean that machine from 30 years ago that was running 30 year old software that has nothing in common with today’s development? And how well does Linux run on 4GB?
That's a 16G windows box which will happily run multiple VMs for whatever your deployment environment is, something the Neo is actually going to struggle with. The Jasper Lake CPU is indeed awfully slow, but again for routine "dev" tasks that's just not a limit.
You would obviously refuse out of taste, but if you were actually forced to use this machine to do your job... you absolutely could.
It would have been a better fit for me than the M4 Air, I literally use it only for typing and browsing, plus a could of Mac-only tools. Brilliant machine but complete overkill for me. It's almost tempting to switch just to get rid of the display notch.
I'm still doing iOS dev on my 2020 M1 MPB, and it's fine! I expect that if I change out its battery and apply new thermal paste it would run for another 6 years.
Better in terms of raw specs. The original M1 Air also came with 8GB of RAM, and the A18 Pro in the Neo is faster than the version of the M1 that shipped in the base model Air
All of FastComments is/was built on an 8th gen i7 from 2017
Using older hardware has helped me not accidentally build slow stuff. Although at some point I gotta upgrade and just add more performance tests :) but nothing replaces feeling it yourself.
most dev workflows from pre 2021 can probably run just fine on a NEO - i think once you get into conductor / 8 terminals with claude code territory that’s where things start to slow down
i just got an m5 max with 128gb of ram specifically to run local llms
Claude Code still runs things on your local machine. So if you have some pretty expensive transpilation, or resolving dependency trees that needs musl recompilation, or doing something rust, you still need a reasonable ammount of local firepower. More so if you're running multiple instances of them.
> just to show people it’s not this handicapped little machine
I used to think this way about Apple and its jarring to read with it 10-15 years behind me.
It reads as aggro and oddly tribalistic / sports fan-y.
(what people? who thinks its slower than an M1? who thinks you can't code on it? what will you coding on it prove to these people that the benchmarks they read can't? with all that, why get so invested you're buying a machine you don't want to use day to day? what does "handicapped" mean in this context?)
Only sharing b/c I never understood why people would roll their eyes at me, and apparently I finally reached my own graybeard moment, and I am now rolling my eyes at both of my selves :)
It’s fine to if you don’t have any memory hogging apps. But as soon as you fire up a couple demanding Docker containers you’ll feel the pain. 8GB isn’t so much RAM for some applications.
Why do you think people buying the cheapest MacBook
available will be running Docket? Do you commonly run Docker containers on the cheapest Windows laptop available? Why not?
I dunno, I'm more afraid of the syndrome where people seemingly stop reading after a word or two, instead of reading/listening to the full context and then understanding that maybe they'll explain what they mean by that later on.
Besides, almost starting to feel like something LLMs cannot replicate as easily, being strongly worded, it's a bit harder to coax commercial LLMs to be "mean" towards people, so if someone is "strongly opinionated", it almost makes the comment feel more human-like. But I digress.
If someone would call Apple customers idiots, would they get banned for breaking HN rules? After all, they were the ones who kept buying generations of overpriced overheating intel garbage with keyboard that would break from spec of dust. Would that not also be the truth?
Then why is it OK to call other people, Mac Neo skeptics in this case, idiots?
>If someone would call Apple customers idiots, would they get banned for breaking HN rules?
No. There's no shortage of comments of HN that do exactly that.
In fact, you just imply exactly that regarding them buying "overpriced overheating intel garbage with keyboard that would break from spec of dust".
Been using Wintel and Mac laptops for 30 years, and, crap keyboard aside (itself an overblown concern, had those for 5 years and never got it stuck) my Intel MBPs were still great laptops, and nothing in the Wintel world was as good except some Lenovo machines (which were good in ways the Mac laptops weren't but also the opposite).
>Then why is it OK to call other people, Mac Neo skeptics in this case, idiots?
> There's no shortage of comments of HN that do exactly that.
Then you should have no shortage of examples to show as proof.
>In fact, you just imply exactly that
I didn't call anyone an idiot. The implication is on your interpretation. Meanwhile the comments I was replying directly called people idiots. If you can't make a distinction, I don't know what else to say.
>For the same reason the previous case is.
Previous case isn't. You get banned if you call people idiots on HN. Double standards hypocrisy.
>Then you should have no shortage of examples to show as proof.
I'm not your search engine. If you're on HN long enough, you should have already seen a fair share of comments calling Apple users (and basically any other tech target group) names. Here's from a random 1-second search:
I know I’m preaching to the choir, but I just want a giant dumb display from my Apple TV. I vaguely remember someone posting a link to tvs restaurants use but I don’t remember exactly what or if it was what I’m looking for.
(Sorry, being lazy here)
reply