Hacker Newsnew | past | comments | ask | show | jobs | submit | tass's commentslogin

I interpreted this as not as good a way to learn.

I learn the most from struggling through a problem, and reading someone’s code doesn’t teach me all the wrong ways they attempted before it looked like the way it now does.


Exactly. And vice versa, one of the biggest benefits of code review is calling out pitfalls you, the reviewer have ran into that the reviewee isn't aware of. LLM addicts won't have any experience with what works/doesn't work, so their reviewing will be pretty useless


I was thinking in situations where a coworker might send me something to review, and I might have thought "hmm, I wouldn't have done it like that, but this is a great way to do it too". Also, a good source of teachable code is to participate in a programming contest, and then review the repositories of the teams who scored better than me after the contest.

I agree that if I don't already know how to implement something, seeing a solution before trying it myself is not great, that's like skipping the homework exercises and copying straight from the answer books.


Yeah I learn from reading other work too, but it doesn’t stick as well as when I work through it.

The problem now is the pressure to use llms means creating more code but understanding so much less.


This is why tutorials in programming don't really teach much because you get the finished version. Not all the wrong steps that were taken, why they failed, what else was tried.

These steps are what help you solve other issues in the future.


I’m not usually an apologist, and I’d agree with this judgement if the car was left to its own devices, but the driver of the car held his foot on the accelerator which is why it blew through those stop signs and lights.

In regards to the autopilot branding, would a reasonable person expect a plane on autopilot to fly safely if the pilot suddenly took over and pointed it at the ground?


The average person does not know how to fly a plane or what a plane autopilot does. It's a ridiculous superficial comparison. Planes have professional pilots who understand the capabilities and limits of aviation autopilot technology.

Tesla has had it both ways for ages - their stock price was based on "self-driving cars" and their liability was based on "asterisk asterisk the car cannot drive itself".


According to your analogy. Certified pilot = Certified driving license holder. Its not like Tesla is advertising non driving license or in eligible person can drive using Autopilot. I wonder how can you even justify your statement


Autopilot is part of a private pilots license and systems are approved by the FAA. Tesla autopilot isn't part of a driving license, nor did it undergo review by the NHTSA prior to launch because Elon considered it "legal by default".


No. You don't need to know the autopilot to get your PPL. You do however need to know how to follow the POH (pilot operating handbook, which may include manufacturer guidelines for the autopilot) and perform basic instrument flying in an emergency. I don't recall any significant expectations of autopilot usage at the PPL level though.


The trend is to include as much as the aircraft’s capabilities in the checkride now.

For an IFR checkride, if you do it in an aircraft equipped with autopilot, at least one approach will have to be flown with the autopilot.

Not sure if that’s made it’s way to the PPL checkride (or been codified) but it’s inevitable.


> if you do it in an aircraft equipped with autopilot

There's also a (stupid, imo) tendency for APs to conveniently become inop right before a checkride. It's not accurate to say that all pilots, or even all pilots that have taken an IR ride, are "pilots who understand the capabilities and limits of aviation autopilot technology."


For the PPL specifically, the focus is on basic airmanship in VFR conditions, and that means eyeballing the six pack (or digital equivalent) and looking out of the window. The instrument flying expectations is primarily for emergencies and preparation for future instrument rating.

But yes, I understand what you mean.


I'm not sure it's been codified, but I was told I would need to understand how to use the VOR and autopilot if the plane I was in had one.

In the fleet at the school I was learning in (Cessna 162) only one plane had an autopilot, which meant nobody practiced with it, so they never scheduled this plane for a check ride.


“Full Self Driving”


Autopilots in airplanes are kind of dumb (keep heading, speed, and altitude, they won’t do much anything else), which is why Tesla doesn’t use the name as branding for its full self driving software. People at least know that much.

But then again even on HN people like parent think that autopilot is the same as full self driving, when it is and always has been just smarter cruise control. The payout was for autopilot (a feature that most new cars have these days under various names), not full self driving.


> Autopilots in airplanes are kind of dumb (keep heading, speed, and altitude, they won’t do much anything else)

That is absolutely false.

A 20 year old avionics suite in General Aviation (GTN 450) does much more than maintain altitude, speed & heading - you input a flight plan including an approach, it will fly the flight plan, capture the approach signals (VOR/localiser/whatever - which is far more complex than “keeping course”) all the way down to approach minimums.

It can go down to 200ft for an LPV approach.


> keep heading, speed, and altitude, they won’t do much anything else

That is absolutely not true. A plane on autopilot can land itself except for applying the brakes.


Autolanders are separate systems from autopilots, and there are definitely planes in production today with autopilots (pretty universal) and no autolanders (like almost all Cessnas, you need the garmin auto land system, and it’s only for emergencies). Autopilots have been a thing since the 1920s, when they were just a rope tied to a stick, they definitely didn’t do auto landing back then.

If you are trying to claim that all autopilots come with auto landers, thats absolutely not true. Even most, and again, they are always separate systems even if the auto lander can access the same servos used by the autopilot. Additionally, autolanders, unlike the autopilot, require the runways to support it (well, the ones used on commercial airplanes where it’s used sometimes in low visibility situations).

I really think only a few people on HN don’t get that autopilots are actually very much simple systems that have been around forever and do one thing well (keep the plane going in one direction at a specific altitude and speed).


If the average person does not know what an autopilot does, why would they expect Tesla's 'autopilot' to take such good care of them? I am reminded of a case many years ago when a man turned on the cruise control in his RV and went to the back to make himself lunch, after which the RV went off some sort of hill or cliff.

Rudimentary 'autopilots' on aircraft have existed for about a century now, and the earlier versions (before transistorization) only controlled heading and attitude (if conditions and other settings allowed it), with little indication of failure.


> If the average person does not know what an autopilot does

The average person does know what an autopilot does, they're just wrong.

I think the example you provided supports that.


This would be more like they enabled cruise control, hit the brakes, and sued the manufacturer because they were rear-ended.


The original judgement held that the driver was 2/3 responsible, Tesla 1/3 responsible, which seems reasonable. The $243 million wasn't for causing the accident, but was a punitive amount for doing things that looked an awful lot like lying to the court and withholding evidence.


This makes a lot of sense and makes the verdict seem reasonable, thanks for providing the context.


A “reasonable person” in a cockpit is not the same as a “reasonable person” behind the steering wheel.

Pilots undergo rigorous training with exam after exam they must pass.

No one is handed the keys to a Boeing 747 after some weekly evening course and an hours driving test.


I don't mean a reasonable pilot. Would a reasonable person expect autopilot in a plane prevents a plane from crashing into something that the pilot was accelerating towards while physically overriding the controls. The claim is that autopilot should not have been able to crash even with the driver actively overriding it and accelerating into that crash.

To me, it's reasonable to assume that the "autopilot" in a car I drive (especially back in 2019) is going to defer to any input override that I provide. I wouldn't want it any other way.


Not sure what "autopilot" means in a car. Is the self-parking feature called "landing gear"?


A person can get mistakenly (or not) flagged for special screening and get it over and over again - it happened to me many years ago.

I fixed it by filling out a form requesting a review, after which I received a “redress number” which could be entered into my booking information. It reliably stopped after that.


Tailscale allows you to disable the expiration time - I do this for my gateways.

My other simplifier is having everything at home get a .home dns name, and telling Tailscale to route all these via tailnet.


can you please tell me how to disable expiration time? I see auth keys have an Expiration which says it "Must be between 1 and 90 days." I do use a custom domain name as well with a Nameservers rule to have all my services reachable as subdomains of my custom domain.


There is some confusion here because while you can disable node key expiration, you can’t disable auth key expiration. But that’s less of a problem than it seems - auth keys are only useful for adding new nodes, so long expiry times are probably not necessary outside of some specific use-cases.

Edit: in fact from your original post it sounds like you’re trying to avoid re-issuing auth keys to embedded devices. You don’t need to do this; auth keys should ideally be single-use and are only required to add the node to the network. Once the device is registered, it does not need them any more - there is a per-device key. You can then choose to disable key expiration for that device.


I want my CI containers created per branch/PR to have their own Tailscale domain, so logging them in is useful via non-expiring key. Only good option I've seen previously is to notify every 90 days when key expires.


The best way to do that is using an OAuth client. These don't expire, and grant scoped access to the Tailscale API. You use this to generate access keys for the devices that need to authenticate to the network.

We use this for debugging access to CI builds, among other things – when a particular build parameter is set, then the CI build will use an OAuth key to request an ephemeral, single-use access key from the Tailscale API, then use that to create a node that engineers can SSH into.

Access keys ideally should be short-lived and single-use where possible. https://tailscale.com/kb/1215/oauth-clients#generating-long-... has details on this flow.


Thanks, I'll soon get to try this out hopefully!


You can create an oauth client that can generate keys as you need them.

https://tailscale.com/kb/1215/oauth-clients#generating-long-...


You mean, with Microsoft 365 Copilot App (there’s no more Office)


Jobs was right.


According to a related article: https://www.discovermagazine.com/why-its-nearly-impossible-t...

The death was caused by an "unknown pre-existing condition" but doesn't elaborate further.


Almost certainly a typo, the first Walkman was 390g or 14 ounces (not 1.4oz)


Yes, it seems you are correct. For example this one is 180g including batteries: https://www.1001hifi.info/2025/01/sony-wm-20-1983-worlds-sma...

Though I stand by my implied argument that older devices were not as heavy as we might remember them to be. And it is okay to consider 240g a bit too heavy in the context of a digital music player with no need for cassettes or mechanical parts.


I read the advert as claiming that the headphones for that Walkman are 1.4oz, which seems plausible (they're a very flimsy design).


Yes, I can’t even use many 10.x subnets at home because my work VPN configures a huge routing table including many of them.

Basically I had no choice but to redo my home network if I wanted to use my new work laptop at home (and I work 100% remote).


I "solved" this by running a separate VLAN for work machines that provides addresses in a slightly weird /24 carved out of the 172.16.0.0/12 [0] range. Is it as collision-resistant as a ULA address? No. But -sadly- I've yet to see an Enterprise VPN that wasn't run as an IPv4-only thing, so it's the best I can do.

[0] Or whatever the netmask actually is. I'm never sure about the 172.16.x.x space.


I'd be tempted to shove that VPN into a network namespace together with jool, and NAT64 their 10.x subnets into, let's say, 2001:db8:a:b::/96, so that their 10.1.2.3 becomes 2001:db8:a:b::10.1.2.3. Then there's no overlap as viewed from outside the namespace.

And if you ever need to use another VPN that also clashes on 10.x, you can do the same thing but map that one into 2001:db8:a:c::/96. Then you've got 2001:db8:a:b::10.1.2.3 and 2001:db8:a:c::10.1.2.3, neither of which clash with either each other or your 10.1.2.3.


They’ve done loud, in-store presentations for longer than Apple Intelligence has been a thing, but you’re right that it’s a captive audience of mostly disinterested people.


I’ve had this music in my head for 30+ years. I could never get very far in the game though!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: