Hacker Newsnew | past | comments | ask | show | jobs | submit | evolve2k's commentslogin

> Notably, the rollout will be handled by an “intelligent” update system that leverages machine learning to determine when a device is ready to receive the update.

> Curiously, there seems to be a lack of transparency around how Microsoft’s machine learning system decides when a device is ready to receive the automatic update.

The open secret is that the LLM has been prompted to make the call and no human in Microsoft is able to interrogate why the agentic AI is pushing updates to some machines and not to others.


A few articles like this one were doing the rounds early last year. Curious, are we any closer to communicating with animals?

This is an Ask HN with a linked article from early last year, but the bot removed the ask HN text. Oh well, curious on peoples thoughts.

Yes the title I put doesn’t math the article as my question is to the HN community now a year or so on.


Comparing the final two images of taken of earth in 1972 and 2026 respectively; does the 2026 (left) image look murkier and less crisp to anyone else?

Surely our camera gear is exponentially better now? Is the reason for the new image being ‘murkier’ due to light, pollution or something else?


> Surely our camera gear is exponentially better now

They are better, but not exponentially. You can't beat physics, film cameras can still compete in terms of dynamic range and resolution, the optical elements haven't changed that much. The 1972 photo was taken on medium format film, which is twice the size of the sensor area in the modern one, which means more photons and less noise. The recent image was take at a really high ISO, which adds to the noisiness.


1972 -> taken during daytime 2026 -> taken during nightime

I’ve looked at Casa. It wasn't CasaOS, but now I'm remembering.. I think they supported casa's marketplace conventions.. so that's how as a new project it had a big marketplace as they sort of piggy backed all the apps that casa supported onto their own marketplace.

This event is giving “banana math” from the show Arrested Development (S01E02 Top Banana).

I like these conventions. Another personal practice I us the body for is where I’ve relied on any webpages; blogs, issue reports, stack overlap pages etc to help the commit come together.

The example of using one library over another, especially if research has gone into which to choose, regularly involves say finding a good article that compares the alternatives.

I’ll say though that I usually include links to more notable references, I won’t usually commit refs to a libraries own docs and more obvious stuff; revealing and keeping references to resources found that went towards getting it done are what I keep and add to commit body.

Maybe there’s spaces for useful references to be added to the spec/conventions. Personally I usually show links like this after the body message.

Example of the commit body:

refs(oath-library):

www.something.com/picking-a-thing


I’m guessing hand coding means, not vibe coding.

Did you use AI? .. Nah I hand coded it.


Well, yes. I was trying to comment in spirit of parent comment.


Real programmers use butterflies. https://xkcd.com/378/


Firstly, I really want this also and am supportive of an opinionated decision to put something at say Temporal.DateTime() that would be logical for developers to use ‘most of the time’.

However my guess is that the spec designers saw this lack of specivity as part of the problem.

A key issue of dates and times is that we use them culturally in day to day use in very imprecise ways and much is inferred from the context of use.

The concepts of zoned time and “wall clock” time are irreducable and it’s likely much code will be improved by forcing the developer to be explicit with the form of time they want to use and need for their particular use case.

I think this is why it’s so explicitly specified right now.

But I agree; I’ve often struggled with how verbose js can be.

Maybe with time (pun intended), more syntactic sugar and shorter conventions can be added to expand what has been an incredible effort to fix deep rooted issues.


My thought is half cynical. As LLM crawlers seek to mop up absolutely everything, companies themselves start to worry more about keeping their own data secret. Maybe this is a reason for shifts like this; as encrypted and other privacy-preserving products become more in demand across the board.


It’s still happening.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: