Hacker Newsnew | past | comments | ask | show | jobs | submit | brg's commentslogin

Usefulness is here, and has been for a while time. I have been consistent in my stance that AGI will be achieved in the demonstration of iterative, stable self improvement. This can be demonstrated in knowledge creation or skill acquisition.

I like this - calling it "usefulness" - I think that's actually something the community could get behind - AGU - where we are right now, and 2 terms for 2 milestones we have yet to reach - which could be reached in any order:

1. increases in consistency and quality to some specific measure

2. development of human-like qualities - "consciousness"? "Intuition"?

And then we toss the term AGI which is overloaded and ambiguous.


More automation may lead to more freedom in open ended exploration, stemming the dramatic decline that has happened in exploration. That would be a fantastic result.

And there is a distinct decline in open ended, personal exploration in academics. There are many reasons for this, but the syndrome is widespread throughout the West and throughout scientific disciplines. The first response to someone interested in recreating a published result or simply asking "What if?" is always "Who will pay for that?" Or more discretely, "Will it lead to a grant." Regardless if the cost is $100 in reagents and 8 hours of bench time in an idle lab, or $200 to machine a new reaction chamber the focus from administration is not enabling exploration but standing in line with your hand out.


Its rot13 encoding to avoid spoilers.

To add to the collection of anecdata, your experience is similar to mine. I have been more exhausted recently by the complaints of AI submissions and pseudo analysis of AI comments than exhausted by the supposed AI generated comments themselves.


A good reason to allow it is because people want to do it and it is as relatively harmful as many other normative human behaviors.. A better reason is that it provides a means of measure real attitudes and opinions. The revealed prices afforded by prediction markets is interesting at the least, and possibly the best source of information for many of topics.


For what topics is it the best source of information?


Election odds, chance of US bombing Iran, and many others


I would argue that the success has more to do with DevDiv being the strongest technical organization at MS than its provenance.


I STILL REMEMBER WHEN SOME MARKETING IDIOT DECIDED THAT VISUAL STUDIO NEEDED TO SHOUT AT YOU. IT TOOK THREE MAJOR VERISONS BEFORE VISUAL STUDIO STOPPED LOOKING LIKE THIS COMMENT. OF COURSE THERE WASN'T A SETTING TO MAKE IT NORMAL AGAIN.


Damn, I got frustrated just by reading this comment :D


Until they messed up the whole UWP / WinRT developer experience in Visual Studio.

Also VS 2026 was released with a hard milestone, thus while there is a new settings experience, many options show a dialog from VS 2022, because the new UI is still not implemented for the new experience.

Note that most organisations have to pay for Visual Studio licenses, and get rewarded with such quality.

Slop has also arrived into DevDiv.


I think after the failure of Metro, I think Microsoft gave up on native apps entirely and now the story is web or Electron.


It appears the problem is more deep than that.

From what I could infer from some community talks, podcasts and so, I would assert that nowadays they have the problem new hires have been educated in UNIX like OSes and Web.

Thus Windows team gets lots of folks that never coded anything for Windows, and management instead of having proper trainings in place, just goes with Webview2 and Electron all over the place.

I might be wrong, this is more my perception than anything else.


I would say the web took over as the primary application platform and Unix-likes provided convenient low cost license-free foundations to build them on.


No excuse for not having trainings in place for Windows native development, for those new hires.

You don't see Apple and Google doing the same Webviews all over the place on their OSes, with exception of ChromeOS, which appears to be on the death row to be replaced with Android anyway.

In fact, at WWDC 2025 Apple executives even spoke publicly on the matter against that approach.


> nowadays they have the problem new hires have been educated in UNIX like OSes and Web.

So, in other words, the kids grow up learning and using Linux, right?


More like macOS and Chromebooks, developing mobile apps and Web.


Developers, at least. And Macs.


This is how I learned to code as a young age. I typed in every single program they had. The most challenging was the hex-based music machine they printed, which was difficult as someone who both couldn't type and didn't understand the mechanics of what I was typing.


Same! The worst was when you double and triple checked that you typed it right, only for next month to have an "oops we made a typo haha". K-Power section was always the best!


In the 90’s we ordered them from TX


My opinion is that we have little to no interest in what animals, plants, or even other people are thinking. The vast majority of it would be considered crude and offensive at best.


In at least one case it wasn’t released by management because it was absurdly embarrassing. Productivity compared between 2019 and 2023 had statistics similar to the following; average yearly CLs decreased from approximately 70 to under 10, significant revisions pushed in comparable products changed from 26 to 4, meeting time increased by a multiple, email volume decreased similarity. All this with significant increases in seniority and pay among the average employee. Contrapositive scenarios argue that there is a huge opportunity cost to the tech efforts from WFH.


What is a CL? What are "significant revisions pushed in comparable products" and what does it measure?


"CL" is a Google-ism for a code change ("change list"). What we'd normally refer to as a pull request I suppose. Googlers like to think the whole world is in on their lingo, but CL is a very unusual acronym outside the Googlosphere.


So they are seriously saying that Google developers on average went from about 70 PRs per year to less than 10 PRs per year when working from home? That seems such an absurdly large decrease that it's hard to believe.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: