The number of HNers who were earnestly arguing that this was the party of free speech indicates that this absolutely needs to be on the HN front page.
> the administration’s rhetoric about cracking down on students protesting what we saw as genocide forced me into hiding for three months. Federal agents came to my home looking for me. A friend was detained at an airport in Tampa and interrogated about my whereabouts.
On a side note, it was interesting after Trump was elected where some of my co-workers wanted to use old pronouns after some laws changed _in meetings_ and I realized the only thing stopping them was the awkwardness it would have been for _them_ in that situation
In the Before Times, I thought that asking Americans to mind pronouns would never work -- not because they were mean, but because it would require the average American to learn what a "pronoun" was.
Of course, it turned out that the average American had no problem learning what a pronoun was if it gave them the opportunity to be mean. Sigh.
Which industry? Tech? Surveillance? Government? I know my father in law is a MAGA racist who believes whatever makes it easy to justify his own beliefs. I’m not sure you can ever reliably judge someone’s true motives in a professional setting.
I'm seeing it in a lot of younger tech people. We had a NASA presentation at work about air quality and that forest fires are one of our biggest problems in CA. TWO separate people (from maybe 20-25 attending) brought up "do you think that if we managed our forests better, this could help?" (clearly talking about the crazy "raking the forests" Trump rhetoric). It blows my mind how "intelligent" people can be this stupid.
Is that really what you're concerned about that somebody would ask a soft ball question about proposed solutions? Why is questioning the buildup of brush a crazy idea? It's been a mainstream concern for years. I really don't think it's healthy for any inquiry to propose a particular mindset and shut down alternative thinking. It doesn't seem very scientific or intelligent to me.
The issue is that the rhetorical game being played is that by saying the risk is all due to the buildup of combustible materials, it shifts the blame to California's Democratic politicians and away from Republican fossil fuel donors. Clearly in a good faith discussion we'd suggest better forest management, as well as doing everything possible to combat fossil fuel emissions. The problem is that it's not a good faith discussion.
Am I dumb to think that the main worry from fossil fuels right now is CO2, not air quality? (at least while environmental regulations are still mostly intact) It seems reasonable to me to ask about forest management for air quality.
Maybe there was some other sign they didn't ask in good faith? But I have no idea what dumb thing trump said you're even talking about.
Notice how pro-free speech = pro-clearing brush buildup?
It's so weird how people join these partisan factions that have a full package of beliefs that you have to be evil not to share. Woe to your job if you say that you think brush buildup should be cleared; you're obviously racist.
> "do you think that if we managed our forests better, this could help?" (clearly talking about the crazy "raking the forests" Trump rhetoric)
Were they clearly actually talking about that? If that was their question, word-for-word, it's a good question! We are not managing our forests all that well. No, we shouldn't be doing Trump's dumbass raking "idea", but we should be doing controlled burns, at minimum.
>clearly talking about the crazy "raking the forests" Trump rhetoric
Are you sure about that? I've been hearing for at least a decade that the solution to CA's forest fire problem is something along the lines of reducing the amount of potential fuel that is allowed to build up by either allowing smaller fires to run their course without intervention or alternatively aggressively executing controlled burns on a regular schedule.
Not sure how viable that is as a solution but I do know the idea didn't originate with Trump because it predates his entire political career.
I remember hearing about forest mismanagement long before Trump's presidential runs. It's curious how many people complaining about right wing talking points associate it solely with Trump.
While Trump's "raking the forest" take is clearly uninformed and unintelligent, there's a substantial kernel of truth to longstanding forest management policies making some of these wildfires worse than what they could have been. We've been artificially suppressing fires far too long in a lot of these places, for example.
Not that this is the only factor in play here on a lot of these fires, and once again I do agree Trump's take is idiotic and ultimately he's not helping but pouring gasoline on the issue. Just pointing out, we definitely aren't managing our forests well for a multitude of reasons.
The federal vs state conflict over prescribed burns doesn't help much either. In states with a much lower % of national forest or blm land or whatever, you get a much larger amount of prescribed burns.
In the west coast, the state vs federal friction reduces how much of that happens, and there's more uncontrolled growth happening. And there's not always a lot that e.g. CA government can do about it if it's federal land.
For example, Minnesota (intentionally) burns like 50% more acreage than California on an annual basis, despite being like half the size. But CA also is like half federal land, MN is like 5% or something.
I totally agree with you there. I'm in no way trying to suggest it was specifically a failure of certain states or individual administrations; its a mixed bag of failures at a lot of different levels with the federal government having a lot of the blame across a wide range of administrations that did nothing to really address the growing problems.
I'm all ears if you've got someone that we can put in power that won't rat fuck us when it comes to privacy or civil liberties. Bonus points if they aren't just slightly less bad than the other guy.
Chase Oliver was the only non-writein person on my ticket that even bothered to put up much pretenses of running on a privacy and civil liberties ticket.
I do get that. Both parties are clearly bad. But one in particular is and was yelling from the rooftops about how they were going to destroy civil liberties of certain groups, and are now doing exactly what they promised.
Everyone must simultaneously fight for a better system and choose the least-worst option when it comes time for an election.
The one that forced people into their homes, required proof of medical operation to shop at stores, and tries to abolish my second amendment rights? Or the one that god forbid is deporting people that shouldn't be here in the first place.
also how do you reconcile your belief in second amendment rights with alex pretti's death at the hands of ice, an organization empowered by the current admin?
Uhhh that was wrong, duh??? But sometime bad things happen? I would much rather ice be empowered and we deport the people who should be deported. Its like how some people died from the vaccine.
That's what they said about Obama, but he got Iran to give up their stockpile of enriched Uranium, give up enrichment beyond 4%, and submit to a severe inspection program. All for unfreezing less money than Trump has spent so far on the Iran War, let alone the $200B that he wants, let alone the economic damage from the Hormuz shutdown, let alone the $5T that happened last time a Republican asked to spend $200B on a quick little war.
At the time, the Republicans whined incessantly about how soft Obama was. But they sure enjoyed dropping those Obama Bombs last year that he commissioned as a Plan B. Obama spoke softly, carried a big stick, and hammered out a brilliant deal. Trump bragged loudly, tore up the deal, swung the stick he inherited, missed, and fell in the oil. Sad.
At the time, Israel whined incessantly about how Iran was going to secretly enrich anyway. But their own intelligence from compromising the enrichment program shows in hindsight that this was not the case and Iran was behaving themselves.
That's why I base my expectations on track records, not on Republican whining.
> At the time, Israel whined incessantly about how Iran was going to secretly enrich anyway. But their own intelligence from compromising the enrichment program shows in hindsight that this was not the case and Iran was behaving themselves.
Then why wouldn't they submit to IAEA inspections?
> Obama spoke softly, carried a big stick, and hammered out a brilliant deal. Trump bragged loudly, tore up the deal, swung the stick he inherited, missed, and fell in the oil.
This is probably the best and most succinct -- and pithy -- take I've read as of yet.
You're right about a lot above. I would clarify though that Obama's deal was made by paying $150B+ to Iran (releasing frozen assets), which was immediately used to fund terrorists and conflicts in Syria, Yemen, Iraq etc.
US withdrew from JCPOA under Trump (which led to a certain chain of events), but Biden was not able to revive it during his term. Not clear why we think a different president would be able to, and under what terms/concessions.
It was $100B more like $50B once you subtract the obligations which unfroze with the assets. We are quarreling over a pittance compared to what we will spend at the pumps and on the war, though.
I wonder what wonderful things all the Russian and Iranian (!) oil that Trump lifted sanctions on will fund! We will find out in time.
Kamala had a better shot at reviving the deal for the same reason Trump thought he had a chance at regime change: Iran's situation has been deteriorating. I'm quite sure that if she had hammered out a deal comparable to the JCPOA, Republicans would be running around yapping about how Trump would have achieved peace in the middle east by just having the stones to bomb Iran. Lol.
Iran's regime sucked (still sucks), to be sure. This was frankly not all that much of an issue for the US. It was a big issue for other Arab nations in the area (not to mention for Israel), but I'm not sure why we should be doing their dirty work.
If the end result of all this is a large weakening in Iran's regime, a reduction in Iran's influence in the region, and (otherwise) a return to the status quo, I guess that's something of a victory. But it's far from clear that we'll even come out that well, and meanwhile we've murdered civilians, and spent American lives and war materiel. Not great. We should have left well enough alone.
"Iran's regime sucked" because they kicked out our western puppet? or is it because Iran is a a democracy, unlike "other Arab nations" (by the way iranian are as much arabs than you) or Israel ? (Note that I'm not fan of Iran culture, but I'm not fan of ingroup cultures either).
> because Iran is a a democracy, unlike "other Arab nations"
Iran isn't a democracy, it's an authoritarian theocracy that spreads terrorism throughout the Middle East, and that brutally oppresses it's own people[0]. The only objective of the regime is to stay in power, regardless of the costs imposed to Iran and other countries, and the only language they understand is violence.
I'm not sure if you're joking and this is a backhanded compliment to Harris, or you're sincere in your belief that what Trump will negotiate is going to be better than the Obama deal he ditched in the first term.
>The number of HNers who were earnestly arguing that this was the party of free speech indicates that this absolutely needs to be on the HN front page.
The number of HNers (and people at large) who think that both corporate parties don't vehemently oppose free speech and privacy is disturbing. Right now, today, a massive number of Democrats who have spent years decrying Trump (and Republicans as a whole) as fascists are lining up to support a "clean" reauthorization of section 702 of FISA, which allows (despite the phony claims of its supporters) the warrantless and unconstituional surveillance of US citizens (and others). If our government was controlled fascists, why would anyone give them the power to spy on anyone without a warrant? Because it's all kabuki theater and everyone in DC is part of the same team, and you ain't on it.
I don't think "both sides" works very well when one side has been supporting the murder of citizens for exercising their free speech, calling for denaturalization of citizens for expressing the wrong opinions or being from the wrong community, openly suppressing criticism by threatening to revoke broadcast licenses and barring reporters from DoD briefings for not taking sufficiently flattering photos.
I don't think anyone posting here thinks that Democrats are pro-free speech and pro-privacy, and it would be great if we could have politicians that truly support free speech and privacy rights. But of the options currently available, one is much less bad than the other.
>I don't think "both sides" works very well when one side has been supporting the murder of citizens for exercising their free speech
Obama was murdering US citizens for exercising their free speech, and their children, more than a decade ago.
>But of the options currently available, one is much less bad than the other.
If one person says they are going to stab 99 people and the other person says they are going to stab 100 people, you could argue that the guy who stabs 99 people isn't as bad, but I won't ever support either one of them or consider them worthwhile no matter how many others do.
You are entitled to your opinion that I'm supporting the status quo by refusing to support the abject murder and depravity of one group of people over another, who may be slightly less murderous and depraved, and I'm entitled to my opinion that you are complicit in the the murder and depravity of the people you support.
Trump's second term has had dire consequences for the US that are simply not on the same level as Obama, and it could have been avoided. The democrats are also terrible, but it's a matter of harm reduction. In this case, I would strongly argue that the difference was greater than 99 and 100.
You should vote to do harm reduction. Elections happen regardless of what you do. Whether 30% or 100% of Americans vote, the winners of elections still get access to the same amount of state power. The system does not require our political participation to continue to exert control our lives. Abstaining from voting is not an effective tactic in reducing the legitimacy of the system. That tactic might work in other situations, but not in this one.
I hope you will keep your distaste for both parties, but still vote for the lesser evil, even if it's distasteful. Because I think we should help that one person not get stabbed. And if indeed you have voted for the lesser evil and my post has a tone that assumes otherwise, I apologize.
Well, the "worst" side has currently returned to power, the other hasn't. There's no reason to belive that the other side wouldn't become worse in its own way to further solidify its power. Before you talk about priors, remember that Trump's 1st turn also wasn't as unhinged as this.
While it is okay, perhaps advisable, to temporarily support the current less worse side, try to not build house for people that would gladly step on you once your usefulness runs out. As OP said: it's a small club, and you ain't in it.
Yes, the point is to keep picking the option that's better on the things important to you. Blind loyalty is why the current guys are acting with such impunity.
Not that far off from the truth. A number of college students who were protesting for Palestine had their college enrollment suspended, and lost their visas, effectively being deported. Which, yes, the university made that decision, but it didn't come without influence from the government.
Democrats have so far not been led by the nose into bombing Iran and fucking up the global economy so I’m not sure how one can keep saying that with a straight face.
Both sides of Congress passed emergency weapons funding for Israel at the start of this war. Even if some Democrats are scoring political points complaining about it since it's during Trump's term and the war has become a stalemate, they're on board at the end of the day, like they were with Iraq (as some forget) before things unraveled. And during Biden's term, it was Gaza instead.
It totally is. Democrats got led into Israel's wars too. Interestingly the support was different, like Trump got money from the Adelsons and Biden from pro-Israel lobbies.
The US was involved in Gaza? The United States was actively spending billions dropping munitions there? When? Under which administration was the US directly involved in bombing Gaza?
Can you further clarify how the US was involved in the war in Gaza, and how that was the Democrats getting involved? And do you really feel that involvement was anywhere near what is happening or comparable with Iran at the moment?
Its not US servicemembers pulling the trigger, its not US commanders deciding on targets, its not the POTUS starting the war. Pretty radically different things in my book.
How many US servicemembers were injured or killed in the US's apparent major war with Gaza?
We've spent ~$20B in grants for weapons procurement on Israel's behalf over several years, with a lot of that being defensive missile systems. I'm not a fan of us spending so much of our money on another country's military, especially when we hear over and over how we can't afford to feed kids or provide transportation to our people. But, we've spent over double than that so far in Iran in less than two months, and that's ignoring the many billions it'll cost to fix things that were destroyed so far. We're looking at the actual US cost of this war potentially reaching one trillion dollars.
Its a scale that's so radically different. And also, one was in support of a country who we have defense agreements with who was attacked, and another was us deciding to go bomb a country seemingly unprovoked.
I don't know if this is a failure of the bandwidth level of an internet forum message or a nuance thing, but like, you can both wish democrats sent less money to israel (for a whole bunch of reasons) while also acknowledging that this behaviour is preferable to the demonstrated alternative of republican presidents constantly trying to invade middle eastern countries.
Like, if you want less american money (and lives) being spent bombing the middle east, the most rational approach is to vote for a democrat majority/president and then primary anyone who still tries to buy bombs or whatever.
The alternative approach of letting a republican get elected is demonstrably worse.
Parent comment isn't a whataboutism, if anything my comment could be seen as that. My point isn't to defend Trump's actions, only to call out that this is a scarier problem than it gets credit for when someone just blames MAGA. We're dealing with a two-front assault here, and they want you to think it's just the other party you don't like.
We've had a Democrat majority plenty of times in the past several decades, including under Biden, and they continued the unconditional support. When congresswoman Omar called out AIPAC's influence, nearly every other Dem joined in on a written condemnation. If anything they're just less blatant than Republicans about this, also some pro-Israel groups have a niche with one party or the other.
If it helps you feel better, I voted for free speech and feel that the administration did not hold up their end of the deal. The FTC’s recent “debanking” letter to the payment processors is just theater until something changes. I’ll leave it at that.
Who is “they”? I voted on party lines, state, federal, and local. The way that people on the left shut down discussions more readily is what contributed to it, using tactics such as bringing particular candidates into the discussion despite me never volunteering that I was pro-Trump per se in this thread.
It’s probably going to be awhile before I’m sympathetic to the “other” side though (it’s still two sides of the same turd after all), seems some things haven’t changed yet.
You voted against free speech. The sooner you can admit to that the better.
Trump has been very clearly against free speech well before 2015. He's been anti-American and anti-constituion well before he came down that escalator.
It doesn't make me feel better that you're still pretending otherwise.
I don't really think he's even gotten that much crazier than his admittedly high 2016 baseline. He has gotten a lot better at execution of said craziness, especially after realizing consequences would be slow and few.
Oh don’t get me wrong, I’m unironically looking forward to the other shoe dropping next term, so that we can finally get a generation of politicians that can bring on the great reset. Too much legacy software running on the kernel and it can no longer be maintained.
I do think they earnestly tried to swim against the current, but yeah, they always knew where it was taking them. Removing the yellow background behind paid results was the turning point IMO.
> The goals of the advertising business model do not always correspond to providing quality search to users.
- Sergey Brin and Lawrence Page, The Anatomy of a Large-Scale Hypertextual Web Search Engine, 1998
Such a wise observation from a paper published in the now-defunct journal "Computer Networks and ISDN Systems" after being rejected for the SIGIR conference...
...then BackRub turned Gogool mis-spelled, and the rest is history.
Idk what they've even done that was not profit-motivated. They loss-led newer products in the 2000s just like everyone else, then 2010s started tightening up, then 2020s went to maximizing profit and paying out. That's ok in a way really, they're a corporation after all. But nobody ever took that "don't be evil" slogan seriously unless maybe they were Google employees.
Ok idk if anyone cares but wanted to fix it, 2020s they went to maximizing profit on some things, but are still aggressively spending and growing on other things.
Migrating is such a good feeling. You don't have to do it all at once, either: I migrated to fastmail over the course of several years. Each time google did something that got my blood pressure up I went into my password manager and migrated another account. In aggregate it was a hassle, but these days I almost miss the feeling of being able to do something in response to stinky actions from google.
I don't think fastmail is going to help you. They are subject to legal requirements too and probably American jurisdiction also despite what their particular position is. https://www.fastmail.com/blog/fastmails-servers-are-in-the-u.... People love to hate Google but they're just doing what any corporation subject to law is going to do.
> It has been pointed out to us that since we have our servers in the US, we are under US jurisdiction. We do not believe this to be the case. We do not have a legal presence in the US, no company incorporated in the US, no staff in the US, and no one in the US with login access to any servers located in the US. Even if a US court were to serve us with a court order, subpoena or other instruction to hand over user data, Australian communications and privacy law explicitly forbids us from doing so.
In addition to what the sibling comments say, this also puts Fastmail at risk of having their US based service suspended while they attempt to resist government overreach (were they to attempt to do so) which is really not a lot better for their users.
They can say what they like, and I am a customer, but in hand-wavey generalization terms one should be aware that Australian law enforcement has excessively broad access to telecommunications data on request and a long history of doing the bidding of the United States. Carriers are forced to retain your data for 2 years.
Under TIA Act provisions (such as s180), an authorised officer of a criminal law‑enforcement agency can authorise access to prospective telecommunications data [metadata only; not whole messages] if satisfied it is reasonably necessary for investigating an offence punishable by at least three years’ imprisonment. (In other words, ~any time they want)
Example: the data‑retention regime’s records were being accessed over 350,000 times a year by at least 87 different agencies, including non‑traditional bodies such as local councils and the RSPCA [pet cruelty nonprofit].
Given Australia's population is only 28M, that means roughly 1 in every 80 people gets communications metadata pulled by their own government annually.
Yep, I am a fastmail user, born and live in Oz. I just assumed that this data would be collected either on this side or via the US servers. Also, we are still a part of the 5 eyes alliance.
I wasn't looking to dodge US jurisdiction, I was looking to dodge "our craptacular moderation AI had a brainfart when reviewing your account and now you are locked out of your life."
I recently migrated off of my legacy "Google Apps for Your Domain" (now Workspace) account to a mix of self hosting and a regular old vanilla gmail account.
It was a real eye opener to experience how challenging it was to move my data from one Google account to another. Takeout is nice in theory, but there is no equivalent "Takein" service that accepts the data form import to another Google account in the format produced by Takeout! I naively assumed "Export Google calendar from here, import same files to there" but nope, that did not work at all. Maps data was even worse.
I've migrated everything from Google except for Google Voice. I have yet to find an alternative that can match the feature set and ease of use, regardless of the cost.
I have one page with my full history of text messages, full transcription of all voice messages, contacts information connected with every number, and I can search everything. I can configure which of my phones ring.
And, possibly most importantly to me right now, my current phone has only a data connection and I make and receive calls using the Voice app. I think SIP eats too much battery and data and doesn't work well for wifi<->lte switching, but it's been a long time since I used it much.
I'm not sure what the OP does, but at least for me I find myself chained to Google Voice for SMS 2FA use because it's basically the only phone number provider that cannot be exploited with a sim swap attack (same deal with Google Fi). And while I don't necessarily trust Google, their account security is leagues ahead of anyone else imo.
I previously looked at jmp.chat but they didn't really inspire confidence on the security front.
My use cases include 2FA and I like the added security that Voice provides, but it's not really added security, it's just moving the risk from your cell provider to Google. IMHO, Google does security better than the cell providers do.
I like the muti-platform integration of Voice. I use it on my iPad, on my Android phone, and mostly from my desktop. It works well on all platforms.
When I'm at home, I mainly use my VoIP phones. GV forwards to them, and they spoof my GV numbers when I make outgoing calls.
I like the spam text and call protections that GV provides. I believe they're partnered and integrated with Nomorobo.
I also have jmp.chat. It has capabilities that GV doesn't have, but it's not well integrated. (I use Cheogram on my Android phone, but there's no easily usable client on my iPad, or my desktop.)
I don't like they way they've made it harder for me to see what they actually offer vs. what I offer myself (with my FreePBX VoIP client). I wish they would (maybe on a separate page) show the capabilities of their SIP trunk. E.g. Does it support SMS? Does it support video calling? Does the client require a static IP? Etc.
Anticipation of stories like this are why I didn't rely much on Google 20 years ago.
Never used Gmail other than as a throwaway account.
Went many years before I had a Youtube account. Finally made one to upload some videos. I am normally not logged in.
(OK, OK - I was more concerned with them suddenly charging for a "free" service, as well as selling data to commercial enterprises than with them giving to the government).
Does anyone else remember Epic 2014? It was a video made years ago that speculated about the future of the internet and media, with the end game being personalized news written by a computer. The timeline is off but the brand names are mostly the usual suspects. Rewatching it now gives me this uncanny feeling.
Edit: People are not understanding the humor in the question. I implied I predicted this reality 20 years ago, and he's asking for another prediction 20 years out.
The other year I bought some Bose exercise earbuds because the cheap ones weren't staying in my ears. They died, warranty replaced them, they died again, I opened them up and the circuit board wasn't coated or potted or anything! The cheap ones were! The premium brand was penny-pinching harder than the no-name brand!
I'm so glad git won the dvcs war. There was a solid decade where mercurial kept promoting itself as "faster than git*†‡" and every time I tried it wound up being dog slow (always) or broken (some of the time). Git is fugly but it's fast, reliable, and fugly, and I can work with that.
> I'm so glad git won the dvcs war. There was a solid decade where mercurial kept promoting itself as "faster than git".
It wasn't the Mercurial team saying it was faster than Git; that was Facebook after contributing a bunch of patches after testing Mercurial on their very large mono-repo in 2014 [1]:
For our repository, enabling Watchman integration has made Mercurial’s status command more than 5x faster than Git’s status command. Other commands that look for changed files–like diff, update, and commit—also became faster.
In fact they liked Mercurial so much they essentially cloned it to create their own dvcs, Sapling [2]. (An aside: Facebook did all of this because it was taking too long getting new engineers up to speed with Git. Shocker.)
Today, most of the core of Mercurial has been rewritten in Rust; when Facebook did their testing, Mercurial was nearly 100% Python. That's where the "Mercurial is slow" thing came from; launching a large Python 2.x app took a while back in the day.
I was messing with an old Mercurial repo recently… it was like a breath of fresh air. If I can push to GitHub using Mercurial… sign me up.
You can push to GitHub using Sapling. I wish Sapling open source was given more love, as the experience for non-Facebookers is subpar. No bash completion outside the box, no distro packages, no good help pages, random issues interacting with a Git repo...
Sounds like what my teachers used to say: “a personal problem”. Literally nobody outside FB knows what they’re missing and until they fix that, literally nobody cares.
No, the "hg is fast" marketing claim that retreated to "hg is Big-O fast and you are dumb for caring about constant terms and factors even if they clearly dominate your use case" predates 2014 and the Facebook patches. These talking points were old in 2010. Mercurial was always dog slow and always gaslighting about it.
I'm glad BigCo made tools to serve their needs, but their needs aren't my needs or most peoples' needs.
> Mercurial has been rewritten in Rust
I'm glad they saw the light eventually! Ditto for the rest of the Rust Tooling Renaissance.
What is kind of funny here is that you're right locally. At the same time, the larger tech companies (Meta and Google, specifically) ended up building off of hg and not git because (at the time, especially) git cannot scale up to their use cases. So while the git CLI was super fast, and the hg CLI was slow, "performance" means more than just CLI speed.
I was never a fan of hg either, but now I can use jj, and get some of those benefits without actually using it directly.
>At the same time, the larger tech companies (Meta and Google, specifically) ended up building off of hg and not git because (at the time, especially) git cannot scale up to their use cases.
Fun story: I don't really know what Microsoft's server-side infra looked like when they migrated the OS repo to git (which, contrary to the name, contains more than just stuff related to the Windows OS), but after a few years they started to hit some object scaling limitations where the easiest solution was to just freeze the "os" repo and roll everyone over to "os2".
They wrote something that allowed them to virtualize Git -- can't remember the name of that. But it basically hydrated files on-demand when accessed in the filesystem.
The problem was I think something to do with like the number of git objects that it was scaling to causing crazy server load or something. I don't remember the technical details, but definitely something involving the scale of git objects.
Probably a lot of Googlers don't know. It's ancient history, was called google3 even in 2006 when I first joined.
google1 = code written by Larry, Sergey and employee number 1 (Craig). A hacky pile of Python scripts, dumped fairly quickly.
google2 = the first properly engineered C++ codebase. Protobufs etc were in google2. But the build system was some jungle of custom Makefiles, or something like that. I never saw it directly.
google3 = the same code as google2 but with a new custom build system that used Python scripts to generate Makefiles. I suppose it required a new repository so they could port everything over in parallel with code being worked on in google2. P4 was apparently not that great at branches and google3 didn't use them. Later the same syntax for the build files was kept but turned into a new languages called Starlark and the Makefile generator went away in favor of Blaze, which directly interpreted them.
Yes, the server is based on Perforce, called Piper, but the CLI is based on mercurial. So locally you're doing hg and then when you create a CL, it translates it into what p4 needs.
Right, and I'm glad there are projects serving The Cathedral, but I live in The Bazaar so I'm glad The Bazaar won.
The efforts to sell priest robes to fruit vendors were a little silly, but I'm glad they didn't catch on because if they had caught on they no longer would have been silly.
I might be the outlier, but am I the only one who doesn't care much about the speed of git?
I've been using git since 2011 as my main vcs for personal and professional work as a freelancer contractor. Whenever I "wait" for git, it is either limited by the bandwidth (git clone) or by the amount of commit hooks that I implemented for linting, verification etc. The percentage of time actually spent in git internal execution must be a tiny fraction of my day to day usage. What IS affecting me (and my the teams I work in) is usability and UX experience. I.e. if people would screw up stuff (no matter if in git or mercurial) we spent far more time fixing this - I don't think the impmentation speed would matter here.
The only case I can imagine is when doing a full checkout of a big repo, but even there, there is --depth which is quite practical.
Isn't it kind of like how you don't care much about the oxygen content of the air around you, but you'd miss it if it was gone? I've done development with Mercurial, simple processes were irritatingly slow, particularly if you stray from the better-supported opinionated path.
I spent a long time educating teams of developers about git's usability quirks. I don't do that as much anymore - partly because the quirks have been worked out, partly because the developers have better guardrails and resources to learn from.
This whole time (the past 15 years) git has been getting faster without most of us noticing, because big companies have been investing in speeding it up. The reason you don't notice or care is that they work on a very different scale. Thousands of users, thousands of PRs per day, millions of CI/CD jobs all hitting the repo.
Now the cycle is repeating again because these numbers are shooting through the roof because of agentic coding.
Mercurial's model is different from Git that these things you list does not make sense there.
Rebase does not make sense in Mercurial because it has the concept of fixed branches. A commit is permanently linked to the branch on which it was made. So you are supposed to use merges.
I know. It's an opinion about how to develop that a lot of people hold - a declining proportion, mind you, like Mecurial's declining market share - and it's one that they're able to represent in Git's model, with Git's features. They're even able to do it without exposing me to it. But the same isn't true in reverse. Strictly superior?
Believe me, I tried to have an open mind about it. Then one day I was getting ready to go on a work trip with a half-finished feature on my work laptop, and realised there was simply no in-model way for backing that wip up to the repo. If I lost my laptop, I lost the progress. mercurial-scm fails at SCM.
>in-model way for backing that wip up to the repo.
That is because you have this notion of a "clean history", (which IIUC prevented you from making this permanent wip commit) which in reality does not have a lot of use. For most project, "useful history" or "real history" is better than a "clean" history.
> one that they're able to represent in Git's model, with Git's features. They're even able to do it without exposing me to it. But the same isn't true in reverse. Strictly superior?
not sure what you mean to say, but for thoroughness' sake, no: git and mercurial concepts are not interchangeable, with git having mostly an inferior model.
To give examples: git has no concept of branching (in the way every VCS but Git uses the term). A branch in git is merely a tag on the tip of a series meant to signify that all ancestors belong to the same lineage. This comes with the implication that this lineage information is totally lost when two branches merge (you can't tell which side of the merge corresponded to which lineage). The ugly and generalised workaround is to abuse commit message (e.g. "merge feat-ABC into main") to store an essential piece of the repository history that the VCS cannot take.
Another example is phasing: mercurial records at commit level whether it was exchanged with others or not. That draws a clean line between the history that's always safe to rewrite, and which that is subject to conflicting merges if the person you shared those commits with also happened to rewrite them on their end.
> Then one day I was getting ready to go on a work trip with a half-finished feature on my work laptop, and realised there was simply no in-model way for backing that wip up to the repo. If I lost my laptop, I lost the progress. mercurial-scm fails at SCM.
Sorry to be blunt, but that's a skill issue: hg is no different than every other VCS in that regard. If you want your WIP changes to leave your laptop, you've got to push them somewhere, just like you would in git.
I'd like to fill up some inaccuracies in your response:
- rebasing in Mercurial simply means chopping a subtree off of the history and re-attaching it to a different parent commit. In that sense, rebasing is a very useful and common history-rewriting operation. In fact, it's even simpler and more powerful/versatile than in git, because mercurial couldn't care less if the sub-tree you are rebasing belongs to a branch or not: it's just a DAG. It gets transplanted from A to B. A may or may not be your checked commit, or be the tip of a branch, doesn't matter.
- that mercurial requires a configuration toggle before rebasing can be used (i.e. that the user need to enable the extension explicitly) is a way to encourage interested users to learn their tool, and grow its capabilities together with their knowledge. It's opinionated, it may be too much hand-holding for some, but there is an elegant simplicity in keeping the help pages and autocomplete commands just as complex as the user can take it.
Sure, but since commits have a branch attribute attached to them, "rebasing" does not appear to be "first class". It is something that has to be bolted on with an extension.
> because mercurial couldn't care less if the sub-tree you are rebasing belongs to a branch or not
IIUC Git also does not care much about the rebase target being a "branch".
I agree that Mercurial provides more value out of the box than git because it preserves branch info in commits.
I can live with Git because Git is "enough" if used carefully and after coming to terms with the non-intutive UI.
> Sure, but since commits have a branch attribute attached to them, "rebasing" does not appear to be "first class".
Again, that's orthogonal: you may or may not use "named branches" (the kind of which persists at commit level), rebasing works either way consistently and predictably.
> It is something that has to be bolted on with an extension.
The extension ships in core, UX is why it's not enabled by default.
> IIUC Git also does not care much about the rebase target being a "branch".
Indeed, it's just that things likely get weird (for no good reason) when you don't (detached head, "unreachable" commits)
> I can live with Git because Git is "enough" if used carefully and after coming to terms with the non-intutive UI.
That's our sad state of affairs. JJ helps a bit, though.
It doesn't seem to support Mercurial though (not to imply that you were implying that it did). All I can find in this proxy/mirror thing to integrate it by presenting the Mercurial repo as a Git server:
https://peterlavalle.github.io/post/forgejo-actions/
Whatever your opinion on one tool or another might be - it does seem weird that the "market" has been captured by what you are saying is a lesser product.
So far you've only gotten responses to "how can a worse product win?", and they are valid, but honestly the problem here is that Mercurial is not a better product in at least one very important way: branches.
You can visit any resource about git and branches will have a prominent role. Git is very good at branches. Mercurial fans will counter by explaining one of the several different branching options it has available and how it is better than the one git has. They may very well be right. It also doesn't matter, because the fact that there's a discussion about what branching method to use really just means Mercurial doesn't solve branches. For close to 20 years the Mercurial website contained a guide that explained only how to have "branches" by having multiple copies of the repository on your system. It looks like the website has now been updated: it doesn't have any explanation about branches at all that I can find. Instead it links to several different external resources that don't focus on branches either. One of them mentions "topic", introduced in 2015. Maybe that's the answer to Git's branching model. I don't care enough to look into it. By 2015 Git had long since won.
Mercurial is a cool toolbox of stuff. Some of them are almost certainly better than git. It's not a better product.
This is so strange, because, at a low level, a branch isn't even a "thing" in git. There is no branch object type in git, it's literally just a pointer to a commit, functionally no different from a tag except for the commands that interact with it.
Meanwhile mercurial has bookmarks. TBF I'm not sure when it got those but they've been around forever at this point. The purpose is served.
I think there are (or perhaps were) some product issues regarding the specifics of various workflows. But at least some of that is simply the inertia of entrenched workflows and where there are actual downsides the (IMO substantial) advantages need to be properly weighed against them.
Personally I think it just comes down to the status quo. Git is popular because it's popular, not because it's noticably superior.
> I think there are (or perhaps were) some product issues regarding the specifics of various workflows.
I love jumping in discussions about git branching, because that's a very objective and practical area where git made the playing field worse. Less and less people feel it, because people old-enough to have used branch-powered VCSes have long forgotten about them, and those who didn't forget are under-represented in comparison to the newcomers who never have experienced anything else since git became a monopoly.
Yes, every commit is prefixed with the branch name. Because, unlike mercurial, git is incapable of storing this in its commit metadata. That's ridiculous, that's obscene, but that's the easiest way to do it with git.
Just because there is one project apparently using this in a way that indicates someone could perceive something as a weakness... It doesn't mean it's a real weakness (nor that it's serious).
You can just not move branches. But once you can do it, you will like it. And you are going to use
git branch --contains COMMIT
which will tell you ALL the branches a commit is part of.
Git's model is clean and simple, and makes a whole lot of sense. IMHO.
> Less and less people feel it, because people old-enough to have used branch-powered VCSes have long forgotten about them, and those who didn't forget are under-represented in comparison to the newcomers who never have experienced anything else since git became a monopoly.
I'm old enough to have used SVN (and some CVS) and let me tell you branching was no fun, so much that we didn't really do it.
To me mercurials branching is closer to the development process and preserves more information, because it records the original branch a commit was made.
Git does not have such concept. That is a trade off and that trade off works great for projects managed like Linux kernel. But for smaller projects where there is a limited number of people working, the information preserved by mercurial could be very valuable.
It also had some really interesting ideas like change set evolution, which enabled history re-writing after a branch has been published. Don't know its current status and how well it turned out to be..
Just FTR - git /can/ store that information, but it requires human input.
If you rebase the feature branch into the main branch THEN follow it up with the merge commit that records the branch name you store the branches (that have been made a part of main) and can see where they are in your log
Mercurial's notes can become cumbersome if there are a large number in the repository, but, obviously, humans can sort that out if it gets out of hand
It's interesting that branches, which is a marquee feature of git, became less important at the same time as git ate all the other vcs. Outside of OS projects, almost all development is trunk based with continuous releases.
Maybe branching was an important reason to adopt git but now we'd probably be ok with a vcs that doesn't even support them.
Not sure if it's true. I mean, I do agree with the core of it, but how do you even do PRs and resolve conflicts, if there are no branches and a developer cannot efficiently update his code against the last (remote) version of master branch?
Trunk based development has every developer in the company committing straight to main - no PRs, supposedly no merge conflicts (but reality is that main moves fast and if someone else is working in the same files as someone else, there will be merge conflicts)
A middle ground is small PRs where people are constantly rebasing to the tip of main to keep conflicts to a minimum
Trunk based development is still a hotly debated topic. I personally prefer branches at this point in time, trunk based development has caused me more trouble than it's claimed worth in the past, BUT that could be a me limitation rather than a limitation of the style
Worse products win all the time. Inertia is almost impossible to overcome. VHS vs Betamax is a classic. iPod wasn’t the best mp3 player but being a better mp3 player wasn’t enough to claw market share.
Google and Meta don’t use Git and GitHub. Sapling and Phabricator much much better (when supported by a massive internal team)
I mean, in the fickle world that is TECH, I am struggling to believe that that's what's happened.
I personally went from .latest.latest.latest.use.this (naming versions as latest) to tortoise SVN (which I struggled with) to Git (which I also was one of those "walk around with a few memorised commands" people that don't actually know how to use it) to reading the fine manual (well 2.5 chapters of it) to being an evangalist.
I've tried Mercurial, and, frankly, it was just as black magic as Git was to me.
That's network effects.
But my counter is - I've not found Mercurial to be any better, not at all.
I have made multiple attempts to use it, but it's just not doing what I want.
And that's why I'm asking, is it any better, or not.
Mercurial has a more consistent CLI, a really good default GUI (TortoiseHg), and the ability to remember what branch a commit was made on. It's a much easier tool to teach to new developers.
Hmm, that feels a bit subjective - I'm not going to say X is easier than Y when I've just finished saying that I found both tools to have a lot of black magic happening.
But what I will point out, for better or worse, people are now looking at LLMs as Git masters, which is effectively making the LLM the UI which is going to have the effect of removing any assumed advantage of whichever is the "superior" UX
I do wish to make absolutely clear that I personally am not yet ready to completely delegate VCS work to LLMs - as I have pointed out I have what I like to think of as an advanced understanding of the tools, which affords me the luxury of not having an LLM shoot me in the foot, that is soley reserved as my own doing :)
Networking effects are significantly strengthened by necessary user buy in. VC is hard, and every tool demands its users to spend a non-significant amount of time learning it. I would guess the time to move from black magic to understanding most of git is ~100h for most people.
The thing is, to understand which one is actually better, you would have to give the same amount of investment in the second tool, which is not something most people are willing to do if the first tool is "good enough". That's how Python became the default programming language; people don't miss features they do not understand.
A little over a decade ago, with only svn experience, I tried both mercurial and git. There was something about how mercurial handled branches that I found extremely confusing (don't remember what), while git clicked immediately - even without reading the manual.
"better" in that sentence is very specific. Worse is also worse, and if you're one of the people for whom the "better" side of a solution doesn't apply, you're left with a mess that people celebrate.
Not always, but in this case the superior product (i.e. VHS) won. At initial release, Beta could only record an hour of content, while VHS could record 2 hours. Huge difference in functionality. The quality difference was there, but pretty modest.
I suppose one lesson could be that there are different dimensions of superiority, different products may be superior in different ways.
Of course, products also can win market dominance for reasons external to the product's quality itself (marketing, monopoly lock-in, other network effects, consumer preferences on something other than product quality itself, etc).
> The issue is solely that OG Mercurial was written in Python.
Are we back to "programming language X is slow" assertions? I thought those had died long ago.
Better algorithms win over 'better' programming languages every single time. Git is really simple and efficient. You could reimplement it in Python and I doubt it would see any significant slowness. Heck, git was originally implemented as a handful of low level binaries stitched together with shell scripts.
Every time I've rewritten something from Python into Java, Scala, or Rust it has gotten around ~30x faster. Plus, now I can multithread too for even more speedups.
Python is absurdly slow - every method call is a string dict lookup (slots are way underused), everything is all dicts all the time, the bytecode doesn't specialize at all to observed types, it is a uniquely horrible slow language.
I love it, but python is almost uniquely a slow language.
Algorithms matter, but if you have good algorithms, or you're already linear time and just have a ton of data, rewriting something from a single-threaded Python program to a multithreaded rust program I've seen 500x speedups, where the algorithms were not improved at all.
It's the difference between a program running overnight vs. in 30 seconds. And if there are problems, the iteration speed from that is huge.
To be fair, Python as implement today is horribly slow. You could leave the language the same but apply all the tricks and heroic efforts they used to make JavaScript fast. The language would be the same, but the implementations would be faster.
Of course, in practice the available implementations are very much part of the language and its ecosystems; especially for a language like Python which is so defined by its dominant implementation of CPython.
Fair! I guess I didn't mean language as such, but as used.
But a lot of the monkey-patching kind of things and dynamism of python also means a lot of those sorts of things have to be re-checked often for correctness, so it does take a ton of optimizations off the table. (Of course, those are rare corner cases, so compilers like pypy have been able to optimize for the "happy case" and have a slow fall-back path - but pypy had a ton of incompatibility issues and now seems to be dying).
Python has a JIT compiling version in GraalPy. If you have pure Python it works well. The problem is, a lot of Python code is just callouts to C++ ML libs these days and the Python/C interop boundary just assumes you're using CPython and requires other runtimes to emulate it.
You don't even need to go all V8, you could just build something like LuaJIT and get most of the way there. LuaJIT is like 10k LOCs and V8 is 3M LOC.
The real reason is that it is a deliberate choice by the CPython project to prefer extensibility and maintainability to performance. The result is that python is a much more hackable language, with much better C interop than V8 or JVM.
I've rewritten a python tool in go, 1:1. And that turned something that was so slow that it was basically a toy, into something so fast that it became not just usable, but an essential asset.
Later on I also changed some of the algorithms to faster ones, but their impact was much lower than the language change.
I don’t know if people think this way anymore, but Python gained traction to some degree as a prototyping language. Verify the logic and structures, then implement the costly bits or performance sensitive bits in a more expense-to-produce more performant language.
Which is only to say: that rewrite away from python story can also work to show python doing its job. Risk reduction, scaffolding, MVP validation.
> git was originally implemented as a handful of low level binaries stitched together with shell scripts.
A bunch of low level binaries stitched together with shell scripts is a lot faster than python, so not really sure what the point of this comparison is.
Python is an extremely versatile language, but if what you're doing is computing hashes and diffs, and generally doing entirely CPU-bound work, then it's objectively the wrong tool, unless you can delegate that to a fast, native kernel, in which case you're not actually using Python anymore.
Well, you can and people do use Python to stitch together low level C code. In that sense, you could go the early git approach, but use Python instead of shell as the glue.
Their point was that by offloading the bottlenecks to C, you've essentially conceded that Python isn't fast enough for them, which was the original point made above
Python is by far the slowest programming language, an order of magnitude slower than other languages
One of the reason mercurial lost the dvcs battle is because of its performance - even the mercurial folks admitted that was at least in part because of python
You barely have to try to have Python be noticeably slow. It's the only language I have ever used where I was even aware that a programming language could be slow.
> Are we back to "programming language X is slow" assertions? thought those had died long ago.
Yes we are? The slow paths of mercurial have been rewritten in C (and more recently in Rust) and improved the perf story substantially, without taking away from the wild modularity and extensibility hg always had.
> You could reimplement it in Python and I doubt it would see any significant slowness
I doubt it wouldn't be significantly slower. I can't disprove it's possible to do this but it's totally possible for you to prove your claim, so I'd argue that the ball is in your court.
You must belong to the club of folks who use hashmaps to store 100 objects. It's amazing how much we've brainwashed folks to focus on algorithms and lose sight of how to actually properly optimize code. Being aware of how your code interacts with cache is incredibly important. There are many cases of using slower algorithms to do work faster purely because it's more hardware friendly.
The reason that some more modern tools, like jj, really blow git out of the water in terms of performance is because they make good choices, such as doing a lot of transformations entirely in memory rather than via the filesystem. It's also because it's written in a language that can execute efficiently. Luckily, it's clear that modern tools like jj are heavily inspired by mercurial so we're not doomed to the ux and performance git binds us with.
> You must belong to the club of folks who use hashmaps to store 100 objects.
Apparently I belong to the same club -- when I'm writing AWK scripts. (Arrays are hashmaps in a trenchcoat there.) Using hashmaps is not necessarily an indictment you apparently think it is, if the access pattern fits the problem and other constraints are not in play.
> It's amazing how much we've brainwashed folks to focus on algorithms and lose sight of how to actually properly optimize code. Being aware of how your code interacts with cache is incredibly important.
By the time you start worrying about cache locality you have left general algorithmic concerns far behind. Yes, it's important to recognize the problem, but for most programs, most of the time, that kind of problem simply doesn't appear.
It also doesn't pay to be dogmatic about rules, which is probably the core of your complaint, although unstated. You need to know them, and then you need to know when to break them.
Most code most people work on isn't about algorithms at all. The most straightforward algorithm will do. Maybe put some clever data structure somewhere in the core.But for the vast majority of code, there isn't any clear algorithmic improvement, and even if there was, it wouldn't make a difference for the typically small workloads that most pieces of code are processing.
I'll take it back a little bit, because there _is_ in fact a lot of algorithmically inefficient code out there, which slows down everything a lot. But after getting the most obvious algorithmic problems out of the way -- even a log-n algorithm isn't much of an improvement to a linear scan, if n < 1000. It's much more important to get that 100+x speedup by implementing the algorithm in a straightforward and cache friendly way.
My core complaint is that folks repeat best practices without understanding them. It's simple to provide API semantics that appear like a map without resorting to using hashmap. I fear python style development has warped people's perception for the sake of simplifying the lives of developers. And all users end up suffering as a result.
They died because everyone knows that Python is infact very very slow. And that’s just totally fine for a vast number of glue operations.
It’s amusing you call Git fast. It’s notoriously problematic for large repos such that virtually every BigTech company has made a custom rewrite at some point or another!
Now that is interesting too, because git is very fast for all I have ever done. It may not scale to Google monorepo size, it would ve the wrong tool for that. But if you are talking Linux kernel source scale, it asolutely, is fast enough even for that.
For everything I've ever done, git was practically instant (except network IO of course). It's one of the fastest and most reliable tools I know. If it isn't fast for you, chances are you are on a slow Windows filesysrem additionally impeded by a Virus scanner.
The fact that Git has an extremely strong preference for storing full and complete history on every machine is a major annoyance! “Except for network IO” is not a valid excuse imho. Cloning the Linux kernel should take only a few seconds. It does not. This is slow and bad.
The mere fact that Git is unable to handle large binary files makes it an unusable tool for literally every project I have ever worked on in my entire career.
Takes 21 seconds on my work laptop, indeed a corporate Windows laptop with antivirus installed. Majority of that time is simply network I/O. The cloned repository is 276 MB large.
Actually checking the kernel out takes 90 seconds. This amounts to creating 99195 individual files, totaling 2 GB of data. Expect this to be ~10 times faster on a Linux file system.
—-depth=1 is a hack and breaks assorted things. It’s irritating. No I can’t tell you what random rakes I’ve stepped on in the past because of this. Yes they still exist.
If you’d like to argue that version control should be centralized, shallow, and sparse by default then I agree.
Then just do git pull --unshallow whenever you see fit. I normally don't do --depth 1 because cloning repositories is rarely my bottleneck. Just saying that when you need a relatively fast clone time, you can have it.
Git LFS is a gross hack that results in pain and suffering. Effectively all games use Perforce because Git and GitLFS suck too much. It’s a necessary evil.
Define "large"; I've never ran into serious performance issues during the ~15 years I've used Git, which either means the projects I've worked in aren't actually large large, or Git is fast enough for most use cases.
not OP, and indeed git is fast-enough in many cases, but git not cutting it at Google and Facebook scale, combined with the versatility of mercurial (monkeypatching and extensions system) was the reason why they both invested heavily in mercurial instead of git.
Among the tricks being used was remotefilelogs, which is a way to "hydrate" content locally on-demand, which was mimicked in git many years later with Microsoft's git-vfs. Same goes with binary/large files that git eventually got as git-lfs.
It's funny to think that a big reason for git to be "fast" today is by playing catch-up with mercurial, which carries this "forever stigma" of being slow.
AI, automation, and globalization would all be uncontroversially brilliant if the benefits weren't distributed like "150% of net benefit to capital, -50% net benefit to labor, better hope some of it trickles down brokie!"
Ownership of the economy is split roughly 30/30/30 between the top 10%/1%/.1% with the bottom 90% of people making an entrance as the rounding error. If you picture "the owners" by drawing a representative sample of 10 people:
1 Normal person
3 Doctors / Lawyers / Engineers ($1M+ net worth)
3 Successful Entrepreneurs ($10M+ net worth)
3 Ultrawealthy ($50M+ net worth)
It's worth putting these through the fundamental theorem of capitalism (rich people get paid for being rich in proportion to how rich they are) to solve for passive income from asset appreciation. Plugging in the crude figure of 10%/yr (feel free to bring your own rate):
1 Normal person
3 Professional ($100k+/yr passive)
3 Successful ($1M+/yr passive)
3 Wealthy ($5M+/yr passive)
You get your incentives where you get your money. Most people get most of their money from working, but the wealthy get most of their money and incentives from the assets they own. In between it's in between.
Are the in-betweeners part of the problem? Sure, but we have a foot on either side of the problem. We could get hype for many of the plausible solutions to aggregate labor oversupply (e.g. shorter workweeks) even if it meant our stocks went down. Not so for 6/10 people in that sample. The core problem is still that the economy is mostly inhabited by people who work for a living but mostly owned by people who own things for a living and all of the good solutions to the problem require rolling that back a little against a backdrop that, absent intervention, stands to accelerate it a lot.
EDIT: one more thing, but it's a big one: the higher ends of the wealth ladder have the enormous privilege of being able to engage in politics for profit rather than charity/obligation. A 10% chance of lobbying into place a policy that changes asset values by 10% is worth $1k to a "Professional", $50k to a "Wealthy", but $8B to Elon Musk. The fact that at increasing net worths politics becomes net profitable and then so net profitable as to allow hiring organizations of people to pursue means the upper edge of the distribution punches above its already-outsized weight in terms of political influence. It goes without saying that their brand of politics is all about pumping assets.
There's more to say, of course. The role of housing, the role of the government, using DCFs for apples-to-apples comparisons of assets, jobs, social services, and the incentives thereof, behavioral economics, and so on. If you reflexively recoil at the notion that assets have returns, however, you aren't even at the starting line.
> mentioning passive income in this context isn't even idiotic, it's a clinical diagnosis
We could use the IRS term if you prefer: "unearned income"
Holy shit how is this the first time I am hearing about this? This should not be my first time hearing about this.
> Suchir Balaji (November 21, 1998 – November 26, 2024) was an American artificial intelligence researcher who was found dead one month after accusing OpenAI, his former employer, of violating United States copyright law -Wikipedia
There was a Tacker Carlson interview with Sam Altman where Tacker probed him on Balaji's murder and Sam quickly got confused and disoriented. Make your own conclusions.
Saying he got confused and disoriented is and interesting conclusion to make of that interview. He was defensive from the onset and even went combative when Carlson continued down a specific line of questioning, which he allegedly did at the request of the victim's family.
Yes, and it's not too late! Plus, sama is one of the only ultra rich I've heard talk about policies that could actually help society cope with reduced aggregate labor demand.
But when I look at how the US handled previous rounds of globalization and automation, I have very sober expectations for our ability to pursue the "happy path." Still, one has to try.
> the administration’s rhetoric about cracking down on students protesting what we saw as genocide forced me into hiding for three months. Federal agents came to my home looking for me. A friend was detained at an airport in Tampa and interrogated about my whereabouts.
reply