Expecting a purely technical discussion is unrealistic because many people have significant vested interests. This includes not only those with financial stakes in AI stocks but also a large number of professionals in roles that could be transformed or replaced by this technology. For these groups, the discussion is inherently political, not just technical.
I don't really mind if people advocate for their value judgements, but the total disregard for good faith arguments and facts is really out of control. The number of people who care at all about finding the best position through debate and are willing to adjust their position is really shockingly small across almost every issue.
Totally agree. It seems like a symptom of a larger issue: people are becoming increasingly selfish and entrenched in their own bubbles. It’s hard to see a path back to sanity from here.
This depends on the particular group of rationalists. An unfortunately outsized and vocal group with strong overlap in the tech community has gone to notions of quasi mathematical reasoning distorting things like EV ("expected value"). Many have stretched "reason" way past the breaking point to articles of faith but with a far more pernicious affect than traditional points of religious dogma that are at least more easily identifiable as "faith" due to their religious trappings.
Edit: See Roko's Basilisk as an example. wherein something like variation on Christian hell is independently reinvented for those not donating enough to bring about the coming superhuman AGI, who will therefore punish you- or the closest simulation it can spin up in VR if you're long gone- for all eternity. The infinite negative EV far outweighing any positive EV of doing more than subsist in poverty. Even managed to work in that it could be a reluctant, but otherwise benevolent super AI such that, while benevolent, it wanted to exist, and to maximize its chances it bound itself to a promise in the future to do these things as an incentive for people to get it to exist.
yeah maybe around the time of Archimedes it was closer to the top, but societies in which people are willing to die for abstract ideas tend to be one... where the value of life isn't quite as high as it is nowadays (ie no matter how much my inner nerd has a love and fascination for that time period, no way i'm pressing the button on any one-way time machines...).
I mean, Archimedes stands out because he searched for the truth and documented it. I'm sure most people on the planet at that time would have burned you for being a witch, or whatever fabled creature was in vogue at the time.
Only among the people who are yelling, perhaps? I find the majority of people I talk with have open minds and acknowledge the opinions of others without accepting them as fact.
> a large number of professionals in roles that could be transformed or replaced by this technology.
Right, "It is difficult get a man to understand something when his salary depends on his not understanding it."
I see this sort of irrationality around AI at my workplace, with the owners constantly droning on about "we must use AI everywhere." They are completely and irrationally paranoid that the business will fail or get outpaced by a competitor if we are not "using AI." Keep in mind this is a small 300 employee, non-tech company with no real local competitors.
Asking for clarification or what they mean by "use AI" they have no answers, just "other companies are going to use AI, and we need to use AI or we will fall behind."
There's no strategy or technical merit here, no pre-defined use case people have in mind. Purely driven by hype. We do in fact use AI. I do, the office workers use it daily, but the reality is it has had no outward/visible effect on profitability, so it doesn't show up on the P&L at the end of the quarter except as an expense, and so the hype and mandate continues. The only thing that matters is appearing to "use AI" until the magic box makes the line go up.
I've heard the same breathless parroting of the marketing hype at large O(thousands ppl) cloud tech companies. A quote from leadership:
> This is existential. If we aren't early adopters of AI tools we will be left behind and will never catch up.
This company is dominant in the space they operate in. The magnitude of the delusion is profound. Ironically, this crap is actually distracting and affects quality, so it could affect competitiveness--just not how they hope.
I've seen the same trend. AI neeeds to be everywhere, preferably yesterday, but apart from hooking everything up to an LLM withot regards for the consequences nobody seems to know what the AI is supposed to do.