If smth is bubble it does not mean that said smth has no value. It just means that there is over investment and thus inefficient investment. Like housing bubble - nobody argues that houses are not needed and are not big part of economy.
"the old economy had changed forever and was dead."
Ehem - what is the difference compered to now? Wasn't programmers obsolete by 6mths ago and nobody would work so we do need UBI?
However your point that if everybody are thinking there is buble there is none is valid. Ironically your whole post undermine this point. And you are not alone in your analysis. General bubble wisdom is not settled as one might think.
Plus famous Alan Greenspan "irrational exuberance" was in 1996. And AFAIK in 1999 everybody know there is buble but it busted only in 2000. On top of that I have seen overlying plots of stock prices now and before dot com suggesting there is 1-2 years of increases still to go.
> Ehem - what is the difference compered to now? Wasn't programmers obsolete by 6mths ago and nobody would work so we do need UBI?
You're applying an arbitrary time constraint to the realization of AI's promise in order to rubbish it. This is a logical mistake common among critics: not yet, so never. It doesn't seem as if there is a near limit to the tech's development. Until that changes, the potential for job wipeouts and societal upheaval is real, whether in 5 or 50 years.
Sorry but that was not my point. My point was refuting of the thesis (in the comment that I am replying to) that nobody was making grand claims about AI contrary to grand claims about internet pre dot com. Obviously in both cases there were/are grand claims made.
So the issue is that 220V is nominal in China, 230V nominal in UE and 240V is UK/part of Australia. So if anyone is preparing product for global market (as most are doing now) more likely then not will support all of this voltages. Thus is kind of normal (but wrong) to assume 220V sounds like 240V.
When the voltage was unified in UE, the nominal value was set to the median of 230 V, but its tolerance was raised from 10% to 15%, so that the new maximum peak value of 230V + 15% will match the old value of 240 V + 10%.
So now for all 220/230/240 V standards you have the same maximum voltage value that is used for electrical designs (about 265 V effective), so they are equivalent, regardless of the name.
True, however there is also old equipment. For example I have heard that light bulbs designed for 220V will last for noticeably shorter period of time ar 230V nominal circuit. That is why it is worth to check supported voltage. But you are right - newer equipment will suport all voltages.
Many charguers are now 100-240V, 50-60Hz, that is close to pluggable anywhere on Earth. (I burned one or two a long time ago, when I forgot to check and used a 120V transformed here with 220V)
Same thing happened to PC PSUs. I don't think there is a recent unit that still has the self-destruct voltage selector switch which pops them if you are in 230V land (and the switch is set to the smaller setting).
"But you say 'if it's not allowed', but not allowed by whom?"
Not allowed by EU law obviously. Role of courts (in general) is interpreting law and thus deciding how said laws apply case by case. Law in EU flows down from EU treaties that where negotiated and signed by member countries. The big ones (treaties) needed also be "ratified" by country wide referenda.
My interpretation of parent comment is that, we shouldn't be just "themwashing" these powers, and start placing them under technical scrutiny more often.
So laws are made by people, sometimes retired people, sometimes people.
So it's just another thing allowed by a person. Law isn't something magical with capability to make something not okay okay. Law is just someone allowing or forbidding something, with this having been incorporated into a sort of system.
I don't know exactly what you mean, since we have a representative democracy and since the governments enter the treaties and have strong influence over many parliaments it's really is very person focused in the end, even though it really shouldn't be.
A sensible world would have lots of referendums with the general public approving or disapproving of parliamentary decisions, à la Switzerland, but that is not the world we live in.
I do not feel heartened by this sentence, even though I should be. We're choosing from a pre-curated menu rather than truly "hiring" representatives. The real power lies with party gatekeepers, donors, and institutional barriers that determine who even makes it onto the ballot, not with voters making the final selection. It's more like being asked to pick your favorite from two restaurants that a food critic already chose for you, rather than having genuine choice over where to eat.
and it's not incredibly practical. Instead those sit at the head of institutions, whether political parties, governments, etc. have real power.
It's a bit like saying 'so make your own Facebook', but that's pretty useless if it's a response to someone who feel that some big social media company is influencing public discourse and harming proponents of certain ideas.
You can't make your own Facebook, or organize a political party other than in response to slow phenomena, and here we're talking about something has until recently been seen as literally illegal-- against the founding principles of the EU, so this is a huge, sudden change which people have no chance of resisting in a representative system.
You are basically chosen by parties and other entrenched organizations. New parties are very unusual.
However, none of that really matters. Democracy, laws, etc. don't make this kind of anti-privacy policy more legitimate. If you create a STASI, it doesn't matter if you do so democratically, and that really is what we are talking about.
With software on your phones controlled by others going through your stuff you have a beyond-STASI-surveillance level.
No, fear is not why Soviet Union allayed with Nazis. Molotov–Ribbentrop Pact was agreement in which Nazis and Soviets divided central/eastern Europe between them. They even had join parade after conquering Poland in Brest (Brześć). And yes, they ware allied.
They were fighting Japan at the time, were unable to fight a war on two fronts and Britain had at that point chosen to follow a strategy of appeasement towards the Nazis.
And your idea is that they had zero reason to fear invasion from the west? Even though that is precisely what happened just a few years later?
First of all this were USSR-Japan skirmishes not war, second they did not have to worry about Japan as was shown by Soviet–Japanese Neutrality Pact of April 1941, third if they were worried about Japan then "spending" army on invasion of Poland, Finland, Baltics, Bessarabia were counter productive, fourthly at the time of Ribentrop Molotov pact Britain ceased following appeasement strategy as shown by declaring war to Germany at 3-rd of September 1939 as fulfillment of security guarantees given to Poland in March of 1939.
It is totally ahistoric to pin any actions of USSR on fear or just reaction to external events. If WWII was continuation of WWI (in my and many opinion it was) both Germany and USRR were revanchist powers that wanted to reverse outcome of WWI. Many forgot that Russia later USSR lost WWI badly. Plus Stalin after very, very, bloody consolidation of power in 30ties was ready (in fact it was imperative for regime stability) to start outward aggression/expansion.
Furthermore historian believe that Stalin knew that confrontation w/ Germany is inevitable but (more popular opinion) was estimating it will happen one year later at least or (less popular, even fringe opinion) was amassing forces to attack Germany and was cough by Nazis w/ "pants down". Either scenario would be explanation for initial successes of Operation Barbarossa.
Fun fact - last train with grain from USSR to Germany crossed border few minutes before start of Operation Barbarossa.
In summary - Soviets and Nazis were allies till 1941 - both parties know it was tactical alliance not unlike USSR - GB/USA against Germany and at the very end Japan. Note that after WWII there was cold war between former allies - not unlike like hot war between former alliance parties of Nazis and Soviets.
Second fun fact: Orwell's "oceania was always at war with eastasia" from 1984 is direct reference to how alliances were changing during WWII.
>First of all this were USSR-Japan skirmishes not war, second they did not have to worry about Japan as was shown by Soviet–Japanese Neutrality Pact of April 1941
...two years after Molotov Ribbentrop.
If they had nothing to worry about Japan it logically follows that they had nothing to worry about Hitler either as was shown by the Molotov Ribbentrop pact.
In 1939 the Soviet military was a disaster, also. It's difficult to overstate just how exposed they were.
>Furthermore historian believe that Stalin knew that confrontation w/ Germany is inevitable
They were right to be afraid.
>In summary - Soviets and Nazis were allies till 1941 -
In summary, out of fear which was entirely legitimate. Fun fact: the only difference between them and Finland is that Finland gets excused for allying to Hitler out of fear by its western allies.
Yeah, I almost called that out. Probably should have. GPU/NPU feels new (at least for us folks who could never afford a Cray). Probably the biggest change in the last 20 years, especially if you classify it with other multi-core development.
I am fan of the Jeff Geerling Youtube series in which he is trying to make GPU (AMD/Nvidia) run on Raspbery Pi. It is not easy - and they have linux kernel source code available to modify. Now imagine all Qualcomm clients have to do similar stuff with their third party hardware, possibly with no access to source code of drivers. Then debug and fix for 3y all the bugs that pop up in the wild. What a nightmare.
Apple at least have full control on hardware stack (Qualcomm do not as they only sells chips to others).
Hardware drivers certainly can be annoying, but a hobbyist struggling to bring big GPUs’ hardware drivers to a random platform is not at all indicative of how hard it would be for a company with teams of engineers. If NVidia wanted their GPUs to work on Raspberry Pi, then it would already be done. It wouldn’t be an issue. But NVidia doesn’t care, because that’s not a real market for their GPUs.
Most OEMs don’t have much hardware secret sauce besides maybe cameras these days. The biggest OEMs probably have more hardware secret sauce, but they also should have correspondingly more software engineers who know how to write hardware drivers.
If Qualcomm moved their processors to RISC-V, then Qualcomm would certainly provide RISC-V drivers for their GPUs, their cellular modems, their image signal processors, etc. There would only be a little work required from Qualcomm’s clients (the phone OEMs) like making sure their fingerprint sensor has a RISC-V driver. And again, if Qualcomm were moving… it would be a sea change. Those fingerprint sensor manufacturers would absolutely ensure that they have a RISC-V driver available to the OEMs.
> If NVidia wanted their GPUs to work on Raspberry Pi, then it would already be done. It wouldn’t be an issue. But NVidia doesn’t care, because that’s not a real market for their GPUs.
It's weird af that Geerling ignores nVidia. They have a line of ARM based SBCs with GPUs from Maxwell to Ampere. They have full software support for OpenGL, CUDA, and etc. For the price of an RPi 5 + discreet GPU, you can get a Jetson Orin Nano (8 GB RAM, 6 A78 ARM cores, 1024 Ampere cores.) All in a much better form factor than a Pi + PCIe hat and graphics card.
I get the fun of doing projects, but if what you're interested in is a working ARM based system with some level of GPU, it can be had right now without being "in the shop" twice a week with a science fair project.
“With the PCI Express slot ready to go, you need to choose a card to go into it. After a few years of testing various cards, our little group has settled on Polaris generation AMD graphics cards.
Why? Because they're new enough to use the open source amdgpu driver in the Linux kernel, and old enough the drivers and card details are pretty well known.
We had some success with older cards using the radeon driver, but that driver is older and the hardware is a bit outdated for any practical use with a Pi.
Nvidia hardware is right out, since outside of community nouveau drivers, Nvidia provides little in the way of open source code for the parts of their drivers we need to fix any quirks with the card on the Pi's PCI Express bus.”
I mean in terms of his quest for GPU + ARM. He's been futzing around with Pis and external GPUs and the entire time you've been able to buy a variety of SBCs from nVidia with first class software support.
But absolutely must ensure that it is free market. Since 90s the biggest threat to free markets is not socialism but monopolies and oligopolies. (Even good social safety net enhance freedom of economic activity).
How some mighty monopolies have fallen. Your examples don't refute GP's point, unless you'd like to assert that monopolies and oligopolies don't exist anymore.
> The greatest lifting of people out of poverty ever seen.
How do you square that with the fact that United States, with your platonic ideal of economic policies, has the highest rate of poverty in the developed world?
>How the mighty monopolies have fallen:
>IBM Sears RCA Intel GE Disney Kmart Kodak Novell Lotus Wordstar AT&T Microsoft
>without any push from the government.
How has Disney fallen? Last I checked, they seem to be doing better than ever, and have gobbled up so many IP franchises they look more like a monopoly than ever (Star Wars, Marvel, etc.).
Microsoft sure hasn't fallen either. Windows isn't quite as dominant as before, but MS got even bigger by moving to cloud stuff instead of just relying on Windows/Office.
Sears was never a monopoly of any kind, that I remember. Neither was Kmart, or Wordstar, or GE. AT&T, formerly Bell, was a monopoly and got a huge push from the government in the form of a forced break-up in the 80s. The AT&T of modern times (the cellular company) was never a monopoly of any sort.
Who cares? The only thing that's important to a company is profitability. I can't be bothered to do any research, but I'm pretty sure MS is just as profitable as ever, if not moreso.
If I can try to explain it - it is overcorrection from times when Tesla and SpaceX were cool with appealing mission places to work on. Bcs of that they could get away with smaller salaries too (although Tesla meteoric rise in value compensate it if you got options as I understand it).