The attention economy corrupts everything it touches: not just science, but journalism, politics, and even childhood.
Being famous used to be rather difficult. Of course there were exceptions (writing To Kill a Mockingbird, being the guy who dove into the river to save a drowning child, for example), but for the most part, you were going to live your life known only to the few hundred or thousand people you met personally.
Even though you could pick up a phone and dial anyone in the world who owned a phone, you wouldn't, and if you did, they'd hang up on you. Now you can force your idiotic, or great ideas onto the screens of millions of people you'll never meet.
Is that a good or a bad thing? It's certainly bad in some ways, and this is one of them.
The problem seems to be more fundamental. Attention is not inherently bad, the issue is how and what kind of attention is rewarded. Many platforms reward engagement of attention seeking behaviour, both good and particularly bad ones, as it easily evokes primal emotions in the audience. And so there is incentives for content creators to continue peddling shitty content.
It's Moloch - the optimization for one criterion (https://slatestarcodex.com/2014/07/30/meditations-on-moloch/) on the grandest scale. It's attention in the "attention economy", but attention is just signal for revenue opportunity, which is profit extraction aka the optimizing measure for Capitalism.
Too be clear, I am not even against Capitalism. Its been a powerful tool for driving market economies, and in conjunction with social justice it has risen all boats. The problem now is that it's been eroding foundational societal elements that make contemporary society (and Capitalism itself) possible in the first place... like attention, science, community, political sense-making, and more.
One particular market economy is well ahead of the others in the experiment to erode democracy. In the marketplace, truth, fact, lies, and propaganda are all just information.
But democracy depends on education and instruction. These two critical types of information are undermined by for-profit purveyors of falsehood. The dysfunctional cycle is complete when people themselves crave the lies and propaganda more than they want the truth and facts.
Belief is repeatedly rewarded; critical thinking becomes an ancient habit that only the signatories of the constitution appear to have exercised.
This is a result of capitalism with unlimited credit-based money supply.
Modern interment businesses rely on the equation CAC < LTV, that is, “is the cost of acquiring a customer less than the lifetime value?” If so, they’ll spend money on advertising and this support an attention-based feed.
The fact that advertising now leads to increased revenues, which can be used as a basis for credit, completes the feedback loop.
With money creation coming from credit, interest rates can be unreasonably low. Bond yields, even for dismal companies, have been ludicrously low by historical standards. Advertising being a tool to turn money into revenue streams, anything that produces an emotional response will be monetized to the extreme.
A company can raise case by borrowing at essentially zero cost, use this cash to buy ads, and increase the demand for attention.
With fixed money supply, the interest rates wouldn’t get nearly so low. That would reduce demand for advertising because the cost of borrowing the money to acquire a user might exceed the LTV of the user.
If I say: Then attention becomes more valuable under Capitalism, so a siginficant portion of people will choose to have low screen time? Think of a "Deep Work" Movement or so.
I don't intend the above to be read as a would-be theorem in economics ("if-then"), I just want to know how you reason here.
> Then attention becomes more valuable under Capitalism, so a siginficant portion of people will choose to have low screen time?
I'm not the person you've responded to but i don't think that these sentences are logically connected.
If the former is true then the hustle to get this attention will become even stronger, increasing the attention economy.
Unless you're not talking about valuable in a momentary manner, but that would mean you've moved on from capitalism.
I think the point is that if everyone is hustling for attention, the ability to do deep work becomes more valuable.
Humans are weird, attention isn't nearly the only way to make money, and in fact, attention itself _depends_ on these other ways to even exist.
Think of it like the rich. Even though they themselves would never build a road, nor do they value building a road, they benefit hugely from them, such that those who are able to build roads make money. And the less people there are that can build roads, paradoxically, the more valuable the ability to build roads becomes.
It's for these reasons that things are so cyclical, this too shall pass.
I don't know, it implies people would be more conservative with their attention if they had some better way to spend it. I mean, there are obvious ways like studying or being with good family and friends or exercising, but they're hard or undesirable for x/y/z. Maybe some of these difficulties are an inherent part of the value, sometimes it's probably not.
Technology should be modulating these difficulties, and instead some of it is short circuiting our more basic emotional systems. Lower hanging fruit. Though I wonder if the more difficult tasks that make us happier, more sustainable as individuals could be pursued in a purely capitalistic incentive scheme, provided it's viewed on a time scale of ten or twenty years instead of one. What businesses take such a long term perspective though?
The thing that scares me the most is that unlike reading a novel, or pensively writing a letter in the 19th century, we focus much less now on one thing for long periods. Even looking at HN, but look especially at TikTok, YouTube “shorts” — the dopamine pinball game going off in our brains and constant change of focus is robbing us of a skill which I fear will have unforeseen consequences at the scale of our global society.
And yet, in my career, I've noticed the rewards are increasing for being the person who is willing to focus on one thing for a long time (for several weeks, or months). For instance, I've never been the kind of software developer who could write obviously clever code. But I have written code that was admired and praised, and sometimes seen as the salvation of the company I was working for -- but not because I'm especially skilled as a software developer, but only because I was willing to think about specific problems, deeply, for longer than anyone else at the company. In 2012/2013, to the extent that I helped re-invent the tech stack at Timeout.com, it was because I was willing to spend weeks thinking about exactly why we'd reached the limits of what we could do with various cache strategies, and then what would come next. I then introduced the idea of "an architecture of small apps" which was the phrase I used because the phrase "microservices" didn't really become widespread until Martin Fowler wrote his essay about it at the very end of of 2013. Likewise, I now work as the principal software architect at Futurestay.com, and my main contribution has been my willingness to spend weeks thinking about the flaws in the old database schema, and what we needed to do to streamline our data model and overcome the tech debt that built up over the 7 years before I was hired. We live in a world where there are large economic rewards for the kinds of people who are willing to think about one thing, deeply, for weeks and weeks or even months and months, until finally understanding a problem better than anyone else.
I have to hope some young people eventually escape the attention-sucking technologies that try to sabotage their concentration, and eventually discover the satisfactions of thinking about complex problems, continuously, for months and months and months.
A very similar experience here (although not to quite the same level of success!).
Even for simpler problems the ability to just sit with it for a few hours/days dramatically increases your ability to write quality solutions.
Good code isn’t the most difficult or complex CS algorithms. It comes from thinking about the problem until you understand it so well that the code is almost self-evident.
Of course, once you’ve thought things through so much it seems trivial to you - which I blame for the annoying verbal tick we all tend to get when explaining things where we say “Basically it’s…”, or “it’s really quite simple…”
The real fun starts when you get to the problems that _do_ need weeks of thinking, and multiple prototypes to validate your theories. That’s when you start to feel like the “science” in “computer science” might not be misnomer :)
"which I blame for the annoying verbal tick we all tend to get when explaining things where we say “Basically it’s…”, or “it’s really quite simple…”"
Very true. I notice this in books on advanced deep-dives on graph theory, queue theory or neural nets, where the writers use the word "obviously" about things that are obviously not obvious.
yep. Another post of mine in this thread is pointing out that the less people do of this type of work, the more valuable it becomes because the work is necessary.
Things are cyclical, and as it becomes obvious that sort of work has become more valuable, more people will start doing it.
Pretty sure being famous is still pretty damn hard. For every tick tock virally famous influencer there are thousands of wannabes that nobody cares about.
> Now you can force your idiotic, or great ideas onto the screens of millions of people you'll never meet.
I'm sorry, did a hand reach out of your phone and force you to install and view the app of the week?
Still hard but I think the parent is arguing that there are more famous people today than say 50 years ago. One of the things the internet did is create more niche groups. There's way more B list and C list celebrities than ever before. Being an A list is still very hard, but being a celebrity in general is easier. Plus there's all the one hit wonders and many more of them. Even the "runs into burning building" famous people are much more likely to become known across the country or globe than before
Take 50 years ago. There were more newspapers then - every town had its local columnists, who were the bloggers of the day. Big cities had multiple daily newspapers.
TV and radio broadcasting wasn't so centralized. There were a lot more local radio DJs and locally produced TV shows.
But if we look back at history, it's mostly the A list who come to mind.
A better 99% example is from the mid-20th century, when a college student was so locally famous the student body voted to name a building after him. But I can't find that example.
I expect the nature of niches has shifted to be less localized and more subject matter. Until recently, there really wasn't an equivalent of Instagram or Tik-Tok influencers that I have doubtless never heard of but which have many thousands of followers.
To the 99 PI point, I'm always struck walking around a city like London how much significant statuary there is of people I've never heard of in spite of being reasonably familiar with British history. And, yes, some of them probably don't have a Wikipedia article and if you were to create one, some admin would probably decide it was insufficiently notable.
>TV and radio broadcasting wasn't so centralized. There were a lot more local radio DJs and locally produced TV shows.
I'm not sure I agree with this in general though. No, you're less likely to know of local DJs today. But go back a few decades and "everyone" watched the same lineup of TV on a Thursday night and it would probably have been something of a cultural knowledge shortcoming if you didn't know who the network news anchors were. (I could name them from 25 years or so back. Really wouldn't know today.)
But there were other sorts of fame which are less popular today than yesteryear.
Secret lodges were once very popular, with their own hierarchies and (internal) fame.
Newspaper kiosks would carry a wider range of newspapers and magazines, but have now effectively disappeared.
Local clubs were also more common. When my old neighborhood was build in the 1950s, one of the lots we set aside as a clubhouse, with square dancing events. (It's since been turned into a pool.)
> go back a few decades and "everyone" watched the same lineup of TV on a Thursday night and it would probably have been something of a cultural knowledge shortcoming if you didn't know who the network news anchors were.
Sure, but godelski was talking about 50 years ago.
In the 1970s, a local TV station in Miami had "Toby the Robot" in a show to read the Sunday comics - https://www.pbase.com/donboyd/image/132365543 . Toby would also appear in local parades.
Go back a few more years and there was even less network programming. If you saw the musical "Hairspray", that portrays a show based on Baltimore's Buddy Deane Show, which was one of several local teen dance television shows later replaced by national shows.
Remember, it wasn't until 1951 that we had a nation-wide microwave system that could carry TV broadcasts, and at the beginning most TV shows were still created locally. The Prime Time Access Rule went into place about 50 years ago to try to prevent that centralization.
(For another fictional movie portrayal, O Brother, Where Art Thou? shows The Soggy Bottom Boys achieving local fame because of their song on The Flour Hour.)
So even though there are new methods now, my observation is still that there were other ways to get famous, and some of these ways are no longer so common, making them somewhat hidden to modern viewpoints.
How then would you determine the validity of "There's way more B list and C list celebrities than ever before."?
Many of your examples don’t seem to fit fame. People may be known locally, but fame requires someone to be widely known which implies a wide geographic range.
Put another way, at what level is the winner of a beauty contest famous? I don’t think people would universally agree except at the extremes. Miss America is famous, the winner of a spring break wet T-shirt contest isn’t barring something unusual happening.
Is Miss America even famous these days? I sure couldn't name one--though perhaps I'd recognize names from years past.
I tend to agree that if you go back a number of decades more local people were probably fairly well-known locally but I'm not sure I'd put the top guy at the Elks Lodge in the "famous" category. Draw a small enough circle in geography or niche interest and a lot of people are notable to some degree.
To some degree, of course, it's because fame/notability in the Miss America case isn't really separable from relatively reliable third-party sources (though I wouldn't put a lot of money down on whatever life story agents and PR people have concocted). The same tends to be true of actors, pro athletes, and politicians above some minimal level. Even senior company executives and academics may not have much written about them.
Which also means that the number people where >1k people know them also increases. The question is about proportions and locality. I feel pretty confident in saying that the rate of non-local celebrities has dramatically increased faster than population. I would also wager that the number of total celebrities (as defined above) has also increased faster than population and accelerated through the instagram and even more through the tiktok age. We have far more ways to communicate than previously, especially non-local. I can talk to someone in Japan without a HAM radio nor an expensive long distance phone call. Hell, I wouldn't be surprised if someone living in Japan reads this comment.
Being famous doesn’t require my personal attention, just the attention of people who actually downloaded the app of the week.
The barrier to being famous dropped as people spend less time on any one thing. A few seconds of attention is qualitatively different than reading a novel or even watching a movie.
That's rediculous. You think that influencer fame just happens? Influencers put lots of effort into being famous. Probably a similar amount as hollywood starlets of yesteryear put into being "discovered".
>Are you suggesting that it is difficult to judge what ideas are great and what ideas are idiotic
that's quite an arrogant thing to say and probably will rub a lot of people the wrong way.
but let's give it the benefit of a doubt.
Let's say that there is a quality greatness separate from the qualities of likely to succeed, easy to implement, monetizable and so forth
where idiotic is concerned I would wonder if this is a quality separate from physically impossible, illegal, has obvious undesirable side effects? Is being guaranteed of business failure for an idea you hope to build a business on make that idea idiotic?
Now once we have this question as to what exactly comprises greatness and idiocy in ideas we can say there are many intelligent and successful people who have thought an idea was brilliant and would be hugely successful (two qualities we have separated), which idea then resoundingly failed.
The Segway comes to mind. I remember when that was first revealed, I thought hey this thing is genius, amazing. I sent it round the office, everyone had a big laugh about how stupid it was, including our lead designer who went into a big tirade about how people have bikes (in Denmark), Americans should just get bikes, they were not going to redesign their cities to accommodate the Segway if they don't redesign to accommodate the bicycle etc. etc.
So, was the Segway an idiotic idea, a brilliant idea, or a mediocre one?
I submit it was both brilliant and idiotic.
Our designer who saw the idiocy did not see how we would end up with the computerized autobalancing of the Segway in everything (except evidently bikes), how we would have autobalancing electric scooters, skateboards etc.
the people who saw the technical brilliance of the Segway did not realize how it was just not going to be a successful consumer product.
So - would you have judged the Segway as Brilliant or idiotic or just mediocre, and why?
The underlying idea made sense. It just looked incredibly dorky compared to the small unicycle things I see zipping around these days. I remember back in the day people used to joke that only an American would be willing to use those things.
That said it will only worsen the obesity epidemic
No, that's not what they're suggesting. That's a straw man that begs the question. They're suggesting there's no such thing objectively. Not everyone agrees on what's great because it's subjective and different people value different things. You're trying to reify your taste and in the process confounding your subjective values whith objective values (which can't exist.) PG does the same thing in one of his essays, whereupon this was repeatedly pointed out.
"Not everyone agrees on what's great because it's subjective and different people value different things."
You are confusing people's disagreement with the competing definitions of "objective" and "subjective". These are separate things. Something can be objectively true and yet people will still disagree about it. Objectively, vaccines for Covid-19 can lower the risk of death for those who get a case of Covid-19 -- people disagree about it, but it is not a subjective question. Objectively, Darwin's theory of natural selection explains much of the development of the diversity of life on Earth, but people still reject the theory. There are many objective facts in the world that people still disagree with, because some people are irrational. What you seem to be confused about is the role disagreement plays in determining whether something is objectively true, which is to say, it plays no role at all -- many things are objectively true, yet fiercely contested by irrational people. The fact that irrational people exist does not mean that everything in the world is subjective, it only means that some people have some experiences which they experience subjectively. George Orwell made the remark that some people could not be brought around to facing to reality till their wrong beliefs were tested on a battlefield, and it is true, if you shoot someone in the brain, the argument that everything is subjective comes to a sudden end.
Except we know for a fact people value different things. It is undeniably subjective. No, you have no argument here. People disagree because they value differently.
You are simply playing with words in a manner that is pointless. There are clearly objective facts about this universe that can be plainly stated. And there are rules of logic that allow us to objectively identify certain sets of assertions as containing a contradiction. Therefore, as I said above, “Are you suggesting that it is difficult to judge what ideas are great and what ideas are idiotic? I actually find this easy.”
You keep making the same mistake, which is to assume the existence of feelings disproves the existence of an objective reality. Reality exists outside of our feelings.
>Now you can force your idiotic, or great ideas onto the screens of millions of people you'll never meet.
No you can’t. Attention is scarce. That’s why it’s called “attention economy”. In fact, it was far easier to force idiotic ideas onto the screens of millions of people before the internet came along. That’s what TV commercials were.
I think you're thinking about corporate/org-scale attention seeking. Putting things into historical context, getting attention today is certainly easier for a random individual. In fact anyone could directly compete with large organisations and even surpass them in some cases.
You can set up a youtube account then do episodic dumb shit, and i can guarantee you'll get thousands of not millions of views eventually. You simply couldn't do that 30 years ago.
Those are your words. How many people do you think have already read them? How many more do you think will have read them after 10, 20, or 50 years?
If you'd said the same thing before we were using computers to communicate with each other, who would you have said it to? How many people would have heard it? How long would those words have been discoverable? What chance you would you have had to have your words reach millions of people?
Forget about you, what chance would I and everyone else have? Speaking as someone who wasn't born after the internet it's much easier now than it was when I was a child for my words to reach a massive number of people, and every child alive today has a better chance than I did at their age.
Attention is scarce, when you've got near instant communication over a global network of billions of people you only need a tiny fraction of them to be looking your way and you've got your ideas onto the screens of millions of people
these days I'd be surprised if they even pick up in the first place. If I don't know you and I'm not expecting your call you get voicemail. Some folks basically never answer their phone because 'everyone I want to talk to knows to text me'.
I don't doubt you'll eventually reach someone who would stay on the line and chat a bit, but unless you're lucky it could take a while.
It is well-established that the human animal is evolved to live in small groups. When people come together in large numbers, it is a special occurrence, limited in time and space. The idea of living "as if" one is always in this situation is unnatural and arguably unhealthy. It is not something we should be promoting or even allowing. We should be promoting small groups.
If I was asked to "regulate" this problem of so-called "altmetrics", and the "attention economy" in general, here is how would I do it.
Twitter used to be based on SMS, but since 2020 it is just a gigantic website like Facebook. These two mega-sized websites are the primary sources of "altmetrics". If we take away the right to create these gigantic outsourced websites, what would happen.
I would place limits on websites that are comprised of so-called user-generated content. For example, if someone wants to run a website with millions of pages, they are free to do so. (If they could actually produce enough content to justify so many pages.) However, they are not free to have millions of different people author the pages. A website could not be a mass scale middleman (intermediaries) for people who wish to publish using the www. A mega-website doing no work to produce content, financed solely by selling people out to advertising could not, for example, supplant an organisation that employs journalists and editors to produce news.
By regulating creation of these mega-websites we could reduce the incentive for advertising. The mega-websites would lose their traffic and disappear. They would be replaced by normal-sized websites that cater to specific audiences.
Allowing a few websites to grow to enormous size while not having to do any work to produce content has been a mistake. Of course they can make billions in advertising revenue. It also allows any notion of egalitarianism in the www's design to be compromised in favour of a sharecropper model, so-called "platforms".
Without oversized websites no one would be able to publish their content to every website in existence. No website would be able to do zero work to create content and yet act as a middleman drawing massive traffic that can be monitised via advertising. That is what these mega-websites like Twitter and Facebook do. They sit between people who create content and people who consume it and sell these people out to advertisers.
The cost of publishing data/information to the www will continue to fall. The technology to make it easy will contnue to advance. We do need to be able to communicate in small groups, as we have always done. That is possible. We do not need to collectively use mega-websites with billions of pages run by a handful of third parties in order to do it. The follow-on effects of millions of people communicating via these third party websites are obviously harmful.
Some excellent ideas here. Can you make an argument that they're constitutional, though? Because that would be the legal problem.
Let's take a pre-web TV show: American's Funniest Home Videos. That showed UGC to millions of people, allowing millions to propose their own (of course, the producers were the deciders, at least up to the point where the videos went on the air). So how is that show different from the websites you'd prohibit?
It was untargeted multicast with no back channel for comments/feedback. This is the diametric opposite of Facebook or Twitter which are all about soliciting and encouraging as much back channel communication as possible, and feature content heavily curated to your particular interests to make it more of a dopamine hit than your average America’s Funniest Home Video watching session.
And here’s another difference: at least in my country, this programme was something that might be shown for an hour, maybe each day or maybe each week (it’s been a while, I don’t remember which one it was). There wasn’t a 24/7 channel that would show a steady stream of such videos without a pause - in other words, the show started, and ended. Twitter and Facebook never end.
As for comments/feedback: au contraire : A lot of those UGC shows had (or have) phone call-ins and phone voting, and piles of fan mail. The producers would gauge audience reaction and modify the selections accordingly.
You may be interested in the upcoming Gonzalez v Google now on the Supreme Court docket. The Supreme Court has never taken a Section 230 case and here they are taking one even when there is no split in the circuits. It is likely that the interpretation of 230 by courts is going to change.
Twitter and other Big Tech websites comprised of UGC and pointers to articles on newspaper websites have grown too large to moderate. Section 230 gives them immunity from lawsuits that the TV show does not have. The TV show has producers, actual people who make editorial decisions. Big Tech has people too but they hide their decision-making in "algorithms".
Gonzalez is probably going to change what Big Tech, i.e., mega-sized websites, can get away with under Section 230 immunity.
Is there a legal problem. What's the basis for the opinion. What is the precedent.
Let's take a pre-web example: Use of the early Internet was restricted to military, academic and later other institutions. Pre-1993, advertising was generally not permitted.^1 Were those rules legal problems.
"ICANN", a secretive non-profit coporation with some very well-compensated staff, domiciled in the US, supposedly regulates "the Internet", at least in part. But no one can tell us where ICANN's "authority" comes from. Maybe acquiescence. I don't know.
When ICANN started handing out "domain names" in the early 1990s there was a rule that certain obscene strings could not be registered. Was that a legal problem. Later the restriction was inexplicably lifted. At first the registrations were free. Then they were $100. Then they were $50. All apparently arbitrary decision-making accepted without legal challenge. Where do these rules come from. Under what authority are they made. If ICANN had some rule about how websites can use "their" domain names, would that have been a legal problem.
The Internet has all manner of "rules" and "limits". Some may be technical in nature, but some are policy-based, at least in part. And those policies may come from a variety of non-governmental sources, including mysterious ones like ICANN. How can anyone challenge these "rules".
Let's say I want an IPv4 address block, but a "regional registry" says I am not allowed to receive one. How does this registry even have any authority to set rules and tell me I cannot have a block. Who "owns" the rights to network addresses. What legal recourse do I have when I am refused.
Whatever the answer, the fact is that there is an enormous amount of cooperation and acquiescence to restrictions imposed by sui generis "authorities" that goes into creating a single "Internet". And AFAIK these registries and other organisations like ICANN have few if any "legal problems".
Obviously the rulemaking is not only limited to made up "Internet authorities".
For example, Cloudflare can "kick a website off the internet". Cloudflare makes its own rules. The website may continue to publish via Tor or some other option, but the point is that most "rules" of the Internet are not found in any legal system.
Or how about when .org was going to be sold off to a shell company. Public protest stopped the sale. Although ICANN would have us believe they stopped it. Who "owns" .org. What are the rules. Who makes them. Can they be challenged through legal process.
Websites like Facebook and Twitter need the cooperation of many parties to do what they do.
The Internet, including the few mega-websites, operates according to cooperation and compliance with self-appointed "authorities" whose "rules" are generally never subjected to legal analysis.
In the rare cases where such "rules" are legally challenged, the defendants almost invariably settle to keep the novel issues out of the courts and retain their uniquely derived "power".
Historically ICANN had authority via two mechanisms. The first is that it was appointed as the Internet Assigned Numbers Authority by the Internet Architecture Board (part of the IETF). At a hard minimum that gives it the authority the run the IETF IANA functions, and historically would have been what gave it authority to issue IP addresses and AS Numbers. In performing the IETF IANA functions it has zero regulatory role, it is just a registry. I believe it has slightly more say over policy in issuing AS and IP addresses to the regional internet registries.
The other (no longer applicable) source of authority was the contract from the US Department of Commerce. This is no longer applicable, as the US government decided it did not need to be involved with this, especially because it got a lot of criticism for being involved, but the contract really offered the government no real control over ICANN.
The place where ICANN has the most say is domain names. Here ICANN acts as a full blown policy maker, in addition to running the naming IANA functions (like creation of the root zone file). To the extent that ICANN "regulates 'the Tnternet'", it is restricted to its policy setting over the DNS.
It is less easy to give any current source of authority for this without the commerce contract. But it really comes down to everybody accepting ICANN as the provider of the root zone file, and the fact that ICANN would reassign the TLD delegations to a different registry if a currently assigned registry does not want to follow its rules.
-----
The regional internet registries nominally get their authority to issue DNS address ranges from a delegation from the IANA (in its numbering function). Like mentioned before, one can trace ICANN's authority to run the numbering portion of the IANA back to being designated as the IANA by the Internet Architecture Board (part of the IETF), but this could be a little misleading, as the "numbering community" now exists, and ICANN claims that it is that community that would be allowed to appoint a new organization to run the IANA numbering functions.
(Also ICANN has created a stand alone nonprofit to actually run the IANA functions. This is a membership based non-profit with ICANN as the sole member, making it basically like a subsidiary without legally being one, because ICANN wanted to make it very clear that the operation of the IANA functions is separate from the policy making part of the organization once the Government contract was stopped.)
------
Now perhaps you want to know how the IETF has authority? Quite simply, they are an outgrowth of, and assumed the functions of the Network Working Group, a group a early ARPANET researchers. As the group that defines protocols like TCP, IP, HTTP, etc they derive their authority from just that, that they develop the standards that underlie the internet.
You are saying that we should prohibit large group gatherings, because they are "unnatural" and "unhealthy" yet provide not a single source or even definition what a large group is supposed to be. Quite grandiose and all encompassing statement to be not backed up by even a single mentioned source.
None of the links concludes that interactions with groups above the size of the dunbar number are detremental. None of this says that large gatherings are "unnatural" and certainly does not provide arguments why we should not allow gatherings or interactions beyond a certain point of people involved. It states that there is a certain number we can actively and comfortably manage, but nothing about the quite wide reaching things you postulated.
Another idea is to prohibit mega-websites that comprise thousands of pages with content generated for free by distinct users from (a) using metadata generated from access to those pages or (b) serving advertising on those pages. Generally, prohibit rent-seeking by websites that delegate ongoing content generation to users on a large scale.
Isn't it already breaking down country wise? Most poorer countries dont have the infra/resources to run these sites. So they have to latch on to the mega nets. But the more developed ones are all inching towards their own nets.
Came here to say this. There is nothing meaningful in our lives which goes completely unscathed from the cultural destruction of the attention economy.
Humans ... us people things, etc ... are prone to our most immediate concerns. It used to be that predators were our concern, now its social media. We react, we make noises, we continue. The unfortunate thing is that there are no lions to take out the weak anymore.
Being famous used to be rather difficult. Of course there were exceptions (writing To Kill a Mockingbird, being the guy who dove into the river to save a drowning child, for example), but for the most part, you were going to live your life known only to the few hundred or thousand people you met personally.
Even though you could pick up a phone and dial anyone in the world who owned a phone, you wouldn't, and if you did, they'd hang up on you. Now you can force your idiotic, or great ideas onto the screens of millions of people you'll never meet.
Is that a good or a bad thing? It's certainly bad in some ways, and this is one of them.