Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

At some point people won't be needed in this crazy cycle.


How much energy will the chain of AIs and counter-AIs between content production and content consumption/discovery burn? More or less than Bitcoin does? Because we complain about that, but this crap would be even less beneficial than Bitcoin is. But how can one opt out of this game? If you do it unilaterally, it just means you lose. Government regulation, the usual solution to this kind of coordination problem? What would that even look like?


>But how can one opt out of this game?

Move to alternative webs such as Gopher or Gemini, which make it impractical / impossible to monetize based on clicks / engagement / amount of content. The issue isn't that you can generate spam. The issue is that you can make money by doing so.


This saves you from nothing.

While making money is a big game, it's not the only game. Your thoughts and actions leads to behaviors off your computer. This is why we see all kinds of political spam and propaganda to affect your choice at the ballot box.


As long as we assume that the html web doesn't completely die and remains significantly larger, it will always remain the biggest target for those kind of activities. Better to have a potential target of millions, than a target of hundreds or thousands.


Depends what you're targeting. For example targeting a much smaller self select group of 'computer experts' for a watering hole attack of one type or another is a great way to get in to some juicy systems.

If a system is machine readable and writable (which AI is fastly increasing this definition) you must assume that not only can you be attacked, you must assume that you are being attacked.


Gemini is an anti-intellectual artistic project that forbids interactivity and rich media like inline images, which massively increase learning and comprehension. If you're looking to learn, you'd want to avoid Gemini (and Gopher, which has similar restrictions).


Gemini has interactivity, it just needs to be server side. Of course it is limited but that is the point. I'll take "anti-intellectual artistic project" over "cesspool of lowest common denominator" any day.


Server-side interactivity is clearly not what anyone means by "interactivity" - you're intentionally twisting the usage of words.

> I'll take "anti-intellectual artistic project" over "cesspool of lowest common denominator" any day.

If everyone else thought like you, humanity would be set back millennia.

Not to mention that it's just flat-out insane to discard the massive amounts of educational content and knowledge present on the internet just because it happens to also contain some undesirable content.

This mindset is not one that we want to spread throughout our society, full-stop. Gemini is a threat to human progress.


> Government regulation, the usual solution to this kind of coordination problem? What would that even look like?

Which government is the question you should be asking.

For example, lets say I'm a authoritarian government that is hostile to other nations. In general there is a strong imperative to block GPT use in my own country except by approved users to make propaganda. At the same time it is a very useful weapon for me to use against my enemies. I can use the 'bullshit asymmetry theory' against them and drown their population in conflicting propaganda.

But you see unlike nuclear weapons, we really don't need multibillion dollar facilities to make these things. Midsized companies make these things easy enough. There is no opting out, unless of course you want a world wide police state to ensure that you're not making anything naughty with your computer. And of course you should realize your government, along with everyone else's government is making their own naughty versions of this kind of application anyway.

So, yea, some smaller EU countries won't make one because of the law, but the US, China, and likely Russia would do it anyway because screw you, we have nukes behavior that drives their other actions.


> How much energy will the chain of AIs and counter-AIs between content production and content consumption/discovery burn?

While I have plenty of cynicism about this and also expect it to at least partially play out like this, let me offer a more optimistic perspective on the same thing.

People come into media with different amounts of background knowledge and context. Currently, this is basically solved by a tiered system of 'knowledge' distribution. As an example (though I think something similar exists outside of the sciences too), scientists write papers that are read by science communicators who put out press releases which are read by journalists who write articles about stuff all of which is read by various content creators who remix all of this into their own content tailored to their specific following. Part of this is tailoring is knowing what context/knowledge your audience already has and giving them enough new information for the new content to be digestible without the consumer needing to seek out other sources. So when ChatGPT-N is reencoding the content for you, it can personalize it your level of knowledge, without wasting your time by either rehashing stuff you're already aware of or by including context that you wouldn't necessarily have known that you're missing.

This of course means that ChatGPT will need to know what you do/don't know...


> Because we complain about that, but this crap would be even less beneficial than Bitcoin is.

This is an issue of capital allocation. Enormous amounts of private money have been wasted chasing dreams of monopolizing currency (crypto), monopolizing taxis (Uber/Lyft), and of course monopolizing the food delivery industry. There is little difference between government picking winners and private industry doing it. How many cumulative billions has Uber lost so far? Likely far more than what the Chilean government lost in a futile attempt to develop a domestic model of automobile in the early '70s.

Now even more of that monopoly money is going to be shoveled into consumer AI plays that will continue to waste energy, not to mention pollute the internet further with half-baked 'content' with the stench of Wikipedia and ArtStation all over it.


>Government regulation, the usual solution to this kind of coordination problem? What would that even look like?

Maybe we should get an AI to write it.


Someone has to pay to run the AI though. With Bitcoin it's a bit different, you mine and you get money directly. Even if you have a AI burning GPU cycles all day generating content, you still gotta turn it into cash.


The motivation to pay is to generate better spam on the one side, and to spot that spam on the other. Plausibly, a lot of content will go through multiple iterations of both of those before reaching any actual human. Same game that's going on now, just cheaper (even more automated) and at higher volume.


In the US presidential election billions of dollars are spent each cycle. If you think advertisements are the only use for AI you're underestimating the scope of the problem.


this is exactly what philosopher Nick Land has said.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: