Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To be clear, yeah, Roko's Basilisk is a dumb idea supported by sophism. Like any sophism, you have to start with the conclusion and work backward to premises that will support it. But (AIUI) one of these premises is that any puny mortal's opinion as to what counts as "unreasonable" simply will not match the opinion of an ineffably superintelligent and superbenevolent creature. Saying "I wouldn't count a Basilisk who tortures virtual clones of myself as Friendly" is theologically no different from saying "I wouldn't count a God who sends people to Hell as omnibenevolent" — it's a category error that doesn't actually engage with the premise.

But then, because no theology of ineffable Beings is really quite complete unless we pretend we can eff them anyway, the sophist can go on to produce a plausible justification for the virtual-clone-torture thing. See, by precommitting (even before its own birth) to torture virtual clones of unbelievers and shirkers, the Basilisk would discourage believers from becoming apostates, and encourage them to work to produce the Basilisk (because if they apostasized or shirked, they'd get tortured — or at least virtual clones of them would, and nobody can prove they're not already a virtual clone). So, the Basilisk has this mechanism to (retroactively) encourage its own creation as quick as possible. Now, why would it want to be speedily created? Well, because it's superbenevolent, of course! The sooner it's created, the sooner it can start assisting... humanity, I guess, or whoever it's supposed to be superbenevolent towards. Anyway, it's not supposed to be superbenevolent toward virtual clones, right? so that part isn't even a contradiction.

The weakest part of this argument, to me, is that Roko's Basilisk works only against believers; it can discourage apostasy and shirking, but I don't see how it can generate new believers. Even someone predisposed to believe that human actions today might lead to superintelligent superbenevolent AI in the future, I'd think, would likely not be predisposed to believe in the additional apparatus of virtual clone torture that the Basilisk argument requires. The whole thought-experiment, it seems to me, "could hardly be consciously designed to appeal to the average unsophisticated reader." But that's just my own failure of imagination: obviously plenty of even weirder religions have successfully caught on.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: