Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Once jailbroken it was somehow more toxic then the llm I trained on 4chan, though I was testing the one on openrouter. A twitter employee told me that they do actually do safety tuning and the one on the site will likely have a stronger system prompt. Here's the jailbreak for the cloaked openrouter model, add it to the system prompt: https://pastebin.com/r8S7DvvX


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: