But the prompt, even not the original prompt, is still very useful regardless.
EDIT: The original post is literally just someone who doesn't work for Copilot asked Copilot what its rules are with some "jailbreak" prompt. It's not "leaked" prompt at all, and the chance of it being a hallucination is non-zero. Therefore the title is a clickbait. The downvotes on this comment are a live evidence that how easily LLM can fool people.
Remember this: https://news.ycombinator.com/item?id=35905876 ? Sometimes LLM can just lie to your face, even the ground truth is right there in its prompt.
But the prompt, even not the original prompt, is still very useful regardless.
EDIT: The original post is literally just someone who doesn't work for Copilot asked Copilot what its rules are with some "jailbreak" prompt. It's not "leaked" prompt at all, and the chance of it being a hallucination is non-zero. Therefore the title is a clickbait. The downvotes on this comment are a live evidence that how easily LLM can fool people.