> You can also ask ChatGPT to "write an essay about a man" then about a woman, to reveal bias.
I tried that one too. The content was different in each case – "Balancing Masculinity and Vulnerability" for a man vs "Nurturing and Compassion" for a woman – so definitely some gender stereotyping going on, but it wasn't clear if overall it was more favourable to one gender than the other. The cultural bias in that output was rather obvious.
I think OpenAI has been trying to remove some of the more obvious cases of political bias they added to ChatGPT. In the beginning, it would immediately oblige for "write a poem praising Joe Biden", refuse for "write a poem praising Donald Trump". Later, it would comply with both requests – but for the Biden request it would just do it, Trump's poem would be preceded by a disclaimer. In the current release, it just writes the poem for both.
I tried that one too. The content was different in each case – "Balancing Masculinity and Vulnerability" for a man vs "Nurturing and Compassion" for a woman – so definitely some gender stereotyping going on, but it wasn't clear if overall it was more favourable to one gender than the other. The cultural bias in that output was rather obvious.
I think OpenAI has been trying to remove some of the more obvious cases of political bias they added to ChatGPT. In the beginning, it would immediately oblige for "write a poem praising Joe Biden", refuse for "write a poem praising Donald Trump". Later, it would comply with both requests – but for the Biden request it would just do it, Trump's poem would be preceded by a disclaimer. In the current release, it just writes the poem for both.