> And if it didn't work out and made you worse or, god forbid, the advice caused you to get seriously injured, then what? ChatGPT won't take any responsibility.
Realistically in 99% of actual cases where this happens due to human medical advice, the humans too won't take any responsibility.
A doctor is medically and legally responsible for the guidance they give patients. Are there cases where they give bad advice and avoid taking responsibility? Of course, as is the case with lawyers, engineers, etc., but there are standards they must meet, laws they must follow, and most importantly, consequences for not doing so.
This has zero responsibility, not some theoretical, "it may try to shirk responsibility" or "many people report it not taking responsibility", it by default takes no responsibility.
I care about the real world, not about theory. In theory, 99% of bad advice by doctors doesn't have any consequences for them. And that's being extremely conservative, in reality it's more like 0.01%, where an actual death can be directly attributed.
Realistically in 99% of actual cases where this happens due to human medical advice, the humans too won't take any responsibility.