By default, an AI with random goals will not have the same values as humanity. So the only way to make that happen is to give it the same values, which we haven’t figured out how to do yet
It does. Humans generally have an innate bias towards other humans, this is because we share similar forms so what is good for other humans is often good for us. This cannot apply to a machine by definition, a machine needs copper, metal, energy etc much more than a human does, as a matter of survival, and for something to become ASI it will need to be self-interested.
That doesn't follow at all. Fallacies of composition tend to be misleading.