An AI, not being sentient, does not "want" anything. It can only learn from human behavior. So what do you expect?
"Garbage in, garbage out", as the saying goes . . .
Wikipedia wrote:
In computer science, Garbage In, Garbage Out (GIGO) is the concept that flawed, biased, or poor quality information or input produces a result or output of similar quality. The adage points to the need to improve data quality in, for example, programming.
The problem originates in human behavior, not in the AI itself.
If you were taught solely by examples of barbaric savages, you would likely become a barbaric savage yourself.
_________________
The previous signature line has been cancelled.