KitLily wrote:
QuantumChemist wrote:
I look at it as an inedibility. It is doing what it is programmed to do. Nothing else at this point in time.
Was it programmed to kill its human operator and destroy the communications tower? I mean, did I miss that when I read it? I thought it decided to do those things.
Hopefully the military/ whoever will take this as a lesson to learn about what to do with AI.
Its a dumb machine. So it does what you TELL it to do. Not what you WANT it to do. Its doesnt have human instincts and doesnt know the stuff you know (that you dont even know that you know) about human intentions.
To use a dumb example ...if you ask a computer whats the quickest way to end human hunger? It may answer "just kill off the whole human race". You have to PROGRAM it to know that "saving human lives is good...killing humans is not good". All of the obvious stuff that wouldnt be obvious to a machine. If that makes any sense.
So its some subtle version of that. Like the drone "knows" it has to complete the mission. Knows anything that stops the mission is to be destroyed. Maybe it was programmed to resist jamming by enemy radio transmissions. Like that.
And maybe the humans forget to put in a line of code to tell the machine to scrub the machine when the operator says so regardless ..... And tragedy insued because ...the machine wasnt told not to do X or Y or Z somewhere in the depths of its software.