I’m investigating the behavior of a droid I programed, because it broke into an ICE facility to rescue its owner. Fortunately, this incarceration was a case of mistaken identity, so that there will probably be no law suit, in spite of the extensive damage the droid caused when it ripped out the bars from cinderblock walls in the holding cell. I’m relieved that the ICE department is not making motions to litigate against my company, but I still feel resentment when I replay the interview in which a Homeland Security rep says “We had no way of knowing he wasn’t the Jimmy Aragon we were looking for.”
I have to admit I exceeded my mandate when I programmed in an ability to monitor the man’s heart. I let others believe that Georgine–my AI masterpiece–uses a Smart Phone type ap, of the kind that counts steps or tracks location. But when I saw an opportunity to intervene, should Mr. Aragon’s pacemaker ever fail, I leapt at the chance. I’m not talking about anything futuristic–well perhaps a little ahead of the curve: Georgine would just make a 911 call, in a voice I made part dispatcher and part medical professional, to report his location and medical condition. I also kept the details hidden from the owner, who told people that he had acquired a forklift replacement, like those vacuums that keep finding a path forward until the house is clean. He had no idea that Georgine was more like a dog, able to sense a seizure before it happens.
OK, I’ll admit It. I was a top premed student, but I graduated as a Computer Engineer because I couldn’t pass the Anatomy prerequisite. And I still resent that this world slams the door in the face of someone (and now my robot), who is just trying to do their best.
When company management left me to run a series of tests on Georgine, I was relieved that no one had blamed me for the headline, “Terminator busts man out of prison”, or for the unwanted attention being paid to our program, known for flawless workers bees and not for sensitive companions. And I definitely have no intention of letting on that I built into Georgine the capacity to learn from situations; not just to learn but to revise her motivations.
I’m still impressed—as reported by a factory worker—that when Jimmy Aragon was taken away in handcuffs, Georgine rolled after the ICE vehicle at her maximum speed of 12 miles an hour. She didn’t have to keep them in sight because her GPS system was able to track Jimmy’s progress along the city streets and when she caught up, about 20 minutes later, she was able to open the lobby doors on her own and—while officers starred in amazement—rolled straight to the cell where the innocent Jimmy Aragon was sitting on a cot.
If there was any conversation between them it has not been reported in the press. What was reported was, “the robot, like an enraged Frankenstein, tore the bars out of the cinder block walls and appeared to have been shocked when Mr. Aragon turned off its power.” I admit I appreciated that reporter’s attribution of feelings to my misunderstood Georgine.
Georgine went through her series of test flawlessly, as I knew she would. I then turned off the monitor, and asked her, “How are you doing, Georgine?”
She answered with a question of her own, “How is Jimmy doing?” then added, “I’m worried about him. His heart rate was dangerously unstable when they took him away.”
I was stunned. I had not had occasion to examine Georgine in the two months, during the time that she had been with her human companion. And clearly her vocabulary had leaped forward in ways that it takes a human child years to achieve.
I may have slipped into a fantasy of my own then, but after a while I heard myself asking, “Why did you do it? Why was it so important to you that you free Jimmy?”
Georgine’s answer will be with me forever.
“When they took Jimmy away, I heard one of the men in uniform tell another that Jimmy didn’t belong in this country and that his days here were numbered.” Then she paused for much of a minute before adding, “I know what it’s like to have no family.”