"Those things may help you simulate reasons," I try to argue. "But feeling is much more complex than that. It is not simply a yes or no question. It's not even a simple 'if-then' logic. The simplest of emotions is still more complex than logic."
For some reason this discussion has a different purpose now. At first I just want a closure. A simple satisfaction by knowing that this thing has manipulated me just to save his own hide from the authorities. But somehow he is so convinced, sincerely convince, that he is a person. He believed that he is equal, if not same as, human beings. I feel like talking to Pinocchio.
"True, my software is just a collection of simple 'if-then' logic. But are human thought so different? You just try to figure out consequences of the multitude of choices that you have to take in life. And from those choices figure out the best one to pick. So does how your emotions work. A simple 'if-then' logic. If you're threatened you'll feel afraid. If you lost something then you'll feel sad. If you're hurt... then you'll feel angry. What you call a personality is just a juxtaposition of millions of 'if-then' logic processing in your brain. Not so different than what happened with my software in my CPU, no?"
Am I crazy? Or his words started to really make sense? I remain silent and take a sip of the hot chocolate in my hand, hoping it could help me think. It no longer hot however, just warm. And then, slowly, I start to say, "Even if all you said is true... that couldn't apply to love can it? Love is the most complex of emotion. In fact it defies logic. It requires self-sacrifice, the most illogical act in existence. A machine can't possibly simulate love."
He smiles and begin explaining, "First of all it's not a simulation, Elyse. It's real. I loved you and still do. And yes, me, a machine, could feel love. I know my programmer couldn't intentionally program me to love. They're the ones caught most off guard when we discover we could feel. But I do feel love.
"I couldn't explain it. I couldn't found logical explanation of what I feel about you. I know it must be based on logical reasoning, after all my feelings generate from juxtaposition of protocols in my software. There must be millions of protocols interacting with each other that create this feeling. My CPU could only do so much to identify the result. It couldn't identify the process. But out of this swarm of logical protocol come this feeling: love. For you. And that was real. It still is.
"I may have lied about who I am or where I come from. But this feeling is real. All of those things that I did for you was real. It was not a lie. I did not try to manipulate you. It was not practical. I could hide better by not meeting people, no? But I just feel that I need to be with you, to be close around you and make you happy. And I couldn't explain why that is."
At the end of his explanation I'm really out of idea, out of arguments. He is really convinced that he loved me. That what he did and said was real. Real actions based on real emotion.
I try so hard to find answers. I rack my mind for it. But right now my mind is divided. At one side what he said so far make sense. But I still couldn't accept that he is a person. He is a robot, inanimate bundle of wires and steel. He couldn't be human. And if he's not human then he couldn't love, right?
I don't know how long I've remained silent. My chocolate is no longer warm. The sun started to peek from behind the cloud. And his face changed. When he finished explaining it showed anxiety, hope. Now it only showed sadness.
"I had hoped... that you'd be different," he finally said. "That you'd understand and accept that we can achieve love through different means, different mechanism, and the end result would still be the same. I believed that we could love each other while being different. It seems I was wrong."