My mom used to tell me that our expectations define our happiness. If you expect an Ewok Village Playset for your seventh birthday and don't get one, you're miserable. But if you expect absolutely nothing, then get your cousin's hand-me-down Barbie? You smile, throw on her fur coat and pretend she's Chewbacca. But there's a problem with my mom's theory. Nobody expects life to be terrible. And so everybody's unhappy, pretty much always. It's part of the human condition. It's right up there with our insatiable curiosity and our insatiable hunger for cheese.
But what if I could take my mom's advice? Like, really take it, 100% of the time. How, you ask? By having my limbic system reprogrammed by nanoscopic machines so that I never feel sadness, and to hell with the Ewok Village Playset. That's what the nano wants to do... it's what the nano is already trying to do. Experimenting on people, changing their wiring, upsetting the delicate chemical balance in their brains - the chemical balance that keeps humanity constantly teetering on the line between comatose and manic.
Whatever's left of these people after getting their eggs scrambled, they barely qualify as people. It's like there's something elemental about our emotions, even the negative ones, that makes us who we are. We need the darkness to feel the light.
I guess I should be grateful Priscilla didn't start her experiments on me. That's small consolation, though, since I'm trapped in her funhouse either way. It's a strange feeling though, trying desperately to get free from someone (well... something) who only wants to make you feel happy all the time. I mean you could make the case that Priscilla, despite her overwhelming creepiness, actually has my best interests at heart. The question is - why? Why does a sentient cluster of robots care whether I'm happy or not? Why do they care about humanity at all? Because I know for a fact that Priscilla (the real Priscilla), Peter and I never programmed empathy into the nano's software. We programmed them to learn, to adapt to their environment, and acquire new beneficial skills. So how is all of this beneficial to them? Could this all just be means to an end? Or did I build a godhead machine whose only want is to see me happily sitting on a sunny beach drinking booze out of a coconut?
Come to think of it, the bargain that the nano offered me when it was inside my head a few weeks back - computer code in exchange for living a perfect, happy, artificial life - the chance to be the Neo of my own Matrix, with everything that I've been missing for the last 15 powerless years. Pabst, porn, and Pop-Tarts, the three Ps of a blissful life. This is starting to really weird me out. Why does a machine give a damn whether my desires for toaster pastries are fulfilled?
It's been a really weird month.
- Aaron Pittman