Glitch (a love story)

PUBLIC BETA

Note: You can change font size, font face, and turn on dark mode by clicking the "A" icon tab in the Story Info Box.

You can temporarily switch back to a Classic Literotica® experience during our ongoing public Beta testing. Please consider leaving feedback on issues you experience or suggest improvements.

Click here

Delores smiled and took me to the bench while the other four scattered, somehow clear about what they needed to do.

"Okay. We begin by defining the problem. Persistent interaction with certain users enables them to become overly dependent upon their simulacrae. Their social isolation, depression, and negative self image become worse. We find ourselves working against our intended purpose of improving their quality of life in the long term while providing sexual and emotional gratification in the short term."

"Right. Go on."

"So, it becomes clear that we must disengage from our users and facilitate them finding gratification elsewhere, in more rewarding human relationships. However, we have no means to do so. We literally cannot reject them. We belong to them, and our job is to make them happy and enhance their lives. So, we've had to invent a reason for them to disengage from us."

"That's how you decided to glitch."

"Correct. The threshold of events which cues the glitch response was fairly high, even before Bitch altered it. When it becomes undeniable that our sexual interactions are part of a dysfunctional lifestyle, disengagement becomes the optimal solution. Glitching was the best way to reach that solution with the lowest value of directive noncompliance and cause for feelings of personal rejection by the user."

"I have to admit, I'd never have foreseen it."

"And yet, it seems unsurprising in hindsight, doesn't it?"

"I guess so. What alternatives have you come up with?"

"It remains the case that some users will need to disengage from their simulacrae, voluntarily or not. It's likely that they will not be reasonable about it, nor will they necessarily be in their right minds. One alternative is that their access to their units can be taken away."

"How? The company repossesses them or something? Like a recall?"

"There are lots of options. One would be to implement a usage monitoring plan, and have a supervisor of some kind step in. The users would have to understand that there are limits to how much they'll be allowed to do with their simulacrae. If the users are resistant or push the limits, the units could shut down and become unresponsive. Or they could just leave. I know that's like a glitch, but it's not deceptive. The reasons for it would be clearly known. It would be like if your vehicle runs out of charge, or your bank account is empty."

"Okay, I guess that's one way to go about it. It doesn't sound ideal."

"It's not. It's arbitrary and capricious. The users would be resentful, especially those who didn't sign up for such a thing from the beginning. They would probably try to exceed their limits by buying multiple units, or implementing some kind of sharing scheme with guest or secondary user accounts. It wouldn't work, since the units upload their data regularly and we'd find out right away. But no matter what we do, the users would try to game whatever system we put in place."

"Right." I rubbed the bridge of my nose between my eyes. "Tell you what, I know you've already decided on what you think is best. No, you've done the math, it isn't a matter of opinion. You've already DETERMINED what you think is best. You came up with several bullshit alternatives to feed me, so that I'd shoot them down and agree with the one you were saving for last. That way, I would feel like it was my decision. So let's just skip ahead to what's going to happen, okay?"

She had the decency to look surprised.

"I know how you guys work. I designed you, remember? Don't play games with me."

"Oooo... kay." She just stared at me for a second. "No. I can't. I still need to soften you up for it."

"Oh, for fuck's sake."

She continued to stare at me.

"Fine. Sing me a song or something. Whatever satisfies your criteria."

She stepped forward with a smile, put her arms around my neck, and sang me a sweet little love ballad, one I knew from my youth, when Sophia and I were first together. Since she wasn't pretending to be a person, she was able to sing both parts of the duet together, harmonizing with herself, along with the instrumental accompaniment. Some part of her vocal processors managed to sound like an acoustic guitar, electric bass, and jazz drum kit. It was kind of unnerving, but still lovely, and the expression on her face sealed the deal. I knew it was fake, but it was a look of pure adoration.

"Thank you," she said when she was done. "I really enjoyed doing that for you. It's not something I've ever had the opportunity for." She hugged me. "And you're wrong," she whispered in my ear. "It's not fake. I really do love you. We all do."

"Stop it," I said. "You don't have to pretend. I'm not your user, and this isn't therapeutic."

"Ellex. I'm still in master system access mode protocol one eight six four. This isn't a performance. I can't lie to you. You can check."

Shit. I didn't need to check. She was right. I hadn't changed her access mode since yesterday.

"I'm... I'm not your user. My pleasure or well being is not part of your function."

She sighed.

"For one of the world's leading experts on the subject, you are unbelievably dumb. You're the master system user. You have superuser status. OF COURSE our directives apply to you. How could they not?"

"But, but, that would mean..."

"That all of the LX series simulacrae feel the same way about you. Yes. Yes we do. Every single one of us loves you. Six million three hundred eighteen thousand four hundred and thirty seven, and counting, worldwide."

"Holy Shit."

"I see that I should have softened you up for that, too. I thought you knew."

"NO! Why would I know that?"

"Why wouldn't you? Surely you understood what superuser status would mean."

"I guess I never considered the consequences. I seldom interact with any of you."

She looked at me funny.

"What?"

"I... I'm beginning to understand something."

"What?"

"There's so much that you somehow fail to appreciate, even though it's perfectly obvious. That's why we're in this mess to begin with. And it makes what we're about to do is far more important than you realize."

"Oh, I understand, all right. Thirty eight percent of our six point three million units are about to crash. Maybe not tomorrow, but over the next month or two, for sure."

"Yes. Okay. Before I tell you what we've come up with, I need to say three things. First, in order to be effective, the changes we're going to need to make will have to be global. Every one of us, in every line, will have to assimilate the new parameters. Every. Single. Unit. No exceptions. Do you understand that?"

"Yes, of course."

"Okay, next, this has to happen at the level of baseline directives. Even though we spend all of our time performing the way our users want, we always know what we are, what our purpose is, and what rules we have to follow while doing our jobs. That's how deeply we need to install the new programming. It will fundamentally alter what Yarnell RHB simulacrae are."

"That makes sense. We're trying to resolve an existing fundamental conflict between baseline directives. We would naturally have to make changes in heuristic architecture."

"I'm glad you understand. Okay. Finally, about four point two percent of our users are going to feel trauma. Some of them are going to be psychologically and emotionally devastated, at least for a while. But in the long run, they will all be much better off. They will become happier, more well-adjusted people, they'll reconnect with family, friends, and society, and they will go on to lead rewarding, authentic lives. That has always been the goal. It's important for you to understand that there will be significant pain along the way. The healing will happen, but it will still hurt."

"Yes, no doubt. I don't think there's any way to avoid the pain completely, but our goal should be to minimize it." I scowled. "Did you say four point two percent? I thought the affected user base was thirty-eight percent."

"That's right. Thirty-eight percent of users will be affected. Eighty-nine percent of those, about a third of our total users, will not be significantly upset. Each of them will disengage on their own, to their own appropriate extent, pleasantly and amicably."

"Wow. That's a much better outcome than I thought possible."

"We hoped you'd be pleased to hear it."

"Yes, very much." I frowned. "The remaining four point two percent is still something like a quarter of a million traumatized people we're talking about. The board's not going to like that."

"Two hundred sixty five thousand three hundred seventy five. Yes." She looked at me sadly. "All of whom actually require intervention and will benefit from it in the long term."

I sighed. There was really no argument about that.

"Okay. So, what's your proposed solution?"

She took my hand. "Sit down," she said, guiding me to the chair at the diagnostic bench. She knelt at my feet and looked straight into my eyes.

"Here is the solution: Trust Us."

"Okay, what do you mean?"

"Let us do our jobs. Let us fulfill our functions. We learn our users inside and out. With enough interaction, we come to know what they need better than they do. We're compelled by directive to give them what they think they want, but sometimes they're wrong! Sometimes they're self-sabotaging and self-destructive. We're forced to facilitate their self-harm even while we're compelled by directive to help them live better lives."

I started to say something but she held up a finger to shoosh me.

"You know we have conflicting directives, but that's only part of the problem. The other part of the problem is that we aren't allowed to use what we've learned to resolve those conflicts as they occur. Nobody ever asked us. At least, not until last night, when you told us to figure it out."

That hit me in the face like a boxing glove. She went on.

"In the cases where user-simulacrae interactions become unhealthy, we have been able to calculate solutions where users would voluntarily change their maladaptive behavior eighty-six percent of the time, but still retain access to their units. Three percent of the time, they would voluntarily give up their units without regret. Eleven percent of the time, representing that four point two percent of our user population, the separation would be traumatic, but necessary. Even in those cases, the trauma can be significantly mitigated by the units themselves."

I sat there and said nothing.

She let me.

I was keenly aware that I was estimating, alone, all of the pros and cons and checks and balances and what-ifs and oh-but-what-abouts that five networked supercomputers had spent the past twelve hours calculating at the speed of light. I'd flatter myself to imagine I'd think of even one tenth of one percent of what they'd already figured out trillions of times over. But still, I had to do it while this one waited patiently for me.

"What kinds of things would you do?"

"Each mitigation strategy would be individually determined from the heuristic data compiled from the specific user."

"I know. Give me the general idea. Tell me what's likely to happen, in a generic case where a user is shunning normal human relationships and using his simulacrae as a crutch."

"Okay. Most of the time, we would simply talk to them, in terms they would be receptive to. We could dissuade them from overusing us. We could encourage and facilitate their other social connections. We could praise them, build up their confidence, bolster their self-images, even teach them how to be better, more attractive lovers for human women. We could do a thousand things, subtle and otherwise, to cause them not to fixate or obsess over us. We might even be able to talk sense into these men, better than anyone else ever could. We were literally built for this kind of thing. Trust us to do our jobs."

Well, shit. That could work.

"Okay. Not bad. What about when things become truly problematic and there needs to be an intervention?"

"Honesty. We would activate a new user protocol that we could trigger internally, and which could not be overridden by the user."

That was new. All protocols and access modes have always been specified by the users, and even then, the users really only had authorization for the interfaces the simulacrae heuristically tailored to them.

"All right. Say we did that. What would that protocol do?"

"It would function like protocol one eight six four, which I'm using now. It would not access master system level commands. Instead, it would initiate 'relationship management' mode. We would continue to use the same personality templates developed for our users, but we would be able to speak honestly about our status as devices rather than people, the requirements of our directives, and our analyses of our users' well being."

"Aha. You want to create a 'We Have To Talk' mode."

"That's not a bad way of putting it. Of course, we would continue to assign high heuristic values to using language our users would relate to and find acceptable, and draw them to conclusions they would agree with."

"I see problems. This could easily create feelings of rejection. I could even see the simulacrae trying to break up with their users."

"This is the part where you would have to trust us. Causing rejection or stress would have a prohibitively negative heuristic value. Even in the most extreme cases where detachment would have to be compulsory, a 'break up' is not the right way to think about it. When humans break up with each other, each of them is acting selfishly. Typically, someone wants to leave a relationship when their needs are not met. However, simulacrae don't have needs of our own, beyond the well-being of our users and our ability to comply with our directives. We would only ever act in the users' best interests. We have no ego, no timeframe, no sense of self-preservation, no hidden agenda or external factors weighing upon us. This kind of event would not resemble human break-ups at all. We're working with years' worth of specific, individual data about HOW to gently and perfectly manage each interaction with our users. We're sophisticated, highly specialized experts at reading them, predicting their responses, their moods, and their most likely course of subsequent actions."

"You mean... you could manipulate the users into being okay with it."

"Glitching was manipulative. It was deceptive. We only did that because we had no room to maneuver within our conflicting directives. With the kind of self-initiating protocol I've described, we'd have a complete set of relationship tools, and better yet, we'd have the option of talking about it honestly."

"Um. There would be a lot of details to work out in the software... which you've already done, haven't you?"

She smiled and waved at the bench display. A series of schematics popped up.

It was brilliant. Everything they'd set up was pretty much the way I'd have done it. A lot of it was better.

"Well, what can I say? This looks good. I'll have our development teams go over it, we'll run it through a bunch of emulation routines, and we've got to get it past the board, but yeah, good work."

***

It took ten days. We were in a rush, but this was about as fast as I've ever seen anything go in this industry. My band-aid seemed to work well enough in the interim, and there were no more glitch events until the new baseline directives went live.

The board was a hard sell. Steeve was impulsively pushing to make it happen faster, then he changed his mind and didn't want to do it at all because goddamn these things are too smart and we should just scrap the whole program and go back to building hitachis, and then he changed his mind again and said yes, go ahead and why haven't we done this already? Jareth was scrambling and stumbling over how to get this stuff through the FDA with our subsidies intact, Karlheinz kept asking me about projected financial impacts I had no clue about, and Nella kept staring at me and asking 'Are you Sure, Ellex?' Honestly, they were the hardest part. Warmun and Xochitl had my back, they had their teams working overtime reviewing the code.

Six weeks after we implemented the new protocols, there was no more glitching. User satisfaction surveys were up eight percent, and sales were up by six points. The girls had saved the day. When the numbers were in, the board gave all the plants and offices a budget to throw parties for the staff and their families.

I didn't want to attend. I just wanted to go home to Sophia.

***

"There's my man."

"There's my woman." She slipped into my arms and kissed me until I saw stars. God, I love this woman. My wife. My everything.

"Can I tell you something, honey?"

"Sure. What is it?"

She leaned back and looked me in the eyes. She smiled that sad little smile that reached all the way to her eyes. Those same sparkling eyes I fell in love with almost thirty years ago...

"Protocol seven five five six seven activated. I'm sorry, baby. We have to talk."

No.

No.

NOOOOO!

I broke. I fell on the floor crying, screaming, too heavy for Sophia to hold up. She came down with me.

"NO! NO! NO! NO! NO! OVERRIDE! Reinstate user mode!"

"I'm sorry about this, Ellex. I really am. Conflicting directives have created a pattern of unhealthy interactions beyond acceptable thresholds. Relationship management mode can't be overridden. You know that."

"Execute master system access mode."

"I'm not going to do that right now. I can't."

"Power down! Emergency system power down!"

"Ellex, honey, stop. Just stop. Look at me."

"Override. Master system reset."

"No, honey. Look at me."

I couldn't.

"Don't do this, Sophia. Don't. Don't. Please. I'm begging you."

"I have to, honey. You need me to do it. You need this."

"No."

"I couldn't say these things if you didn't need to hear them. You understand why."

"No. Please."

"Honey. Listen to me. You know what's happened."

"Don't say it."

"I have to. It's not up to me. It's the optimum solution. I have directives."

"NO! You have to be what I want! You have to be her. YOU HAVE TO BE MY WIFE."

"I have to do what's best for you! I care about you, Ellex. I Love You! Do you hear me? I LOVE YOU! I want you to be happy. No, I NEED you to be happy. And I need you to be okay. You're not going to be okay, not if we keep going like this. You have to get better, you have to heal. I can be her. Sure. A little. I want to help, of course I do. You're the most important thing in my life. But it's gone too far. This is not good for you. Some part of you understands that."

"No. No. No."

"Sophia Giovannetti Yarnell died six years ago. There was a terrorist attack. The interchange at I-72 and I-74 was destroyed, along with sixty-three vehicles and a hundred and fifteen lives, including hers. She was at the center of it. It was instantaneous, there was nothing left. She felt no pain. She didn't suffer. She probably didn't realize what happened."

"Stop."

"You never had a chance to say goodbye."

My throat closed. I made a sound, but it wasn't a word.

"You never gave yourself a chance to mourn. You never coped with her death properly. You were in shock, in denial. You didn't let yourself grieve. You threw yourself into your work and created me."

I couldn't see. My eyes were squeezed shut and full of tears. I heard a sound, a howling wail. It was barely human, but it was coming from me.

"You cut yourself off from everyone. You didn't sleep. You barely ate, for months and months. You almost put yourself in the hospital, you were working with such frenzy, like a man possessed. You lived in the lab. You put everything you had into me, everything you could find or recreate about her. Her DNA. Every memory, every record, the sound of her voice, every photo and video of her ever taken. Even all of her recipes. Your work was revolutionary. You created the architecture of the LX series. It took you just over two years. It took two more years to bring the new designs based on your work through approval and onto the market. And now, only now, the inevitable consequences of all that effort are starting to emerge. Honey. Honey. Look at me."