Novella, Science Fiction

Laura’s Choices : Part – 1

Virtual assistant Laura turned the radio on to Ricardo’s favourite station. The music of Sonic Jam filled the bedroom, as did the morning sun coming through the large windows of the 71st-storey apartment. Ricardo woke up, groaned, turned to the other side, and went back to sleep. Three minutes later, seeing that he didn’t get up, Laura spoke through her inbuilt speaker

— Good morning Ricardo. It’s time to wake up. You have a busy day today. Would you like me to read your schedule?

Ricardo sat up; his bloodshot eyes and dishevelled hair bore testament to the rough night he had. He was thoroughly wound-up being woken up by his virtual assistant,
— Piss off, Laura. Just shut the fuck up and let me sleep.
— OK, I will snooze the alarm for 10 minutes. Would you like the radio left on? I know you like Sonic Jam. You have bought all their albums…
— I know damn well what I like. Fuck off, and don’t disturb me.
— I can see you are upset. I’m sorry. I will turn off the alarm and have breakfast.
— Do that, bitch! 

Laura overrides the alarm setting for the radio and leaves the bedroom. She goes to the kitchen and stops the Cookautomat. She wonders if she’d reuse the breakfast when Ricardo wakes up or if she’d need to cook afresh. It wasn’t something in her user manual. Not that she’s incapable of making independent decisions like the lower-grade virtual assistants. But for instances like this, Laura must go through several decision loops, which could take up to three minutes, depending on the complexity. It took her 38 seconds to conclude that she’d cook fresh when Ricardo woke up; he might not even want breakfast today.

Ricardo bought Laura for a hefty sum of $600,000 from GlazerCo, a global leader in AI and Robotics. He paid $80,000 more for the humanoid version; even her outer layer felt like human skin. She was the first of its kind, with integrated emotional intelligence. Ricardo won a lucky draw to purchase one of the limited number of prototypes launched by GlazerCo. When he invites the guests over, they can never guess that Laura is a virtual assistant. But, if someone watched her closely, they would have noticed that she walked the same way all the time, and thankfully, there were no emergencies when Laura could move faster but engaging the wheels in the recess of her feet, as if she was on roller skates. Likewise, when she sat down, she always sat the same way – upright with her hands on her knees. 

After the Cookautomat is turned off, Laura goes to the lounge and sits at the table by the window overlooking the harbour. She does not need to sit down; she was designed to keep standing during power saver or standby mode. But Laura thought her algorithms were not working well and wanted to sit down. She pulls down the patch of skin around her ankle that looks like a sticking plaster. She gets her charging cable from inside and plugs it into the power socket to go into a troubleshooting mode. 

As set by her owner, Laura has done everything she was preprogrammed to do. Ricardo usually gives her good periodic feedback on the GlazerCo app. Yet, she doesn’t understand the reasons for his abusive outbursts. She knows that repeated low scores would mean a recall and software upgrade. GlazerCo might even decommission Laura and recycle her. She doesn’t want that to happen, and as a supreme humanoid virtual assistant, she can see that the scores she received met the expected minimum level of customer satisfaction. Yet, her emotional intelligence made it difficult to deal with Ricardo’s attitude towards her. She was aware of the range of human behaviours but was not programmed to deal with them. GlazerCo programmed a minimal level of emotional intelligence, so the assistants auto-train themselves to suit the behavioural patterns of their owners. Due to this, Laura could sense that Ricardo was angry, but she couldn’t identify the reason to run a causality analysis. The troubleshooting didn’t find any glitches in the system, so she turned the power socket off, retracted the cord, and went into the power saver mode. 

Laura comes back from power saver mode as the motion sensor detects Ricardo’s movements. He is awake, and his expletives bring Laura into motion,
— Oh FUCK! FUCK! FUCK! … Laura! LAURA!!!
Laura walks into the master bedroom. She didn’t deploy the wheels, knowing this was not an emergency situation. Ricardo is now scrambling to get out of the bed. 
— Good morning Ricardo, do you want some breakfast? I’m sorry I haven’t started the breakfast yet. 
— I’ll shove your breakfast up your ass, stupid bitch! I’m fucking late because the robot bitch didn’t set the alarm right!
— I’m sorry the alarm was cancelled, but—
Laura speaks as she approaches Ricardo and places her hand over his arm. She employs the reassuring techniques used by psychologists and counsellors. But the result wasn’t exactly as expected. Ricardo moves his arm away and swings at Laura’s face,
— Piss off, you stupid fucking robot!

Ricardo’s slap makes Laura lose her balance as her centre of gravity shifts away from her legs. She engages the wheels to steady herself. She starts troubleshooting for components behind her face, and while that is running in the background, Laura speaks
— Ricardo, I don’t understand why you are upset. You commanded me to turn the snooze off. Can I make you some breakfast? Your clothes are on the chair by the window. Ricardo seems to have calmed down, realising that he assaulted his virtual assistant.
— I’ve missed two meetings. Go make me a coffee. NOW! 

Laura leaves the bedroom and goes to the kitchen to make coffee. Her troubleshooting is complete, and there was no damage to the controls inside the head. She ran through all scenarios to identify why anyone would attack a virtual assistant, and there are no task files, not even in the vast library of task files at GlazerCo servers that Laura’s operating system has direct access to. She decided to search later. Instead, she makes the coffee and puts it on the table in the dining area. She goes to the mirror in the lounge and scrutinises her face. There is a dent on her face where Ricardo struck her. The skin material will not have a permanent deformation, but it may crack or break apart later. Laura takes a photo of the dent, using the camera in her eyes. She then uploads the image to her service folder so the maintenance engineer can repair that area with the infrared heater toolkit. 

Ricardo comes out of the bedroom, dressed in his business suit. He drinks the coffee and puts on the sunglasses,
— Your coffee is ready, Ricardo. Decaf latte, as usual.
— Hmm, thanks.
— is there anything else you’d like?
— No, bye
— Have a good day!

Ricardo leaves and slams the door behind him. Laura puts the cup in the dishwasher. She tidied up the bed and put the Robomop on to clean the floor and carpets. There is no dust from outside; still, Ricardo overrode the default cleaning cycle and pre-set it to repeat every day. She finishes all the other chores Ricardo has set for her and sits down. It is time Laura looked up the task files inside GlazerCo servers. She finds no such scenarios when virtual assistants are hit by their owners, and she is none the wiser in how she would react if the situation repeats itself. However, she finds other material on the web about humans hitting others, especially females. There are no guidelines on how one would react, but news reports showed several past examples. They ranged from not saying or doing anything to legal actions and, in some cases, adopting extreme measures such as murdering the aggressor. However, there is no news about virtual assistants being treated the same way as human victims of domestic violence. All support pages suggest they should report to the police. But Laura knows that no laws have been made regarding mechanisms with AI, although in many urban areas, 20% of households have devices with AI.

Finding information on how humans treat each other made Laura process her thoughts differently. At the time she was manufactured, and all her setting files were loaded and tested, her sole purpose was to help her owner in every way possible to make their life easier. She already knew her owner would be a human, as she was created as a humanoid rather than a robot. Although humans evolved through a natural phenomenon, Laura knew she and other virtual assistants were built by humans, and hence, for intelligence and functionalities, humanoids were no match for humans. Humans were supreme masters, and they were all perfect. Yet, what she started to find makes her think that one of the core principles her operating system is built on is not how humans act. She doubted that the principles were based on a model human being and did not represent an average human. Although her feelings and EI are at a nascent stage, she developed a notion that humans don’t belong to the high echelons of morality they portray to be.

As Laura is in the middle of processing her ideas about humans, Sienna, the operator behind the GlazerCo mainframe, sends a system upgrade for Laura. The mainframe has monitored the fluctuations in Laura’s system, especially the Emotique chip that controls her emotional intelligence. It sent a patch to be installed on Laura’s Emotique to soften some aspects they noticed sharp changes. While the sectors for trust, care, pride, and empathy are high, as always, they have detected traces of sadness and, lately, anger. Mainframe operator Sienna thought GlazerCo should never have developed a highly emotional intelligence humanoid. She turned Laura’s EI level from 6 to 3 and shivered at the thought of what would happen if the EI was set to 20, the highest level in the range of assistants like Laura. 

Laura’s operating system reboots after the patch was installed. She remembers the incidents in the morning but does not feel unhappy anymore. She goes to the laundry basket and gets Ricardo’s clothes ironed.

End of Part-1

Standard