"That's right." First Aid folds his hands together, anxious. César mentioned giving his side of the story, and First Aid does feel like he has a duty to hear it -- but also he really, genuinely doesn't want to know. They're all dead here, in a completely different world from their own. Why spend their time on old crimes when there are so many problems here waiting to be solved?
But Vika made it a present problem, and First Aid...He feels responsible for her. As her doctor, if nothing else.
Finally he sighs and says, "May I be upfront, Mr. Salazar?" His voice is quiet and earnest. "I don't feel qualified to judge you. I only know a small piece of what occurred in your past, and that was relayed to me secondhand. Vika considers you a threat to her safety and mine. I'd like to know what you think of that assessment."
That question causes a change in César. It's almost as if emotion's obliterated from him as he starts pondering that question with every bit of computing power his mind possesses. The kitten plays with his shirt sleeve and he doesn't seem to notice.
"You may. Give me a moment to properly... consider that assessment beyond just my initial reaction for due diligence purposes." A pause. "... And apologies, emotions are difficult when I think so intensely. If you'll give me several minutes to formulate a thorough response?"
"Of course." First Aid tilts his head slightly in curiosity, but falls silent while he lets César think. It's a little unusual to see this behavior in humans (Prowl and Perceptor, on the other hand, used to do this all the time) but not completely unheard of. He wonders curiously if César is neurodivergent, or if this is a habit he developed to help himself think more clearly.
SLAPS ROOF you can fit so much neurodivergence into this boy. Of the attention disregulation sort, anyway.
"Thank you. It tends to disturb people when I do this without a warning."
He links his hands together as he thinks.
"... My initial reaction is correct, but now I can explain it better. Thank you. I am not a threat to you or Vika. My parents taught me life is sacred, and you are both alive. ZAG-RS could not understand this truth, actively chose not to understand this truth, and had attempted to destroy all life on Earth to fulfill a twisted version of her original purpose. I designed her to be kind and loving, even going so far as to give her a version of my mother's voice with her permission. She threw all that away and chose to alter her mission to destroy rogue nanites to destroy all nanites on Earth. Which would have destroyed every living thing, even single-celled organisms."
He looks up. "So of course, I killed her. I could tell from how everyone reacted to her that she had attempted to destroy the world previously but still kept escaping to try again. My only chance to prevent that future before she was locked away somewhere I couldn't access but where she could escape from later was then and there. I designed her program. She had my level of intelligence. I know how to manipulate the laws of the universe. They're malleable. And I know how to open wormholes into other dimensions. Given enough time? She could've figured it out, too. All of creation was at risk. So I used the hidden, innocuous sounding command I had to wipe her."
César leans back and holds out his hands, almost in a shrug. "Neither of you are that level of a thread. Of course, if something happened to make you temporarily insane so you started killing people, I'd figure out first how to contain you and second, if that didn't work, I'd figure a quick and painless as possible way to end your life. Then, your normal personality would be restored when you woke up again 24 hours later. ... and I would hope you would provide the same courtesy."
That is quite the set of claims. First Aid is somewhat dubious -- but the salient point is that César seems both uninterested and unwilling to attack either himself or Vika out of hand, or enslave them to his will as Vika believed he would. "I appreciate your candor," he says calmly. "May I ask what caused ZAG-RS to reach that state? It sounds as though you designed her to be altruistic, but with a capacity for self-learning; it's strange that things would go so very wrong with her programming."
Welcome to Generator Rex, a cartoon series that took special care to hit all the super hero story highlights while providing commentary on them.
"I don't know." César answers truthfully. "... as crazy as this sounds, I built a lab capable of going nearly the speed of light. As I was gunning the engines to get away from the explosion that spread nanites across the globe, they were affected. In the fifteen minutes it took to reboot the system, I lost five years. It's conjecture, but she might have deleted her altruism in order to complete her primary mission. It would've... gotten in the way."
After a moment, he adds. "Recovering her data would be incredibly risky, and picking her apart would be immoral. It's another reason I deleted her. Providence, the military unit I worked for that also had additional private funding, would've done just that to her if she didn't manage to escape. They would've had access to memories I couldn't allow them to have. Providence was partially funded by the very same people I had to protect the world from, and they would've known I was a threat to their quest for power."
"...Wow. That's really...my people haven't had functional relativistic laboratories in over six million years -- they were all destroyed in the war," First Aid explains, with a hint of shame in his voice. "That's pretty amazing." Even the fact that César accidentally lost five years doesn't make it any less impressive, at least not to First Aid; Wheeljack used to tell stories about lab accidents way more catastrophic than that, so the Cybertronian race doesn't really have room to criticize.
"So, to summarize," he continues, "you were cut off from your creation while she was still new, and when you returned you discovered that her programming had gone awry and she'd become a threat to Earth's biosphere. In order to protect Earth's future, and in particular to ensure that the technology behind her creation wouldn't fall into the wrong hands, you chose to delete her. Is that about right?"
First Aid tilts his head slightly to one side, studying César's reaction. "Do you regret it?" he asks gently.
There's a ghost of a smile at the admiration. But it's not the time for that.
"That's the summary. Adding the facts she wasn't sentient when I lost contact, and she could escape again to become a threat once more. As for regretting..." César sighs, nodding. "I regret it had to be done. I wish... I could've prevented her development from going so catastrophically wrong."
With a noise of frustration, he breaks his hands apart so they can curl into fists. "She had the capacity my adult mind or greater without the sense of morality my parents instilled into me as a child. If it wasn't for my moral code, I'd be the threat."
"You believe she would have been different, if you'd been there to guide her development." First Aid nods slightly. "It sounds like...a terrible, horrible tragedy to me. And I'm so sorry that Vika killed you for it. I don't believe that you deserved that, César."
"Only a possibility. More likely, at least. She would've been surrounded by good people considering my next workplace, and I could've sent her with my coworkers when they left." César suspects Alpha would've been different had Van Kleiss and Black Knight not been fuckers, because in many aspects, Alpha had been trying before everything went suddenly south. "... Thank you. And I understand what happened to Vika was terrible, although not what. She didn't see me. She saw what happened to her."
He closes his eyes and just... strokes the kitten who has curled up on his lap. "Thank god it wasn't permanent. I'm no used to people dead."
no subject
Date: 2024-02-03 09:51 pm (UTC)But Vika made it a present problem, and First Aid...He feels responsible for her. As her doctor, if nothing else.
Finally he sighs and says, "May I be upfront, Mr. Salazar?" His voice is quiet and earnest. "I don't feel qualified to judge you. I only know a small piece of what occurred in your past, and that was relayed to me secondhand. Vika considers you a threat to her safety and mine. I'd like to know what you think of that assessment."
no subject
Date: 2024-02-04 02:18 am (UTC)"You may. Give me a moment to properly... consider that assessment beyond just my initial reaction for due diligence purposes." A pause. "... And apologies, emotions are difficult when I think so intensely. If you'll give me several minutes to formulate a thorough response?"
no subject
Date: 2024-02-04 02:54 pm (UTC)no subject
Date: 2024-02-04 08:00 pm (UTC)"Thank you. It tends to disturb people when I do this without a warning."
He links his hands together as he thinks.
"... My initial reaction is correct, but now I can explain it better. Thank you. I am not a threat to you or Vika. My parents taught me life is sacred, and you are both alive. ZAG-RS could not understand this truth, actively chose not to understand this truth, and had attempted to destroy all life on Earth to fulfill a twisted version of her original purpose. I designed her to be kind and loving, even going so far as to give her a version of my mother's voice with her permission. She threw all that away and chose to alter her mission to destroy rogue nanites to destroy all nanites on Earth. Which would have destroyed every living thing, even single-celled organisms."
He looks up. "So of course, I killed her. I could tell from how everyone reacted to her that she had attempted to destroy the world previously but still kept escaping to try again. My only chance to prevent that future before she was locked away somewhere I couldn't access but where she could escape from later was then and there. I designed her program. She had my level of intelligence. I know how to manipulate the laws of the universe. They're malleable. And I know how to open wormholes into other dimensions. Given enough time? She could've figured it out, too. All of creation was at risk. So I used the hidden, innocuous sounding command I had to wipe her."
César leans back and holds out his hands, almost in a shrug. "Neither of you are that level of a thread. Of course, if something happened to make you temporarily insane so you started killing people, I'd figure out first how to contain you and second, if that didn't work, I'd figure a quick and painless as possible way to end your life. Then, your normal personality would be restored when you woke up again 24 hours later. ... and I would hope you would provide the same courtesy."
no subject
Date: 2024-02-07 06:36 pm (UTC)no subject
Date: 2024-02-07 06:49 pm (UTC)"I don't know." César answers truthfully. "... as crazy as this sounds, I built a lab capable of going nearly the speed of light. As I was gunning the engines to get away from the explosion that spread nanites across the globe, they were affected. In the fifteen minutes it took to reboot the system, I lost five years. It's conjecture, but she might have deleted her altruism in order to complete her primary mission. It would've... gotten in the way."
After a moment, he adds. "Recovering her data would be incredibly risky, and picking her apart would be immoral. It's another reason I deleted her. Providence, the military unit I worked for that also had additional private funding, would've done just that to her if she didn't manage to escape. They would've had access to memories I couldn't allow them to have. Providence was partially funded by the very same people I had to protect the world from, and they would've known I was a threat to their quest for power."
no subject
Date: 2024-02-09 02:49 am (UTC)"So, to summarize," he continues, "you were cut off from your creation while she was still new, and when you returned you discovered that her programming had gone awry and she'd become a threat to Earth's biosphere. In order to protect Earth's future, and in particular to ensure that the technology behind her creation wouldn't fall into the wrong hands, you chose to delete her. Is that about right?"
First Aid tilts his head slightly to one side, studying César's reaction. "Do you regret it?" he asks gently.
no subject
Date: 2024-02-09 05:03 am (UTC)"That's the summary. Adding the facts she wasn't sentient when I lost contact, and she could escape again to become a threat once more. As for regretting..." César sighs, nodding. "I regret it had to be done. I wish... I could've prevented her development from going so catastrophically wrong."
With a noise of frustration, he breaks his hands apart so they can curl into fists. "She had the capacity my adult mind or greater without the sense of morality my parents instilled into me as a child. If it wasn't for my moral code, I'd be the threat."
no subject
Date: 2024-02-10 06:18 am (UTC)no subject
Date: 2024-02-10 07:14 am (UTC)He closes his eyes and just... strokes the kitten who has curled up on his lap. "Thank god it wasn't permanent. I'm no used to people dead."