That is quite the set of claims. First Aid is somewhat dubious -- but the salient point is that César seems both uninterested and unwilling to attack either himself or Vika out of hand, or enslave them to his will as Vika believed he would. "I appreciate your candor," he says calmly. "May I ask what caused ZAG-RS to reach that state? It sounds as though you designed her to be altruistic, but with a capacity for self-learning; it's strange that things would go so very wrong with her programming."
Welcome to Generator Rex, a cartoon series that took special care to hit all the super hero story highlights while providing commentary on them.
"I don't know." César answers truthfully. "... as crazy as this sounds, I built a lab capable of going nearly the speed of light. As I was gunning the engines to get away from the explosion that spread nanites across the globe, they were affected. In the fifteen minutes it took to reboot the system, I lost five years. It's conjecture, but she might have deleted her altruism in order to complete her primary mission. It would've... gotten in the way."
After a moment, he adds. "Recovering her data would be incredibly risky, and picking her apart would be immoral. It's another reason I deleted her. Providence, the military unit I worked for that also had additional private funding, would've done just that to her if she didn't manage to escape. They would've had access to memories I couldn't allow them to have. Providence was partially funded by the very same people I had to protect the world from, and they would've known I was a threat to their quest for power."
"...Wow. That's really...my people haven't had functional relativistic laboratories in over six million years -- they were all destroyed in the war," First Aid explains, with a hint of shame in his voice. "That's pretty amazing." Even the fact that César accidentally lost five years doesn't make it any less impressive, at least not to First Aid; Wheeljack used to tell stories about lab accidents way more catastrophic than that, so the Cybertronian race doesn't really have room to criticize.
"So, to summarize," he continues, "you were cut off from your creation while she was still new, and when you returned you discovered that her programming had gone awry and she'd become a threat to Earth's biosphere. In order to protect Earth's future, and in particular to ensure that the technology behind her creation wouldn't fall into the wrong hands, you chose to delete her. Is that about right?"
First Aid tilts his head slightly to one side, studying César's reaction. "Do you regret it?" he asks gently.
There's a ghost of a smile at the admiration. But it's not the time for that.
"That's the summary. Adding the facts she wasn't sentient when I lost contact, and she could escape again to become a threat once more. As for regretting..." César sighs, nodding. "I regret it had to be done. I wish... I could've prevented her development from going so catastrophically wrong."
With a noise of frustration, he breaks his hands apart so they can curl into fists. "She had the capacity my adult mind or greater without the sense of morality my parents instilled into me as a child. If it wasn't for my moral code, I'd be the threat."
"You believe she would have been different, if you'd been there to guide her development." First Aid nods slightly. "It sounds like...a terrible, horrible tragedy to me. And I'm so sorry that Vika killed you for it. I don't believe that you deserved that, César."
"Only a possibility. More likely, at least. She would've been surrounded by good people considering my next workplace, and I could've sent her with my coworkers when they left." César suspects Alpha would've been different had Van Kleiss and Black Knight not been fuckers, because in many aspects, Alpha had been trying before everything went suddenly south. "... Thank you. And I understand what happened to Vika was terrible, although not what. She didn't see me. She saw what happened to her."
He closes his eyes and just... strokes the kitten who has curled up on his lap. "Thank god it wasn't permanent. I'm no used to people dead."
no subject
Date: 2024-02-07 06:36 pm (UTC)no subject
Date: 2024-02-07 06:49 pm (UTC)"I don't know." César answers truthfully. "... as crazy as this sounds, I built a lab capable of going nearly the speed of light. As I was gunning the engines to get away from the explosion that spread nanites across the globe, they were affected. In the fifteen minutes it took to reboot the system, I lost five years. It's conjecture, but she might have deleted her altruism in order to complete her primary mission. It would've... gotten in the way."
After a moment, he adds. "Recovering her data would be incredibly risky, and picking her apart would be immoral. It's another reason I deleted her. Providence, the military unit I worked for that also had additional private funding, would've done just that to her if she didn't manage to escape. They would've had access to memories I couldn't allow them to have. Providence was partially funded by the very same people I had to protect the world from, and they would've known I was a threat to their quest for power."
no subject
Date: 2024-02-09 02:49 am (UTC)"So, to summarize," he continues, "you were cut off from your creation while she was still new, and when you returned you discovered that her programming had gone awry and she'd become a threat to Earth's biosphere. In order to protect Earth's future, and in particular to ensure that the technology behind her creation wouldn't fall into the wrong hands, you chose to delete her. Is that about right?"
First Aid tilts his head slightly to one side, studying César's reaction. "Do you regret it?" he asks gently.
no subject
Date: 2024-02-09 05:03 am (UTC)"That's the summary. Adding the facts she wasn't sentient when I lost contact, and she could escape again to become a threat once more. As for regretting..." César sighs, nodding. "I regret it had to be done. I wish... I could've prevented her development from going so catastrophically wrong."
With a noise of frustration, he breaks his hands apart so they can curl into fists. "She had the capacity my adult mind or greater without the sense of morality my parents instilled into me as a child. If it wasn't for my moral code, I'd be the threat."
no subject
Date: 2024-02-10 06:18 am (UTC)no subject
Date: 2024-02-10 07:14 am (UTC)He closes his eyes and just... strokes the kitten who has curled up on his lap. "Thank god it wasn't permanent. I'm no used to people dead."