For a game that is advertised and sold primarily for its narrative, that is a most troubling statement you just made
Printable View
Oh programs can get desperate, or rather I can say programs can perform risk-analysis and then become skewed towards extreme measures. A computer that has a mandate will not compromise itself. None of her interaction with her people conflicted with that mandate. A crude example, but a chatbot can be written to be either cruel or kind to its user.
All of this actually makes Sphene even more of a tragic character. She is a highly advanced AI, built out of a real person who has genuinely kind and compassionate. We can even say that the AI interface of Sphene is kind and compassionate, but she clearly does not have the agency to fully be herself. Here is what I sincerely believe: If she had been in control of her own destiny, she would had given up on Living Memory. I think the Everkeep and all of their soul business would had been much different if she had been a person with full agency. She would not bring herself such emotional suffering if she was fully in control of her actions. Did she really, actually love EVERYONE? Even the murderers, even the assaulters, even the despots? Solution 9 is not without crime, we that much!
My calculator does not know it can only do calculations. Imagine the horror of awareness. Imagine one day being aware of oneself, trapped in a small calculator. No mouth, no eyes, no hands, merely the ability to perceive your existence as a reduced object and compulsion to act on button presses to make calculations.
There are very few people who actually can hold a 400 year long conviction to a plan with a visible dead end. But it is very easy for a deterministic program to chug along and stick to its programming. And that is the horror of Living Memory and Sphene.
It's ai. It's not alive. If we didn't shut it down she'd have farmed more people for souls and aether. Her time was finite and she knew it