>>
|
52ddcd.jpg
Bright Jade Sea
52ddcd
Ok, Penn, put on your professional face because this system is messy, and there's a lot that you can contribute as an AI scientist to improve it - and, hopefully, make things a little better for the neumono.
First, comment on memory erasure. When working with experimental AI, you often want to erase their memory of a test so that you can run it again with slight variables and see their fresh response. However, memory erasure is very tricky! If you just erase the direct memory files, there'll be lots of leftovers. Links in other systems that refer to now-absent files, notes that the AI left to help itself attached to certain programs, shortcuts made to make repeat tasks easier, and so on. AIs learn and grow, and a smart AI tries to make things easier and more efficient for itself, leading to unique permutations that a standard memory sweep won't catch - the longer and more often the AI has been put through cycles, the more often these build up. And that's just artificial memories! Organic memories are practically made of cross-referencing. Memories are hooked on particular senses, particular emotions, particular actions, all tangled up with each other. If a neumono, say, learned to ride a bike from their uncle, then erasing their memory of their uncle might erase a chunk of their memory of how to ride a bike. Or, if a neumono experiences emotional trauma, and you try to remove the traumatic event, you might remove part of the mind's coping methods, and eventually end up with a neumono who can't handle emotional pain at all. So it's hard to believe that complete memory erasures are possible without creating useless neumono - even if it was, you'd expect it to take longer and be more involved. Ask him, are you sure the predator is really erasing the memories? Or is it just leaving the memories in place and locking away the connection that would cause those memories to be actively recalled, which would be much easier and quicker, but leave the possibility of memories resurfacing later? In fact, how can you know that the predator understands what you really want done, instead of just doing a patch surface job that looks good enough for you to keep feeding it? You can often have a similar problem with AIs - they might learn what makes you happy with them, without learning what you actually want.
Second: security. AIs, of course, have to be protected from being hacked. Which is a problem, because as a researcher, you also want to still have easy access to their innards yourself! You can be tempted to use just one big security measure that you know how to get past, but that leaves you with a single point of failure. If it comes down, all your work could be ruined - and, of course, the more you rely on it, the more you put your security to the test, the faster and more likely it is that it'll come down. In this case, your single point of failure is this predator. What if Four Square gets sick, and can't do his job for a week or two? Besides which - what if the neumono can adapt to resist him somehow? What if the process leaves some sort of mental scar tissue that makes part of their mind tougher and less flexible? Or contrariwise, what if being broken and rebuilt repeatedly leaves them weak-willed in general? Perhaps that's why Korrili felt so compelled to answer your questions, because she's been conditioned to be responsive. The whole operation could be in danger from things like that. Tell him, you hope there are failsafes so that the hive won't collapse if your predator can't work for a while. If not, relying on him as little as possible seems wise until you have something in place. You should try reduce the mental strain on the neumono until then - and afterwards, to limit how often you have to use predator control at all.
And finally, this isn't something specifically from studying AIs, but an AI scientists needs a basic grounding in psychology. Ask him, don't you think that, as non-neumono, we would naturally be tempted to focus on what makes them alien, their empathic connection? It's so unique, so strange, so interesting. Which means we might forget that a hive is also supposed to be friends and family to each other in the same way as we are in our social units, with all the corresponding mental bits and pieces that have nothing to do with empathy. If the neumono have an empathic link, but not the other factors... well, the natural response is confusion and hurt, but the presence of the empathic connection will make them feel an obligation to suppress their feelings of something being wrong. Obviously you clear up that sense of wrongness when it compels them to come here, but they have to become aware of it for that to happen. If they push it into their subconscious by themselves, it might build up without you being alerted, thus leaving you unable to do something about it before it's too late. Telling them to be happy won't solve it - organic minds can easily feel two conflicting emotions at the same time. Tell him that, for example, earlier today, Korrili was approached by a hivemate who wanted sexual release. She said no but he pressed his attentions anyway. Now, she only refused because she was being considerate to me being right there, but what if it had been something more traumatic for her than that? None of what you just had the predator do would have helped her with it; it would have just made her not think about it. You didn't ask her what's specifically disturbed her recently or take any measure to straighten out the kinks such events would have put in her mind. That kind of stress will build up over time. To use a metaphor, the predator is blocking the holes when the water tank springs a leak, but the pressure is still building up. Either take more time with the neumono to clear their minds when they come down here, or take steps to reduce the amount of emotional and mental strain they're picking up. If you don't, one of them's going to have a costly psychotic episode, sooner or later.
|