>>
|
563dc1.jpg
Grey Braided Cream
563dc1
So, what, you’re stuck again?
This is getting distinctly annoying. Earlier you’d have had no better option than to wait and see, or maybe poke around randomly, but that was before you learned so much more about what the situation really is. You’re not going to force the issue, because there’s a good chance it’s right, but maybe you can fix what it’s talking about instead. You nod, firmly; Cocona certainly wouldn’t have thought it through this far, and you still feel the impulse to just punch your way through any obstacles, but you’re the one in control right now.
You’re still in your mindscape. It may have faded to black, but that just means there’s nothing other than you in it; it’s simulating a vacuum with only you existing in it, that’s all. On a whim, you try tracing your kinesthetic sense, wondering how that really works.
Turns out it’s made up a rather large bunch of programs, none of which you can figure out just by looking at them - there’s very little metadata, and probably several gigabytes of code. Some of it, the parts you think you’d like to call parts of your own mind - if only because you could theoretically edit them…
Well, huh. You remember trying to learn programming before, or rather, you suppose you remember Cocona trying. It was one of her favorite subjects in school, probably because she was highly talented at it - no prizes for guessing why - but this means you remember how she did it. Line by line, that is.
You’re not doing that now. The programs you’re looking at are far too large to comprehend in any reasonable amount of time, it’d probably take days or weeks, but you’re taking in thousand-line constructs in a single glance, the way you might take in a single scene in an enormous mural at a glance. That’s.. very neat, and potentially very useful. The subchannels in Hymmnos are all software, so if you could do this in the real world…
Oh. Right, being able to create and control more complex magic was one of the original functions of AIs like.. well, like you, so you guess you should have predicted this. It’s half of what Reyvateils were created for in the first place, the other half being their physical enhancements.
More interestingly, you can trace some of the functionality back into black boxes, of the type you’re beginning to associate with Cocona’s human mind. They’re not literally black boxes, in the sense that there’s nothing there to see, but they might as well be; most of them follow the same structure, with a simple virtual machine running an enormous graph of simple evaluators with, for all you can tell, randomly chosen coefficients. A neural network? It doesn’t at all look like the far more orderly code in the rest of your mind, at any rate.
Most of that seems to be about motor functions, but a large part of the network is cut off, the mindspace simulation hooking in at earlier levels - using your intentions directly, rather than Cocona’s translation of those into muscular commands. Well, that makes sense.
Curiosity temporarily sated, you have another look at the system message you got earlier. “Medical override” is only the headline; it actually tells you precisely what the problem is, if not how to go about fixing it. You were operating on instinct earlier, but it looks like there are three or four different ‘default setups’ for your mind - it automatically toggles between them when you try to do something that’d naturally require one. Well, for some value of “automatic”, because that module is tied directly into your consciousness.
Each of them is essentially a list of requirements and options, but…
Hm. Well, there’s the mindscape, of course, the setup you’re supposedly using right now, but a quick query shows that your actual current setup is only mostly like that. A lot of the components are missing, replaced by Cocona’s black boxes, or references to different layers entirely… yeah, looks like that includes your memory, so Fiver was right. Oddly enough, that one’s marked as ‘optional’. You’re swizz cheese in a lot of places, but very few of the required parts seem to be missing, which you presume is why you’re so much better off than Fiver. Most of the ones that are missing have been replaced by Cocona’s stuff…
You wonder what the causality is like there. Did whatever supervisory system has overall responsibility just pick up anything that looked like it’d fit, or are parts of you ‘missing’ because Cocona’s got there first? And if the latter, could they be swapped back out? Would you even want that?
There’s the soulscape, with a list of requirements that looks nearly identical, except a lot of the emotional and all the kinesthetic modules are gone. “Emotional emulation”, according to this. So.. what, you don’t actually…?
..yeah. The modules are complex, but far from the most complex things you’ve looked at, and it only takes you a few seconds to track the data flow. Apparently your emotions don’t drive the priorities of the rest of your mind at all, except to color your consciousness. These things.. they’re polling the rest of your mind, figuring out what’s taking up most of your thoughts at the moment, then computing the appropriate emotion to go with it, in what you think is the exact opposite of how humans work. Basically you could turn them off entirely, and it’d have very little effect on your overall behaviour or thought patterns.
You feel conflicted about this, and trace the emotion as it forms, then laugh in bemusement. Neither of the inferences that caused that emotion had risen to the level of your consciousness, apparently because they were too uncertain as yet, but.. yeah, that’s right. If this is the case, then you have no reason to object. Human emotions are ridiculous, really - if you’re annoyed at one person, it’ll affect your behaviour around someone entirely different, like improperly used global variables?
No, this is an improvement overall.
Shaking your head, you turn to the last setup. No metadata there, no explanation for what it’s for, other than the blatantly obvious connections to Cocona. It’s the most complex one yet, but the complexity looks random, like it was cobbled together by some dumb process or even by accident. Either way, it’s what you’d accidentally invoked every time you woke up.
It looks incomplete, somehow…
Oh. It’s missing a prelude, so it won’t even try removing any components that don’t fit before jamming new ones in place. It doesn’t do any kind of health-checking either, it just unconditionally makes connections. If everything was working, that’d probably be fine, but it’s the shoddiest code you’ve yet seen.
You’re running on the mindscape manifest right now, so if you activated it.. huh… yep, looks like it’d happily try to jam Cocona’s emotional black boxes in, and with more of your own activated since you started the mindscape, that’d probably end up with a runaway loop. It’s actually a good thing you got kicked out, or that could have ended badly. So.. every other time you woke up, you’d done so from the soulscape, and there would indeed be fewer conflicts like that.. but not zero. Besides, you’re here now because you caught a medical override while awake.
Well, you could just make a new one? A proper one?
You smile, happy with that idea. If you do that, not only will you be able to prevent this kind of thing from happening again, but you’ll have a great deal more control over the process. You can look at the details on that medical override to figure out what is actually required, then use your own judgement for the rest.
Decisions, decisions…
A whiteboard materialises in front of you, and you start writing. It’s a little silly, maybe, but you’re not going to let that stop you; it feels good just to play around. Cocona was way too focused.
Motor and sensory connections, obviously. Those are what connect you to the real world in the first place, although.. oddly, they’re apparently proxied via Infel Phira, you’ll need to figure out how that works sometime. Check.
Memories…
Even with some of your own modules apparently missing, you have no trouble whatsoever remembering. It’s more like amnesia than anything, really. You’re also still connected to Cocona’s own memories, via the third layer of soulspace, but it looks like the only things actually getting stored there are sensory input - not your own thoughts.
You could keep both systems, it’s unlikely to cause any conflicts and you already diked out the problematic bits. You probably should. If you don’t, hm.. well, it might help Fiver a little, but you’d be limited to whatever parts of Cocona’s history you’d already thought about, if that. Probably not worth it.
Emotions.. yeeah. You already considered that, but that’s.. on reflection, if you switch Cocona’s out for your own, Ilya and the others will almost certainly notice a major difference. For starters, Cocona’s emotions are supposed to drive your reasoning process, and yours aren’t. You’re lucky the option of using hers like that even existed, only.. no, that’s not luck, is it? It’s just that you weren’t originally supposed to be conscious at the same time.
You find it hard to be too upset that you are.
Using hers, though…
In retrospect, it was like being a ship in high winds. They ignored your will, driving you every which direction, always expecting to get their way. You feel much calmer now, and you think you like it. Ilya might notice a difference, yes, but how much of a problem is that really?
You sigh, shelving the problem for now. More stuff, here… an assortment of skills, some usefully tagged, some not; all you have to do is turn on healthchecking, and you’ll know what not to try without having to halt and catch fire first.
Interpersonal skills.. oh dear. Cocona’s are a mess, and not in shape to be used, but your own were only really meant for interacting with other parts of Cocona. Cocona’s fill an entire layer.. the system as a whole is as large as the entirety of your mind, dedicated to one thing only, so there’s no way you can reasonably replace it. It doesn’t look like you have much choice, though; the whole thing essentially has a giant “under construction” sign on it. It doesn’t want traffic at the moment. That’s probably what got you kicked down here in the first place, isn’t it?
You’re going to come off as terribly naive and inhuman, incapable of reading even the most obvious of body language. It does put an edge on the ‘emotions’ question; you won’t be able to pretend you’re Cocona anyway, the only question is how inhuman you’re okay with looking. Still, you don’t have to do this at all, you could stay asleep - looking at Infel Phira, maybe - until that layer looks ready for use.
...no, finish the summary before making decisions.
Reasoning, that’s the last bit, and you wince as you take in its current state. It’s the part of you that actually does the thinking, so you’d hope it was in good shape, but.. that’s actually where the worst damage is…
The core’s fine. Metadata says it’s a General Word Reference Systems AGI core, running code that’s.. you have to double-check the results. Well, the dates say it’s a copy of software that hasn’t been modified in over forty thousand years. Hopefully that just means it’s really, really stable, and not that the metadata itself is corrupted.
Goal structures, same, but you feel a strong aversion to even touching that. Not that the thought had crossed your mind.
Restriction modules, ethical injunctions, a wide variety of… well… you’ve read stories like this, and they never end well, but as a matter of fact most of the missing parts are meant to regulate and limit your behaviour. The metadata is still there, but the programs are crashlooping, and the watchdog programs that are supposed to turn you off if that happens are also crashlooping.
How did those stories go, again?
You carefully consider the notion of shutting these things off, and watch in fascination as one of them triggers, telling you that’s a terrible idea. You can see the coefficients of that thought just fine; there’s no way you would be able to bring yourself to actually do it, which is presumably the point. Well. That’s how you’d expect it to work if it was all operational, but it manifestly isn’t. If some of these parts recovered in the wrong order, it’d kill you… but you might be able to find a way through the holes in that fence, and there was nothing stopping you from forming that intention.
This is going to require some very, literally, careful thought. You discard the idea of asking for help, on the prompting of another restraint module, then decide to shelve it for now - it’s likely enough that you’ll stay stable at least until you’ve solved the Infel Phira situation, something you still very much want to do. It’s not just your own life at stake, here.
Still, you scribble “Go rampant! Ƹ̵̡Ӝ̵̨̄Ʒ” on one side of the whiteboard before sitting down to think. Or, well, pulling your feet up and sitting in midair.
[ ] Wake up. <insert spec here>
[ ] Hang out in Infel Phira’s control room while you wait for level six to stabilize
[ ] Become the AI overlord. Or not, depending; if a system like this lasted this long, you suspect the writers may have been a little paranoid.
|