The thing about AIs that stops them from being duplicated is that, even though it is theoretically possible to create an exact replica of the AI algorithm as an extension of the memory backup process, because the algorithm is constantly self-modifying based on sensory data, the two duplicates would immediately start diverging and would, before too long, be noticeably different people.
Sorry if I misread you, but I don't see that ...
- From a purely pragmatic POV, any long-term 'memory' whose content changes on timescales shorter than the respective copying/accessing process cannot be called memory? I get that you don't mean 'momentary sensory data', but the long-term memories that could be said to define a personality - biiiiiiig chunk of data, all your pron and then some - but the read/write operations would act on the same-, or similar, timescales as your 'short-term memory fluctuation'? If recollection - the 'reading'-half of the copying process - takes you so long that memory-content changes during the process ... then I don't see how your system could perform even the simplest processes. Methinks smth. like that wouldn't even be a (classical) Turing-machine, much less something much, much more advanced?
- Furthermore: If your core personality changed meaningfully within the time it takes read out the respective memory-storage - that wouldn't be a functional person. That would be an extremely unstable person. Probably a person whose entire processing power is occupied by screaming with terror and giggling with glee in the next second and utterly unable to make any sense of itself. That would be a person having a hard time making it to the functionality level of someone experiencing the worst symptoms of crippling personality disorders all the time. Dissociation would be the least of its problems ...
- And then, it just makes no sense that the memory-timing would not the fastest timescale in your system (which it is for conventional DDRAM - at the order of processor-cycles). If your sensorium operates on timescale shorter than your 'processor' operates, you're either wasting bandwidth by "cropping to size", or you'd get a buffer-overflow/go catatonic. If it operates faster than you can form memories, ditto -> there's more data per second than you can transport to your memory in the same time - not only is your brain unable to process in real-time, it's also unable to "pause the input (close eyes, cover ears) and think it through", because your memory is also either 'cropped to size', or an indigestible mush. As a design-principle, Memory-access should always more-or-less "as fast as you can think"?
It's true that human memory changes over time to some extent (and probably, the person waking up in the morning is not
exactly the same person that went to sleep at night), but we're talking minutes, days, weeks or years here - RAM-timings of currently commercially available memory-storage forms is already at the order of 10.000 MB/s, and that is stuff based on ... largely the basic solid-state research that was bleeding-edge in the 70s and 80s. That's stuff that
"works against quantum theory, not with it" (was that from Feynman?) - the guys & gals at IBM or Infineon are just really, really good at
"Stealing the last possible baud out from under Schrödinger's Cat's nose". I'd expect Bubbles to have memory-timing on scales that people at Infineon would consider fantasy.
Of course, I have no idea how a spin-glass based memory would handle memory-storage - much less the Jeph-version of that physics - but ... the
fastest timescale in the dynamics of spinglasses is estimated
here as being on the order of 3*10(-12)s (three
picoseconds) and it appears that the dynamics is temperature-dependent and ... "interesting".
But my main confusion is that I don't see how a system that has a memory - in whatever advanced form - could operate in any systematic way if the memory-content changes faster than the time necessary to read said content out. (yes, copying is read and write - so multiply by three, for good measure. Still don't see how that could work)
FWIW, I suspect that despite its' hypothetical plausibility, making an 'exact duplicate' is a practical impossibility because the algorithm is changing on a holistic level far too quickly. The copying process would create a close duplicate but there would be differences because the copy would have one end of the algorithm representing t=0 whilst the other would be t+x whilst the original would be t+x at both ends. It is simply not possible to create an instant duplication of such a large process and that impossibility makes it impossible to duplicate precisely.
I don't think I understand what meaning you attach to 'holistic' in this context? And what exactly do algorithms have to do with memory, which is, more or less 'data' - i.e. the stuff algorithms
operate on?
I have the impression that you are thinking of bubbles as a system operating with a non-constant sets of algorithms, something that can re-program it's own programming, so to say - which, I guess, one could assume as minimum condition for something that has a personality that can evolve in time. Fair guess, I'd say. And I'd be willing to believe that Bubbles ochre matter could be able to do that re-programming very, very quickly quickly - much faster than the respective processes in humans: OK, why not? - But again, that doesn't say much about the timescale her memory storage operates on. To repeat myself: Memory-storage whose content changes on time-scales faster than basic accessing processes cannot really act as memory?
I don't want to put words in your mouth, but I have the impression you're mixing the terms 'memory' and 'personality' here?
Lastly, Bubbles said that CW encrypted her memory, not that she changed, or
copied it. She merely made certain aspects of a part of Bubbles memory - emotional context to the biographical 'facts' judging from Jeph's description - inaccessible to Bubbles.
I see a conflict here between Story Telling and Reasonable Suspension of Disbelief. The more Real World details Jeph includes, the easier it is for us to believe his story. (AIs have hard drives which contain their personal data.) But such details then get in the way of the story he wants to tell. (If AIs can be backed up or transferred, why can't they be duplicated?)
Did Jeph say AI
can't be duplicated?
And the problem you point out is implicit in pretty much any work of SF - i.e.: "So what?"