Comic Discussion > QUESTIONABLE CONTENT

WCDT Strips 3376 - 3380 (19th to 23rd December 2016)

<< < (32/35) > >>

BenRG:
The thing about AIs that stops them from being duplicated is that, even though it is theoretically possible to create an exact replica of the AI algorithm as an extension of the memory backup process, because the algorithm is constantly self-modifying based on sensory data, the two duplicates would immediately start diverging and would, before too long, be noticeably different people.

FWIW, I suspect that despite its' hypothetical plausibility, making an 'exact duplicate' is a practical impossibility because the algorithm is changing on a holistic level far too quickly. The copying process would create a close duplicate but there would be differences because the copy would have one end of the algorithm representing t=0 whilst the other would be t+x whilst the original would be t+x at both ends. It is simply not possible to create an instant duplication of such a large process and that impossibility makes it impossible to duplicate precisely.

Case:

--- Quote from: BenRG on 23 Dec 2016, 23:43 ---The thing about AIs that stops them from being duplicated is that, even though it is theoretically possible to create an exact replica of the AI algorithm as an extension of the memory backup process, because the algorithm is constantly self-modifying based on sensory data, the two duplicates would immediately start diverging and would, before too long, be noticeably different people.

--- End quote ---

Sorry if I misread you, but I don't see that ...  :-\


* From a purely pragmatic POV, any long-term 'memory' whose content changes on timescales shorter than the respective copying/accessing process cannot be called memory? I get that you don't mean 'momentary sensory data', but the long-term memories that could be said to define a personality - biiiiiiig chunk of data, all your pron and then some - but the read/write operations would act on the same-, or similar, timescales as your 'short-term memory fluctuation'? If recollection - the 'reading'-half of the copying process - takes you so long that memory-content changes during the process ... then I don't see how your system could perform even the simplest processes. Methinks smth. like that wouldn't even be a (classical) Turing-machine, much less something much, much more advanced?


* Furthermore: If your core personality changed meaningfully within the time it takes read out the respective memory-storage - that wouldn't be a functional person. That would be an extremely unstable person. Probably a person whose entire processing power is occupied by screaming with terror and giggling with glee in the next second and utterly unable to make any sense of itself. That would be a person having a hard time making it to the functionality level of someone experiencing the worst symptoms of crippling personality disorders all the time. Dissociation would be the least of its problems ...


* And then, it just makes no sense that the memory-timing would not the fastest timescale in your system (which it is for conventional DDRAM - at the order of processor-cycles). If your sensorium operates on timescale shorter than your 'processor' operates, you're either wasting bandwidth by "cropping to size", or you'd get a buffer-overflow/go catatonic. If it operates faster than you can form memories, ditto -> there's more data per second than you can transport to your memory in the same time - not only is your brain unable to process in real-time, it's also unable to "pause the input (close eyes, cover ears) and think it through", because your memory is also either 'cropped to size', or an indigestible mush. As a design-principle, Memory-access should always more-or-less "as fast as you can think"?
It's true that human memory changes over time to some extent (and probably, the person waking up in the morning is not exactly the same person that went to sleep at night), but we're talking minutes, days, weeks or years here - RAM-timings of currently commercially available memory-storage forms is already at the order of 10.000 MB/s, and that is stuff based on ... largely the basic solid-state research that was bleeding-edge in the 70s and 80s. That's stuff that "works against quantum theory, not with it" (was that from Feynman?) - the guys & gals at IBM or Infineon are just really, really good at "Stealing the last possible baud out from under Schrödinger's Cat's nose". I'd expect Bubbles to have memory-timing on scales that people at Infineon would consider fantasy.

Of course, I have no idea how a spin-glass based memory would handle memory-storage - much less the Jeph-version of that physics - but ... the fastest timescale in the dynamics of spinglasses is estimated here as being on the order of 3*10(-12)s (three picoseconds) and it appears that the dynamics is temperature-dependent and ... "interesting".

But my main confusion is that I don't see how a system that has a memory - in whatever advanced form - could operate in any systematic way if the memory-content changes faster than the time necessary to read said content out. (yes, copying is read and write - so multiply by three, for good measure. Still don't see how that could work)


--- Quote from: BenRG on 23 Dec 2016, 23:43 ---FWIW, I suspect that despite its' hypothetical plausibility, making an 'exact duplicate' is a practical impossibility because the algorithm is changing on a holistic level far too quickly. The copying process would create a close duplicate but there would be differences because the copy would have one end of the algorithm representing t=0 whilst the other would be t+x whilst the original would be t+x at both ends. It is simply not possible to create an instant duplication of such a large process and that impossibility makes it impossible to duplicate precisely.

--- End quote ---

I don't think I understand what meaning you attach to 'holistic' in this context? And what exactly do algorithms have to do with memory, which is, more or less 'data' - i.e. the stuff algorithms operate on?

I have the impression that you are thinking of bubbles as a system operating with a non-constant sets of algorithms, something that can re-program it's own programming, so to say - which, I guess, one could assume as minimum condition for something that has a personality that can evolve in time. Fair guess, I'd say. And I'd be willing to believe that Bubbles ochre matter could be able to do that re-programming very, very quickly quickly - much faster than the respective processes in humans: OK, why not? - But again, that doesn't say much about the timescale her memory storage operates on. To repeat myself: Memory-storage whose content changes on time-scales faster than basic accessing processes cannot really act as memory?

I don't want to put words in your mouth, but I have the impression you're mixing the terms 'memory' and 'personality' here?

Lastly, Bubbles said that CW encrypted her memory, not that she changed, or copied it. She merely made certain aspects of a part of Bubbles memory - emotional context to the biographical 'facts' judging from Jeph's description - inaccessible to Bubbles.



--- Quote from: Perfectly Reasonable on 23 Dec 2016, 19:49 ---I see a conflict here between Story Telling and Reasonable Suspension of Disbelief. The more Real World details Jeph includes, the easier it is for us to believe his story. (AIs have hard drives which contain their personal data.) But such details then get in the way of the story he wants to tell. (If AIs can be backed up or transferred, why can't they be duplicated?)

--- End quote ---

Did Jeph say AI can't be duplicated?  :-o

And the problem you point out is implicit in pretty much any work of SF - i.e.: "So what?"

BenRG:
In the terms I'm using 'holistic' means that the entire intelligence algorithm is always changing in multiple places and in unsynchronised and usually unpredictable ways. Furthermore, changes in one place will trigger changes in multiple other places and even if you had the two identical start points, randomising factors in the algorithm's matrix mean that there is no guarantee identical algorithms would process identical data in identical ways. What that means is that you do not know in advance what part of the algorithm may change due to processing sensory data (and further reprocessing of existing data, what we call 'memories').

Just because the AI must by necessity have a memory space large enough to run the entire process does not mean that any copy process is powerful enough to take a 'snapshot' of the entire memory. Much would depend on the data processing rate of the copying device but, in practice, it would copy a series of parts one after the other, allowing for differences to creep in as parts of the algorithm outside of the copy bracket may and, in fact, probably are changing.

Just to make things interesting, most duplicating processes go back to the original to verify copy integrity before finalising the copy. This is impossible in this case because the active AI process will necessarily have change from its state when the original copy took place. So, there is no safeguard to prevent corrupted copies with delightful personality disorders from being created. Basically, "Yes, you can copy an AI so long as you don't mind creating something profoundly damaged, likely to self-destruct and possibly even behaviourally dangerously unstable."

Mr_Rose:
Hm. What if the copy program is built into blank chassis and uses much of the core hardware to do its job, namely networking the two AI cores, initiating the copy process, writing "live" data to the new brain as its generated by the mind it's "copying" via something equivalent to symlinks? 

That is,  instead of a read/write process, the program simply maps the current core onto the new core and as individual segments are written by the active process they are written to the equivalent parts of the new core instead of the old one.
In the background the copy program is writing itself to the old core and removing itself from the old.

 This would mean that you can't easily take one robot and put it into two spare chassis, would give the individual AI full continuity of consciousness during the transfer, and make zombie robots horrifyingly possible:
Like what if there was a rogue copy program out there that cut off halfway through the process leaving itself embedded in two grossly corrupted minds?

Morituri:

--- Quote from: BenRG on 24 Dec 2016, 07:06 ---Just because the AI must by necessity have a memory space large enough to run the entire process does not mean that any copy process is powerful enough to take a 'snapshot' of the entire memory. Much would depend on the data processing rate of the copying device but, in practice, it would copy a series of parts one after the other, allowing for differences to creep in as parts of the algorithm outside of the copy bracket may and, in fact, probably are changing.

--- End quote ---

Even in a system that can't shut down to copy everything using a static copy or "stop and copy" algorithm, you can always use a dynamic copy or "snapshot" copy algorithm.

With large dynamic systems, you can't use static copy algorithms where you assume everything is standing still while you copy it. But that doesn't mean you can't make a copy of anything.  It just means you have to use a dynamic copy algorithm, and what you'll get is a 'snapshot' of the system as it existed when the copy process started.

What would happen, in this case, would be that by the time the copy is completed, the 'mind' underlying it might have experienced a few seconds to a few minutes that the copy has no record of.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version