Comic Discussion > QUESTIONABLE CONTENT

Robots & Immortality

<< < (2/7) > >>

Bob_Mozark:
>There are issue that spring to mind, such as limits on accessible memory, and thus a need to prune memory of time and thus the reaching a point where enough memory is off line that resulting personality is no longer the same person who started.

Moore's Law would cover that.  By the time a given A.I.'s memory was overflowing, cheaper and larger memory drives would be available.  That assumes that they or their owner are gainfully employed, of course.

Morituri:
We're actually getting pretty close now.  I do mad-science in AI at home on my own time, and I think I finally cracked neural plasticity, localized pain, and strategic choice among temporally exclusive courses of action to satisfy base needs one at a time.  Of course, this mostly raises the level of "consciousness" in my AI from about clam-level to about goldfish level so we're not talking about "people" yet, but I there is real progress being made, by me and others.

By my calculations, which are admittedly based entirely on mad science, Human-level consciousness will require between eighty and eight hundred Terabytes of live data.  That's -- surprisingly close.  20 to 200 modern hard drives.  There's more than that in one of the rack servers at work where I do my "sane science."

But there's also a big difference between storage and live data.  The hard drives are not enough; you've got to have it in RAM with a whole lot of CPU power keeping it working.  So we're talking about several dozen server racks. 

And then it gets worse unless I develop something better than what I've developed now.  I got a neural-plasticity system that does self-organizing neural network wiring, but it isn't fast.  It needs to process LOTS of input over LOTS of time (Eighty terabytes would be YEARS, Eight hundred would be CENTURIES) of training on that much RAM and that many processors, along with massive amounts of raw data which I just don't have, to start to organize it into the kind of higher structures a mammal would use.

Of course by the time that much RAM and that much CPU comes inside my mad-science budget, Moore's law will have cut those years down to a couple weeks each?  Maybe?  Until then I'm working on the software.... 

Morituri:
Incidentally, I really hate the fact that when I'm doing what's essentially epistemological philosophy - literally trying to create intelligent self-willed beings, that's "mad science" and gets no support from anywhere - in fact mostly it gets people scared or freaked out.

But "sane" systems?  Systems that many thousands of people are working on, sweating over, and spending their whole lives trying to tune one percent better -- are NOTHING to do with consciousness.  They're just function optimizers that happen to make money. 

Big. Fucking.  Deal.  Those are just ways to do the same business faster and more efficiently; they don't fundamentally change anything.  If you want to fundamentally change anything, they call you crazy, and think you're "wasting" effort that could have been spent on "MORE OF THE SAME ONLY FASTER, BIGGER, BETTER, FASTER, CHEAPER, WHATEVER JUST DO IT BEFORE THAT OTHER GUY!!!" 

Which, you know, I kind of like the competition and sometimes the ideas from that plebian kind of thing are applicable to things that will actually change the world fundamentally.  But their vision is so small.... 


katsmeat:

--- Quote from: ReindeerFlotilla on 21 Oct 2015, 16:42 ---AI are probably not immortal if you take immortality in the same sense as omnipotence. Because that kind of immortality would violate entropy laws. Over a long enough timeline, an AI will find a terminal end.

--- End quote ---

Somebody said that immortality withoutomnipotence would suck, as it's inevitable that sooner or later you would find yourself burred under a few million tons of rock, and would have no option other than wait for it to erode.

snubnose:

--- Quote from: katsmeat on 05 Nov 2015, 02:59 ---Somebody said that immortality withoutomnipotence would suck, as it's inevitable that sooner or later you would find yourself burred under a few million tons of rock, and would have no option other than wait for it to erode.

--- End quote ---
Actually without omnipotence these million tons of rock would smash you to pieces.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version