Comic Discussion > QUESTIONABLE CONTENT

WCDT Strips 3571 to 3575 (18th to 22nd 2017)

<< < (11/26) > >>

zisraelsen:
That proposal doesn't seem that unbelievable to me. There could easily be protocols in the software to parse the movement of the foot based on what the AI's consciousness is trying to do, much like you don't need to consciously decide to move every individual muscle involved when you decide to walk somewhere. That would allow an AI to walk without intimate knowledge of their mechanics.

OldGoat:

--- Quote from: zisraelsen on 19 Sep 2017, 11:57 ---That proposal doesn't seem that unbelievable to me. There could easily be protocols in the software to parse the movement of the foot based on what the AI's consciousness is trying to do, much like you don't need to consciously decide to move every individual muscle involved when you decide to walk somewhere. That would allow an AI to walk without intimate knowledge of their mechanics.

--- End quote ---
Yup.  My friend and I both walk just fine, but I just know there's bones and stuff down there.  He, OTOH, is licensed by the state to take people's feet apart and put them back together.   For money, no less!  (He's a DPM and licensed podiatric surgeon.)  I'll ask next time I see him, but I very much doubt he thinks about what each muscle and joint is doing unless they start yelling at him about something going wrong.

Ever been awkward in a new pair of shoes?  That happens because your brain and sense of proprioception haven't reset where the ends of your toes are at, and scraping the curb the first time you park a strange car is an extension of that same thing.  Locomoting AIs need to have these functions, too.

JimC:

--- Quote from: swapna on 19 Sep 2017, 11:05 --- So, the idea you're proposing is that AI are  can use those protocols/their  bodies without consciously accessing them, like organics, or even without being _able_ to access them.
--- End quote ---
Think APIs and Object oriented programming.
I think the chassis and its components would have a lot of local processing built in, so the brain would issue command "Put fingertip at co-ordinates x.y.z", but it would be local control on the limb that selected which muscles to contract and release to obtain the desired result. And even if the hardware can't actually do it, the command could still be issued. I can imagine a 6th finger on my hand, and I can think, "wiggle that finger", but nothing happens, whereas if I think "wiggle 4th finger" then it moves.

swapna:

--- Quote from: JimC on 19 Sep 2017, 13:06 ---
--- Quote from: swapna on 19 Sep 2017, 11:05 --- So, the idea you're proposing is that AI are  can use those protocols/their  bodies without consciously accessing them, like organics, or even without being _able_ to access them.
--- End quote ---
Think APIs and Object oriented programming.
I think the chassis and its components would have a lot of local processing built in, so the brain would issue command "Put fingertip at co-ordinates x.y.z", but it would be local control on the limb that selected which muscles to contract and release to obtain the desired result. And even if the hardware can't actually do it, the command could still be issued. I can imagine a 6th finger on my hand, and I can think, "wiggle that finger", but nothing happens, whereas if I think "wiggle 4th finger" then it moves.

--- End quote ---


The idea to mask the functions under layers makes sense to me, what doesn't is that the AI wouldn't have access to them even when they needed to. Wouldn't they need to have a very detailed interface with their chassis? Run diagnostics? Read error logs? How would you know if an update worked? Is "it feels weird" really the most accurate diagnostic you can get from a sophisticated mechanical device with  highly sensitive and diverse sensors?

As for humans not having to think about our foot movements: yes, I know that's how it works for us. I am not confused about how humans know how to walk, thank you, but I was thinking about how robot bodies, especially exchangeable ones, might work within the logic of the comic. If "well, it's like it's for humans! Duh!" is the answer, that's just very boring.

Pilchard123:
Maybe "it feels weird" is all they can do on their own, but they need - for want of a better term - a debugger for more detailed analysis. A bit like how we could say "it feels weird" but may need further testing with "human debuggers" (blood tests; EEG; X-ray; lung function test, though what you might need all four of those for at once I dread to think) for a better result.

AIs might have simple tests built in. Current computers have POST, and to quote the late, great Terry Pratchett: "Most people, on waking up, accelerate through a quick panicky pre-consciousness check-up: who am I, where am I, who is he/she, good god, why am I cuddling a policeman's helmet, what happened last night?", but any further diagnostic tools may not be commonly necessary, so they aren't part of common chassis because it's extra expense and extra things to go wrong.


Also, I'm not sure the Thumblord Incident is a reliable indicator of how AIs work. It was a funny set of strips, and may have been true at the time, but Jeph has made AIs and their functions a lot more realistic since then.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version