Comic Discussion > QUESTIONABLE CONTENT
Something bothering me a lot
Thrudd:
I'll throw in a little QC term here ..... Pareto Analysis.
Pareto Analysis is a statistical technique in decision-making used for the selection of a limited number of tasks that produce significant overall effect.
It uses the Pareto Principle (also known as the 80/20 rule) the idea that by doing 20% of the work you can generate 80% of the benefit of doing the entire job.
AI system optimizations are based on the principles of this tool.
The problem with such a tool is that you are limited to the tasks/parameters you listed in the first place.
This is where creativity and thinking outside the box would come into play using something called a Fish-bone Diagram.
Often referred to as a cause and effect diagram, or Ishikawa, it is a simple root cause analysis tool that is used for brainstorming issues and causes of particular problems or outcomes.
Unfortunately AI systems are just not capable of such analysis and a goodly number of humans are just as limited.
ckridge:
I know fuck-all about programming, so my intuitions aren't likely to be of much use here, but, just for the sake of adding ideas to the mix, I will contribute this.
It doesn't seem to me that computers' problem-solving or reasoning abilities are likely to be inferior to humans'. We have been analyzing how we reason since Aristotle invented logic, and we are pretty good at turning our various techniques into sets of rules. This quote from the first article I posted seems to me to summarize computers' primary problem.
--- Quote ---“What machines are picking up on are not facts about the world,” Batra says. “They’re facts about the dataset.”
--- End quote ---
What is astute here is the distinction between dataset and world.
It seems to me that computer development and brain evolution have proceeded along opposite courses. Computer development has been a matter of constructing data processors, and then feeding datasets into them and then, maybe, hooking them up to sensors. Brain evolution appears to have started with organisms that were very little more than sensors hooked to switches, improved those sensors till the organisms were almost perfectly connected to their environments, and then begun to add data processing ability. The computer's primary problem is often dataset deficiency. The organism's primary problem is always dataset excess, that is, so much information coming in that much or most of it has to be edited out before data processing can be possible.
This is, I surmise, the difference between having a dataset and having a world. Worlds are the unprocessably large body of information that organisms' senses deliver. Datasets are worlds edited down to processable size. Humans are inferior to computers in many respects, but they do have the advantage that if their dataset lacks essential information, they can change the editing rules and extract a new dataset from the world.
In QC, the robots have sensoria comparable in size to that of humans, though different in nature and scope. They live in worlds, though different worlds. Living with them would be like living with a dog that could tell you what it is like to have lingering smells so present that they do the work of memory, or a cat that could tell you what it is like to be sharply aware of every small movement that might be prey.
Is it cold in here?:
Then there are the big AIs Momo talked about to Emily who have millions of high bandwidth inputs, utterly beyond the imagination of us the carbon-based.
Navigation
[0] Message Index
[*] Previous page
Go to full version