Shift happens

Kacey

Sr. Grandmaster
MTS Alumni
Joined
Jan 3, 2006
Messages
16,462
Reaction score
227
Location
Denver, CO
I'm not sure where some of the statistics in this clip come from, but it definitely provides food for thought.

Comments?
 
I'm not sure where some of the statistics in this clip come from, but it definitely provides food for thought.

Comments?

Interesting numbers, Kacey. But the stuff about cheap computers exceeding the computational capacities of the human brain, and then those of the human race, is deceptive. The problem is that these machines are still going to be largely serial in architecture, because that's really the kind of algorithm design we understand; parallel algorithms are still relatively primitive. But the human brain computes in massively parallel fashion, in ways we don't understand, and that means that it is capable of cognitive tasks that we still have only the most basic understanding of, even in terms of the basic sensoria—vision, kinesthetic awareness (balance, body and body-part orientation, etc.) and language (which is best viewed as an extremely complex sense, a preprogrammed, hard-wired data-processing capability that gets triggered during a certain developmental window). There are no models for higher-order thinking, of the kind that lead to the discovery of the axioms of quantum theory, or the creation of Bach's Mass in B-minor. There is strong evidence that what counts is not processing based on neuron-to-neuron or neuron-group-to-neuron-group interaction, but on synapse (group)-to-synapse (group) interaction, which means that the number of connections that enter into these multilevel `stacked' parallel computational architectures that the brain runs is not, say, (10^11)!, which is big enough, but some vastly greater number proportional to the factorial of the number of syntaptic connection in those 10^11 brain cells. (I know, I know, this is a gross oversimplification because you actually have to work out all the combinations of connections possible among all subsets of those synapses; but this is just to get a feel for what the orders of magnitude are. The true numbers of connections is clearly much larger than the factorial of the number of synapses, but that's big enough, eh? :erg:)

I'm not saying that we won't someday understand the nature of these massive stacked parallel computations, but no one should be under the impression that those $1000 laptops a half century down the line will be able to duplicate what the brain of the ordinary person is capable of, let alone the thinking of geniuses...
 
Given how bloated Microsoft software is....that's just telling us how powerful our machines will need to be just to boot a future copy of Windows. :lol:
 
Given how bloated Microsoft software is....that's just telling us how powerful our machines will need to be just to boot a future copy of Windows. :lol:

Right—MS Word is now so big you need to ship it using 18-wheelers, and it's still not a tenth as good as LaTeX! :D
 
Interesting numbers, Kacey. But the stuff about cheap computers exceeding the computational capacities of the human brain, and then those of the human race, is deceptive. The problem is that these machines are still going to be largely serial in architecture, because that's really the kind of algorithm design we understand; parallel algorithms are still relatively primitive. But the human brain computes in massively parallel fashion, in ways we don't understand, and that means that it is capable of cognitive tasks that we still have only the most basic understanding of, even in terms of the basic sensoria—vision, kinesthetic awareness (balance, body and body-part orientation, etc.) and language (which is best viewed as an extremely complex sense, a preprogrammed, hard-wired data-processing capability that gets triggered during a certain developmental window). There are no models for higher-order thinking, of the kind that lead to the discovery of the axioms of quantum theory, or the creation of Bach's Mass in B-minor. There is strong evidence that what counts is not processing based on neuron-to-neuron or neuron-group-to-neuron-group interaction, but on synapse (group)-to-synapse (group) interaction, which means that the number of connections that enter into these multilevel `stacked' parallel computational architectures that the brain runs is not, say, (10^11)!, which is big enough, but some vastly greater number proportional to the factorial of the number of syntaptic connection in those 10^11 brain cells. (I know, I know, this is a gross oversimplification because you actually have to work out all the combinations of connections possible among all subsets of those synapses; but this is just to get a feel for what the orders of magnitude are. The true numbers of connections is clearly much larger than the factorial of the number of synapses, but that's big enough, eh? :erg:)

I'm not saying that we won't someday understand the nature of these massive stacked parallel computations, but no one should be under the impression that those $1000 laptops a half century down the line will be able to duplicate what the brain of the ordinary person is capable of, let alone the thinking of geniuses...

The complexity of our vehicles is ever increasing. The issue of serial versus Parallel is very true. Yet, more and more of the sub-systems are making more and more decisions without having to go to a singel master controller. This type of distributed and parallel systems could be a format for such a projected computer.

Yet, I expect there will be limitations in some physical area at the same time the Human brain will be growing and learning and adapting even more.
 
Back
Top