Technological Singularity

sgtmac_46

Senior Master
Joined
Dec 19, 2004
Messages
4,753
Reaction score
189
In recent weeks i've become fascinated by the concept of Technological Singularity. Simply put, Technological Singularity is defined as "a technological singularity (also referred to as just the Singularity) is a predicted future event when technological progress and societal change accelerate due to the advent of superhuman intelligence, changing our environment beyond the ability of pre-Singularity humans to comprehend or reliably predict."

http://en.wikipedia.org/wiki/Technological_singularity
http://www-rohan.sdsu.edu/faculty/vinge/misc/singularity.html

The questions that come to mind are:

What is your opinion on how long, if ever, it will be before we reach Technological Singularity

What will the effect be on the average human's life in a Post-Technological Singularity world?

What are the ramifications for the human race and the world as we know it?

In fact, many questions come to mind. I'd like to hear any thoughts you folks have.

I believe we've already seen the first edges of what Technological Singularity will do to the human race with the vast explosion in technology of the last 30 years involving information sharing. It's hard to remember life before the internet. What's more, I see many of the problems we have with modernity being symptoms of this coming event. What is Globalization but a consquence of increasing technological sophistication?
 
I struggle with the idea of technology, which is entirely created by human intelligence, culture, and society, suddenly one day going beyond our ability to control it. At best, it seems that the "human intelligence being left behind by technology" (paraphrasing) really only refers to a certain lifestyle.

Take, for example, cars or airplanes. Late 1800s, we saw very few automobiles and no airplanes; now, a mere hundred years later, our entire economic infrastructure is centered around automobiles and air travel being in existence. Our society's become dependent on this technology--seemingly the standard for this Technological Singularity taking place--yet we still remain in control of them. The internet is no different.

Personally, I think that "going beyond human capacity" (again paraphrasing) really only means requiring a new paradigm to adjust, which really is something we've been doing to adjust for new technologies for quite a while now.
 
RandomPhantom700 said:
I struggle with the idea of technology, which is entirely created by human intelligence, culture, and society, suddenly one day going beyond our ability to control it. At best, it seems that the "human intelligence being left behind by technology" (paraphrasing) really only refers to a certain lifestyle.

Take, for example, cars or airplanes. Late 1800s, we saw very few automobiles and no airplanes; now, a mere hundred years later, our entire economic infrastructure is centered around automobiles and air travel being in existence. Our society's become dependent on this technology--seemingly the standard for this Technological Singularity taking place--yet we still remain in control of them. The internet is no different.

Personally, I think that "going beyond human capacity" (again paraphrasing) really only means requiring a new paradigm to adjust, which really is something we've been doing to adjust for new technologies for quite a while now.
I understand the perspective that you are referring to, but the issue goes far deeper.

What we mean by "Technological Singularity" is a point in time in which humans, or most humans, become irrelavent. Whether that be by the creation of Ultrahuman artificial intelligence that makes the human mind unnecessary, or by Intelligence Inhancement and genetic modification that allows the elite to far surpass the average human, making the average human obsolite.

Again, the idea that these things are fast moving OUT of the realm of fantasy, and in to the realm of possibility is all around us.
 
Hmm, sounds like the basic premises behind movies like Terminator, Matrix, or Stealth (the one with the computer-controlled fighter jet going rogue.) Sadly, I never saw that last movie.

I take comfort in the thought that these remain science fiction fantasies. We really can't create artificial intelligence until we find some way of creating an artificial lifeform, which as I understand we're nowhere near doing. Any artificial system requires humans to instil it with values, goals, and drives, leaving them very much within our control. In order for them to surpass us far enough to drive us into obsolescence, artificial systems would need the programming to do so. And as far as the enhancement concern goes, I don't think we have the capacity at the moment to go around creating any X-men. :borg: The internet seems to be the only real concern, and that's just a powerful database.

It's an intriguing idea, and as our technology becomes more precise and more powerful it will be definite reason to make sure there are safeguards to insure human control over the technology we create. However, I don't think we've set ourselves down on any path of oblivion with our innovations.

That being said, I f anyone hears of any Skynet being created, make sure to give me the bomb-shelter instructions.
 
Sci-fi speculation is all good and fun. Plus, it makes for a great movie plot! ;)

Of course, the reality is very different. The most advanced computers we have today don't even match the complexity of a eukaryotic cell, let alone the human brain. A single human brain has more synaptic pathways than there are estimated stars in this galaxy. The human brain also possesses a level of organizational and phenotypic plasticity that is, quite literally, unmatched by anything else in nature (that we know of, anyway --- I mean, who knows what's out there, Mulder?).

Aritificial Intelligence isn't even vaguely close to even imagining something like that.

I should also point out that many of these speculations are based around rather silly and archaic ideas about "intelligence" to begin with. The idea being that there is this single, monolithic variable we call "intelligence" (Howard Gardner would argue otherwise --- as would most cognitive scientists) and that it is somehow "flat".

I would be interested in hearing what this "superintelligence" is supposed to be like. Human beings are already capable of relativistic and dialectical post-formal reasoning. Some authors have suggested our capability for genuinely trans-rational stages of cognitive development, as well.

So, then, what's this "superintelligence" supposed to be?? So, far, all I've heard are fantasies that suppose logicomathematical formal-operations is the end-all, be-all of the universe.

Puh-leeze.
 
heretic888 said:
Sci-fi speculation is all good and fun. Plus, it makes for a great movie plot! ;)

Of course, the reality is very different. The most advanced computers we have today don't even match the complexity of a eukaryotic cell, let alone the human brain. A single human brain has more synaptic pathways than there are estimated stars in this galaxy. The human brain also possesses a level of organizational and phenotypic plasticity that is, quite literally, unmatched by anything else in nature (that we know of, anyway --- I mean, who knows what's out there, Mulder?).

Aritificial Intelligence isn't even vaguely close to even imagining something like that.

I should also point out that many of these speculations are based around rather silly and archaic ideas about "intelligence" to begin with. The idea being that there is this single, monolithic variable we call "intelligence" (Howard Gardner would argue otherwise --- as would most cognitive scientists) and that it is somehow "flat".

I would be interested in hearing what this "superintelligence" is supposed to be like. Human beings are already capable of relativistic and dialectical post-formal reasoning. Some authors have suggested our capability for genuinely trans-rational stages of cognitive development, as well.

So, then, what's this "superintelligence" supposed to be?? So, far, all I've heard are fantasies that suppose logicomathematical formal-operations is the end-all, be-all of the universe.

Puh-leeze.
Sorry if I didn't make myself clear. This is a discussion of futurism in a theoretical near future world where such things as gene manipulation, technological enhancement, and high technology create a disparity between the average human, and those elites who are able to purchase those enhancements, creating such a disparity that it is difficult for the average person to compete. Couple this with automation and technology that makes human workforces completely unnecessary, thereby ultimately make most humans unnecessary.

You're right, work forces becoming increasingly unnecessary because of automation...Sounds like science fiction...Silly me.
 
sgtmac_46 said:
Sorry if I didn't make myself clear. This is a discussion of futurism in a theoretical near future world where such things as gene manipulation, technological enhancement, and high technology create a disparity between the average human, and those elites who are able to purchase those enhancements, creating such a disparity that it is difficult for the average person to compete. Couple this with automation and technology that makes human workforces completely unnecessary, thereby ultimately make most humans unnecessary.

You're right, work forces becoming increasingly unnecessary because of automation...Sounds like science fiction...Silly me.

There's a world of difference between a manual workfoce becoming unnecessary because of increasing technology and then going on to claim that there will be some sort of technologically-induced "superintelligence" leaving us regular ol' humans in the dust. As I said before, such speculation is largely based on outmoded ideas of what "intelligence" is in the first place, as well as the incredible, irreducible complexity of the human brain (which hasn't seen any significant structural changes for the better part of 50,000 years).

As for "elites" only having access to this new technology, it'd certainly be a novelty in human history. The exact same arguments could be leveled at computers some 50 years ago, but guess what I'm sitting in front of right now?? It always happens whenever there's this new, breakthrough technology --- only the "elites" have it at first, but eventually it becomes commonplace. My guess is a similar thing will happen with this new technology (assuming the government doesn't ban some of it).

Laterz.
 
heretic888 said:
There's a world of difference between a manual workfoce becoming unnecessary because of increasing technology and then going on to claim that there will be some sort of technologically-induced "superintelligence" leaving us regular ol' humans in the dust. As I said before, such speculation is largely based on outmoded ideas of what "intelligence" is in the first place, as well as the incredible, irreducible complexity of the human brain (which hasn't seen any significant structural changes for the better part of 50,000 years).

As for "elites" only having access to this new technology, it'd certainly be a novelty in human history. The exact same arguments could be leveled at computers some 50 years ago, but guess what I'm sitting in front of right now?? It always happens whenever there's this new, breakthrough technology --- only the "elites" have it at first, but eventually it becomes commonplace. My guess is a similar thing will happen with this new technology (assuming the government doesn't ban some of it).

Laterz.
Of course your assessment of the "impossibility" of the technology has been leveled at the vast majority of the scientific advancements of the 20th century.

1902 "It's clear that man cannot gain powered flight, that's been proven and is a silly concept"

1950 "Man space flight is purely the realm of science fiction, and is never likely to happen."

1990 "The idea that multi-celled organisms can be cloned is clearly preposterous."

Prediction 2005 "It has been proven that Strong Artificial Intelligence is an impossibility."

Moore's observation in 1965 stating that the complexity of an integrated circuit will double every 24 months has held true to the present day. What's more, the introduction of nanotechnology will likely see this trend continue or increase.

With the ever increasing complexity of machines, it is likely in the near future that we will at the very least achieve a state where all humans no longer are involved in any real scale of manual labor. This will result in a situation where only relatively small labor force of well trained technicians are required.

We are seeing this trend in stores, where self-service checkouts are beginning to replace human service workers.

The creation of Strong AI isn't even necessary to reach Technological Singularity. Simply achieving a situation where Auxon's, or Von Nuemann Machines, or self-replicating systems, could cause this in conjunction with the direction of a societal elite class, who, by virtue of their lack of need to have human involvement in physical labor, could remain extremely small in population.

The Von Neumann machine is NOT far fetched futuristic science fiction, but is likely to be a reality in the extremely near future.

Couple this with the nanotechnology and advancements in human genetics and we have world changing technology on the horizon.

And what will the cultural and political implications be of a world where mechanical labor is far more expensive than manual human labor? We've already seen the effect of automation and cheap labor have had on US labor forces.

It has almost entirely limited a segment of society made prosperous on heavy industry, and forced a sharp division between skilled upper middle and upper class and the service industry. Now, even in the service industry technology is phasing out human beings.
 
sgtmac_46 said:
Of course your assessment of the "impossibility" of the technology has been leveled at the vast majority of the scientific advancements of the 20th century.

The difference is that, in the 20th century, scientists actually had a pretty good idea on how eukaryotes and aerodynamics actually work.

The arguments being presented for AI and similar technologies, as well as these speculations for a metahuman "superintelligence", all indicate a fundamental ignorance of the way the human brain and human consciousness actually operate. And, as many psychologists and neuroscientists will tell you, there's a helluva lot we just don't know. Ergo, the idea of artificially replicating these processes is rather silly.

But, whatever the case, at the very least these guys are not taking into account:

1) The domain-specific nature of "intelligence", as opposed to the mistaken impression that there is a single, unilateral, cross-domain variable you can call "intelligence".

2) The hierarchical-emergent and discontinuous nature of all domains of "intelligence", as opposed to the archaic notion that "intelligence" is something of a continuous variable like heat or elevation that simply goes up or down across a given wavelength or scale.

3) The irreducible complexity of the human brain, which: a) has more synaptic pathways than there are stars in the galaxy, b) is the most phenotypically plastic structure in nature, and c) is perhaps the most organically plastic strucure in nature.

4) The irreducible reality that cultural and socioeconomic variables have on individual "intelligence", including non-human organisms.

At present, much of this speculation is based on archaic, anemic, monopolar, "flat", and ridiculously simplistic notions about how intelligence actually operates.

sgtmac_46 said:
And what will the cultural and political implications be of a world where mechanical labor is far more expensive than manual human labor? We've already seen the effect of automation and cheap labor have had on US labor forces.

It has almost entirely limited a segment of society made prosperous on heavy industry, and forced a sharp division between skilled upper middle and upper class and the service industry. Now, even in the service industry technology is phasing out human beings.

Once again, if such a phenomenon happened, it'd be a first in human history.

Who would have guessed the majority of people in America would own --- or at least have easy access to --- computers, television sets, telephones, cellular phones, automobiles, and so forth?? If you had asked the individuals pioneering the original technology for these instruments, they'd have likely told you something similar to what you're commenting about nanotechnology, genetic engineering, cybernetics, and so on. I see something what you're predicting only happening in the short-term.

In any event, the importation of new technology into the workforce does make many 'manual' or 'physical labor' jobs obsolete --- but it also creates a whole new array of jobs that need to be done. Overseeing the operation of the new technology, maintenance of the new technology, networking of the new technology into the jobsite's overall infrastructure, and probably dozens I have missed.

Assembly lines are becoming less and less common these days, its true. But, how many new jobs involving the use of computers or the maintenance and oversight of the technology now on assemply lines have arisen??

Its a bit more complex than all this wild speculation is making it out to be --- whether we're talking about socioeconomics or intelligence.

Laterz.
 
heretic888 said:
The difference is that, in the 20th century, scientists actually had a pretty good idea on how eukaryotes and aerodynamics actually work.

The arguments being presented for AI and similar technologies, as well as these speculations for a metahuman "superintelligence", all indicate a fundamental ignorance of the way the human brain and human consciousness actually operate. And, as many psychologists and neuroscientists will tell you, there's a helluva lot we just don't know. Ergo, the idea of artificially replicating these processes is rather silly.
It is not necessary to replicate every process in the human brain. The most accurate statement you made was "We just don't know". What's further, you assume that there is concensus on the level of technology that will be available in the next 30 to 50 years, and the idea that you can predict it's path is in itself rather silly.

heretic888 said:
But, whatever the case, at the very least these guys are not taking into account:

1) The domain-specific nature of "intelligence", as opposed to the mistaken impression that there is a single, unilateral, cross-domain variable you can call "intelligence".

2) The hierarchical-emergent and discontinuous nature of all domains of "intelligence", as opposed to the archaic notion that "intelligence" is something of a continuous variable like heat or elevation that simply goes up or down across a given wavelength or scale.

3) The irreducible complexity of the human brain, which: a) has more synaptic pathways than there are stars in the galaxy, b) is the most phenotypically plastic structure in nature, and c) is perhaps the most organically plastic strucure in nature.

4) The irreducible reality that cultural and socioeconomic variables have on individual "intelligence", including non-human organisms.


At present, much of this speculation is based on archaic, anemic, monopolar, "flat", and ridiculously simplistic notions about how intelligence actually operates.
You're entire argument is based on the inability for "Strong AI" to reproduce every aspect of the human brain. That argument is irrelavent. Reaching a species altering technological point can occur without replicating Human intelligence.


heretic888 said:
Once again, if such a phenomenon happened, it'd be a first in human history.
Is your argument really "It can't happen because it hasn't happened"?

heretic888 said:
Who would have guessed the majority of people in America would own --- or at least have easy access to --- computers, television sets, telephones, cellular phones, automobiles, and so forth?? If you had asked the individuals pioneering the original technology for these instruments, they'd have likely told you something similar to what you're commenting about nanotechnology, genetic engineering, cybernetics, and so on. I see something what you're predicting only happening in the short-term.
They would have told me something similiar? Why is that? What about TV's and Telephones would lead to replacing the bulk of human labor?

Just exactly what technology that we have previously had available to us even remotely equates to genetic engineering, cybernetics and nanotechnology on the scale of altering the human species? Genetic engineering itself has the potential to alter the very definition of what 'is' human. In fact, so do nanotechnology and cybernetics.

What you fail to grasp the snowballing effect technological, how each new step alters humanity further and faster than the steps before, rapidly increasing in speed and effect.

Make a list of the technological advancments that have fundamentally altered human life in the last 10,000 years.

How many were in the last 2000 years?
The last last 1000?
The last 500 years?
The last 200 years?
The last 100 years?
The last 60 years?
The last 50 years?
The last 20 years?

Does each list seem much larger than before? The printing press was societally altering, it changed the face of Europe in a couple of centuries.

How does access to printed words compare to near instant access to millions upon millions of pages of information at the click of a mouse? The last 100 years of human history have produced more species altering discoveries and inventions than the last 10,000 years of human history before it. What's more, that advancement of science and technology is speeding up, not slowing down.

heretic888 said:
In any event, the importation of new technology into the workforce does make many 'manual' or 'physical labor' jobs obsolete --- but it also creates a whole new array of jobs that need to be done. Overseeing the operation of the new technology, maintenance of the new technology, networking of the new technology into the jobsite's overall infrastructure, and probably dozens I have missed.
Again, how many jobs are necessary at the point at which robotics becomes self-replicating and self-repairing? The only job necessary is that of initial programmer. Even maintenance can more cheaply and efficiently be done with other automation. Supervisor of their function is ultimately even superfluous. Again, give me a list of the types of jobs necessary in that environment and who will fill them? I can tell you they will only fall in creative areas.

Anything that requires any sort of repetetition and very little creatively will more cheaply and efficiently be done with automation. Everything from factory work, much of which that is already automated, to all manner of service jobs.

So, lets hear this large laundry list of jobs that will still require human labor or be openned up by this level of technology. They will fall solely in to the realm of professional services, as they are increasingly now.

heretic888 said:
Assembly lines are becoming less and less common these days, its true. But, how many new jobs involving the use of computers or the maintenance and oversight of the technology now on assemply lines have arisen??

Its a bit more complex than all this wild speculation is making it out to be --- whether we're talking about socioeconomics or intelligence.

Laterz.
Not so many as you would argue. There are a number of professional service jobs, but the bulk of those who would formerly fill relatively high paying manufacturing jobs have been relegated to low paying unskilled service jobs. As automation continues to advance, with software with more advanced decision making processes, unskilled service jobs will disappear, leaving only professional service jobs. What happens to the majority of people unfit or unprepared for professional service jobs at that point.


Think about this the next time you walk in to a store with automatic checkout. It is designed to replace the function of human employees at a fraction of the cost. E-commerce is further pushing out service employees. What, then, happens to those employees? Do we create a massive welfare state where they are simply given money to remain consumers.

In short, it's clear your dismissive tone appears to be a knee jerk reaction to the suggestion of strong AI, that's made clear by your focus on attempting to declare that sole point pure fantasy.

This isn't simply about a "Super intelligence computer", that is one and only one path to technological singularity. It is not the sole path, or the most likely. It's about technological singularity beyond which the human species and our social structure is no longer recognizable. Your declaration that Strong AI is impossible does not answer the core questions.

Even without the implications of Strong AI, it should be clear to many that we are moving toward this phenomenon. The complaint that many new jobs in the US are increasingly low paying is blamed solely on the politics and the failures of political parties, but the truth is that it is likely, when examined from a bigger perspective, a natural result of the advancement of human technology and it's ability to allow more efficient and cheaper use of resources.

Consider outsourcing to India for telemarketing. It is the advancement in communications technology that has made that an attractive cheaper alternative. 30 years ago, that would not have been remotely an option.

Further, when discussing intelligence, it is not necessary to mystify intelligence and discuss all the varied ways in which intelligence is truly complex. The measure of intelligence enhancing technological advancement is on it's ability to enhance performance and creativity. Take the computer and the computer database. It allows an intelligent person to perform tasks that would have required many people 100 years ago. For that reason, it is intelligence inhancing. Declaring the issue of intelligence 'to complex' to discuss in those terms is really a dodge.

The point has been made that any moderately intelligent person with a masters level education and a computer can max out any written IQ test on the planet. Attempting to make the point that IQ tests are not a valid measure of the varied degrees of what has come to be known as "intelligence" doesn't really alter the intelligence enhancing technological advantage of computer technology.

Further, as computer technology becomes more integrated with human beings, the effect will be that much more pronounced. What took many people just a few years ago, will take only a very limited number when coupled with intelligence enhancing technology and self-replicating robotics.

True self-replicating robotics can mine raw material, transport raw material, process raw material, create products with raw material, create more self-republicating robotics with raw material, and repair itself and other self-replicating robotics. It does not require Strong AI to create a technological singularity effect. Couple this robotics with humans utilizing technological intelligence enhancement, and an extremly small number of human beings can perform what took many thousands of human beings up to present day. So, again, the focus on the "impossibility of strong AI" is irrelavent.

In fact, once human intelligence has begun the process, it can be carried out devoid of human interaction. Only the most cursory supervision of the process would be necessary to insure that no problems develope within the system.

If this type of technology becomes available to individuals, what is going to prevent them from utilizing to their own advantage? Some sort of human wide agreement that such technology would be unbeneficial to humans as a whole? That is going to prevent individuals and nations from utilizing the advantages of such technology? Such an idea hardly seems to grasp human nature. Any technology that is likely to gain individuals and groups an advantage will never remained bottled up, no matter how it is declared undesirable by humanity as a whole. We need look no further than the nuclear genie as an example of that.
 
Back
Top