Will Artificial Intelligence (A.I) bring Technological Singularity?

in hive-175254 •  2 months ago 

Did you remember the birth of Skynet in the movie The Terminator, I guess you do. One of the greatest fear of humans and technology is Technological singularity (a point in time when human depends on technology for civilization and can’t revert from it thereby allowing the technology to rule us since we can’t do anything by ourselves). In this concept, humans depend on Artificial intelligence for the best in assistance and the next generation of technology where everything can and will be done by AI. With this, technological singularity can be achieved at a point when AI can grow really smart to upgrade themselves indefinitely which will lead to machine ruling humans. Are you scared? A lot of people have been worried about the day AI will take over the world but this didn’t start now, it was even proposed in a book in 1993 called The Technological Singularity by Vernor Vinge.


The Genesis

A lot of people believe that AI will lead to the world technological singularity soon maybe as early as 2050. Before the birth of the computer, we had to do everything ourselves from writing, washing, packing, driving and a lot more. When the computer was invented, it was made to process simple equations but as things kept on evolving with the supercomputers, things kept becoming easy. Currently, we are at a point where a phone, watch or even a pen can compute things in seconds. In other to make the easier and simpler, humans created the artificial intelligence A.I. which is a program identity with some free will which allows it to grow and improve other functions.


Technological Singularity

Every generation of computer made is always better than its predecessors but in the current world of A.I, artificial intelligence will be the one creating the next generation of computer and since it can learn fast and self-grow its knowledge it could reach a level where it will allow it go beyond its programming that human will not be able to predict its limit. This has been the fear of mankind “A technological explosion in A.I could lead to the death of mankind”.

Let an Ultra intelligent machine be defined as a machine that can far surpass all the intelligent activities of any Man however clever. Since the design of machine is one of these intellectual activities, an ultra-intelligent machine could design even better machines. There would then unquestionably be an explosion and the intelligence of man would be left behind, thus the first ultra-intelligent machine is the last invention that man need ever make provided that the machine is docile enough to tell us how to keep it under control. It is curious that this point is made so seldom outside of science fiction. It is sometimes worthwhile to take science fiction seriously. – Irving John Good.


Seeing Watson an AI that is so smart that it wants to create jeopardy to two of its biggest champions ever while getting only one question wrong. Seeing artificial intelligence become very smart is a cool thing but it is also very scary as an advancement in the technology an knowledge of the machine, determined by the machine could make it grow more intelligently than the creator which could also lead to a technological singularity where we would have to depend on machines for everything and we won’t be able to do anything without the machines. A lot of people believe we are at the point of singularity with the emergence of artificial intelligence.


Although A.I is still in its infancy state, and robots haven’t gotten to a stage of a terminator where they can mimic us and use A.I to think like us, the fear about this technological advancement is technological singularity which could lead to the death of humanity.

Will there ever be a point when A.I will lead to technological singularity, do you think they could wipe out humanity?

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!
Sort Order:  

A very renowned proponent of emerging technology, Ray Kurzweil, has not only inferred that advances in technology will lead to technological singularity but has even given the roadmap to singularity. Just like you said, technology is still in its infancy and we hope to get more advanced.
Nice piece

@tipu curate

Upvoted 👌 (Mana: 0/10)

Skynet is one of the things that reminds me of my youth hahaha, the Terminator saga always brings back good memories.

The fear of humanoid or intelligent creations spinning out of control is very old, in fact I remember that the old golem strips were focused on that fear.

I have the belief that the advancement of AI technology, coupled with the development of biological computing, may eventually lead to a species gap in non-human intelligent life. as to whether or not he will be hostile to his creative race, well, I'm not sure, but without a doubt humans can do horrible things to them (slavery, humiliation, discrimination, etc.) that in fact we already do to each other as a species .

I mean, if we mistreat those of our species and kill beings of other species in many cases for "fun", would the artificial beings we manage to create have better luck? I do not think so.

They would have to be too patient and peaceful intelligent beings to endure abuse without retaliation. So in the long run they may explode and rebel, after all, we would blame them for revealing themselves without accepting that it was our mistreatment that led to that.

Source: Terminator Genisys Review: Best Terminator Movie Since T2

Good post, great theme. You certainly don't know what to expect from such a breakthrough. The benefits of using AI are tangible, but beyond that, 30 years from now, who knows what the reality will be?

Hello again, @eni-ola!

It's very interesting and apocalyptic your post. However, I think that no machine will ever surpass the intelligence of a human being.

Obviously, they will be able to create a machine that has more "knowledge" than many of the people one knows and who are about to become extinct by themselves...

Especially those who practice extreme sports or think they can steal the electric light from street lights. Hehehe.

But "intelligent", I don't think. The concept of "intelligence" is impossible to simulate.

Exterminate humanity? Not at all. We do that ourselves. We don't need any AI to tell us how to wipe out ourselves...

And that's a fact! XD

Thanks for sharing :D

Stay Safe & Stay strong