Sojourner
Thinks s/he gets paid by the post
- Joined
- Jan 8, 2012
- Messages
- 2,603
Anyway, the way I see it: biological immortality, brain emulation, at will gene programming and AI.
Any of those mean the end of the human race as we know it. And all of those are quite close to reality.
Hmmm... what do you mean by "quite close"? 25 years? 40 years? I also follow this subject with some interest, and I'd say the consensus view is that "the singularity" is certainly not going to occur within the next 25 years, and probably not even the next 50. If you think differently, please share some links that support that point of view.*
The problem, as I understand it, is primarily that true AI -- that is, machine intelligence that is deeply analytical, can generalize and model problems, and that self-adapts and improves as it learns -- is nowhere near being a reality. Of course, there are many AI research projects ongoing around the world that are attempting to further the field in fundamental ways, but a tremendous amount of progress will need to be made before we're close to creating something akin to an AI human-like brain (or even a lizard-like brain).
* According to a poll of attendees at a "Future of Humanity" conference in 2011, there is about a 50% chance of developing an artificial human-level intelligence by the year 2050, and a 90% chance of developing it by 2150. Only 10% thought it would happen by 2028. I imagine the people at this conference would tend to be "fast AI" enthusiasts, so IMHO these poll results could be a bit optimistic... but I could be wrong. The results were posted on the Kurzweil AI website, after all.
Machines will achieve human-level intelligence in the 2028 to 2150 range: poll | KurzweilAI
Last edited: