luis is a co-founder and social software architect at SyndeoLabs, and a director at Exist Global. he likes building small web toys a whole lot. More ...

quick links to the good stuff

  • 25 First Dates 25 May 2009
  • True Crime: Confessions of a Criminal Mastermind 17 Feb 2009
  • Finding Your Soul Mate: A Statistical Analysis 27 Jan 2009
  • Sex and Schrodinger's Cat 07 January 2009
  • An Extended Rant on Heroes 26 September 2008
  • Zero Barrier 05 May 2008
  • Sweatshop Blogging Economics 08 April 2008
  • The Doomsday Singularity 25 February 2008
  • Piracy and Its Impact on Philippine Music 21 January 2008
  • The Manila Pen-etration by the Hotelier Antonio Trillanes 29 November 2007
  • Journey of a Thousand Heroes 17 December 2006
  • Shake, Rattle & LOL 30 December 2005

    elsewhere online

    • Last.FM
    • Del.icio.us
    • Flickr
    • Plurk
    • Multiply
    • Stumbleupon

    guttervomit

    • 12

      The Doomsday Singularity

      25 Feb 2008

      A supplement to my recent article about the End of the World:

      I was reading today about a potentially apocalyptic event in our future referred to as the Technological Singularity, and have come to the conclusion that if the Mayan deadline passes by uneventfully, this may be the most likely way humanity will eventually wipe itself out.

      It’s no accident that the term “singularity” appears in other branches of science as well. The most well-known is the spacetime singularity which describes a point at which gravity approaches infinity, thereby causing a breakdown of the laws of modern physics (the study of black holes mentions this a lot). Here’s the Cliffnotes version: when we fall, the speed at which we approach the ground doubles every second. If it were possible to fall from an infinite height, our speed would also approach infinity given enough time.

      In a similar vein, a “technological singularity” refers to a point in time when all of our knowledge and innovations occur at such speed that the potential (and consequences) become “infinitely unpredictable.” Specifically, we’re looking at the moment when man builds an ultra-intelligent machine that can surpass the intellect of its makers, at which point things spin out of control.

      When originally hypothesized by statistician I. J. Good in 1965, he declared that this ultra-intelligent machine would be “the last invention that man need ever make,” because we will at that point have rendered ourselves obsolete.

      As you can imagine, this is the kind of theory that makes for some incredibly dramatic science-fiction. In the Terminator movies, the Singularity is reached on July 25th, 2004, and is promptly followed by a massive nuclear launch that nearly wipes out the entire human race. In the Matrix trilogy, the robots subjugate humanity at the turn of the 22nd century. Other writers have even worked out methods of prevention: in William Gibson’s Neuromancer, artificial intelligences are regulated by “Turing Police,” to make sure they never become smarter than us. And leaping even further beyond that, in Dan Simmons’ Hyperion, a group of artifical intelligences debate whether to design a new technology that will render themselves obsolete, suggesting that even the AIs may have to face their own subsequent singularity.

      Whether or not it’s possible to build this ultra-intelligent game-ender seems obvious to me, which is why this theory is so troubling. We’re already building machines that are better than us at specific tasks, and it’s only a matter of time before we build a machine that is better than us at everything. (For example, although the chess computer Deep Blue just barely plays the game better than the best players in the world, it totally trounces its programmers. So it is definitely possible to build an entity that exhibits more intelligence than its creators; the key difference is that, at the moment, this is only possible in very narrow applications like chess.)

      Precisely when this world-changing event will occur is, of course, unknown, although the leading voices on this topic have published some opinions on the matter. The rather dramatic first paragraph from mathematician Vernor Vinge’s 1993 treatise reads:

      Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended.

      If this is the kind of analysis that intrigues you, check out the Singularity Institute for Artificial Intelligence, and this riveting article on friendly (and un-friendly) AI. 15 years to go, folks :)

      12 Responses to “The Doomsday Singularity”

      1. Ryan Says:
        February 26th, 2008 at 8:41 am

        I had am STS professor once, who argued that an AI that’s samrter than us would never be created because humans aren’t stupid enough to create something that they know would wipe our species out.

        Obviously, he has a much larger faith in humanity than I do.

      2. luis Says:
        February 26th, 2008 at 11:09 am

        Yeah, he probably belonged to the camp opposing the Singularity theorists. They’re the first ones the machines will crush, I reckon.

      3. Ramil Says:
        February 26th, 2008 at 1:40 pm

        Why so morbid man? Succumbing to this theory would necessitate loss of hope for all humanity. I don’t care how intelligent these machines could get, the triumph of the human spirit has always and would always prevail. :-)

      4. Ryan Says:
        February 26th, 2008 at 4:52 pm

        The human spirit cannot prevail against mechanized might!

        It isn’t so much morbidity but just a thoughtful look at what the future may bring. Suddenly, I’m wondering maybe my STS professor was right. But not so much that we wouldn’t create an intelligence smarter than us but that some entity, say a corporation would hold it back in the same that oil companies have stunted research into hydrogen cars or reusable fuel resources.

      5. luis Says:
        February 26th, 2008 at 6:04 pm

        >corporation would hold it back in the same that oil
        >companies have stunted research into hydrogen cars

        Greedy corporations eventually save the human race from extinction. Haha, how deliciously ironic.

      6. Ryan Says:
        February 27th, 2008 at 9:18 am

        haha exactly! But it makes sense in a way that corporations are designed to crush anything that gets in their way, be it mom and pop stores or world dominating AI. If anything it’d make for humorous science fiction.

      7. morgan de sade Says:
        February 28th, 2008 at 9:12 am

        robot builders should and must build a self destruct button on all AI machine’s just in case.hehehehe.

      8. morgan de sade Says:
        February 28th, 2008 at 9:53 am

        the following documentaries are a must watch.

        ok this one is about the development of the electric car and basically the destruction by there own developers apparently electric cars were to efficient for profit.
        http://www.whokilledtheelectriccar.com

        this next one is the peak oil debate which tries to cover the coming fall of oil supplies around the world.
        scary thing is the turning point for the dramatic drop of oil supply is between 2010 - 2012.
        http://www.oilcrashmovie.com/

      9. j4s0n Says:
        March 1st, 2008 at 8:48 am

        The mention of Hyperion makes me wanna grow young than old. Good read =)

      10. Ryan Says:
        March 5th, 2008 at 5:10 pm

        isn’t hyperion on of the battle cruisers in starcraft? Yung kay Raynor?

      11. Of a Thousand Truths » Blog Archive » Hyperion Says:
        March 10th, 2008 at 7:36 am

        [...] read a post named The Doomsday Singularity today, the author luis briefly described the danger of a superhuman AI taking over and render [...]

      12. sikor9845 Says:
        November 12th, 2010 at 10:19 am

        thanks.good article

      Leave a Reply

     

    categories

    • Home
    • No categories

    archives

    • April 2011
    • March 2011
    • February 2011
    • January 2011
    • August 2010
    • May 2010
    • April 2010
    • February 2010
    • January 2010
    • December 2009
    • November 2009
    • October 2009
    • September 2009
    • August 2009
    • July 2009
    • June 2009
    • May 2009
    • April 2009
    • March 2009
    • February 2009
    • January 2009
    • December 2008
    • November 2008
    • October 2008
    • September 2008
    • August 2008
    • July 2008
    • June 2008
    • May 2008
    • April 2008
    • March 2008
    • February 2008
    • January 2008
    • December 2007
    • November 2007
    • October 2007
    • September 2007
    • August 2007
    • July 2007
    • June 2007
    • May 2007
    • April 2007
    • March 2007
    • February 2007
    • January 2007
    • December 2006
    • November 2006
    • October 2006
    • September 2006
    • August 2006
    • July 2006
    • June 2006
    • May 2006
    • April 2006
    • March 2006
    • February 2006
    • January 2006
    • December 2005
    • November 2005
    • October 2005
    • September 2005
    • August 2005
    • July 2005
    • June 2005
    • May 2005
    • April 2005
    • March 2005
    • February 2005
    • January 2005
    • December 2004
    • November 2004
    • October 2004
    • September 2004
    • August 2004
    • July 2004
    • June 2004
    • May 2004
    • April 2004
    • March 2004
    • February 2004
    • January 2004
    • December 2003
    • November 2003
    • October 2003
    • September 2003
    • August 2003
    • July 2003
    • June 2003
    • May 2003
    • April 2003
    • March 2003
    • February 2003
    • January 2003
    • December 2002
    • November 2002
    • October 2002
    • September 2002
    • July 2002
    • May 2002
    • April 2002
    • February 2002
    • January 2002
    • December 2001
    • November 2001
    • October 2001

    friends

    • Dementia
    • Gabby
    • Gail
    • Gibbs
    • Helga
    • Ia
    • Ina
    • Jason
    • Kaye
    • Lauren
    • Lizz
    • Luna
    • Mae
    • Migs
    • Mike
    • Ryan
    • Sacha
    • Vicky
    • Vida
    • Yuga

    search

    notes

    Guttervomit v3 went online in January, 2008. It uses Wordpress for publishing, and was built largely with Adobe Illustrator and Textmate. Logotype and navigation is set with Interstate.