I’m giving you a link to a Business World article which argues that we should cease all work on artificial intelligence (AI). It cites an article appearing last week in Time, by Eliezer Yudkowsky who apparently has the background to make the following dramatic claim.
If somebody builds a too-powerful AI, under present conditions, I expect that every single member of the human species and all biological life on Earth dies shortly thereafter.
Both authors agree the proposed six month moratorium is a step in the right direction but insufficient. The cynical side of me suggests the outcome he predicts is no more than we deserve, although I’m not ready to die just yet.
The sci fi aficionado part of me wonders if this is why we’ve never been contacted by a space-faring species? Implying all such invented the machines that eliminated them before attaining interstellar travel, as we appear ready to do.
If somebody designs that too-powerful AI, they should name it Orkin, as we’ll be the pests it exterminates. Who would have guessed that one of the early prophets of the Butlerian Jihad would be named Yudkowsky?