Monday, November 17, 2008

I H8 Robots

Now that the election is done, we can get back to our favorite pastime: ridiculing those who are engineering the downfall of the human species. "Whole Brain Emulation," you say?

The best quote is this one:

"The only way you can emulate a person with a computer is by first defining the person to be a machine. The Future of Humanity Institute would seem to be misnamed."

Several problems present themselves immediately to the idea that human scientists can replicate the behavior of an individual brain (funny, they don't say "mind"...). One is that, based on the description given, the process being employed to capture a singular brain's behavior is in fact reductive and structurally rigid. In a word, it's no better than an approximation of any given brain's function (because it is limited by its own assumptions, which have become built-in structural limits), and likely would produce nothing more than an output that resembles the output of a living, human brain. Or, to put it another way, the Whole Brain Emulation output of, say, your brain, would probably do little more than remind, say, your mom of you. Evocative does not equal authentic. Sorry, scientist-guys.

Another issue I could see with this is that the people heading up this endeavor apparently don't believe that randomness is very important to the human brain-condition. This notion is so crazy that I submit that it proves just how central irrationality in fact is. QED.

Finally, their proposal to introduce an element of irrationality to the simulation of a brain roughly equates with the "difficulty setting" designed by the Sega Genesis programmers circa 1994: to make it more "lifelike," one need only instruct the computer to change the rules. In College Football '94, if you turned up the difficulty setting, the game would simply stop the human player from affecting the game -- all passes would be incomplete, all runs would result in fumbles, all penalties would be on you, etc. In the Whole Brain Emulation, it sounds like the irrational would be captured, to a limited extent, by introducing more "noise" into the program, thus increasing the chance that a connection would not be made or that an error (literally, a programming error) would result. In other words, there would be a rule that would suspend all rules.

The problem with this, of course, is that it's based on a set of rules; rules that don't exist in the human mind.

At some point, we have got to take stock of future-science and separate for the scientists what is science fiction and what is science actual. They don't seem able to do it (remember, they believe robots will rule the world).