Computers, by their very nature, don’t need to have a point of view. However, for our purposes, it is often preferred that they do.
In the days before natural language processing, this manifested as a bias towards other computers. For example, Macintosh hardware didn’t run Windows software until 2006, and printers weren’t recognized by PC hardware without deliberate driver installation until Windows 7 came out in 2010.
But as of late, computers are capable of holding a new kind of ‘bias’, that being a ‘biased’ opinion about human beings, and about the world at large.
This past year computers began working as journalists, writing articles about data-intensive topics such as weather and sports.
For articles generated by the software program Statsheet, over 80% of the time, sports readers cannot tell whether a computer or human has written the article. Say what you will about sports fans, a large part of this software’s success has to do with the successful incorporation of ‘bias’ into the articles.
In contemporary society, a major portion of the journalism industry is devoted to the production of ‘biased’ articles. Sports fans, for instance, like to read articles that favor their home team rather than those that provide an objective opinion of the situation. As demonstrated by Statsheet, computer generated articles that sympathize with the shortcomings of the local team, and over emphasize the team’s success, are more likely to fool readers into thinking that ‘someone’, rather than ‘something’, wrote the article.
As is well emphasized, part of being human and not being a computer, at least in 2011, is being ‘conscious’. With consciousness comes subjectivity, a point of view or a knowledge gap between how things look to you, and how things really are.
It has long been realized that in order for a computer to pass the Turing Test it will have to be able to imitate human strengths as well as human weaknesses. So in 2029, or when the first computer passes the Turing Test, we will still want computers to have a ‘point of view’.
But will the first computer that exceeds human intelligence have a point of view?
Despite the incompatibility of “subjectivity” and “objectivity” in human reality, perhaps a conscious computer smarter than we are will become the first real entity to possess both at once. The closest analogy to this, though not quite exemplifying the notion, might be the Orwellian “1984” concept of Doublethink; holding two conflicting ideas in mind at once and accepting them both.
Empirical inquiry tells us that a Singularity will likely happen, but it can do little to tell us about the likely ‘subjectivity’ of that Singularity. If it is indeed conscious, will subjectivity restrict a computer as it restricts the human mind?
In many ways, computers will become more than we are, and be capable of more than we can even imagine, literally. This is just one more way in that this might be true, because when computers are as smart as we are, we will not be able to think like they do. They will likely have modes of approach to questions that are completely foreign to the human mind.
About the Author:
Nikki Olson is a writer/researcher working on an upcoming book about the Singularity with Dr. Kim Solez, as well as relevant educational material for the Lifeboat Foundation. She has a background in philosophy and sociology, and has been involved extensively in Singularity research for 3 years. You can reach Nikki via email at [email protected].