Computer chatbots are always easy to spot aren't they? They repeat themselves, don't understand conversational tools like sarcasm or irony and are never, never funny, but it looks like that might not be the case for much longer. A pair of researchers have managed to create a computer that was able to trick enough people into thinking it was a real boy, that it successfully passed the Turing Test, the first computer ever to do so.
At a test of five computing systems at the Royal Society in London recently, humans were asked to have a five minute text conversation with each of the machines and to judge whether they thought the chat was with a person or a computer. This is the measure of the Turing Test – introduced by Alan Turing in the '50s – which requires a score of 30 per cent or more to pass and this particular computer, known as Eugene Goostman if you asked it, was able to do so.
Developed by Russian born Vladimir Veselov and Ukrainian Eugene Demchenko, Goostman was able to trick 33 per cent of those testing it, into thinking that it was a real 13 year old boy. While other computers have been hailed as the first to past the Turing Test, they often had pre-ordained questions, or used extracts from real conversations, so the machines weren't thought to be really thinking for themselves. Goostman on the other hand, does. Or at least appears to.
Veselov said of his machine's victory: “It's a remarkable achievement for us and we hope it boosts interest in artificial intelligence and chatbots.”
He and his partner Demchenko, also believe that developments like this will one day have profound questions for our society, where we try to judge whether a computer behaving just as we do, is alive or not.
Discuss on our Facebook page, HERE.
KitGuru Says: Anyone who's watched ‘Her' will no doubt have pondered these same sorts of things. What do you guys think of artificial intelligence? Can it ever be considered alive as we are?[Thanks The Guardian]