pmb: (Default)
[personal profile] pmb
When people aren't trying to screen out people vs. non-people, then we have many many instances of computers successfully passing themselves off as people and having long conversations with unwitting strangers who never caught on. If I told you that a particular AIM account was actually a perl script designed to pass the Turing test, and, when you initiated a chat with that account it said "No, man. That's just one of my friends playing a trick on me - I'm totally real, and that's totally a hoax", how could it convince you of its humanity without resorting to out-of-band methods "call me on the phone" or "check out my webpage"?

If you can't think of a method, then I submit that computers have already passed the Turing test.

Date: 2005-06-29 11:34 pm (UTC)
From: [identity profile] amoken.livejournal.com
I cannot think of a standardizable test, nor could I give you a time limit on how long a test could take. I think it's indeterminate, and a moving target anyway. But for starters, I'd go through the random sorts of stuff that computers tend to have difficulty with (common sense—when Alice goes to the store does her head go with her; amateur chatter in several domains; mindless chitchat; tell me about yourself, your parents, your home, your job, etc; and so on), ask more in-depth questions in response, and possibly go through some of the things computers tend to excel at.

Date: 2005-06-29 11:39 pm (UTC)
From: [identity profile] pmb.livejournal.com
Right, but I contend that the person (who is irritated with the prank) would not give you the in depth answers you want - they've had enough deep probing from strangers. And I'm pretty sure I could write a program that answered most facile questions, and made pleasant chitchat and then quickly became pissed off.

Our biggest Turing Test successes have been in simulating conversations as people who have mental problems - paranoid schizophrenics turn out to be the easiest ones of all. I think irritation from a bunch of people asking if you are really human might lead to a simulatable frame of mind (getting pissed off when the answer isn't obvious for example).

Date: 2005-06-29 11:47 pm (UTC)
From: [identity profile] amoken.livejournal.com
Oh! If they claim to be intentionally uncooperative, then sure. But then we get to the point where I don't care, cuz it's not a very interesting game for me. :)

Date: 2005-06-30 06:36 am (UTC)
From: [identity profile] patrissimo.livejournal.com
but then they haven't passed the turing test. They have rendered an unconclusive answer. Or rather, once computers are good enough to simulate a human who has gotten annoyed, we need to make sure that we don't judge "looks like an annoyed human" as passing the Turing Test, we judge it as an ambiguous answer.

I don't think that means that computers have passed the turing test. It just means that we were giving a weak test because computers were so bad.

Look, by a similar argument to yours, the null response could be considered "passing the turing test", because hell, it could just be a person who doesn't feel like talking. A reasonable definition for the turing test has to include "After talking co-operatively for as long as the interviewer desires..." Heck, it might, for all I know. Turing was a smart fella.

Date: 2005-06-30 05:13 pm (UTC)
From: [identity profile] ouro.livejournal.com
I'm not so sure that the simulation of a deranged interlocutor can be considered a valid pass of the Turing Test. More interesting, then, would be able to simulate a number of moods and mood shifts in the conversation.

Profile

pmb: (Default)
pmb

October 2009

S M T W T F S
    1 23
45678910
11121314151617
18192021222324
25262728293031

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 16th, 2025 12:40 am
Powered by Dreamwidth Studios