> I don't think the Turing Test is teaching us anything about AI system capabilities.
Sure, it is.
A system only really passes the Turing test (you might call this the "focused Turing test") to the degree it passes the regular Turing test when taken by people whose experience of AI systems matches the system being evaluated.
That is, when someone who has experience with humans and that kind of AI system. who knows specifically that they are looking to distinguish humans from that kind of AI system, still cannot do so better than chance.
Anything else and the system can be distinguished by humans from human interactions, even it gets by because human expectations for the particular tests are primed in a way which has them looking the wrong way.
Sure, it is.
A system only really passes the Turing test (you might call this the "focused Turing test") to the degree it passes the regular Turing test when taken by people whose experience of AI systems matches the system being evaluated.
That is, when someone who has experience with humans and that kind of AI system. who knows specifically that they are looking to distinguish humans from that kind of AI system, still cannot do so better than chance.
Anything else and the system can be distinguished by humans from human interactions, even it gets by because human expectations for the particular tests are primed in a way which has them looking the wrong way.