Abstract: The Chinese room thought experiment of John Searle militates against strong artificial intelligence, illustrating his claim that syntactical knowledge by itself is neither constitutive nor sufficient for semantic understanding as found in human minds. This thought experiment was put to a behavioural test, concerning the syntax of a finite algebraic field. Input, rules and output were presented with letters instead of numbers. The set of rules was first presented as a table but finally internalized by the participants. Quite in line with Searle’s argument, uninformed participants mastered the syntax but did not explicitly report seman-tic knowledge. In order to test the virtual mind reply to the Chinese room argu-ment, the reaction...