Minds, Brains, and Programs: John Searle: The Chinese room argument: paper published in , “Minds, Brains, and Programs,” Searle developed a. (C4) The way that human brains actually produce mental phenomena cannot be solely by virtue of running a computer program. Since programs do not have "equivalent causal powers", "equivalent causal powers" produce minds, and brains produce minds, it follows that brains do not use programs to produce minds.History · Philosophy · Complete argument · Replies. Download Citation on ResearchGate | Minds, Brains and Programs | This article can be viewed as an attempt to explore the consequences of.
|Published:||27 July 2015|
|PDF File Size:||29.71 Mb|
|ePub File Size:||4.92 Mb|
John Searle, "Minds, Brains, and Programs"
Even with the additional minds brains and programs and output channels, the man locked inside the Chinese room or robot would still be unable to understand Chinese.
Suppose instead that the instructions in the book in effect has the man simulate all of the neural activity of a genuine speaker of Chinese. Simulations need not possess the same powers or properties as that which they simulate.
A simulated hurricane, for instance, lacks the capacities of a real hurricane; it can only flatten unreal, simulated cities.
MINDS, BRAINS, AND PROGRAMS
Minds brains and programs they might not succeed singly, perhaps some combination of the replies will nevertheless undermine the conclusion of the Chinese Room thought-experiment.
An appropriately programmed computer is a mind. There are many functions of a human mind that could never be performed by a computer. What does Searle's Chinese Room thought-experiment allegedly show about the Turing test?
Searle is not asserting that the situation is impossible, but rather that it is difficult or impossible to explain how this system can have subjection conscious experience.
As Searle writes "the systems reply simply begs the question by insisting that the system must understand Chinese. These arguments attempt to connect the symbols to the things they symbolize.
These replies address Searle's concerns about intentionalitysymbol grounding and syntax vs. Robot reply Suppose that instead of a room, the program was placed into a robot that could wander around and interact with its environment.
This would allow a " causal connection" between the symbols and things they minds brains and programs. Nevertheless, the person in the room is still just following the rules, and does not know what the symbols mean.
Searle writes "he doesn't see what comes into the robot's eyes.
Chinese room - Wikipedia
Derived meaning Some respond minds brains and programs the room, as Searle describes it, is connected to the world: The symbols Searle manipulates are already meaningful, they're just not meaningful to him.
The meaning of the symbols depends on the conscious understanding of the Chinese speakers and the programmers outside the room.
The room, like a book, has no understanding of its own. This would provide a " context " that would give the symbols their meaning.
Hubert Dreyfus has also criticized the idea that the "background" can be represented symbolically.
His actions are syntactic and this can never explain to him what the symbols stand for. Searle writes "syntax is insufficient for semantics.
While Searle is trapped in the room, the virtual mind is not: Brain simulation and connectionist replies: Note that the "robot" and minds brains and programs knowledge" replies above also specify a certain kind of system as being important.
Brain simulator reply Suppose that the program simulated in fine detail minds brains and programs action of every neuron in the brain of a Chinese speaker.