Computational theory of mind

Computational theory of mind

In philosophy, the computational theory of mind is the view that the human mind is an information processing system and that thinking is a form of computing. The theory was proposed in its modern form by Hilary Putnam in 1961[citation needed] and developed by Jerry Fodor in the 60s and 70s.[1] This view is common in modern cognitive psychology and is presumed by theorists of evolutionary psychology.

The computational theory of mind is a philosophical concept that the mind functions as a computer or symbol manipulator. The theory is that the mind computes input from the natural world to create outputs in the form of further mental or physical states. A computation is the process of taking input and following a step by step algorithm to get a specific output. The computational theory of mind claims that there are certain aspects of the mind that follow step by step processes to compute representations of the world.

The computational theory of mind requires representation because 'input' into a computation comes in the form of symbols or representations of other objects. A computer cannot compute an actual object, it must interpret and represent the object in some form and then compute the representation. The computational theory of mind is related to the representational theory of mind in that they both require that mental states are representations. However the two theories differ in that the representational theory claims that all mental states are representations while the computational theory leaves open that certain mental states, such as pain or depression, may not be representational and therefore may not be suitable for a computational treatment. These non-representational mental states are known as qualia. The computational theory of mind is also related to the language of thought. The language of thought theory allows the mind to process more complex representations with the help of semantics. (See below in semantics of mental states).

Contents

"Computer metaphor"

Computational theory of mind is not the same as the computer metaphor, according to which the mind literally works like a computer.[2] Computational theory just uses some of the same principles as those found in digital computing.[2]

'Computer' is not meant to mean a modern day electronic computer. Rather a computer is a symbol manipulator that follows step by step functions to compute input and form output. Alan Turing describes this type of computer in his concept of a Turing Machine.

Causal picture of thoughts

At the heart of the Computational Theory of Mind is the idea that thoughts are a form of computation, and a computation is by definition a systematic set of laws for the relations among representations. Meaning that a mental state represents something if and only if there is some causal correlation between the mental state and that particular thing. An example would be seeing dark clouds and thinking “clouds mean rain”, there is a correlation between the thought of the clouds and rain, as the clouds causing rain. This is known as Natural Meaning. Conversely, there is another side to the causality of thoughts and that is the non-natural representation of thoughts. An example would be seeing a red traffic light and thinking “red means stop”, there is nothing about the color red that indicates it represents stopping, and thus is just a convention that has been invented, similar to languages and their abilities to form representations.

Semantics of mental states

The computational theory of mind states that the mind functions as a symbolic operator, and that mental representations are symbolic representations; just as the semantics of language are the features of words and sentences that relate to their meaning, the semantics of mental states are those meanings of representations, the definitions of the ‘words’ of the language of thought. If these basic mental states can have a particular meaning just as words in a language do, then this means that more complex mental states (thoughts) can be created, even if they have never been encountered before. Just as new sentences that are read can be understood even if they have never been encountered before, as long as the basic components are understood, and it is syntactically correct. For example: “I have eaten plum pudding every day of this fortnight.” While it's doubtful many have seen this particular configuration of words, nonetheless most readers should be able to glean an understanding of this sentence because it is syntactically correct and the constituent parts are understood.

Criticism

There are arguments against the Computational Theory of Mind. Some of the most compelling[citation needed] encompass the physical realm of a computational process. Gallistel writes in Learning and Representation about some of the implications of a truly computational system of the mind. Essentially Gallistel is concerned with the limits of thermodynamics within the circuits of the brain. With the high volume of information, and the low level of lost material necessary, we have to ask where the energy comes from and how the heat would be dissipated.[citation needed]

John Searle has offered a thought experiment known as the Chinese Room that demonstrates this problem. Imagine that there is a man in a room with no way of communicating to anyone or anything outside of the room except for a piece of paper that is passed under the door. With the paper, he is to use a series of provided books to “answer” what is on the paper. The symbols are all in Chinese, and all the man knows is where to look in the books, which then tell him what to write in response. It just so happens that this generates a conversation that the Chinese man outside of the room can actually understand, but can our man in the room really be said to understand it? This is essentially what the computational theory of mind presents us with; a model in which the mind simply decodes symbols and outputs more symbols. It is argued that perhaps this is not real learning or thinking at all. However, it can be argued in response to this that it is the man and the paper together that understand Chinese, albeit in a rudimentary way due to the rudimentary nature of the system; as opposed to if the man learned Chinese, which would create a sophisticated system of communicating Chinese.

Searle has further raised questions about what exactly constitutes a computation:

the wall behind my back is right now implementing the Wordstar program, because there is some pattern of molecule movements that is isomorphic with the formal structure of Wordstar. But if the wall is implementing Wordstar, if it is a big enough wall it is implementing any program, including any program implemented in the brain.[3]

Putnam has similarly claimed that "every ordinary open system realizes every abstract finite automaton."[4] Computationalists have responded by aiming to develop criteria describing what exactly counts as an implementation.[5] [6] Additionally, Roger Penrose has proposed the idea that the human mind does not use a knowably sound calculation procedure to understand and discover mathematical intricacies. This would mean that a normal Turing complete computer would not be able to ascertain certain mathematical truths that human minds can.[7]

Prominent scholars

  • Daniel Dennett proposed the Multiple Drafts Model, in which consciousness seems linear but is actually blurry and gappy, distributed over space and time in the brain. Consciousness is the computation, there is no extra step or "Cartesian Theater" in which you become conscious of the computation.
  • Jerry Fodor argues that mental states, such as beliefs and desires, are relations between individuals and mental representations. He maintains that these representations can only be correctly explained in terms of a language of thought (LOT) in the mind. Further, this language of thought itself is codified in the brain, not just a useful explanatory tool. Fodor adheres to a species of functionalism, maintaining that thinking and other mental processes consist primarily of computations operating on the syntax of the representations that make up the language of thought.
  • David Marr proposed that cognitive processes have three levels of description: the computational level (which describes that computational problem (i.e., input/output mapping) computed by the cognitive process); the algorithmic level (which presents the algorithm used for computing the problem postulated at the computational level); and the implementational level (which describes the physical implementation of the algorithm postulated at the algorithmic level in biological matter, e.g. the brain). (Marr 1981)
  • Ulric Neisser coined the term 'cognitive psychology' in his book published in 1967 (Cognitive Psychology), wherein Neisser characterizes people as dynamic information-processing systems whose mental operations might be described in computational terms.
  • Steven Pinker described a "language instinct," an evolved, built-in capacity to learn speech (if not writing).
  • Hilary Putnam proposed functionalism (philosophy of mind) to describe consciousness, asserting that it is the computation that equates to consciousness, regardless of whether the computation is operating in a brain, in a computer, or in a "brain in a vat."
  • Bruno Marchal, professor at the Free University of Brussels, claims in a Ph.D thesis (University of Lille, France, 1998, Calculabilité, physique et cognition[8]) that physical supervenience is not compatible with computational theory, using arguments like Universal Dovetailer Argument or Movie Graph Argument.
  • Georges Rey, professor at the University of Maryland, builds on Jerry Fodor's representational theory of mind to produce his own version of a Computational/Representational Theory of Thought.

Alternative Theories

See also

Notes

  1. ^ Horst, Steven, (2005) "The Computational Theory of Mind" in The Stanford Encyclopedia of Philosophy
  2. ^ a b Pinker, Steven. The Blank Slate. New York: Penguin. 2002
  3. ^ Searle, J.R. (1992), The rediscovery of the mind 
  4. ^ Putnam, H. (1988), Representation and reality 
  5. ^ Chalmers, D.J. (1996), "Does a rock implement every finite-state automaton?", Synthese 108 (3): 309–333, doi:10.1007/BF00413692, http://cogprints.ecs.soton.ac.uk/archive/00000226/00/199708001.html, retrieved 2009-05-27 
  6. ^ Edelman, Shimon (2008), "On the Nature of Minds, or: Truth and Consequences" (PDF), Journal of Experimental and Theoretical AI 20: 181–196, http://kybele.psych.cornell.edu/~edelman/Edelman-JETAI.pdf, retrieved 2009-06-12 
  7. ^ Roger Penrose Mathematical intelligence. In Jean Khalfa, editor, What is Intelligence?, chapter 5, pages 107-136. Cambridge University Press, Cambridge, United Kingdom, 1994.
  8. ^ *Bruno Marchal argues that physical supervenience is not compatible with computational theory

References

External links


Wikimedia Foundation. 2010.

Игры ⚽ Нужно сделать НИР?

Look at other dictionaries:

  • Theory of mind — is the ability to attribute mental states beliefs, intents, desires, pretending, knowledge, etc. to oneself and others and to understand that others have beliefs, desires and intentions that are different from one s own.[1] Though there are… …   Wikipedia

  • Computational — may refer to: Computer Computational algebra Computational Aeroacoustics Computational and Information Systems Laboratory Computational and Systems Neuroscience Computational archaeology Computational auditory scene analysis Computational biology …   Wikipedia

  • Computational creativity — (also known as artificial creativity, mechanical creativity or creative computation) is a multidisciplinary endeavour that is located at the intersection of the fields of artificial intelligence, cognitive psychology, philosophy, and the arts.… …   Wikipedia

  • Modularity of mind — The concept of modularity is also used in other fields. See modularity. Modularity of mind is the notion that a mind may, at least in part, be composed of separate innate structures which have established evolutionarily developed functional… …   Wikipedia

  • Theory-based semantics — is a phrase used by Richard L. Ballard to describe knowledge representations that are based on the premise that the binding element of human thought is theory, and that theory constrains the meaning of concepts, ideas and thought patterns… …   Wikipedia

  • How the Mind Works — (ISBN 0 393 31848 6) is a book by American cognitive scientist Steven Pinker, published in 1997. The book attempts to explain some of the human mind s poorly understood functions and quirks in evolutionary terms. Drawing heavily on the paradigm… …   Wikipedia

  • Mind uploading — This page is about whole brain emulation in futurism, transhumanism and science. See also Mind uploading in fiction. Whole brain emulation or mind uploading (sometimes called mind transfer) is the hypothetical process of transferring or copying a …   Wikipedia

  • mind, philosophy of — Branch of philosophy that studies the nature of mind and its various manifestations, including intentionality, sensation and sense perception, feeling and emotion, traits of character and personality, the unconscious, volition, thought, memory,… …   Universalium

  • Mind-wandering — See also: Daydreaming and Attention Mind wandering (sometimes referred to as task unrelated thought) is the experience of thoughts not remaining on a single topic for a long period of time, particularly when people are not engaged in an… …   Wikipedia

  • Quantum mind — theories are based on the premise that quantum mechanics is necessary to fully understand the mind and brain, particularly concerning an explanation of consciousness. This approach is considered a minority opinion in science, although it does… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”