Human's Capability of Rational Thought
Discussing the works of Tversky and Kahneman; Stanovich, 2009, stated that “being rational means acting to achieve one’s own life goals using the best means possible”. A long history of research has looked into the mystery of rational human thought and decision making – Tverky and Kahneman, 1973, proposed that humans use a set of simple steps known as heuristics to obtain the answer to a difficult problem. Rationality is regarded as one of the most valued human traits and many psychologists (such as Stanovich, 2009) believe that a Rationality Quotient (RQ) should be as important in assessing psychological domains, as the Intelligence Quotient (IQ). Individuals certainly have different patterns of cognition to each other, which means that some may think and make decisions more rationally than others. However, as Tverksy and Kahneman have shown, the basic structure of human cognition means that all of us are susceptible to making errors in judgement – but, on occasion, individuals are able to override these errors and make rational decisions. Whether humans are capable of rational thought – is a subjective question which should be investigated by means of various theories and previous experiments.
The Classical Decision
Theory is a collection of the earliest models of decision making. One of these is the Model of the Economic Man and Woman. It assumes three things: that decision makers are fully informed about the topic, that they are aware and sensitive to the most minor distinctions between the different options and that they make a rational choice. An alternative theory, the Subjective Utility Theory, states that when people make decisions their ultimate objective is to bring about pleasure (positive utility) and to avoid pain (negative utility). They calculate the subjective utility, which is based on the utility or value of the choice, rather than criteria and the subjective probability, which is a person’s estimate of possible outcomes. In comparison to the Economic Man and Woman Model, this theory takes into account the complexity of the human mind and subjectivity – it is not realistic that anyone would truly be fully informed about difficult choices and be sensitive to minor differences between each choice (Stenberg & Stenberg 2006).
Taking this into consideration, it is possible to say that humans are not capable of rational thought - we don’t always make the most correct decisions because we are not capable of calculating and distinguishing tremendous amounts of knowledge. Kahneman (2011) proposed that the mind works by means of two systems. System 1 is an automatic and fast-responding system. It is used to answer simple questions (such as 2+2=?), reading and identifying objects. System 2 is slower and used to answer more difficult and compelling questions. It requires attention and is used in cases such as trying to identify a voice in a crowd or filling out a form. If attention is lacking in these situations, the System 2 does not perform as well. Individuals use a mixture of both systems in daily decision making.
Sometimes when individuals focus their attention too intensively, they become blind to other stimuli. This has been shown in a study by Chabris and Simons, 2010, where people had to focus on the task of counting how many passes of a ball has been made between basketball players. Most people missed a very obvious stimulus – a woman dressed in a gorilla suit appearing mid-way through the video. This demonstrates flaws in both systems, as individuals can become blind to stimuli and be oblivious to their blindness. Biases in cognition, such as hindsight bias and overconfidence (Stenberg & Stenberg, 2006), may be the cause of these mistakes. Deductive reasoning, which is a process of coming to conclusions from the knowledge of general statements, gives valuable information about rational human cognition. Conditional reasoning, a type of deductive reasoning, is when a person comes to a conclusion using an “if-then” proposition (Stenberg & Stenberg, 2006). It works effectively in the cases of simple conditional arguments such as if “p” then “q”, “p” therefore “q” (or “if it rains she gets wet, it rains therefore she gets wet”). This is known as the modus ponens form, which is logically, a valid statement. Individuals usually have no problem quickly coming to the correct conclusion when asked to finish statements in this form.
However, it takes longer to finish a statement in the modus tollens form - if “p” then “q” and “q” is false it follows that, therefore, “p” is false (or “if it rains, she gets wet. She did not get wet, therefore it did not rain). Modus ponens and modus tollens are two valid inferences of simple conditional arguments (Eysenck & Keane, 2010). Another two inferences can be drawn from these conditional arguments, however they are invalid (fallacies). One case is the “affirmation of the consequent” - if “p” then “q”, “q” therefore “p” (or “if it rains she gets wet, she gets wet therefore it was raining”). This is not valid because a conclusion cannot be drawn, the statement did not say that the reverse is true – however, some individuals tend to finish the statement that way, although the only “correct” conclusion would be that no conclusion can be made.
In the case of “denial of the antecedent”, some individuals tend to finish the statement if “p” then “q”, not “p” with therefore not “q” (or “if it rains then she gets wet, it did not rain therefore she did not get wet), which is also not valid (Eysenck & Keane, 2010). In an experiment (Marcus & Rips, 1979), 100% of the subjects made the modus ponens inference, only around 50% made the modus tollens inference and even less subjects made the affirmation of the consequent and denial of the antecedent inferences. The suppression of these fallacies suggests that human reasoning does not follow a logical pattern. That being said, individuals don’t always make the modus ponens and modus tollens inferences while suppressing the affirmation of the consequent and denial of the antecedent inferences – sometimes people fail to make the correct inferences and conclude these conditional arguments with fallacies. It is interesting to note, that when an alternative condition is given, very few subjects make affirmation of the consequent or denial of the antecedent inferences. An example of this is if “p” then “q”, if “r” then “q”, “q” therefore …? (Or “if it rains, she gets wet. If it snows she gets wet. She got wet therefore…?”). Individuals are more likely to say that no conclusion can be made, when they are given a clear alternative condition – thus extra information can aid the improvement of logical reasoning (Eysenck & Keane, 2010).
This can also be observed in the Wason Selection Task: the subjects of the experiment are given four cards with numbers and letters (A, B, 2, 3), and told that every card has a letter on one side and a number on the other. The task was to select a card that would prove that all of the cards were following a condition (if a card has the letter A on one side, then it has the number 2 on the other side). Most people select either the A card, the 2 card or both to prove that this condition was met. The correct card to select was 3, because if it had an A then condition could not be met. When a more realistic context was given to the subjects (such as using envelopes and stamps instead of cards), they were more likely to give the correct answer (Wason, 1966). Mental heuristics are effective for humans; they are fast and decrease thinking efforts while usually giving the correct answer. Heuristics allow us to process less information and keep our cognition working efficiently. Sometimes they are affected by various biases however, which causes mistakes to happen when trying to reason rationally. Simon (1957), proposed that sometimes people think rationally and sometimes we do not – we show bounded rationality. Unlike the Classical Decision Theory, the assumption that we have bounded rationality takes into account that we are limited and don’t use every piece of information available to make decisions, but instead come close to it by using heuristics.
Satisficing (Simon, 1957), is one of the first heuristics introduced by researchers, which suggests that reasoners first make a list of options and consider them one by one, until they come to an option that they are most satisfied with. Individuals tend to compare the benefits of getting more information (about each choice) against the added costs, time and effort needed to get this information. A lot of moral behaviour is interconnected with bounded rationality, in particular with satisficing. When faced with a difficult moral decision, people depend on satisficing rather on maximising (which means finding the provably best course of action) because maximising would only work in a world that is less full of various unpredictable possibilities. In fact, satisficing can reach better results than maximising (Gigerenzer, 2010). Another way of coming to a decision is eliminating different choices based on their characteristics, until the last choice is left – which ends up meeting the decision maker’s criteria. This is called elimination by aspects. For example, when buying a new car, the buyer’s main criteria could be automatic transmission; this would eliminate all cars that do not have automatic transmission (Tversky, 1972).
The use of mental shortcuts can lead to biased decisions that are not rational, because individuals often have distorted views of probability. Sometimes we believe that one event has a bearing on another, even when the probability of that event happening, or not happening, does not change. “Gambler’s Fallacy” is the name given to this problem, which often happens when people believe that the occurrence of one random event has a bearing on another event. The opposite of this, is the “hot hand”, where individuals think that an event is more likely to occur again if it has occurred already (Stenberg & Stenberg, 2006). This fallacy is often linked with the representativeness heuristic, which involves estimating the probability of an event “by the degree to which it is (i) similar in essential properties to its parent population and (ii) reflects the salient features of the process by which it is generated” (Kahneman & Tversky, 1972, p.431). We rely on this heuristic a great deal and quite often, because it works. For example, we base our judgement of what the weather will be like on a particular day, by looking at other characteristics of weather - such as time of the year, the area, presence of clouds etc. We may then correctly or wrongly formulate the probability that it will rain, on that particular day (Stenberg & Stenberg, 2006). We also use the representativeness heuristic because we’re inclined to believe that the characteristics of a small sample apply to and represent an entire population (Tversky & Kahneman, 1971). Base rates, which refer to the occurrence of an event within its population, are often ignored in every-day decision making, but they are vital in effective decision-making (Stenberg & Stenberg, 2006). Mistakes made due to the representativeness heuristic do not mean that people are incapable of rational thought, as there are times when this cognition process is very useful.
Individuals also tend to estimate the probability of events using the availability heuristic. Information about certain events is more readily available for us when those events occur more frequently than others, thus when assessing the probability of an event, individuals use the information which comes most easily to them. However, the availability heuristic does not solely rely on frequency, which means that the use of this heuristic sometimes gives rise to systematic biases (Tversky & Kahneman, 1973). When asked whether there are more words beginning with the letter “r” or words that have “r” as the third letter, most individuals say the first option - because words that begin with “r” come to the mind more readily than words that have “r” as the third letter. There are actually more words that have “r” as the third letter, so this is an example of where the heuristic leads to the conjunction fallacy (Tversky & Kahneman, 1983). The availability heuristic has also been associated with fears about certain risks. If a particular risk (such as terrorism or climate change), is more “available” to a person’s cognition then they will be more likely to fear it due to the availability heuristic (Renn & Rohrmann, 2000; Sunstein, 2006). Another heuristic which is related to availability is the anchoring-and-adjustment heuristic. Individuals modify their estimation of things through reference points known as end-anchors. They change the end answer according to the first value they see. The starting point may lead to a certain estimate, thus different starting points lead to different estimates (Tversky & Khaneman, 1974).
There is an additional fallacy that occurs in people’s reasoning known as the “sunk-cost fallacy”. This is linked to the tendency of people to be loss-averse; Kahneman (2011) proposed that early humans who focused more on avoiding threats rather than pursuing opportunities, had greater chances of survival – thus we evolved to be loss-averse. People tend to invest more resources into something if they have already invested into it before (to avoid loss), although in the end, more resources are spent in total. An example of this is when people spend money on a holiday and realise on the first day that they are not enjoying it, but still stay and continue spending money to “get their money’s worth” (Stenberg & Stenberg, 2006). The existence of such fallacies means that people have a tendency to think irrationally, but it does not mean we are incapable of rational thought; sometimes heuristics work excellently in solving problems (Cohen, 1981).
Biases and fallacies in decision making would lead some to believe that people are not capable of rational thought, but as discussed earlier individuals show bounded rationality (Simon, 1957; Gigerenzer & Selten, 2002) – that is we are rational, but to an extent. Differences in individuals show that some are more capable of rational thought more than others, but learning to suppress biases (by taking base rates into account, for example), can lead to better decision making (Gigerenzer, 1996). Sometimes people make decisions that do not follow any logical pattern; in conditional reasoning some make affirmation of the consequent or denial of the antecedent inferences – however, fewer individuals come to these conclusions and more individuals make the correct and logical inferences (Marcus & Rips, 1979). The question that remains to be answered is whether rational thought is simply logical thought, or whether it is the ability to make the best decision in a given situation – either way, a vast amount of research has shown that people are, indeed, capable of making rational decisions and the next step comes in recognising that rationality is as valuable as intelligence in the modern world (Stanovich, 2009)
References
- Stanovich, K. E. (2009). What intelligence tests miss: The psychology of rational thought. Yale University Press.
- Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive psychology, 5(2), 207-232.
- Stenberg, R. J., & Stenberg, K. (2006). Decision Making and Reasoning. Cognitive Psychology 6th Edition, 507-509.
- Kahneman, D. (2011). Thinking, fast and slow. Macmillan.
- Chabris, C., & Simons, D. (2010). The invisible gorilla: And other ways our intuitions deceive us. Harmony.
- Eysenck, M. W., & Keane, M. T. (2010). Reasoning and Deduction. Cognitive psychology. 6th ed. Hove and New York: Psychology Press, 571-605.
- Marcus, S. L., & Rips, L. J. (1979). Conditional reasoning. Journal of Verbal Learning and Verbal Behavior, 18(2), 199-223.
- Wason, P. C. (1966). Reasoning. New Horizons in Psychology, 135-151.
- Simon, H. A. (1957). Models of man; social and rational.
- Gigerenzer, G. (2010). Moral satisficing: Rethinking moral behavior as bounded rationality. Topics in cognitive science, 2(3), 528-554.
- Tversky, A. (1972). Elimination by aspects: A theory of choice. Psychological Review, 79(4), 281-299.
- Kahneman, D., & Tversky, A. (1972). Subjective probability: A judgment of representativeness. Cognitive psychology, 3(3), 430-454.
- Tversky, A., & Kahneman, D. (1971). Belief in the law of small numbers. Psychological bulletin, 76(2), 105.
- Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive psychology, 5(2), 207-232.
- Tversky, A., & Kahneman, D. (1983). Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychological review, 90(4), 293.
- Renn, O., & Rohrmann, B. (2000). Cross-cultural risk perception: State and challenges. In Cross-Cultural Risk Perception (pp. 211-233). Springer US.
- Sunstein, C. R. (2006). The availability heuristic, intuitive cost-benefit analysis, and climate change. Climatic Change, 77(1), 195-210.
- Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124-1131.
- Cohen, L. J. (1981). Can human irrationality be experimentally demonstrated?. Behavioral and Brain Sciences, 4(3), 317-331.
- Gigerenzer, G., & Selten, R. (Eds.). (2002). Bounded rationality: The adaptive toolbox. MIT press.
- Gigerenzer, G. (1996). On narrow norms and vague heuristics: A reply to Kahneman and Tversky.
Cite this Essay
To export a reference to this article please select a referencing style below