The Rationality Dialogy Between Economics and Psychology
Jing Qian*
Economics and psychology take distinct approaches to predicting and formalizing human behavior. Economics focuses on the normative view of rationality, while psychology emphasizes the descriptive nature of rationality. This article reviews models of rationality related to understanding human decision making, including notions of complete rationality, bounded rationality, and ecological rationality. By examining the aims and functions of such rationality models, the author wishes to draw attention to the utility of ecological rationality and adaptive rationality approaches in a unified theoretical model for understanding human rationality. The ecological rationality approach evaluates not only by performance, but more importantly by how well the behavior fits the individual' s environment. The adaptive rationality approach completes this model by additionally incorporating changes to the environment.
Rationality debate; Bounded rationality; Ecological rationality; Adaptive Rationality
Economics and psychology, two important branches of social sciences, take distinct approaches to predicting and formalizing human behavior. As a normative science, economics is mostly concerned with issues of how people should make optimal decisions. Working from a different angle, psychology is mainly concerned with describing how decisions are made. Various types of work have been done at the interface between psychology and economics, including studies of bounded rationality (Gigerenzer& Selten, 2003, 2001; Simon, 1955); heuristics and biases in decision-making (Kahneman&Tversky,1996; Tversky&Kahneman, 1975), experimental economics (e.g, Hertwig, 1998; Starmer, 1999), and behavioral economics (e.g., Camerer, Loewenstein, & Rabin, 2003; Rabin, 1998).
In this article, I will review the debate and dialog about rationality among economists and psychologists. First, a historical review of the notion of rationality in economics will show that the idea of rationality in economics is closely related to the notion of the "economic man", whose capability in making decisions is beyond the actual capability of the human mind. Second, we will review criticisms of the classic economic view of rationality, represented by the work of three different research groups: the heuristics and biases program of Kahneman, Tversky, and others (Gilovich, Griffin, &Kahneman, 2002), the notion of bounded rationality advocated by Herbert Simon (1955; 1956; 1982), and the more recent view of adaptive rationality proposed by Anderson (1990), Gigerenzer (2000), and Oaksford and Chater (1998). The differing approaches of bounded rationality and adaptive rationality share the assumption that many aspects of human behavior can be understood as adaptively rational for an organism with limited resources in a structured environment. It is concluded that the adaptive rationality approach, and in particular the need to focus on the structure of the environment is of vital importance to understanding behavior.
Rationality is a broad concept that typically encompasses the appropriate use of logic as well as "uncertain but sensible arguments" based on probability, expectation, personal experience and the like. The rationality debates among economists and psychologists is mainly constituted of discussions about complete (unbounded) rationality and bounded rationality, but also includes debate regarding whether observed deviations from the conventional normative standards should be interpreted as "adaptive rationality" or "irrational biases". The view of rationality in economics is undergoing some changes as a result of these debates.
The assumption of rationality lies at the heart of modern economic theory. The concept of rationality in economics, first introduced by Adam Smith (1776), is now generally viewed as the choice of optimal means to achieve a given end (Gerrard, 1993). In one key textbook of microeconomics (Frank, 2002), being rational is defined in terms of making choices if and only if the benefits exceed the costs of the choice. This notion of rationality is based on the four assumptions of Neoclassical Economics, personified in the concept homo economicus (economic man)-a decision maker who incorporates the characteristics of self-interest, omniscience (having complete information), conscious deliberation (mental calculation of an optimized "as if" equivalent), and representativeness (i.e., homo economicus is representative of all decision makers). The assumption of rationality in normative economics amounts to the claim that agents should optimize. In positive economics, where the focus is on developing and testing economic theories, the premise of rationality is the hypothesis of maintained consistency (Gerrard, 1993). The concept of rationality through the development of Economics has however, changed over time, and the following summary aims to review the different notions of rationality that have accompanied the development of economic theories.
The concept of rationality as it appeared in The Wealth of Nations (Smith, 1776) was viewedin terms of the standards of economic production and trade. It implied that when people see a clear advantage in a particular course of action, they will act upon it. This notion of rationality is the rationality of everyday common sense. The logic behind Smith' s work is that when every individual pursues solely his/her selfish goals, the market will reach its maximum efficiency. This assumption of rationality does not depend on an elaborate calculus of utility or assume any consistency in the factors that are taken into consideration when moving from one choice situation to another (Simon, 1997; Smith, 1937).
In Alfred Marshall' s Principles of Economics (1920), a wider notion of rationality was developed to incorporate not only the study of wealth, but also the study of the economic agent. Rationality requires the ability to forecast the future and shape one' s course with reference to distant aims. The emphasis is placed upon deliberation in decision making, which involves marginal analysis (Book IV) and maximization of utility (Book III). From this point on, neoclassical economics was established, and economics became more mathematical in nature. The assumption of rationality approximated the assumption of optimality in choices and decisions. Marshall' s contemporary neoclassical economists—William Stanley Jevon (1871), Carl Menger (1871), and Leon Walras (1954) proclaimed that rationality is exemplified by utility maximization in a general equilibrium framework.
With John Maynard Keynes' General Theory of Employment (1936), the concept of rationality departed from the key assumptions of the neoclassical framework. The author claimed that the neo-classical system represented "the way in which we should like our Economy to behave…But to assume that it actually does so is to assume our difficulties away" (p.34). He asserted that people do not have complete rationality. It was lapses from rationality of these sorts that brought about departures from a full employment of resources, and these lapses could be remedied by appropriate governmental policies. In The General Theory, Keynes comments:
There is the instability due to the characteristic of human nature that a large proportion of our positive activities depend on spontaneous optimism rather than on a mathematical expectation, whether moral or hedonistic or economic…Most, probably, of our decisions to do something positive, the full consequences of which will be drawn out over many days to come, can only be taken as a result of animal spirits-of spontaneous urge to action rather than inaction, and not as the outcome of a weighted average of quantitative benefits multiplied by quantitative probabilities (p.161-162).
Keynes pointed out the unrealistic nature of the rationality assumption in mainstream economics, but this line of thought was not followed by others. In Essays in Positive Economics (1953), Milton Friedman returned to the Neo-classical version of rationality. Acknowledging the unrealistic assumption underlying economic models—that decision makers have to be "Laplacean Demons" to be able to make optimization calculations—he instead proposed that economic agents make decisions "as if" they were applying complicated optimizations. He used examples like the Newtonian physical laws regarding free fall objects, biological phenomena concerning the density distributions of trees, and the manner pool players strike the ball as analogies for how economic agents "appear to" make decisions "as if" they were following strategies derived from precise optimal calculations:
It is only a short step from these examples to the economic hypothesis that under a wide range of circumstances individual firms behave as if they were seeking rationally to maximize their expected returns, and had full knowledge of the data needed to succeed in this attempt; as if, that is, they knew the relevant cost and demand functions, calculated marginal cost and marginal revenue from all actions open to them, and pushed each line of action to the point at which the relevant marginal cost and marginal revenue were equal. (p.21).
Even though the assumption that an economic agent employs mathematical optimization in making every decision is clearly unrealistic, mainstream economists do accept that agents behave "as if" they are using optimization. Friedman' s view of rationality became very popular among economists, who took the "as if" rational model as a useful approximation of human behavior.
The modern notion of rationality in Decision Theory (or Rational Choice Theory) is based on Savage' s formalization of Expected Utility Theory (1954), which states that the decision maker chooses between risky or uncertain prospects by comparing their expected utility values. Expected utility was formalized in the multiplicative combination of outcome utility values and their respective probabilities. To be rational in decisionmaking under risk one must have complete and transitive preferences. In the domain of decisionmaking under uncertainty, von Neumann-Morgenstern Theory (VNMT) states that being rational means having preferences that are also independent. These assumptions of consistency in preferences allow expected utilities from all alternatives to be calculated and compared, and allow a choice to be made for the most preferred option using utility maximization. In order to have consistent preferences, a decision maker is assumed to have all information about all the options, their probability of occurrence (either from a probability distribution in the case of decision under risk, or subjective probability in the case of decision under uncertainty), and to have the time and ability to weigh every choice against every other choice.
In general, the approaches reviewed above share a concept of rationality developed within economics bound by the idealistic, logical, deductive, and normative qualities of homo economicus. This notion of rationality is "useful in generating solutions to theoretical problems, but it demands much of human behavior—much more in fact than it can deliver" (Arthur, 1994, p.406). Criticisms about the general assumptions that this "economic man" represents in economics have come mainly from three areas. The following sections review these criticisms.
Around the same time Expected Utility Theory (EUT) became the dominant model of individual behavior in economics literature, and a large body of evidence was accumulating that human behavior deviates systematically from the idealized behavior assumed by economists who believe decision makers maximize expected utility.
3.1 Heuristics and Biases
This evidence (Gilovich, Griffin, &Kahneman, 2002; Kahneman, Slovic, &Tversky, 1982), collectively dubbed Kahneman and Tversky' s Heuristics and Biases program, used a broad array of problems to demonstrate experimentally that, under quite ordinary circumstances,people reason and make decisions in ways that systematically deviate from what would be predicted according to the basic rules of logic and probability theory. Specifically, people do not have consistent preferences, and their preferences may vary in accordance with contextual settings or mental representations. Such phenomena as the endowment effect (Kahneman, Knetsch, &Thaler, 1991), loss aversion (ibid.), status quo bias (ibid.), framing effects (Tversky&Kahneman, 1986), and preference reversals (Slovic& Lichtenstein, 1983) are well-established anomalies that violate the assumption of consistency in Expected Utility Theory. On the basis of a series of studies, Tversky and Kahneman (1974) concluded that, "people rely on a limited number of heuristic principles which reduce the complex tasks of assessing probabilities and predicting values to simpler judgemental operations. In general, these heuristics are quite useful, but sometimes lead to severe and systematic errors" (p.1124).
The Heuristics and Biases program showed that as people employ a handful of heuristics when making decisions, deviations from normative models are systematic and predictable. Kahneman and Tversky (1973) argued that human choices are not consistent and transitive, as they would be if a utility function existed. Their studies invalidate the justification of the "as if" approximation in normative economic models, and call for alternative models to be proposed.
Prospect Theory (Kahneman&Tversky, 1979) offered an alternative framework for judgment and choice under risk. According to Prospect Theory, the decision process consists of the editing phase and the evaluation phase. In the editing phase, prospects are coded in terms of gains and losses, combining common features and segregating riskless components. The core of Prospect Theory, however, lies in the evaluation phase. A reference-dependent value function together with a probability weighting function is used to transform outcomes and probabilities into subjective utility and decision weights. The value function is concave for gains and convex for losses. The weighting function assumes the overweighting of small probabilities and underweighting of large probabilities typical of subjective judgments. The outcomes of these two functions are multiplied in a similar fashion to calculations used in EUT, and preferences are predicted by comparing the values of these outputs. Prospect Theory provided an account of a range of empirical observations on the differential weighting of gains and losses, as well as high and low probabilities; thus, its formulation solved several violations of Subjective Expected Utility Theory (SEU). This theory provides a good descriptive account of decision making under risk, but remains only a descriptive account, because it does not answer why people employ such heuristics.
Similar to work by Evans and Over (1996) that marked a distinction between bounded and unbounded rationality types, Kahneman (2003) emphasized a distinction between two types of mental processes: those that are part of the intuitive automatic system (which are error-prone), and those that are part of the serial effortful deductive system that follows strict rules.
By treating decision heuristics as biases, the evident conclusion is that humans are not rational because they systematically display reasoning errors and inconsistency in preferences relative to normative standards. As with optical illusions, human are easily susceptible to cognitive illusions that cannot be reconciled (Kahneman, 1996). The Heuristics and Biases program preserved the normative standards of Neo-classical Economics while developing the view that the human mind normally operates using heuristics. This extremeconclusion from the heuristics and biases programthat deviations from SEU theory are biased and irrational-has met with severe criticism from several researchers. Rather than detailing these numerous criticisms, I will focus instead on the alternative approach offered by bounded rationality.
3.2 Bounded Rationality
In an attempt to understand complex human decision making, Herbert Simon (1955, 1979, 1982, 1992, 1997a) was the first to chart both how and why cognitive reality departs from the formalized ideal decision environment assumed by normative theories of Economics. As Simon (1975) noted:
The capacity of the human mind for formulation and solving complex problems is very small compared with the size of the problems whose solution is required for objectively rational behavior in the real worldor even for a reasonable approximation of such objective rationality (p.198).
This fundamental limitation on human information processing gives rise, according to Simon, to satisficing behavior-the tendency to settle for satisfactory, rather than optimal, courses of action. In terms of bounded rationality, people satisfice with respect to their aspiration level instead of optimizing with respect to all information about the world.
"One requirement of optimization not shared by satisficing is that all alternatives must be measurable in terms of a common utility function" (Simon, 1986, p.210). Simon (1956; 1979) pointed out that blocks of an organism' s time can be allocated to activities related to individual needs (separate means-ends chains) without creating any problem of overall allocation or coordination, or the need for any general utility function.
Simon (1983) described Subjective Expected Utility Theory as "a beautiful object deserving a prominent place in Plato' s heaven of ideas" (p.13), but pointed out several ways in which realworld decision making falls a long way short of this ideal. Whereas SEU assumes that decision makers have an undisturbed view of all possible scenarios of action, real human decision-making is almost invariably focused upon specific matters. The former theory requires that the decision maker comprehend the entire range of possible alternatives, but decision makers are most likely to contemplate only a few of the available alternatives (Fischhoff, Slovic, & Lichtenstein, 1978). Human decision-making is constrained by its "keyhole" view of the problem space-what Simon (1975) has coined "bounded rationality".
The bounds of rationality are dictated by the complexity of the world in which we live: the incompleteness and inadequacy of human knowledge, the inconsistencies of individual preference and belief, the conflicts of value among people and groups of people, and the inadequacy of the computations we can carry out, even with the aid of the most powerful computers (see Simon, 1956).
In contrast with the assumptions of an economic man, Herbert Simon proposed a model of a thinking man, who makes decisions by "satisficing" rather than "maximizing". Simon (1979) emphasized the following qualities of a thinking man:
Thinking Man is capable of expressing his cognitive skills in a wide range of task domains: learning and remembering, problem solving, inducing rules and attaining concepts, perceiving and recognizing stimuli, understanding natural language, and others. An information-processing model of Thinking Man must contain components capable of humanly intelligent behavior in each of these domains; and, as these models are created, they must gradually be merged into a coherent whole. (p.10).
This account of bounded rationality that Thinking Man is equipped with is more closely related to psychological theories of perception, memory, learning and cognition. It calls for theories that address not only the cognitive mechanism of the decision maker who has limited time and knowledge, but also the structure of the environment to which the decision maker adapts. In Simon' s (1956) terms, "Human rational behavior is shaped by a scissors whose two blades are the structure of task environments and the computational capabilities of the actor." (p.129). Thus, one emphasis of the bounded rationality approach is the role of the decision environment. Recent accounts of adaptive rationality focus particularly on this aspect of rationality and evaluate rational behavior in light of the structure of the environment, ecological as well as contextual.
3.3 Adaptive Rationality
The concept of adaptive rationality, or ecological rationality, is related to the notion of bounded rationality. Particularly, great emphasis is placed upon the evaluation of human behavior in terms of its success in its natural environment rather than against normative standards. The central ideal behind adaptive rationality is that people use heuristics to solve everyday problems; and human memory and reasoning, which are evolved to facilitate the use of these heuristics, are adaptive and successful within a representative natural environment, even though sometimes these heuristics produce behaviors that are discordant with the laws of logic, probability theory, expected utility theory, and rational choice maxims. As Cosmides and Tooby (1994) pointed out, "Forms follow function: the properties of an evolved mechanism reflect the structure of the task it evolved to solve." (p.328). These heuristics are "smart" because they exploit the structure of the environment, they dispense with optimization and, for the most part, with calculations of probabilities and utilities. The fact that such heuristics do not fit into the framework of decision theory, leads to the question of whether traditional normative standards should be used to evaluate human rationality.
In particular, Gerd Gigerenzer and colleagues (e.g. Simple Heuristics that Make us Smart, 1999b; Adaptive Thinking, 2000; Bounded Rationality, 2000) established research programs investigating the adaptive nature of human behavior, with a focus on the use of fast and frugal heuristics. Gigerenzer (2000) compared the human brain to an "adaptive toolbox", which is a repertoire of such heuristics. Their central argument is that "fast and frugal" strategies can perform as well as full optimization if not more, but they operate at a much lower cost cognitively.
Fast and frugal heuristics such as the recognition heuristic and the "take-the-best" strategy are extremely effective in tasks like these, when the distribution of information in the environment is skewed, and the cognitive resources of the decision makers are limited. Gigerenzer and Goldstein (1996) argued that the mere success of human inferential ability in evolutionary terms is "an existence proof" for adaptive rationality, and such rationality need not be judged in terms of rational norms. They further challenged the validity of classical rationality as the standard for measuring rationality, and suggested that rationality should instead be measured by itssuccess in solving ecologically relevant problems.
Chater, Oaksford, Nakisa, and Redington (2003) examined the viability of fast and frugal heuristics, and argued that the adaptive rationality approach could be consistent with classical rationality assumptions (such as probability theory and decision theory). They further assessed why fast and frugal heuristics are rational heuristics using the rational analysis method (Anderson, 1990). Evaluating the take-the-best heuristic (TTB) against the normative criteria, Chater et al. found that TTB performs impressively-especially in a frugal information environment-compared with other standard algorithms such as exemplarbased models (e.g., Nosofsky, 1990), connectionist networks (Rumelhart& McClelland, 1986), and decision trees (Quinlan, 1993). TTB is also impressive because it represents a process that is comparable to human performance. However, detailed analysis revealed that, though the TTB heuristic is fast and frugal, in some cases other algorithms are just as cognitively plausible.
Payne, Bettman, and Johnson (1993) showed that fast and frugal heuristics are among a continuum of strategies that people employ daily. These heuristics are seen as applicable to a wide range of reasoning and judgement tasks (see Czerlinski, Gigerenzer, & Goldstein, 1999; Dhame& Ayton, 2001 for some interesting examples). Fast and frugal heuristics like TTB thrive in a decision environment where information is scarce and time is pressing. But given enough time and resources, people may not necessarily choose fast and frugal heuristics. Oppenheimer (2003) questioned the reason why the recognition heuristic, which is fundamental in the adaptive toolbox, succeeded. He suggested that its success may be attributed to people using knowledge associated with the non-compensatory cue of recognition rather than pure recognition. In the city size example, Oppenheimer hypothesizes that the recognition heuristic works because of people' s knowledge that the known cities are large. To test his hypothesis, he used local cities that are recognized but known to be small, and fictional cities for which participants could have no recognition. He found that local cities that people recognized were chosen as the smaller city on average, contrary to Gigerenzer and Goldstein' s finding, if taken at face value. Oppenheimer' s finding highlights the importance of the ecological validity of cues, and that the structure of the environment where information is retrieved may determine both the choice of heuristics and their success rate.
Our memory system is one aspect of adaptive cognition that is optimized to the structure of the environment (Anderson &Schooler, 1991). The rate of forgetting an item in memory is optimized to the likelihood of encountering that item in the world. A rational analysis of information encoding in memory reveals that forgetting is adaptive, because it reflects the pattern with which certain information appears and reappears in the environment (Schooler&Hertwig, 2005).
Oaksford and Chater (1998) reviewed the adaptableness of human behavior in light of the structure of the task environment. Aspects of adaptive cognition including categorization, information searching, and selecting evidence in reasoning may all be viewed as optimizing the amount of information gained at a fixed cost. The rational analysis approach to cognition can be seen as both descriptive and normative, because its hypotheses can be tested against empirical data. It can explain both how the mind works and why it is successful. This direction of research is different to that of deductive Neo-classical rationality in the a priori assumptions it holds. Rational analysis holds the assumption that accounts of the mindmust not only be both normatively justified, but also descriptively adequate.
The important differences in conceptualizations of rationality rest on a fundamental distinction: in economics, rationality is viewed in terms of the choices it produces; in the other social sciences, it is viewed in terms of the processes it employs (Simon, 1976, 1982, 1997).
To be rational from the point of view of economics, with its manifestation of modern decision theory and probability theory, is to be deductive, logical, and consistent. When evaluating human behavior against these standards, systematic deviations from normative answers are often observed (as revealed by the heuristic and biases program). The heuristics and biases program treated these deviations as human biases, and suggested modifications to existing rational models (such as Expected Utility Theory) to incorporate these biases. In contrast to this, Gigerenzer (1996) argued that the norms for evaluating reasoning and decisions have been too narrowly drawn, and the type of reasoning task used to evaluate human rationality is devoid of context and content (Hertwig, Ortmann, &Gigerenzer, 1997). It is not irrational to make such errors, as people are adaptive thinkers who draw inferences from the statistical world where inductive information is valuable (e.g. Gigerenzer& Goldstein, 1996; Oaksford&Chater, 1994). Recent work has shifted the focus to notions of bounded and adaptive rationality, which aim to explain why it is that people use heuristics. Crucial to the shift is the emphasis on the structure of the environment. In a world of uncertainties, rational judgement will largely depend on making correct inference about the information distribution of the environment.
[1] Anderson, J. R. The adaptive character of thought [M]. Hillsdale, NJ: Erlbaum. 1990.
[2] Anderson, J. R., &Schooler, L. J. Refections of the environment in memory[J]. Psychological Science, 1991 (2), 396-408.
[3] Arthur, W. B. Inductive reasoning and bounded rationality [J].American Economic Review, 1994(84), 406-411.
[4] Camerer, C. F., Loewenstein, G., & Rabin, M. (Eds.). Advances in behavioral economics. [M]. Princeton, NJ: Princeton University Press. 2003.
[5] Chater, N., Oaksford, M., Nakisa, R. &Redington, M. (). Fast, frugal and rational: How rational norms explain behavior[J]. Organizational Behavior and Human Decision Processes, 2003(90) 63-86.
[6] Cosmides, L., &Tooby, J. (). Beyond intuition and instinct blindness: Towards an evolutionary rigorous cognitive science[J]. Cognition.1994(50) 41-77.
[7] Czerlinski, J., Gigerenzer, G., & Goldstein, D. G. (). How good are simple heuristics? In G. Gigerenzer, P. M. Todd & The ABC Group (Eds.), Simple heuristics that make us smart[M]. Oxford, UK: Oxford University Press. 1999: 97-118
[8] Dhami, M. K., & Ayton, P. Bailing and jailing the fast and frugal way[J].Journal of Behavioral Decision Making, 2001(14) 141-168. [9]Evans, J. S. B. T., & Over, D. E. Rationality and reasoning[M]. Hove, Sussex: Psychology Press. 1996
[10]Fischhoff, B Slovic, P., & Lichtenstein, S. Fault trees: Sensitivity of estimated failure probabilities to problem representation [J]. Journal of Experimental Psychology: Human Perception and Performance, 1978(4) 330-334.
[11]Frank, R.H. Microeconomics and behavior (5th ed.)[M]. New York: McGraw-Hill. 2003
[12]Friedman, M. The methodology of positive economics. In M. Friedman (Ed.), Essays on positive economics. [M]. Chicago: University of Chicago Press. 1953.
[13]Gerrard, B. The economics of rationality. [M]. London, Routledge. 1993
[14]Gigerenzer, G. On narrow norms and vague heuristics: A reply to Kahneman and Tversky. [J]. Psychological Review, 1996.103, 592-596.
[15]Gigerenzer, G., & Goldstein, D. G. Reasoning the fast and frugal way: Models of bounded rationality[J]. Psychological Review, 1996(103), 650-669.
[16]Gigerenzer, G. Adaptive thinking: Rationality in the real world. [M]. London: Oxford University Press. 2000
[17]Gigerenzer, G., &Selten, R. (). Bounded rationality: The adaptive tool box[J]. Psychology and Marketing, 2003(20), 87-92.
[18]Gilovich, T., Griffn, D., &Kahneman, D. (Eds.). Heuristics and biases: The psychology of intuitive judgment[M]. New York: Cambridge University Press. 2002
[19]Hertwig, R. Psychology, experimental economics and the question of what good experimentation is / psychologie, experiment elleokonomie und die frage, was gutes experimentierenist. Zeitschrift fur [J].Experimentelle Psychologie, 1998(45) 2-19.
[20]Hertwig, R., Ortmann, A., &Gigerenzer, G. Deductive competence: A desert devoid of content and context[J]. Current Psychology of Cognition, 1997(16) 102-107.
[21]Jevons, W. S. The theory of political economy [M]. Harmondsworth: Penguin, 1871[1970].
[22]Kahneman, D. A perspective on judgment and choice: Mapping bounded rationality [J]. American Psychologist, 2003(58) 697-720.
[23]Kahneman, D., Knetsch, J. L., &Thaler, R. H. The endowment effect, loss aversion, and the status quo bias[J]. Journal of Economic Perspectives, 1991(5) 193-206.
[24]Kahneman, D., Slovic, P., &Tversky, A. (Eds.). Judgment under uncertainty: Heuristics and biases[M]. Cambridge, UK: Cambridge University Press. 1982.
[25]Kahneman, D., &Tversky, A. Prospect theory: An analysis of decisions under risk[J]. Econometrica, 1979(47) 263-291.
[26]Kahneman, D., &Tversky, A. On the reality of cognitive illusions[J]. Psychological Review, 1996(103) 582-591.
[27]Kahneman, D., &Tversky, A. (Eds.). Choices, values, and frames [M]. Cambridge, England: Cambridge University Press. 2000.
[28]Katz, M. L., & Rosen, H. S. (1998) Microeconomics (3rd ed.). New York: McGraw-Hill.
[29]Keynes, J. M. The general theory of employment, interest and money[M].London: MacMillan. 1936.
[30]Marshall, A. Principles of economics (8th ed.). [M]. New York: Macmillan. 1920.
[31]Menger, C. Principles of Economics. Institute for Humane Studies Series in Economic Theory, 1871[1994].
[32]Mosteller, F., &Nogee, P. An experimental measurement of utility[J]. Journal of Political Economy, 1951(54) 371-404.
[33]Nosofsky, R. M. Relations between exemplar similarity and likelihood models of classifcation[J]. Journal of Mathematical Psychology, 1990(34), 393–418.
[34]Oaksford, M., &Chater, N. A rational analysis of the selection task as optimal data selection[J]. Psychological Review, 1994 (101), 608-631.
[35]Oaksford, M., &Chater, N. (Eds.). Rational models of cognition. [M]. New York: Oxford University Press (1998).
[36]Oppenheimer, D. M. Not so fast and not so frugal!: Rethinking the Recognition Heuristic. [J]. Cognition, 2003(90), B1-B9.
[37]Payne, J. W., Bettman, J. R., & Johnson, E. J. The adaptive decision maker. [M]. New York: Cambridge University Press. 1993.
[38]Quinlan, J. R. C4.5: Programs for Machine Learning [M].Morgan Kaufmann, Los Altos. 1993.
[39]Rabin, M. Psychology and economics. [J]. Journal of Economic Literature, 1998(36) 11-46.
[40]Rumelhart, D. E., McClelland, J. L., & The PDP Research Group Parallel distributed processing: Explorations in the microstructures of cognition .1986 (Vols. 1 & 2).
[41]Savage, L. J. The foundations of statistics. [M]. New York: Wiley. 1954.
[42]Schooler, L. J., &Hertwig, R. How forgetting aids heuristic inference. [J]. Psychological Review, 2005(112) 610-628.
[43]Shepard, R. N. Evolution of a mesh between principles of the mind and regularities of the world. In J. Dupré (Ed.)[J]. The latest on the best: Essays on evolution and optimality. 1987.
[44]Simon, H. A. A behavioral model of rational choice[J]. Quarterly Journal of Economics, 1955(69) 99-118.
[45]Simon, H. A. Rational choice and the structure of environments[J]. Psychological Review, 1956(63), 129-138.
[46]Simon, H. A. Information processing models of cognition[J]. Annual Review of Psychology, 1979(30), 363-396.
[47]Simon, H. A. Models of bounded rationality. [M]. Cambridge, MA: MIT Press. 1982.
[48]Simon, H. A. Rationality in psychology and economics[J]. Journal of Business, 1986(59) S209-S224.
[49 Slovic, P., & Lichtenstein, S. Preference reversals: A broader perspective[J]. American Economic Review, 1983(73), 596-605.
[50]Smith, A. The wealth of nations (5th ed.)[M]. New York: The Modern Library, 1776.
[51]Starmer, C. Experimental economics: Hard science or wasteful tinkering?[J]. Economic Journal, 1999(109), F5-F15.
[52]Tversky, A., &Kahneman, D. Judgment under uncertainty: Heuristics and biases [J]. Science, 1974.(185) 1124-1131.
[53]Tversky, A., &Kahneman, D. Rational choice and the framing of decisions[J]. Journal of Business, 1986(59), S251-S278.
[54]Walras, L. Elements of Pure Economics. Tr. William Jaffe. [M]. London: George Allen &Unwin. 1954.
*Jing Qian, Department of Psychology, School of Social Sciences, Tsinghua University, Beijing, China.
Fund project: This research was funded by the Chinese National Science Foundation (CNSF: 71401089) and Tsinghua Research Initiation Fund (20151080447).
Contemporary Social Sciences2016年1期