The Computational Economists
Economics faces six fundamental questions: (i) will the society organized on the principles of exchange (the Market Economy) stay composed or will it fall apart (the question of existence of equilibrium)?, (ii) will such an equilibrium be unique (a multiplicity of equilibria poses difficult and embarrassing questions)?, (iii) will such an equilibrium be robust (the question of stability of equilibrium)?, (iv) will such an economy (society) be efficient?, (v) will it grow or expand forever?, and (vi) will it be just? The classical economists, Adam Smith in particular, answered all these questions affirmatively using a characteristic methodology. However, Karl Marx challenged the entire structure of faith in the merits of the exchange economy and shattered all optimism regarding the said order. The neoclassicists, mostly using their own new (mathematical, marginalist, rationalistic, atomistic, hedonistic, etc) methodology set out to prove that answers to all those six questions were in affirmative. Keynesian economics explored the possibilities and implications of disequilibrium as well as the low level equilibria (such as the under-employment equilibrium) and Neo/New-Keynesian economics, too, used the concept of equilibrium as a basic tool of analysis. All these variants of economic theory were conceptual. The heterodox variants of economics (e.g. Institutional economics, Marxian economics, etc) also were theoretical and used the top-to-down approach for developing the theories.
Prompted mainly by the economic malfunctiong experienced in the aftermath of World War-I finally culminating into the Great Depression, econometrics was born in the midst of dissatisfaction from deductively formulated economic theories. In the early days of econometrics (during 1930's) there arose two schools of thought: the one that accepted the given Neo-Classical theories and set out to test them by developing and using statistical methods on the empirical data and the other (led by W.C. Mitchell, who was an Institutional economist and valued the Neo-Classical economic theories very little) that proposed to develop and use statistical methods on the empirical data and formulate economic theories inductively. However, the latter (Mitchell's) approach was ridiculed (by the economists - like T.C. Koopmans - of the former school) as "measurement without theory" (also see Simkins, 1999) and thus remained marginalized. Indeed, economists like Joseph Schumpeter did not support the then contemprary trends in quantitative/econometric analysis. For Schumpeter: "...the most serious shortcoming ...[is]... that nobody seems to understand or even to care precisely how industries and individual firms raise and fall and how their raise and fall affects the aggregates." Schumpeter felt that without understanding how individual actions form the aggregates, "this abstraction would be revealed as unwarranted, and much econometric work on unanalysed aggregates would have to be rethought or scrapped." (E.S. Andersen, 2004). Ragner Frisch, who was perhaps the founding father of econometrics, himself was unhappy with the post-1950 development of econometrics, which, in his view, had become more of a toyometrics. Francisco Louçã's "The Years of High Econometrics" presents the early history of econometrics in a very enchanting style (also see Mary Morgan's (1990) History of Econometric Ideas). It will not be inappropriate to comment that econometricians of the post 1950's killed econometrics by tightly binding it with the dominant neo-classical school.
It is pertinent to note, however, that Mitchell's points (that had some roots in Peirce, C. S (1876/1958(reprinted)) did not die out completely (though remained aliented by the economists by and large). Box and Wilson (1951) and Box and Draper (2007) developed their response surface analysis that was well received in statistics and engineering. Machine learning processes, especially unsupervised learning and reinforcement learning procedures are used to find hidden patterns in data and may be used to formulate new economic theories. However, one does not find their significant acceptance in economic analyses.
Thus, development of economics from below - the bottom-up approach - could not take firm roots. It is pertinent to observe that W.C. Mitchell as well as J. Schumpeter foreshadowed the agent-based methodology - as well as data mining and artificial neural network approach - to develop economic theories (what is being done now-a-days), but at their time it could not have been possible since computers were not invented for doing research in that direction.
It may be noted that the Neo-Classical economists set themselves at deriving the laws of economics axiomatically, but in the process of so doing they had to kill themselves by distancing their economics from the reality. The real world is too complex to be axiomatized. They also minimized the framework of rules in which individuals work and make decisions. The Game Theory Economists did recognize the rules of behaviour explicitly, but very soon found that near-reality game-theoretic models could be mathematically non-tractable. Game theory exhibited the zenith of deductive reasoning and its failure to capture the reality. On the other hand, the heterodox economists were deductive, too descriptive and thus permissive to a loose structure of reasoning. The failure of deductive reasonong, therefore, invited an application of inductive reasoning in which it may be possible to observe the evolution of economic laws in an artificially given framework of rules of conduct, which may be changed at will to observe once again its effects on the conduct of individuals and the laws of the evolving economy. Such an approach required an intensive use of computation.
Computational economics explores the intersection of economics and computation. Areas encompassed under computational economics include agent-based computational modeling, computational econometrics and statistics, computational finance, computational modeling of dynamic macroeconomic systems, of transaction costs, computational tools for the design of automated Internet markets, programming tools specifically designed for computational economics, and pedagogical tools for the teaching of computational economics. Some of these areas are unique to computational economics, while others extend traditional areas of economics by solving problems that are difficult to study without the use of computers.
Of particular interest is the use of computation for building the economic theories (or testing the given theories) by the bottom-up approach. This approach uses simulation (with an appropriate design suitable to the behavioural assumptions) to observe the performance of a model in yielding the expected results. Agent-Based Computational Economics (ACE) is the computational study of economic processes modeled as dynamic systems of interacting agents. Here "agent" refers broadly to a bundle of data and behavioral methods representing an entity constituting part of a computationally constructed world. Agents can represent social, biological, and/or physical entities. Starting from initial conditions determined by the modeler, an ACE model develops forward through time driven solely by agent interactions. The "agents" in ACE models can represent individuals (e.g. people), social groupings (e.g. firms), biological entities (e.g. growing crops), and/or physical systems (e.g. transport systems). The ACE modeler provides the initial configuration of a computational economic system comprising multiple interacting agents. The modeler then steps back to observe the development of the system over time without further intervention. In particular, system events should be driven by agent interactions without external imposition of equilibrium conditions (which means that ACE parts with the methodological equilibrium, presumed by the Neo-Classical Economics). In brief, agent-based models incorporate five blocks of construct: (i) the agents and their disposition, (ii) the social rules and norms to be followed by the agents in interaction with the fellow members or institutions, (iii) the environment - including the resources, (iv) the technology including the rules that govern transformation of inputs into output, and finally (v) emergence encompassing all the four blocks. These blocks and their elements act and interact with each other. The ACE has yielded many interesting and useful conclusions.
|The Computational Economists|
|W. Brian Arthur||William A. Brock|
|Santa Fe Institute||Leigh Tesfatsion|
|Kenneth L. Judd||Hans M. Amman|
|Joshua M. Epstein||Robert Axtell|
|Mitchel Resnick||Robert Axelrod|
Computational Economics is attracting many new authors. Most of their works may be free downloaded.
References:Peirce, C. S (1876). "Note on the Theory of the Economy of Research". Coast Survey Report: 197-201. (Appendix No. 14). NOAA PDF Eprint. Reprinted in Collected Papers of Charles Sanders Peirce. 7. 1958. paragraphs 139-157, and in Peirce, C. S. (July-August 1967). "Note on the Theory of the Economy of Research". Operations Research 15 (4): 643-648. Viw in Science Bought and Sold)
Box, G. E. P. and Wilson, K.B. (1951) On the Experimental Attainment of Optimum Conditions (with discussion). Journal of the Royal Statistical Society Series B13(1):1-45.
Box, G. E. P. and Draper, N. (2007) Response Surfaces, Mixtures, and Ridge Analyses, Second Edition [of Empirical Model-Building and Response Surfaces, 1987], Wiley.