## Saturday, January 29, 2011

### Entrepreneurial Profits In A Classical Model

I came upon this passage recently:
"Thus, in the short run, the entrepreneurs introducing new techniques would reap supernormal profits. [Suppose] there is continuous technical change in the system, which Marx assumes to be in the nature of capitalist competition and a requirement for the maintenance of the 'reserve army of labour', then there arises a permanent income category that cannot be accounted for by labour-time accounting... Though Pareto does not clearly separate such entrepreneurial income from returns to capital in general, or from the notion of productivity of capital, it is plain that he foreshadows the idea that was later developed by Schumpeter...as an explanation for positive profits in a capitalist economy." -- Ajit Sinha (Theories of Value from Adam Smith to Piero Sraffa, Routledge (2010) pp. 214-215.)
This suggests to me a puzzle: can I create a model in which entrepreneurs make profits even though the workers are paid what would be the entire net output if the technology in use during the period in which they are paid were to persist unchanged? This post demonstrates that Sinha's comment is well-founded.

2.0 A Model
2.1 The Technology
Consider a simple economy in which a single commodity, corn, is produced each year. Workers produce the annual output from inputs of (seed) corn and their labor. The technology is defined by:
• a0(t): the labor (in person-years) needed as input per bushel corn produced in the tth year.
• a1(t): the (seed) corn needed as input per bushel corn produced in the tth year.
The coefficients of production evolve as in the following two equations:
a0(t) = e-λ0t
a1(t) = c e-λ1t
where the positive constants λ0 and λ1 are the rate of decrease in the labor and (seed) corn inputs, respectively. I impose the condition that the quantity harvested must exceed the quantity of seed corn planted in the spring:
0 < c < 1

2.2 Quantity Flows
Let Q(t) be the bushels of corn produced during the tth year and available after the harvest at the end of year. Assume:
Q(t) = eλ0t
The labor employed each year is a0(t)Q(t), that is, one person-year.The seed corn, K(t), required for planting at the start of the tth year is:
K(t) = a1(t)Q(t) = c e-(λ1-λ0)t
The seed corn decreases each year if and only if the rate of decrease of the labor input per bushel corn produced exceeds the rate of decrease of the seed corn input per bushel corn produced:
λ1 < λ0

The surplus corn harvest, Y(t), over the seed corn planted at the start of the year is:
Y(t) = Q(t) - K(t) = eλ0t(1 - c e-λ1t)

2.3 Prices, Wages, And Distribution
Assume the labor hired during a given year is paid at the end of the year out of the harvest. The Sraffian price equations for this model are:
a1(t) + a1(t) w(t) = 1
where w(t) is the wage per person year, and I have taken a bushel of corn as the numeraire. It is easy to solve this equation to find that the wage is the net output produced by the person-year employed:
w(t) = Y(t)

If the seed corn required for a constant labor force declines year-by-year, this model provides a source of entrepreneurial profit:
π(t) = K(t + 1) - K(t) = c e-(λ1-λ0)t[e(λ0-λ1) - 1]
What happens if the condition on technological progress is not met? I haven’t worked out this case, but two possibilities seem to me to arise. In the first case, workers cannot consume the entire surplus each year. Perhaps, the capitalists obtain some accounting profits on their capital and they save those profits as additions to the seed corn each year. In the second case, the number of hours worked decline.

## Wednesday, January 26, 2011

### No Crisis In Mainstream Economics...

...Instead, the situation is chronic. The long-festering situation of economics is seen in the constant literature on the "crisis" in economics, going back for maybe half a century.

The book The Crisis in Economic Theory (1981) is a timely example. It was edited by Daniel Bell and Irving Kristol, and Daniel Bell's obituary was published in the New York Times today. (See also this Crooked Timber post.) The editors wrote the introduction, and each wrote a chapter. Other contributions include Frank Hahn on general equilibrium theory, Israel Kirzner on the Austrian school, Paul Davidson on Post Keynesianism, and Edward Nell on a Sraffa-influenced interpretation of Marxist economics.

## Tuesday, January 25, 2011

### Blah, Blah, Jevon's Paradox, Blah, Blah, Backfire

 Figure 1: The Carrier Dome, Named After A Manufacturer Of A/C Equipment No Longer In Syracuse, NY

The Jevon's paradox arises when increased efficiency in the use of a resource results in greater overall use of that resource. This is a severe example of "rebound", where the effects of increased efficiency are lessened by increased use. Jevon's wrote about Coal.

Some may have noticed David Owen's recent article in The New Yorker focused on air conditioning (A/C). By synchronicity, Computer, the flagship journal of the Institute of Electrical and Electronic Engineers (IEEE) Computer Society, also published an article on rebound about the same time. Tomlinson, Sliberman, and White recommend mindfulness in the pursuit of energy efficiency in Information Technology and reference a report from an organization in the United Kingdom. (I'm not as dismissive of the Jevon's paradox as my title may suggests; I just wanted a template to could apply to both articles.)

The New Yorker published three letters on their article. I select Amory Lovins' letter for not just because he is a well-known advocate of increased energy efficiency. He points out that much of the increased use of A/C that David Owens describes is due to increased wealth, not rebound. He uses the example of oil to assert that increases in efficiency could drop energy use in absolute terms, even with economic growth. Apparently, between 1977 and 1985, the United States Gross Domestic Product rose 27 percent and oil use fell 17 per cent.

For utility-sponsored research in the United States, I look to the Electric Power Research Institute (EPRI). I don't know if they have a take on the Jevon's paradox. I think Leontief Input-Output analysis and Luigi Pasinetti's structural economic dynamics provide empirical tools for investigating the question.

References

## Friday, January 21, 2011

### Krugman On The Importance Of Austrian Business Cycle Theory

Paul Krugman writes:
"Someone, I don’t know who at this point, sent me to this post by Robert Murphy, which is the best exposition I’ve seen yet of the Austrian view that’s sweeping the GOP..."
I think the Republicans are not as coherent as Krugman makes them out to be. They are even less coherent than Austrian Business Cycle Theory.

I don't know that I will pursue trying to publish my rebuttal of Austrian Business Cycle Theory. Some have previously brought up my working papers in discussions of Krugman's blog and column. So far, I have not seen my name mentioned in discussions of this Krugman blog post.

## Tuesday, January 18, 2011

### ASSA, Not AEA In Denver?

 Denver In Some Other Month

The proceedings of the recent American Economic Association annual conference are online. Preliminary versions of some of the papers are available for download.

I have trouble getting a sense of what mainstream economists are about from such a massive list. Doubtless I would find some of these papers of interest, despite the forbidding technical titles. I regret that downloads are not available for panel discussions (for example, "What's Wrong (and Right) with Economics? Implications of the Financial Crisis", "Lessons for Economics from the Great Recession", "Grand Challenges for Social Science...", "History, Crisis, Institutions and Economic Analysis", and "The Ethics of Professional Economic Practice"). (Actually, doing a search on the word "Panel" does cut down the list of titles to a manageable size that may give a feel to the direction of the profession.)

I also look in this list for economists of schools of thought and fields in which I'm interested. The Union of Radical Political Economics has only one session co-sponsored with the AEA. Steve Keen provides videos and download links. The History of Economics Society also has one co-sponsored session. As far as I can see, the Association for Evolutionary Economics (AFEE) and the International Association for Feminist Economics have no sessions. Brad DeLong's presentation is the only one with the word "Keynes" or "Keynesian" in the title. This sample should not lead one to conclude that heterodox economists were not represented in Denver, for they appear in the program of the Allied Social Science Associations. Explicitly identified heterodox economists just seem to be banished from the AEA, to the AEA's shame.

## Friday, January 14, 2011

### And The Life Of Man, Solitary, Poore, Nasty, Brutish, And Short

I am hostile to the notion of logically deriving an ideal society from first principles. Ideal norms can only be understood by us humans in a context that will evolve over time, not from some timeless, interest-free view from nowhere. Humans, in arguing about society, invariably base themselves on some partial perspective or faction. And to understand how to apply some principle articulated from some interest, one will have to draw on empirical results. (Is this an institutionalist, pragmatic view?)

Perhaps, nevertheless, some (not necessarily inconsistent) norms can be stated in keeping with these ideas. I suggest the following:
• Human society should be able to reproduce itself (Karl Marx).
• Unnecessary suffering should be alleviated (Karl Popper).
• Everybody should have the freedom to develop their capabilities to the best of their abilities (Aristotle, Marx).
Other norms don't seem to be compatible with my views on how to think about political philosophy. "People should be able to keep what they make" is a meaningless norm, maybe common among current intellectually and ethically impoverished mainstream economists.

Robert Nozick, as I understand him, begs a definition of natural rights. And then he argues for the following principles on that basis:
• A person who acquires a holding in accordance with the principle of justice in acquisition is entitled to that holding.
• A person who acquires a holding in accordance with the principle of justice in transfer, from someone else entitled to the holding, is entitled to the holding.
• Unjust acquisitions or transfer should be rectified
He ends up arguing for a more-or-less night-watchman state.

Perhaps these sorry days are ripe for an immanent critique of the idea of equality of opportunity.

I did not manage to mention above the Austrian-school economist Israel Kirzner's defense of returns to entrepeneurship with what he calls a "finders keepers" ethic. I have not read John Rawls.

## Tuesday, January 11, 2011

### Both Sides Now

Eric Schoenberg, of Columbia Business School, has a very confused response to the usual idiocy from Greg Mankiw. Schoenberg quotes Mankiw as saying
"under a standard set of assumptions... the factors of production [i.e., workers] are paid the value of their marginal product... One might easily conclude that, under these idealized conditions, each person receives his just deserts."
Schoenberg thinks the above is coherent. He merely questions whether the standard assumptions apply in our economy.

But marginal productivity is not a theory of the distribution of income, as I have demonstrated. It is a theory of the choice of technique. Income distribution can be anywhere on the factor price frontier and all agents will be receiving the value of the marginal product of the factors of production that they own. (I don't know if anybody in the comments at the Huffington Post points out that Mankiw does not know what he is talking about, even given all of his assumptions.)

By the way, Bill Mitchell has a recent post that wanders into my theme.

## Sunday, January 09, 2011

### Canonical Statements Of Current Mainstream Price Theory?

I find puzzling where contemporary mainstream economists would point for good statements of a theory to explain prices in, say, the United States economy. I can think of a couple of possibilities.

The first would be General Equilibrium models in which a complete set of spot and future markets exist and in which goods are distinguished by location, date of delivery, and contingent events. Canonical statements of such a theory include:
• Gerard Debreu (1959). Theory of Value: An Axiomatic Analysis of Economic Equilbrium, Cowles Foundation Monograph.
• Kenneth J. Arrow and Frank Hahn (1971). General Competitive Analysis, Holden-Day.
But is this the current theory? Even though more textbooks have been written, the references would seem old to most young economists. The Sonnenschein-Mantel-Debreu results show that dynamic equilibrium paths are not limited by the theory. The theory does not allow for the existence of money to have any effect. Since the quantities of initial endowments are givens of the theory and any groping out of equilibrium, particularly with production going on, would change the data defining the equilibria, any time to reach equilibrium is too long. I think many mainstream economists would tell me General Equilibrium theory was abandoned in the 1980s.

Another possibility is models of temporary equilibrium. I look at the following as canonical statements of this theory
• J. R. Hicks (1946). Value and Capital: An Inquiry into Some Fundamental Principles of Economic Theory, 2nd edition, Oxford University Press.
• J. M. Grandmont (1977). "Temporary General Equilibrium Theory", Econometrica, V. 45, N. 3 (Apr.): pp. 535-572.
In this theory, only spot markets and maybe future markets for the numeraire commodity clear in equilibrium. The plans of different agents for future activities are not brought into agreement in equilibrium. Maybe this approach is related to Samuelson's model of overlapping generations. In addition to all the problems of General Equilibrium Theory, this approach has the difficulty of modeling how expectations alter, something that hardly seems observable or easy to model in a mechanical fashion.

Maybe game theory is the foundation of contemporary mainstream price theory. But cannot a game be found to rationalize almost any observed behavior? Is it not more a bag of tricks than a theory? Nevertheless, I have heard mainstream economists say good things about the following text:
• David M. Kreps (1990). A Course in Microeconomic Theory, Princeton University Press.

As I understand it, the dominant introductory graduate textbooks in mainstream microeconomics remain:
• Andreu Mas Colell, Michael D. Whinston, and Jerry R. Green (1995). Microeconomic Theory, Oxford University Press.
• Hal R. Varian (1992). Microeconomic Analysis, 3rd edition, W. W. Norton.
When I have perused these, I have found them to be more a collection of mathematics than a clear presentation of price theories. And I have found them very unclear on why the student should accept anything in them as applicable to actually existing capitalism.

## Tuesday, January 04, 2011

### Increase In The Feasibility Of Economic Planning

Mathematical programming is a key technique for economic planning. And we can solve linear programs much better now:
"Even more remarkable - and even less widely understood - is that in many areas, performance gains due to improvements in algorithms have vastly exceeded even the dramatic performance gains due to increased processor speed.

The algorithms that we use today for speech recognition, for natural language translation, for chess playing, for logistics planning, have evolved remarkably in the past decade. It's difficult to quantify the improvement, though, because it is as much in the realm of quality as of execution time.

In the field of numerical algorithms, however, the improvement can be quantified. Here is just one example, provided by Professor Martin Gröschel of Konrad-Zuse-Zentrum für Informationstechnik Berlin. Grötschel, an expert in optimization, observes that a benchmark production planning model solved using linear programming would have taken 82 years to solve in 1988, using the computers and the linear programming algorithms of the day. Fifteen years later - in 2003 - this same model could be solved in roughly 1 minute, an improvement by a factor of roughly 43 million. Of this, a factor of roughly 1,000 was due to increased processor speed, whereas a factor of roughly 43,000 was due to improvements in algorithms! Grötschel also cites an algorithmic improvement of roughly 30,000 for mixed integer programming between 1991 and 2008.

The design and analysis of algorithms, and the study of the inherent computational complexity of problems, are fundamental subfields of computer science." -- Report to the President and Congress - Designing a Digital Future: Federally Funded Research and Development in Networking and Information Technology, Executive Office of the President, President's Council of Advisors on Science and Technology (December 2010)
I don't know how such speedup was accomplished. I assume it cannot be merely a tradeoff between Dantzig's simplex algorithm and interior point methods (such as Karmarkar's algorithm). The simplex algorithm, for example, has never struck me, from what I recall, as naturally parallelizable. But parts of it can be done in parallel, I think. I think of choosing a pivot element, multiplying a vector by a scalar, and taking the inner product of two vectors as parallelizable operations. I think these improvements must have been accomplished by customizing an implementation with a well defined Application Programming Interface for a specific architecture.

(H/T: Noam Nisan)

Update: I've been reading Robert E. Bixby's "Solving Real-World Linear Programs: A Decade and More of Progress" (Journal of Operations Research, V. 50, Issue 1 (2002)). Apparently speedups were accomplished by algorithm improvements such as matrix operations that exploit the sparcity of the matrices, removing redundant constraints, aggregating decision variables under specified conditions, and many improvements I do not understand. The simplex algorithm, the dual simplex algorithm, and interior point methods all remain competitive on different problems. Bixby considers example problems with millions of decision variables and constraints. I think a couple of more orders of magnitude of improvements can be achieved with parallelization. Maybe somebody has tried that since Bixby's publication.