Tuesday, January 04, 2011

Increase In The Feasibility Of Economic Planning

Mathematical programming is a key technique for economic planning. And we can solve linear programs much better now:
"Even more remarkable - and even less widely understood - is that in many areas, performance gains due to improvements in algorithms have vastly exceeded even the dramatic performance gains due to increased processor speed.

The algorithms that we use today for speech recognition, for natural language translation, for chess playing, for logistics planning, have evolved remarkably in the past decade. It's difficult to quantify the improvement, though, because it is as much in the realm of quality as of execution time.

In the field of numerical algorithms, however, the improvement can be quantified. Here is just one example, provided by Professor Martin Gröschel of Konrad-Zuse-Zentrum für Informationstechnik Berlin. Grötschel, an expert in optimization, observes that a benchmark production planning model solved using linear programming would have taken 82 years to solve in 1988, using the computers and the linear programming algorithms of the day. Fifteen years later - in 2003 - this same model could be solved in roughly 1 minute, an improvement by a factor of roughly 43 million. Of this, a factor of roughly 1,000 was due to increased processor speed, whereas a factor of roughly 43,000 was due to improvements in algorithms! Grötschel also cites an algorithmic improvement of roughly 30,000 for mixed integer programming between 1991 and 2008.

The design and analysis of algorithms, and the study of the inherent computational complexity of problems, are fundamental subfields of computer science." -- Report to the President and Congress - Designing a Digital Future: Federally Funded Research and Development in Networking and Information Technology, Executive Office of the President, President's Council of Advisors on Science and Technology (December 2010)
I don't know how such speedup was accomplished. I assume it cannot be merely a tradeoff between Dantzig's simplex algorithm and interior point methods (such as Karmarkar's algorithm). The simplex algorithm, for example, has never struck me, from what I recall, as naturally parallelizable. But parts of it can be done in parallel, I think. I think of choosing a pivot element, multiplying a vector by a scalar, and taking the inner product of two vectors as parallelizable operations. I think these improvements must have been accomplished by customizing an implementation with a well defined Application Programming Interface for a specific architecture.

(H/T: Noam Nisan)

Update: I've been reading Robert E. Bixby's "Solving Real-World Linear Programs: A Decade and More of Progress" (Journal of Operations Research, V. 50, Issue 1 (2002)). Apparently speedups were accomplished by algorithm improvements such as matrix operations that exploit the sparcity of the matrices, removing redundant constraints, aggregating decision variables under specified conditions, and many improvements I do not understand. The simplex algorithm, the dual simplex algorithm, and interior point methods all remain competitive on different problems. Bixby considers example problems with millions of decision variables and constraints. I think a couple of more orders of magnitude of improvements can be achieved with parallelization. Maybe somebody has tried that since Bixby's publication.

6 comments:

Anonymous said...

But... what do you think about the Socialist Calculation Debate? Didn't Mises and Hayek won?

Tomboktu said...

It occurs to me that Red Plenty by Francis Spufford might interest. Not technical, and an unusual mixture of fiction and fictionalised stand-overable fact, it deals with economic planning (in the USSR).

Anonymous said...

Hayek and Mises proved that "Socialism" won´t work. If "Socialism" means complete planification then it´s a non sequitur; seriously is there anyone who believes in COMPLETE planification?

Robert Vienneau said...

I think a hostile reading of Mises could say that he was just ignorant of linear programming and duality theory. I think Hayek has a better argument with his points about tacit knowledge and difficulties in gathering the data needed for central planning. But I am re-evaluating my opinion as I read about applying the theory of computational complexity to the problem.

I look forward to reading Red Plenty when it becomes available in the U.S. as a Kindle edition.

Anyways, I find the math of linear programming elegant and am impressed by the improvements noted in my post. Apparently, linear programming can be applied in planning in a decentralized fashion too.

Emil Bakkum said...

I have not read any works of Mises or Hayek, because they seem to be without interest. Evidently centralised planning has resulted in highly competitive economies. The information problem was solved by means of sophisticated planning networks. It is simply a matter of organization. Many productive activities are repetitive, and do not need rigorous planning. Perhaps the best case to compare the systems of free enterprises and central planning are the two Germanies. The preference seems to be a matter of taste. Do you prefer material possessions (car, telephone, computer) or social gains (a guaranteed existence, free child care and culture)? Many explanations for the collapse of the planned economies are available. There has always been a resistance from within the system, notably from the intellectuals. And planning across borders is very difficult, because of nationalism and undefined international prices.

Emil Bakkum said...

I should add, that linear programming on the macro-economic level leads to some obvious difficulties, for instance with respect to the objective function. I have not yet seen a rigorous approach on this theme.