"Even more remarkable - and even less widely understood - is that in many areas, performance gains due to improvements in algorithms have vastly exceeded even the dramatic performance gains due to increased processor speed.I don't know how such speedup was accomplished. I assume it cannot be merely a tradeoff between Dantzig's simplex algorithm and interior point methods (such as Karmarkar's algorithm).
The algorithms that we use today for speech recognition, for natural language translation, for chess playing, for logistics planning, have evolved remarkably in the past decade. It's difficult to quantify the improvement, though, because it is as much in the realm of quality as of execution time.
In the field of numerical algorithms, however, the improvement can be quantified. Here is just one example, provided by Professor Martin Gröschel of Konrad-Zuse-Zentrum für Informationstechnik Berlin. Grötschel, an expert in optimization, observes that a benchmark production planning model solved using linear programming would have taken 82 years to solve in 1988, using the computers and the linear programming algorithms of the day. Fifteen years later - in 2003 - this same model could be solved in roughly 1 minute, an improvement by a factor of roughly 43 million. Of this, a factor of roughly 1,000 was due to increased processor speed, whereas a factor of roughly 43,000 was due to improvements in algorithms! Grötschel also cites an algorithmic improvement of roughly 30,000 for mixed integer programming between 1991 and 2008.
The design and analysis of algorithms, and the study of the inherent computational complexity of problems, are fundamental subfields of computer science." -- Report to the President and Congress - Designing a Digital Future: Federally Funded Research and Development in Networking and Information Technology, Executive Office of the President, President's Council of Advisors on Science and Technology (December 2010)
(H/T: Noam Nisan)
Update: I've been reading Robert E. Bixby's "Solving Real-World Linear Programs: A Decade and More of Progress" (Journal of Operations Research, V. 50, Issue 1 (2002)). Apparently speedups were accomplished by algorithm improvements such as matrix operations that exploit the sparcity of the matrices, removing redundant constraints, aggregating decision variables under specified conditions, and many improvements I do not understand. The simplex algorithm, the dual simplex algorithm, and interior point methods all remain competitive on different problems. Bixby considers example problems with millions of decision variables and constraints. I think a couple of more orders of magnitude of improvements can be achieved with parallelization. Maybe somebody has tried that since Bixby's publication.