9 Algorithms That Changed the Future

By JOHN MacCORMICK

In our increasingly digitally dominated world, any book that attempts to explain for the layperson “the ingenious ideas that drive today’s computers” should find a ready audience and become required reading for the curious, enthusiastic, responsible and attentive netizen — a category more and more of us find ourselves in these days, willy-nilly. 

John MacCormick’s new volume 9 Algorithms That Changed the Future — which bears as subtitle the quoted phrase above — does indeed go a long way toward satisfying that need, assuming the cooperation of a reader who possesses a modicum of patience, diligence and brain-teaser-friendly applied intelligence.  And, amazingly, MacCormick does it all for this willing reader “without assuming [on the reader's part] any knowledge of computer science.”

With the clear-eyed precision and logical rigor of the computer science professional that he is, MacCormick begins with a handy definition of an algorithm as a third component of computer architecture, neither software nor hardware, but rather an almost abstract entity:  “a precise recipe that specifies the exact sequence of steps required to solve a problem.”  Having neatly and cleanly described his subject, he lays out his criteria for choosing contenders for the title of nine most consequential algorithms.  The winners prove to be those processing tricks associated with 1) search indexing; 2) search result ranking; 3) encoding; 4) error correction; 5) pattern recognition; 6) data compression; 7) database structure and management; 8) authentication; and 9) the limits of the computable.  Each algorithm — or cluster of allied recipes — gets a chapter of its own, with a concluding look at the future of such “aha” shortcut inventions.

MacCormick’s two main techniques for conveying his insights are metaphor and a stepwise progression of complexity, moving from usefully oversimplified examples to the actual algorithmic realities.

In the metaphor department, MacCormick exhibits a real talent for picking understandable realworld analogues to his algorithms which do not betray the nature of the digital processes.  For instance, in the chapter on public-key encryption, he hits upon an allegory involving mixing paints.  Likewise, when discussing verification of digital “signatures,” he uses the trope of padlocks and keys.  These very tangible constructs allow the reader to intuitively comprehend the actualities of the computer code.

The author also exhibits an admirable ability to conjure up naïve, distilled schematics of real problems.  For instance, his little table of friendship relations among three imaginary people sets the stage brilliantly for describing how enormous databases like those in banking work.

MacCormick is no slouch when it comes to history and the human element either.  He gives little snapshots of such computer luminaries as Alan Turing, Claude Shannon and Alonzo Church that illuminate the personal dimensions behind these geniuses.  A real sense of the steady progression of computer science arises.  Moreover, the reader will pick up an astonishing new set of handy buzzwords.   You might like such mouthfuls as “stochastic gradient descent,” but my personal favorite is “idempotent,” which among computer professionals refers to any specified action that can be applied to the same data any number of times without producing false results.

MacCormick’s concluding chapter speculates on the generation of future algorithms and the decay of present ones, while reaffirming his generous sense of wonder that such near-cosmic conceptualizations are within the scope of human intelligence at all.


The Speculator

Paul Di Filippo’s column The Speculator appears monthly in the Barnes & Noble Review.  He is the author of several acclaimed novels and story collections, including Fractal Paisleys, Little Doors, Neutrino Drag, and Fuzzy Dice.