Monday, May 28, 2018

When Math Applies to Politics

Being a conservative, I want to minimize the power that Washington has over our lives.  In general I want to minimize government at all levels, but especially at the federal level.  The laboratory of democracy has served us well for a couple centuries, and every assumption of power by Washington moves us closer to a national as opposed to a federal government.

One of my favorite classes in my 5-year odyssey to get my master's degree was Discrete Optimization.  One of the concepts I learned in that class involved describing the efficiency of algorithms.  If functions grow like polynomials, algorithms have an easier time of handling them than they do for functions that grow exponentially or even faster.  The notation used for describing that is "Big-O"; functions that grows "polynomially" might be O(n-squared) (read, big-O of n-squared) or O(n-cubed).

How do those previous paragraphs fit together?  This morning I was reading an article called Why Government Doesn't Scale, an article on why central governments can't be very efficient, and what makes an appearance but Big-O:
But local networks were becoming big. We knew our office would need one. But what kind? The big debate then was client-server vs. peer-to-peer. In P2P, every machine on the network carried some responsibility for the management. It was difficult. It was messy. It was OBVIOUS to me that the superior solution was client-server: a really powerful central server made all the decisions and routed all the traffic, and the rest of the machines were mere clients that talked to it. It was simple. It was elegant. If the network got bigger, you just needed a bigger, smarter server. And I confidently told management to go with client-server.

And then slowly, inexorably, the industry went peer-to-peer. (The modern internet has elements of both. That’s a subject for another discussion…) And I could only scratch my head and wonder why.

So I read. I researched. I tried to understand. And eventually I found articles that pointed out O(n^2) (order n squared), a familiar concept from computer science. It describes a system where the work grows as the square of the number of elements.

And I realized: every node in a network has to potentially speak to every other node. A classic bit of communications theory is that such a network has O(n^2) potential relationships between the nodes. 2 nodes have 1 potential relationship. 4 nodes (twice as many) has 6 potential relationships (6 times as many). 8 nodes (twice again) has 28 potential relationships.

100 nodes = 450 relationships [I believe this should be 4950--Darren]
1,000 nodes = 499,500 relationships—nearly half a million.

And in a client-server network, the server has to be ready to manage every single relationship. In peer-to-peer, each node has a hand in managing its own relationships, and maybe a few others along the way. The work is distributed.

And it finally got through my head: NO server, no matter how large and how powerful, could keep up with O(n^2). Even if it were perfectly designed and never broke down, there was some number of nodes that would crash the server. It was mathematically unavoidable. You HAVE TO distribute the management as close as possible to the nodes, or the system fails.

And this is the part that is just crystal clear in my memory over a quarter century later: in an instant, I realized that the same is true of governments.
You just never know when that math education will be useful!

2 comments:

cthulhu said...

I don’t dispute the general point being made here, but I disagree with the use of O() notation. Specifically, algorithms of O(n^c), where n is a measure of the size of the input and c is a positive constant, are not exponential time; they are polynomial time algorithms. An exponential time algorithm would be O(c^n), where once again n is a measure of the size of the input and c is some constant greater than one. In general, for sufficiently large n, exponential time problems will take longer (usually many orders of magnitude longer) to solve than polynomial time problems.

Exponential time problems are one case of what are called NP problems: those problems which can only be solved in polynomial time on a non-deterministic Turing machine; i.e., on a deterministic Turing machine (which is all we currently have), NP-hard problems can’t be solved in polynomial time. Problems involving combinatorial connections among elements, such as the central government problem you’re discussing, typically may be NP-hard, which I think is what you’re try to say.

Darren said...

The typo is mine. I said polynomial in one place but exponential in another. Thanks for the correction.