Thursday, February 16, 2017

Anthropomorphizing Government

If you were to create an extremely powerful android, including feelings and inclinations, what would that look like? And would you subject yourself to such a creation?

We want our android creation to “care” about people. An android can’t actually care; it doesn’t, by definition, have feelings. But it can appear to have them.

For example, if people are starving, our android must be programmed to give food, or money for food. If people are homeless, our android must be programmed to build and/or otherwise provide housing to those in need. If people need medical care that is beyond their ability to pay for, our android must be programmed to pay their medical bills.

There might be other “feelings” we’d like our android to seem to feel. Maybe it should be free from bigotry. We tell it to ignore skin color, gender, age, and other human attributes when making decisions. It should instead give preferences based on character, perhaps, or sometimes need. There could be an algorithm.

We might program our android to “care” about the next generation with a passion about educating children.

If our android creation now has the inclination to provide food, shelter, medical care, and education to anyone in need, it must either suffer the “sadness” or “frustration” of failing to meet its programmed goal (which would mean blowing its circuits or some technical breakdown), or it must have resources with which to do those things.

We will need to program our android to obtain those resources. Will we make it super talented, so it can do some job that people are willing to pay a lot of money for? Maybe be a superstar performer? Maybe a great entrepreneur that happens to succeed in many business ventures? Maybe it will do physical labor to earn a nest egg to then invest and earn a surprising return on investment?

It needs to be something that keeps resources coming in while the android spends proceeds on those charitable things we assigned it to care about.

And here’s another detail: we can’t program it to do something we aren’t capable of doing. It won’t sing or act better than the programmer. It won’t invent and take entrepreneurial risks better than the programmer. It won’t make better investments than the programmer.

Maybe we could program the android to come up with its own means of finding resources. “Just find a way,” we could say. What could go wrong? Well, if you’re into sci-fi, the rule is that androids always turn on the people. Always. (This has come up in recent story lines on Marvel's Agents of S.H.I.E.L.D., in which very human-like androids have been popping up--with feelings, like anger, jealousy, and vengeance. And Mack says something like, "Don't you ever watch the movies? The robots always turn on the humans.")


Android Aida starts doing damage to humans
on Marvel's Agents of S.H.I.E.L.D.
image from here

Let’s say this android creation of ours is our government. People talk as if it actually has feelings. A “caring” government must provide free health care. A “caring” government must get rid of bigotry. A “caring” government must “feel” guilty of its success compared to other countries and give away its wealth to other countries to even the playing field. A “caring” government must give everyone the same opportunities for education—more than K-12 and all the way through college.

This creation we call government doesn’t actually earn a living, let alone a living plus proceeds to pay for all of this largesse. So we’ve programmed it to “just find a way,” including the permission to take resources from the creators, who actually do work for a living.

Since government has been programmed to do so many specific things to show that it “cares,” it finds a way. It seizes our resources, usually through taxes, fees, tariffs, licenses, and sometimes outright theft—because somewhere in the algorithm it prioritizes who it decides is deserving, regardless of who earned or built up the property.

The android, supposedly programmed to “care” for people, turns on them—at least on some of them—and asserts its power. It takes and gives as it chooses. And we, the creator of it, find ourselves at its mercy. Because, as sci-fi lit tells us, the more human-like you make the android, the more likely it will turn on humans—with super-human power. 

Then, when the creature is out of control and on a rampage, what must be done? Die or take it down.

It would be preferable to avoid the problem. Instead of saying, “This time it will be different,” say, “This time I will limit the powers of the creation to specific purposes."

With government, the limits of its power should be where it can do what we can do ourselves, but better; that's a very few things. The creators of our government—the writers of the US Constitution—spelled out the limited purposes in the Preamble:

·        Establish Justice (law enforcement and adjudication).
·        Insure domestic Tranquility (avoid civil wars, settle interstate or regional disputes).
·        Provide for the common defence (protect sovereignty from attacks).
·        Promote the general Welfare (handle things that affect everyone, such as coining money, standardizing measurements,providing some infrastructure, and encouraging interstate commerce).
·        Secure the Blessings of Liberty to ourselves and our Posterity (the blessings of protecting life, liberty, and property lead to general economic prosperity and civilization).
The rest of the Constitution, including the Bill of Rights, enumerates the specific things the federal government can do—the limited programming.

The creation is, indeed, power, which, like fire is good when contained and used for its specific purposes, but is dangerous outside those boundaries. There’s nothing in there about “caring.” Any attempt to program in “caring” goes beyond the bounds of its proper purpose.

We can’t enforce caring in our neighbor; we can only persuade. And we can model the behavior by doing it ourselves. So we cannot program caring into government. We cannot take our neighbor’s property and give it to government to redistribute to someone it deems more deserving—because we do not have the right to take our neighbor’s property in the first place. When you attempt to imbue government with that power, you are creating a monster.


If we want government to be our servant, rather than our tyrannical oppressor, we’d better remove any of the dangerous programming—anything beyond the limited purpose of government.

No comments:

Post a Comment