Friday, April 26, 2013

Lesson from Economic Sphere

With Economic Sphere visiting this week, we’ve covered a few economic topics. And mostly I’m reassured that I’ve been on the right track.

Lesson 1: Thou Shalt Not Covet
One conversation the other day was about income redistribution. He asked me, of two job situations, which would I prefer? In the first one, I’m offered $50,000 a year, and my boss makes $55,000, 10% more. In the second I’m offered $100,000, and my boss makes $200,000, 100% more.
I answered sensibly: I’d prefer making $100,000. First, because I’m making twice as much money as I would have in the other job, and second, because the amount my boss makes shows a lot more growth potential for a career.
Surprisingly, when this question was asked in a study, an alarming number of people preferred making less money, as long as their boss made only a little more than they did. They thought it was immoral to have the boss make so much more than the employee.
We agreed that the morality is skewed. And the problem lies in that least of the Ten Commandments: Thou Shalt Not Covet. Why should it matter to me what a boss makes as long as I’m being fairly compensated for my work? The amount the boss makes is irrelevant. His job is different. He takes different risks, and has different expectations placed on him. His job probably includes some advanced education and experience in strategic planning.
The Ten Commandments, if they'd been written in English
image found here
I agree that some executives are overpaid. That’s a concern to the company, its board and stockholders, and to the extent that affects the company, also to the employees. But if the highest paid employee makes 20-fold what the entry-level employee makes, who cares, as long as he’s worth it to the company? Difference in income is simply irrelevant.
When you ask someone with that alternative moral belief, “What is immoral about someone making more than someone else?” and you get kind of a sputter answer. They think it’s self-evident; it’s unfair. But they can’t explain why different outcomes for different inputs equates to unfairness. They just have this internal sense that it does. What they don’t recognize is that refraining from jealousy over another’s fortune is a higher morality. Forcefully taking from a producer to give to a non-producer is simply theft, whether the state does it or a thug.
That’s why you see the argument for leveling the outcome for everyone in the southern hemisphere of the Spherical Model, where you also find tyranny and savagery. What you see in the northern hemisphere is actually more fair. And, because that is where you also find a more moral people, you also find them willingly giving aid to those truly in need—which means a two-way exchange of love as well. The giver gives to the poor because he loves and cares about him and wants to relieve his suffering. The receiver humbly receives, recognizing the gift was voluntary, and he is both grateful and determined to become productive and giving if he can. Love and gratitude are eliminated between people in the southern hemisphere, with the state placing itself in a godlike benefactor role, requiring gratitude and allegiance for its theft.

Lesson 2: No Central Planner Can Know Enough
This conversation was about the Superman comic strip nemesis Brainiac, which I was not familiar with. So I’m summarizing here without expertise. In the Superman, the Animated Series version, Brainiac was a “character” on the planet Krypton, where Superman was born. The people had developed a sort of central computer repository of knowledge, that became sentient—Brainiac. The idea was that everyone who learned anything would upload their information into this central brain, and then it would have all the knowledge necessary to make the wisest decisions for all.
This went well until a certain point in the history of the planet. Using only nonspecific technical jargon (which is what the series does), we learn that something has gone awry with the core of the planet, and it is going to blow up. This was the first time that Brainiac, the know-all computer, had a discrepancy between his purposes and those of the people he served. If he let the people know of the danger, they would expect him, even directly order him, to help them find a way to get everyone safely off the planet. He would thus be destroyed, but the people he served would survive. Or, he could use his processing power to upload himself onto something that he would get off planet—thus the people’s history and culture would survive, because he held it all within his brain, but the people themselves would die. He decided that was the better option; in order to accomplish it, he lied to the people, claiming the disturbance in the core was simply some seismic activity, nothing to worry about.
From Brainiac Attacks
image found here
Superman’s dad, Jor-El, as we know, knew about the danger to the planet. He tried spreading the word, but when people asked brainiac, Jor-El was contradicted. So he put his efforts into getting his son safely off planet before the explosion. So, there were two pods leaving Krypton in time, Superman’s and Brainiac’s.
The comparison here is that a central knowledge source is not simply a servant of the people who built it; it sees itself as its own entity of value—surpassing in value the individual people.
That led to further conversation about central planning, and how, no matter how all-knowing, no central planner can make decisions as consistently appropriate as individuals. The reason is that the central planner can never know the one most important thing necessary for making a decision about how I will spend my money: my preferences. I may not know them up until the time I find a pair of jeans in a store and try them on. I might prefer the feel of one pair over another. Or the way one pair fits my exact shape better than another—not measurement-wise, just in where things pull or tug. Or maybe there’s a subtle difference, like the topstitch color or the buttons that are the deciding factor for me. And I don’t know those things to feed the information into a central decision maker until I actually make the decision. How much worse is it if I am not allowed to make my decision, but must depend on the computer, using whatever amalgam of data it has up to this point, spits out as my decision?
The point is, no central planner, no matter how all knowing, has enough information to make better decisions for individuals than the independent individuals do. Friedrich Hayek’s The Road to Serfdom has this as a major theme. Economist Thomas Sowell explains it from time to time (here is one piece). 
Things that have been common sense to the common man (AKA: We the People) for centuries continue to be true.

No comments:

Post a Comment