Archive for March, 2009

Charles Bukowski wrote a great short story explaining why he doesn’t write about politics.   In the story, he noted that:

the difference between a Democracy and a Dictatorship is that in a Democracy you vote first and take orders later; in a Dictatorship you don’t have to waste your time voting.

Robin Hanson has suggested that people prefer voting for their rulers, in part, because it’s higher status:

I’ve been saying for years that people prefer democracy mainly because they think it raises their social status – being ruled by a king makes you lower status relative to people who “rule themselves.”  We can’t quite fool ourselves into thinking a king is just a “steward”, but we apparently can think we really rule because we elect our rulers.

In this country, I don’t think we are really even ruled by the people we elect.  Sure, the President and congress do have some power. I believe that we are better off with Obama than we would have been with McCain.  However, I think for the most part these people are figureheads.  Or maybe puppets is a better metaphor.  They do have some power, but they’re not really the ones calling the shots.  The people who write policy, the people who have real influence, the people who help decide who can get elected, they’re not voted on.  We didn’t vote to have Pharmaceutical companies shape the prescription drug program.  We didn’t vote to give AIPAC power.  We didn’t vote to give Sandy Weill enough influence over congress to repeal Glass-Steagall.

Do we even want direct democracy?  It appears to me that we are okay with being ruled, as long as we call it Democracy.

Bukowski also said:

are there good guys and bad guys? some that always lie, some that never lie? are there good governments and bad governments?  no, there are only bad governments and worse governments.

He spends some time pointing out examples of how easily citizens of any country can become convinced that their country is killing for freedom, democracy and/or humanity (take your pick!). And you are probably thinking “but our country does only kill for those reasons.”  And people in other countries think the same thing about their governments.  In any case, the people we are killing are just like us, but we don’t see them that way.  We see them as evil.  The Iraqi’s were killing babies in incubators!  They’re not like us!  Or we see them as victims of their government.  We defended South Vietnam by invading South Vietnam.  The Soviet Union invaded Afghanistan in self-defense. We invaded Panama because Noriega wanted to get our kids high on drugs, instead of high on life.  And so on through a million excuses that should be transparent.  But if there is one industry that is more advanced than any other, it’s P.R.

Read Full Post »


I find it interesting that people feel safer when they are with others, even if they are not actually safer.

For example, my 8 year old son is afraid to go upstairs by himself at night.  So, if he needs to go up there he’ll try to talk his 4 year old brother into going with him.  What protection does a 4 year old offer against ghosts, boogie men and closet monsters? 

Similarly, my wife feels safer when I’m home with her.  But really, what protection am I?  If an intruder had a weapon, I’d hold him off for about a second.  If he didn’t have a weapon, I might last 10 seconds in hand-to-hand combat before he knocked me out (on a good day). 

We evolved to feel safer with others, which makes sense.  But we apparently didn’t evolve enough to distinguish between useless and useful people in a time of need.

Read Full Post »

Hotel showers

Some hotels use very thin and cheaply made shower curtains.  When the shower is turned on, it becomes like a wind tunnel in there.  The thin shower curtain chases after me, trying to stick to my skin. I end up in hand-to-curtain combat against it.  And then when I get out of the shower, there is a big puddle of water on the floor.   I’ve had this happen many times.  What gives?  Is it that expensive to just buy heavier shower curtains for the rooms?

Read Full Post »

Robin Hanson recently wrote this interesting post about status prudes:

Societies also vary in how “prudish” they are about status talk.  Social status, a shared perception of individual quality, is central to every society.  In some societies, like high school or the ghetto culture as depicted on The Wire, it is mostly OK to directly jockey for status; you can tell someone you are better than them, or that they have a loser car.  In contrast “egalitarian” societies  discourage such talk; such jabs must be made indirectly enough to allow plausible deniability.

It seems to me that people of low social status have nothing to lose by directly challenging someone of equal or higher status.  Alternatively, people of the highest social status have strong incentive to discourage lower status folks from challenging them.  If you were to directly challenge a high social status person, you would be viewed as not sophisticated enough to belong with their crowd.  This prudishness becomes internalized.  You know that to become high status, you must find less impolite ways to move up the ranks.

(In the US, status prudishness seems to be strongly correlated with status itself.  Are there societies where that is not the case?)

What are the implications?  Our prudishness causes us to value people who are effective at signaling positive attributes, whether or not they actually have these attributes.  Thus, we reward style, political savvy and high status accomplishments.  While there is certainly a positive correlation between the ability to signal attributes and possessing the attributes, the correlation is not 1.  For example, people who attend Harvard Law or are good at social networking might be more talented, on average, than others.  However, the more prudish we are, the more we rely on this type of indirect evidence. 

This all naturally leads to the most talented people not necessarily doing the most important work.  I’d imagine it causes an increase in nepotism and less fluidity between the classes.  For example, a child of high status parents has major advantages in obtaining things that we use to infer ability, such as attending top schools.   

As Robin pointed out:
Relative to societies where most people have a say in ranking folks around then, status-prudish societies tend to delegate this ranking task to elites.  This may in fact produce a better society, but it seems odd to call this more “egalitarian”; people still end up being ranked, and the power to set those rankings is concentrated more into elite hands.

Read Full Post »

False belief can result from a variety of causes, including unwarranted trust, confirmation bias and motivated skepticism.  I’d rank them in that order, from least to most dangerous.  


Suppose a particular belief of ours resulted from interaction with a trusted authority figure (e.g., parent or teacher). If we never independently investigated the factual basis of their claim, we might be in more of a position to be persuaded away from the belief.  Most likely we would acknowledge that the belief is weakly evidence-based.  Belief based on trust can be better than belief based on a coin flip, if the person we relied is of, say, above average intelligence and/or has demonstrated general reliability in the past.

Consider the following two examples:

Example 1.  You believe that global warming was caused by human actions. You have not read any arguments for or against the claim. Your belief is based entirely on your discovery that nearly every scientist with expertise in this area has come to just that conclusion.    

Example 2. You believe that Jesus was born of a virgin, died for our sins and rose from the dead.  You base that belief entirely on the fact that your parents said it’s true.  You have never read the bible or looked into arguments for or against it.  However, your parents have been reliable in the past.

Both are examples of faith-based belief.  It is necessary to have some faith-based belief, as we don’t have the time or even the ability to thoroughly investigate every claim.  However, I would consider the evidence in example 1 to be stronger than in example 2 (community of scientists have a better record than parents, in my opinion).

The point, though, is that in either example, if we are presented with actual disconfirming evidence, we might be persuaded.  That’s because we are not operating under the assumption that we have already researched the topic.

Confirmation bias

Belief based entirely on trust or faith tends to be held by people who are not particularly motivated truth-seekers.  Thus, in my opinion, they could be persuaded by actual evidence.  

Let’s now consider another type of person.  This person started with a belief that was faith-based, but then went out and looked for evidence.  They used whatever tools were at their disposal to find confirmatory evidence.  And of course, it’s never hard to find evidence in favor of your beliefs if you look hard enough with an uncritical eye.  As a result, the person went from having a belief that they knew was faith-based, to having one that they believe is based on real evidence.

In my opinion, this person’s beliefs are going to be more strongly held than those of people who did not bother to look for evidence.  Thus, disconfirming evidence might be less likely to persuade them.  There is hope, though, because they have never bother looking into evidence in favor of alternative theories.   

Motivated skepticism

With motivated skepticism the person takes it one step further.  Not only do they look for confirmatory evidence, and suffer from confirmation bias, but they also search for flaws in the arguments made in favor of alternative theories.  Thus, they think they are being a good rationalist by looking at both sides of an issue.  However, they apply a completely different level of skepticism to confirming evidence relative to disconfirming evidence.

Suppose, for example, that I believe that deregulation of the financial markets was the primary cause of the current financial crisis.  Perhaps I do not know a lot about finance.  My belief might be based on political loyalties, instinct, what I heard on my preferred news programs or what my friends were saying.  Whatever the reason, I strongly believe it’s true.  I decide to be a good citizen, and do a little research.  Whenever I find an article that supports my belief, I soak in the information.  It feels good.  When I find an article that argues against my belief, I excitedly read through it, searching for flaws.  By the time I am done with my ‘research,’ I have found evidence that favors my beliefs, and flaws in the arguments against my beliefs.  Thus, I am more convinced than ever that I am right.

The danger here is the feeling the person has that they have objectively looked at the evidence on both sides.  In their mind, their work is done.  Will disconfirming evidence persuade them?  It seems unlikely, as they have moved past the investigative phase on that topic.

Example — political beliefs

Consider the following study on political beliefs (link).  First, they described how people should update their beliefs:

Assuming one has established an initial belief (attitude or hypothesis), normative models of human decision-making imply or posit a two-step updating process, beginning with the collection of belief-relevant evidence, followed by the integration of new information with the prior to produce an updated judgment. Critically important in such normative models is the requirement that the collection and integration of new information be kept independent of one’s prior judgment (see Evans & Over, 1996).

 The authors carried out an experiment, where participants were exposed to a balanced set of pro and con arguments on two topics, affirmative action and gun control.  Interestingly, they found that participants who were politically sophisticated and/or had strong prior beliefs were more likely to view the evidence in a biased way.  This lead to more polarization, despite the fact that they were exposed to a balanced set of arguments.  As the authors’ stated:



Asymmetrical skepticism – as would be reflected in the type of thoughts that come to mind as we read pro and con arguments – deposits in mind all the evidence needed to justify and bolster our priors with a clear conscience (Ditto, Scepansky, Munro, Apanovitch & Lockhart, 1998).

If you have strong prior beliefs, ‘research’ can just end up being an exercise in bolstering your case.  You’re like an attorney, searching for evidence in favor of your client and trying to find flaws in the evidence against your client.  The danger, though, is you think of yourself as a judge who is listening to both sides.

Read Full Post »

Recently I was trying to decide between doing something the traditional way, the way I have always seen it done, or doing it an alternative way.  The alternative made more sense to me, given the situation.  Well, I took the alternative approach and got burned by it.  This isn’t the first time in my life I’ve paid a price for going against the norms.  

In this case, there would have been no harm in going with the traditional approach.  Sure, I think it would have been a little more boring, but it would have been risk-free.  I knew the alternative approach did come with risk.  

I went with what I would have preferred if I were in other people’s shoes.  All the while, forgetting just how important tradition and formality are to some people.  When something doesn’t make sense to me, such as insisting on tradition for the sake of tradition, I tend to underestimate its importance to others .  Yet, I’ve had enough life experience where I should have been able to anticipate the consequences.  

It was a clear case of failing at rationality.  I was a victim of some of the biases that I’ve been writing about. 

There have been times when I have benefited from going against norms.  The key is to recognize when to conform.  In the above example, there was a lot more risk and little potential benefit to not conforming.  I mainly did it because I thought it was what people should prefer.  I didn’t spend enough time thinking about what they would actually prefer.

Read Full Post »

Teenagers, in particular, tend to think they are unique.  They often feel alone.  “No one understands me.”  They are biased to think that they are more unique than they really are.  That’s because they have a small world view.  Their world is their school and neighborhood, both of which were chosen by their guardian(s).  They have not had sufficient time or opportunity to find people who are like them.  

As you move from your teenage years into your 20s and 30s, you tend to have more opportunities to find new social networks.  You meet new people at your university or job.  You have choice in where to live.  Over time you meet people who are like you.  

We like people who make us feel good about ourselves.  We naturally segregate to groups that share our values and have similar social status.  For example, in nearly every neighborhood I have chosen to live, the people tend to be left-leaning, either non-believers or not overly religious and value science and education.  Because I spend so much time with people who are like me, it would be easy to overestimate the percentage of atheists in the US population, for example.  Often when I have a conversation with someone who is very religious, they tell me that I’m the only atheist they know.  Like me, they have surrounded themselves with people like them.  It’s the same way with politics.  If most of our friends favor the same political candidate, we might overestimate that candidate’s popularity.  There is the famous Pauline Kael quote:  “I don’t know anyone who voted for Nixon.”  As a result of this in-group homogeneity, we no longer feel like as much of a minority.

There is another reason we are biased in the direction of underestimating our uniqueness.  Our own mind is the only one we know.  We only get to experience one lifetime.   It is easy to fall into the trap of thinking other people’s minds are more like ours than they really are.  For example, suppose you are unhappy.  You might suspect that a lot of people who appear happy are really unhappy, but are just faking it. Alternatively, if you are genuinely happy, you might have less tolerance for someone who is unhappy.  To you, happiness isn’t difficult to achieve, so why should it be for someone else?  Or, suppose you are attracted about equally to men and women.  You might suspect that most people are the same way, but pretend they’re not due to societal pressure.  Alternatively, if you are strongly attracted to the opposite sex and never feel same-sex attraction, you might assume that people who claim to be bisexual are just trying to be trendy.

I think these are the two major causes of the false consensus effect

The observant reader might have noticed something —  that the opinions in this post might suffer from the same bias.  I did feel isolated as an adolescent.  I now have surrounded myself with people who are more like me than the general population.  I only know my own mind.  Maybe I am guilty of assuming that my story is more common than it really is.

Read Full Post »

Older Posts »