Bayesian Truth Serum: A Simple Trick To Catch Liars

Have you told a lie yet today?  Did you just lie about your answer to that question?  According to this study, the average number of lies consciously told per day by participants was 1.65.  This number seems low to me and it’s likely because people lie about how often they lie in these studies.  Regardless, my point is simply that myriad lies are hurled at you on a daily basis (and this doesn’t include what Harry Frankfurt calls “bullshit” either).

Relax, all this talk about lying isn’t meant to get you all worked up.  Lying and sophistry are natural and perfectly normal behavior for socially intelligent simians.  It’s theorized, that our ability to lie evolved as an important part of how we communicate with each other.  Most of us learn how to lie at a very early age and, according to this article, “lying is actually proof of cognitive development”.

Whatever the reason, I suspect that I have a proclivity to tell the truth more often than I should.  Even so, I think one of the worst pieces of advice I’ve ever heard is the following: don’t ever tell a lie.  If a female has ever asked you if she looks fat in a certain pair of pants you already intuitively know why.  Lying is not only acceptable in some situations, it can even be the right thing to do.  This is partly why I can’t classify myself as a deontologist.  I’m not sure of much, but when a female asks “Do I look fat in these pants?” I’m pretty confident that lying is the right thing to do.

With that being said, I still think it’s a valuable skill to know when you are being lied to.  In fact, sometimes it can be incredibly difficult to sniff out lies.  Alas, there is a tool to help improve your ability to do this in some domains. I was recently reading Discover Your Inner Economist and the author, economist Tyler Cowen, mentioned something called Bayesian truth serum (“Bayesian” may be a stretch here, but it sure sounds cool!).  In essence, this means that to get a more truthful answer from someone it is more useful to ask for their opinion on what they think other people do than it is to ask them what they do.

For example, one is more likely to home in on the truth if they ask a person “How many sexual partners do you think the average man/woman has had?” versus “How many sexual partners have you had?”.  The reason is due to the availability heuristic.  If a single man was asked the latter question by an attractive woman he’s likely to quickly concoct a lie to make himself look better to her.  However, if she asked him the first question mentioned above instead, he would use the mental shortcut of referencing his own number.  Interestingly, he will likely think his own number is indicative of the average person (even if it isn’t).

Who needs a lie detector when you have Bayesian truth serum?


5 Comments on “Bayesian Truth Serum: A Simple Trick To Catch Liars”

  1. Cool trick. BTW, I like Tyler’s contributions to the EconTalk podcast. I’ve added that book to my reading list. 

  2. Interesting. I remember reading something similar. During world war 2, they US army created a statistical test to find how many people were drug users. The test involved them answering questions, and worked even if people lied.

    But a quick Google search can’t find it. Maybe you know about it Greg?

    • Greg Linster says:

      I’m sorry to say that I don’t.  Perhaps they would have gotten a reasonably accurate answer if they asked the soldiers to tell them how often they thought the average soldier uses drugs.

      •  I read it in a book a long time ago, and I think it went something like this.

        The army asked all the soldiers to answer a question, in a private place, if they had ever taken any drugs. They had to tick yes or no.

        Now they knew everyone would lie, even if there was no way to identify the individual soldiers, so they introduced a twist. Each soldier was given a coin, which they had to toss.  If it came heads, they had to answer the truth. If tails, they had to lie.  The coin toss could be seen by no one but the soldier.

        This means they could safely tick the “yes, I have taken drugs” box, as the coin told them to lie.

        Now, statistically, if no one had been taking drugs, 50% of the final result would be positive for drug use(those who got tails), and 50% would be negative (those who got heads).

        But in practice, it was something like 60-40, which meant about 10% of soldiers had been doing drugs. By using this slightly round about method, they got the soldiers to tell the truth, without fear of individual reprisal.

  3. Interesting. Thanks for the serum.

Leave a Reply