Are We Amusing Ourselves to Death?

Neil Postman. Amusing Ourselves To Death. Penguin, 1985.

Many great thinkers, such as the recently deceased historian Jacques Barzun, have argued that the sun is setting on the West, meaning that the great American empire is in decline. Whether or not this is true, of course, is debatable. If you spend enough time in any major American city though, I suspect you’ll notice the cultural decay, or at least that people are so distracted by their phones that they have a hard time noticing the people around them. Perhaps I’m simply becoming crotchety as I age and this is just another personal rendition of the golden age fallacy. [1] Then again, perhaps not.

The late cultural critic Neil Postman was one of these thinkers who saw the cultural decay in America. In Amusing Ourselves to Death, Postman indicts the television as the root cause of this decay. This prescient book was published in 1985 long before the advent of tablets and smart phones, but many of the arguments are still applicable to modern times. Essentially, Postman argues that television is a medium that transforms our relationship with each other — and with our ability to obtain knowledge about the world — in a negative way. His belief is that writing is a medium suitable for rational discourse, where as television is a medium suitable almost purely for entertainment.  Given the current state of global media it’s hard not disagree, but fortunately for the human race the issue is not quite as black and white as Postman makes it sound.

+++

Near the beginning of the book, Postman reminds us that two prominent writers of the 20th century, George Orwell and Aldous Huxley, prophesied about the state of human affairs in the distant future.  Postman also points out that despite often being associated with envisioning the same thing, the two thinkers visions couldn’t have been more different. He eloquently captures the difference between their vision’s with the following passage, which is worth quoting at length:

What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one. Orwell feared those who would deprive us of information. Huxley feared those who would give us so much that we would be reduced to passivity and egoism. Orwell feared that the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of irrelevance. Orwell feared we would become a captive culture. Huxley feared we would become a trivial culture, preoccupied with some equivalent of the feelies, the orgy porgy, and the centrifugal bumblepuppy. As Huxley remarked in Brave New World Revisited, the civil libertarians and rationalists who are ever on the alert to oppose tyranny ‘failed to take into account man’s almost infinite appetite for distractions.’ In 1984, Huxley added, people are controlled by inflicting pain. In Brave New World, they are controlled by inflicting pleasure. In short, Orwell feared that what we hate will ruin us. Huxley feared that what we love will ruin us.

Before you accuse Postman of snobbery or elitism, remember that he is not arguing that television is inherently bad, but rather that it’s inherently bad for public discourse. The mind numbing, yet often highly entertaining, shows on television are not a threat to our political process, but news programs like CNN and Fox News are. The reason Postman believes this is because  news programming is about creating entertainment and making money, not about exploring the truth. Talking heads, ostensibly discussing some serious matter, will be interrupted with advertisements on virtually any news channel you can find.  Is no issue important enough to be exempt from profiting off of it?

If you don’t believe news is about entertainment — and not truth — then consider the following anecdote Postman notes in the book involving a news anchor in Kansas.  This particular female news anchor Postman speaks of was fired “because research indicated that her appearance ‘hampered viewer acceptance'”.  I bet this happens more than most of us realize.  Come to think of it, can you name a single horribly unattractive news anchor?  I can’t, and the reason is because politics has become like show business, and “If politics is like show business,” writes Postman, “then the idea is not to pursue excellence, clarity or honesty but to appear as if you are, which is another matter altogether.”

While I’m largely sympathetic to Postman’s critique of television as a poor medium for public discourse, I don’t believe it applies to computers and the Internet as well as it applies to television. Postman wrote this book before the explosion of the Internet, which is too bad, because I’m sure he’d have plenty of interesting things to say about the Internet as well.  If he were still alive perhaps he’d view, as I do, the rise of the Internet as a positive thing — perhaps his outlook on the future of human intelligence and discourse might not have been so grim.

Notes:

[1] See my essay: “Longing for the Human Golden Age”


Book Review: Too Big To Know

How do you know what you think you know? What counts as knowledge and what doesn’t?  These questions speak to a great semantics-based problem, i.e., trying to define what ‘knowledge’ is. Studying the nature of knowledge falls within the domain of a branch of philosophy called epistemology, which happens largely to be the subject matter of David Weinberger’s book Too Big Too Know.

According to Weinberger, most of us tend to think that there are certain individuals — called experts — who are knowledgeable about a certain topic and actually possess knowledge of it. Their knowledge and expertise is thought to be derived from their ability to correctly interpret facts, often through some theoretical lens. Today, like facts, experts too have become ubiquitous. It seems we are actually drowning in a world with too many experts and too many facts, or at least an inability to pick out the true experts and the important facts.

Most of us are appalled, for instance, when we hear the facts about how many people are living in poverty in the United States. However, these facts can be misleading and most people don’t have enough time to think critically about the facts that are hurled at them every day. There might in fact be “X” amount of  people living in poverty in the United States, but did you know that someone with a net-worth north of one million dollars can technically be living in poverty?  How the government defines poverty is very different than the connotation that many of us have of that word. The amount of income you have is the sole factor used to determine if one is “living in poverty,” but this bit of information seldom accompanies the facts about how many people are “living in poverty.”

I recently posed a question on Facebook asking my subscribers if a fact could be false. To my surprise, there was much disagreement over this seemingly simple question. Weinberger reminds us that facts were once thought to be the antidote to disagreement, but it seems that the more facts are available to us, the more disagreements we seem to have, even if they are meta-factual.

It’s unquestionable that today’s digitally literate class of people have more facts at their fingertips than they know what to do with. Is this, however, leading us any closer to Truth? Well, not necessarily. This is because not all facts are created equal, and not all facts are necessarily true. Facts are statements about objective reality that we believe are true. However, while a fact can be false, truth is such regardless of our interpretation of it — we can know facts, but we can’t necessarily know Truth.

In the book, Weinberger draws an important distinction between classic facts and networked facts. The late U.S. Senator Daniel Patrick Moynihan famously said: “Everyone is entitled to his own opinions, but not to his own facts.” What he meant by that was that facts (what Weinberger calls classic facts) were thought to give us a way of settling our disagreements. Networked facts, however, open up into a network of disagreement depending on the context in which they are interpreted. “We have more facts than ever before,” writes Weinberger, “so we can see more convincingly than ever before that facts are not doing the job we hired them for.” This seems to be true even amongst people who use a similar framework and methodology for arriving at their beliefs (e.g., scientists).

One of Weinberger’s central arguments is that the Digital Revolution has allowed us to create a new understanding of what knowledge is and where it resides. Essentially, he claims that the age of experts is over, the facts are no longer the facts (in the classical sense), and knowledge actually resides in our networks. While this is an interesting idea, I’m not sure it’s entirely true.

Knowledge is a strange thing since it depends on the human mind in order to exist. I have a stack of books sitting on my desk, but I don’t point to them and say there is a stack of knowledge sitting on my desk. I simply don’t yet know if there is any knowledge to be gleaned from those books. For this reason, I don’t think knowledge can exist on networks either. Knowledge requires human cognition in order to exist, which means that it only exists in experience, thus giving it this strange ephemeral characteristic. I cannot unload my knowledge and store it anywhere, then retrieve it at a later date. It simply ceases to exist outside of my ability to cognize it.

Knowledge, Weinberger argues, exists in the networks we create, free of cultural and theoretical interpretations. It seems that he is expanding on an idea from Marshall McLuhan, who famously said, “The medium is the message.” Is it possible, then, that knowledge is the medium? The way I interpret his argument, Weinberger seems to be claiming that the medium also shapes what counts as knowledge. Or, as he himself puts it, “transform the medium by which we develop, preserve, and communicate knowledge, and we transform knowledge.” This definition of knowledge is, however, problematic if one agrees that knowledge can only exist in the mind of a human (or comparable) being. To imply that a unified body of knowledge exists “out there” in some objective way and that human cognition isn’t necessary for it to exist undermines any value the term has historically had. Ultimately, I don’t agree with Weinberger’s McLuhanesque interpretation that knowledge has this protean characteristic.

In a recent essay in The Atlantic Nicholas Carr posed the question: “Is Google Making Us Stupid?” His inquiry spawned a fury of questions pertaining to our intelligence and the Net. Although Weinberger has high hopes for what the Net can do for us, he isn’t necessarily overly optimistic either. In fact, he claims that it’s “incontestable that this is a great time to be stupid” too. The debate over whether the Internet makes us smarter or dumber seems silly to me, though. I cannot help but conclude that it makes some people smarter and some people dumber — it all depends on how it is used. Most of us (myself included) naturally like to conjugate in our digital echo chambers and rant about things we think we know (I suspect this is why my provocative “Who Wants to Maintain Clocks?” essay stirred up some controversy — most RS readers don’t usually hear these things in their echo chambers).

Weinberger also argues that having too much information isn’t a problem, but actually a good thing. Again, I disagree. In support of this claim, he piggybacks off of Clay Shirky, who tells us that the ills of information overload are simply filtering problems. I, however, don’t see filtering as a panacea because filtering still requires the valuable commodity of time. At some point, we have to spend more time filtering than we do learning. An aphorism by Nassim Taleb comes to mind: “To bankrupt a fool, give him information.”

Overall, Weinberger does a nice job of discussing the nature of knowledge in the Digital Age, even though I disagree with one of his main points that knowledge exists in a new networked milieu. The book is excellent in the sense that it encourages us to think deeply about the messy nature of epistemology — yes, that’s an opinion and not a fact!

(Cross-posted at Rationally Speaking)


Book Review: Technopoly

The late Neil Postman’s book, Technopoly, is a sobering assessment of a technologically obsessed American culture.  The fact that the book was presciently published in 1992, long before the Internet became ubiquitous, is alarming.  Don’t be fooled though, Postman isn’t a pure Luddite and this isn’t a book that is anti-technology.  Perhaps the best way of putting it is that Postman harbors a sense of digital ambivalence.  Like Postman, I don’t necessarily condemn the technologies themselves per se, although I certainly share some of his concerns.  Technology can complement human values or it can desecrate them.  It all depends on its application.  So how did American culture become a Technopoly?

According to Postman, a technological history of a society can be broken into three phases: tool-using, technocracy, and Technopoly.  In a tool-using culture, technology is used merely as a physical tool (think utensils),  where as in a technocracy the tools “play a central role in the thought world of the culture”.  In a Technopoly, then, the culture can only be understood through the tools.  Technopoly can thus be thought of as a “totalitarian technocracy”.  At the time this book was published Postman claimed that United States was the only Technopoly in existence (I suspect he would revise that statement today if he were still alive).

A Technopoly is a society that thinks that knowledge can only be had through numbers and thus, it is a society that puts an obsessive focus on trying to quantify life and puts excessive trust in experts.  It’s also a society that believes that management is a science.  I suspect Postman, if he were still alive, would agree with me that it’s the soft technologies that are the most insidious.  You know, things like IQ tests, SATs, standardized forms, taxonomies, and opinion polls.

The idea of trying to quantify things like mercy, love, hate, beauty, or creativity simply wouldn’t make sense to the likes of Galileo, Shakespeare, or Thomas Jefferson, according to  Postman.  Yet, this is exactly what many of our platonified social scientists try to do today.  He goes on to say that, “If it makes sense to us, that is because our minds have been conditioned by the technology of numbers so that we see the world differently than they did.”  Or as Marshall McLuhan succinctly put it: “The medium is the message.”

So where did this obsessive focus on quantifying begin?  Postman traces its history back to the first instance of grading students’ papers (quantitatively), which occurred at Cambridge University in 1792, thanks to the suggestion of a tutor named William Farish.  Farish’s idea of applying a quantitative value to human thought was crucial to those who believed we could construct a mathematical concept of reality.

So what beliefs emerge in the technological onslaught?  Here’s one passage that resonated with me.

These include the beliefs that the primary, if not the only, goal of human labor and thought is efficiency; that technical calculation is in all respects superior to human judgment; that in fact human judgment cannot be trusted, because it is plagued by laxity, ambiguity, and unnecessary complexity; that subjectivity is an obstacle to clear thinking; that what cannot be measured either does not exist or is of no value; and that the affairs of citizens are best guided and conducted by experts.

Another modern side effect of Technopoly is information overload and I think it’s fair to say that Postman was disgusted by our obsession with information and statistics.  There are statistics and studies that support almost any belief, no matter how nonsensical.  Personally, I think Nassim Taleb put it well: “To bankrupt a fool, give him information.”  Postman stretches a popular adage to drive home this point himself.  “To a man with a hammer, everything looks like a nail, and therefore, “to a man with a computer, everything looks like data.”

Postman reminds us, however, that not all information is created equal.  He writes: “Information has become a form of garbage, not only incapable of answering the most fundamental human questions but barely useful in providing coherent direction to the solution of even mundane problems.”  For example, consider the following noise that I’ve made up, but could easily be recited on ESPN:  77% of all Superbowl games have at least one field goal scored within the last seven minutes and 27 seconds of the third quarter.  Even if this were true, does it really tell us anything useful?  If one has an opinion they want verified, they can easily go on the Web and find “statistics” to support their belief.  Sadly, there seems to be not only a market for useless information on the Web today, but for harmful information too.

A Technopoly, according to Postman, also promotes the idea that education is a means to an end, instead of being an end in itself.  He laments the fact that education is now meant to merely train people for employment instead of instilling a purpose and human values in them.

Ultimately, reading this book reminded me that those who don’t learn how to use technology will be used by it.

[click the following for amazon.co.uk and amazon.ca copies of the book]

 


Book Review: The Price of Everything

You’ve likely heard of the pop economics genre, but did you know that there is econ-fiction too?  Perhaps econ-fiction is a merely a branch of pop economics, but either way, it makes for a great way to teach basic economics to inquiring young (and old) minds alike.  And as far as I’m concerned, Russ Roberts (a professor at George Mason University) writes some of the most powerful didactic fiction about economics around.

His book, The Price of Everything, is a parable that engages readers and nudges them to think deeply about the economic concepts that we encounter in our everyday lives.  The two main characters in the story are Ramon Fernandez, a budding tennis prodigy who is studying at Stanford, and an economics professor named Ruth Lieber.  At one point in the story, Professor Lieber poses the following question: “Don’t you think it’s strange that in America, the country where the greatest economic revolution in history has taken place, the average citizen has no idea why we’re richer?”  I would add that it’s not only strange, it’s very strange.  Attempting to answer this question in intelligible and non-dull terms was (I suspect) Roberts’ impetus for writing this book.

At the beginning of the story, we learn that an earthquake has just rocked the Bay Area.  In the wake of the disaster, Ramon and his girlfriend are on a quest to purchase some flashlights.  Ever the champion for social justice, Ramon becomes outraged to find out that Big Box (a fictional mega-store) is selling flashlights at double the price of a Home Depot, which is fresh out of flashlights.  While waiting in line to purchase the pricey flashlights at Big Box,  Ramon becomes distraught when he sees a poor woman waiting in line who realizes that she can’t afford baby food and diapers because of the store’s post-disaster price hike.  She only has $20, but needs $35 worth of food and diapers.  Ramon asks: “How could she have known that Big Box would gouge her with doubled prices?”  We then learn that Ramon collects money from other store patrons and is soon using a megaphone in front of the store to rile people up about this perceived injustice.

Not long after this debacle at Big Box, we learn there is a planned protest against Big Box on the Stanford campus.  This is where Ramon meets Ruth Lieber, a university provost and economics professor.  They chat about the protest and slowly end up developing a relationship that unfolds throughout the rest of the book.  The economic lessons contained within the book largely play out through their many conversations.

In one such lesson, Ruth explains to Ramon how free-market price signals allow an economy to operate more efficiently than centrally planned ones.  She goes on to elucidate (in a Hayekian vein) about the knowledge problem in economics, which Friedrich Hayek himself put this way: “The curious task of economics is to demonstrate to men how little they really know about what they imagine they can design.”

At one point in the parable, Professor Lieber is in the middle of a discussion with Ramon and we get the sense that she’s feeling a bit rattled by a questioning Ramon.  She collects herself and then shares some profound wisdom about economics by saying the following:  “Oscar Wilde said that a cynic is someone who understands the price of everything and the value of nothing.  Clever people like to say the same thing about economists, as if we were soulless calculators in green eyeshades, obsessed with prices and money.  We’re mercenaries, it is said, weighing costs and benefits down to the last penny.  But economics is not about prices and money.  Economics is about how to get the most out of life.”

If there is only one thing to learn from this book (or about economics in general), I believe it is this last point.  The beauty of Roberts’ writing (and his podcasting) is that his ideas get in your bones. Accordingly, I highly recommend this book to anyone interested in learning about the dismal science, I mean economics.

[click the following for amazon.co.uk and amazon.ca copies of the book]

 


Book Review: The God Delusion

I’m fascinated by religion and the need we humans have for it. And I suspect that many others are as well. However, as any culturally fluent individual understands (at least tacitly), it’s not exactly the choice topic to discuss at cocktail parties with guests whom you don’t know very well. Unless, of course, you want to risk being perceived as a gauche individual. In a similar fashion to politics (yet more intense), religion has a tendency to escalate tensions between people in a fiery way that usually ends in a nonsensical shouting match.

That religion tends to create controversy is not, however, a sufficient reason to avoid being critical of it, even publicly.  I think is because cultural fashions come and go.  Consider this: I suspect at one point in American history it was socially taboo to openly criticize slave owners too.  Just imagine if no one had the courage to rock the boat with slave owners (think “master-slave morality“).  Other people’s religious beliefs, both intentionally and inadvertently, affect all of our our personal lives, often in negative ways.  In other words, the foolish beliefs of others can cause very real harm to innocent bystanders.  My question is this: should rational people politely respect these types of beliefs?

Richard Dawkin’s book, The God Delusion, is a brave, yet much needed, polemical critique of religion (even moderate religion) for modern day atheists. And parts of it certainly come off as offensive. Offending others, however, is not categorically a thing to avoid doing. In fact, sometimes it’s mandatory! As Daniel Dennett wrote:

I listen to all these complaints about rudeness and intemperateness, and the opinion that I come to is that there is no polite way of asking somebody: have you considered the possibility that your entire life has been devoted to a delusion? But that’s a good question to ask. Of course we should ask that question and of course it’s going to offend people. Tough.

Perhaps you’ve heard a line of reasoning similar to the following: you’re an atheist (?), well then, you must be an immoral heathen too! Even an amateur logician would be able to tell you that this argument is absurd, yet I frequently hear variations of this argument muttered from the mouths of people who are otherwise very intelligent.

Dawkins asserts that atheists can be “happy, balanced, moral, and intellectually fulfilled” and he is absolutely correct. It’s not within the scope of this essay to provide the scientific evidence behind why this absurd claim needn’t necessarily be true; however, I do intend to use this anecdote to make a further claim (and one that Dawkin’s promotes too).  The current fashion in America is to view atheism as a dirty word that is associated with some nasty connotations.

In America, it’s acceptable to say you reject the gods and tenets of certain religions, but heaven forbid you call yourself an “atheist”. The problem, however, is that there is some confusion over what atheism really is, because there are at least two ways to think about atheism.

Here is how Wikipedia defines atheism:

Atheism is, in a broad sense, the rejection of belief in the existence of deities. In a narrower sense, atheism is specifically the position that there are no deities.

So in the first sense, almost every human is an atheist of some gods. In the second sense, atheist is a word for the dogmatic belief in something we cannot yet prove with empirical evidence or logic. I believe that atheists (at least intelligent ones) use the term in the broad sense when discussing religion. I concede, however, that when the term is used dogmatically in the second sense, it can be as offensive as religion itself is. Allow me, then, to reveal my personal bias and religious views since they certainly influence what I’m writing in this essay. I am committed to the first kind of atheism.  There are many different fashionable euphemisms to describe the type of atheist I am.  However, if I must use one, I prefer “possibillian“.

Dawkins, in a witty fashion, elucidates on the first type of atheism when he wrote the following. “I have found it an amusing strategy, when asked whether I am an atheist, to point out that the questioner is also an atheist when considering Zeus, Apollo, Amon Ra, Mithras, Baal, Thor, Wotan, the Golden Calf and the Flying Spaghetti Monster. I just go one god further.”  In other words, if one is willing to reject any of these gods, then that individual is an atheist in at least one sense of the word.  And what methodology does one use to reject these types of gods?  Shouldn’t this same methodology be used to evaluate all gods?  I’ll leave it to the reader to ruminate over the logical implications of this point.

Along with atheism, “God” itself is a confusing word that can have subtle, but distinctly different meanings. To some children (and sadly to some adults too), God is a Colonel Sanders looking figure who sits on a cloud and interferes in our lives (by helping people win football games for example). But to renown scientific figures like Albert Einstein and Stephen Hawking, the word God is often used to describe the supreme mystery that science cannot yet (or never will) explain.

As such, it’s entirely disingenuous to equate these two usages as equal.  A theist’s version of God is much different than, say, a deist’s or a pantheist’s. When scientific geniuses, like Einstein, used the word “God”, then, it does not necessarily imply that they meant it in the same context that today’s theists would have us believe.  Although, theists obviously have an incentive to have us believe otherwise in order to support their goofy beliefs.

According to Dawkins, Einstein didn’t believe in a personal God. However, there is a reason to believe that Einstein may have been an agnostic. When Einstein said or wrote things like “God does not play dice”, he was not speaking literally though. Religious texts run deep in most people’s literary background and, as such, we often use religious terminology metaphorically and poetically. Even atheists of the dogmatic variety inadvertently do this on occasion.

I think many critics of Dawkins, especially those who haven’t read his work, unfairly accuse him of attacking all notions of “God”. He’s not. In fact, he makes it explicitly clear that he is out to attack the theists’ version of God and not Einsteinian versions of God.  And as I alluded to earlier, if one is already willing to reject some gods based on lack of evidence or logic, why not follow the same methodological and logical implications that would reject theistic notions of God as well?

Strangely, many people are quick to protect the sacred irrationality of religion, again, particularly in America. I have my own theory as to why this is and it relates closely to Argumentative Theory. My reasoning goes like this. Arrogance affronts us, even if we know that the arrogant person is correct (actually, I think it affronts us more so in people who are correct). And we humans have this fascinating self-preservation mechanism that often lets our ego get in the way of searching for truth.  In other words, we’d rather win than be right.   As such, I suspect that this is why many open and liberal minded people come to the defense of religion (even if they don’t agree with it) and dislike the militarism they perceive in people like Dawkins. They’re simply looking to argue with someone and want to defend the religious, whom they likely feel are ill equipped to defend themselves in argumentation.

The trouble with this sort of defensive passivity towards religion, though, is that real harm is inflicted on real people because of foolish and irrational beliefs (read some history books for examples). Religion can (and has been) used to justify doing a lot of harmful things (again, history books are littered with examples). If we allow ourselves to be sensitive to some absurd faith based claims, then how can we shun other ones that are equally as absurd, but culturally different? For instance, some religions have faith that illegal narcotics allow them to come into contact with their God. Should we make a legal exception for these individuals?  How about for prayers in schools? How can the open-minded person deny these individuals their right to worship their God, where ever and when ever they choose? For this reason, I think Dawkins makes a compelling case for actively fighting against religion’s influence in places where it doesn’t belong, namely public places.

In the end, Dawkins reminded me that religion stems from the uniquely human and narcissistic desire to believe that the universe was created for our benefit.  News flash: it wasn’t.  Science tells us that we are incidental and accidental.  Not surprisingly, this thought makes some people uncomfortable.  The point of science isn’t to make people comfortable though, it’s to discover truth.  Personally, I think the great Umberto Eco offers up some consoling words to those who embrace their atheism: “When men stop believing in God, it isn’t that they then believe in nothing: they believe in everything.”

[click the following for amazon.co.uk and amazon.ca copies of the book]

 


Book Review: On Bullshit

One problem with the world, particularly in public life, is that we encourage people to spew bullshit from their mouths. Now, if you’re anything like me, you probably have a notion in your mind of what bullshit is, but you probably haven’t ruminated over the word’s semantic nuance. Bullshit is one of those things that you most likely think you understand until you’re asked to define it, at least that was the case for me. So what exactly is bullshit anyway?

The answer to this question is largely the subject of Harry G. Frankfurt’s philosophical essay titled On Bullshit. The book is fairly short (I read it twice in the same week). Essentially, when someone isn’t concerned with the truth value of what they say, they’re spewing bullshit out of their mouth. Or as Frankfurt puts it: “It is just this lack of connection to a concern with truth — this indifference to how things really are — that I regard as of the essence of bullshit.”

Bullshitting, then, is different from lying because lying, according to Frankfurt, requires the intention to deceive, which means the person telling the lie knows what they are saying is false. In other words, a liar unequivocally believes they know the truth and tries to deceive based on that knowledge. It is this intention to deceive that is the distinguishing factor between lying and bullshitting. It’s important to note though that bullshit can be either true or false. In other words, a bullshitter may inadvertently spew truth out of their mouth too.

The following passage nicely elaborates this distinction between lying and bullshitting.

It is impossible for someone to lie unless he thinks he knows the truth. Producing bullshit requires no such conviction. A person who lies is thereby responding to the truth, and he is to that extent respectful of it. When an honest man speaks, he says only what he believes to be true; and for the liar, it is correspondingly indispensable that he considers his statements to be false. For the bullshitter, however, all these bets are off: he is neither on the side of the true nor on the side of the false. His eye is not on the facts at all, as the eyes of the honest man and of the liar are, except insofar as they may be pertinent to his interest in getting away with what he says. He does not care whether the things he says describe reality correctly. He just picks them out, or makes them up, to suit his purpose.

Like me, you’ve probably noticed that bullshit is ubiquitous. And one of the most interesting things about bullshit is that every one of us has bullshitted at some point in time. I suspect, as does Frankfurt, that this is because most of us have been pressured to state an opinion about something in which we are perhaps somewhat knowledgeable, but also to some degree ignorant. Bullshit, then, flows from our mouths when we step beyond the realm of our knowledge and into that murky area where our thoughts are littered with contradictions we have not been able to reconcile. As such, we generally cling to hopeful thoughts and envision a world that we want to see, false as that vision may be.

In my opinion, you’re virtually always going to hear bullshit when people discuss politics, economics, or religion. I suspect this is because most people are more interested in pleasantries than truth. In instances where you suspect bullshitting, it’s important to think deeply about the motivations of the person you are conversing with. Is this person a liar or are they just talking out of their ass? I usually suspect the latter first. As such, when it comes to bad public policy, I think most harm is caused by foolish thinking from people who want to do good, but inadvertently cause harm. In other words, I suspect that there are far more bullshitters than liars in the world.

Sadly, in public life, it seems that these three subjects aforementioned have been hijacked by people who aren’t really interested in truth though, even though they may not be liars per se. Personally, I think this can have disastrous consequences for our collective well-being. You’ve probably noticed this in conversations you’ve had as well. For some strange reason, most people feel compelled to share their opinions about these subjects. A lack of knowledge on these topics certainly doesn’t deter most people from professing strong opinions, nor does even, say, a lack of deep and critical thinking about the topics. If only it were more acceptable to simply say “I don’t know” when confronted with the limits of your knowledge!

The ability to bullshit, I suspect largely for evolutionary reasons, comes naturally to all humans. This book, however, reminded me that I need to do my part in cognitively recognizing and limiting the amount of bullshit that is spewed out of my mouth. I plan to revisit this essay in the future when my tolerance for bullshit has once again waned.

[click the following for amazon.co.uk and amazon.ca copies of the book]


Book Review: Why Read?

Why do we go to school? Is it for the credentials? Is it to learn some technical skills? Is it to learn how to live better? Is it for some other reason? Or is it for some combination of reasons? In a wonderful and emotionally charged essay in the Oxford American titled, “Who Are You And What Are You Doing Here“, Mark Edmundson challenges readers to think about what an education really ought to provide us. Education is not immune to economic analysis, but, before running a cost-benefit analysis, it’s of utmost importance that we tally up all the costs and benefits, both financial and non-financial. My views on the topic of education have slowly evolved over time, which is one of the beauties of writing a blog! Anyway, Edmundson’s essay was my first introduction to his work and it prompted me to read his book Why Read.

In essence, Why Read is a book about the importance of literature and, more generally, a liberal-style education. It seems that in the modern era many people find a liberal education increasingly irrelevant when compared with learning technical skills. I suspect that this is often because the financial return on an education rooted in the liberal arts pales in comparison to the financial return that can come from studying, say, finance. What are those poor literary types and pedantic souls who enjoy reading literature, history, and philosophy really good for anyway? Edmundson asserts, and I agree, that the professors in the humanities in particular fulfill a very important societal role because, if they do their job well, they help teach people how to live. And what is more important than learning how to live?

For many people, learning how to live has traditionally been the responsibility of religious teachers. That, however, needn’t necessarily be true. Books, the ones studied in literature courses anyway, can be thought of as secular bibles, according to Edmundson. “I think that the purpose of a liberal arts education”, writes Edmundson, “is to give people an enhanced opportunity to decide how they should live their lives.” Admittedly, religion often does this, but so too can a secular education (and often without many of the dogmatic ills). I’ll acknowledge that Jesus can teach us lessons about how to live, but lest we forget that Socrates can too! And some of the similarities between these two characters are uncanny. Why not treat both of them as the historically significant philosophers they were without deifying either of them? Edmundson ultimately suggests that literature can be a substitute for religion, although he very carefully refrains from attacking religion.

While I largely enjoyed the book, Why Read was not without it’s flaws. On several occasions, Edmundson dances around the topics of truth and Truth. First, consider the following passage.

What am I asking when I ask of a major work (for only major works will sustain this question) whether it is true is quite simply this: Can you live it? Can you put it into action? Can you speak–or adapt– the language of this work, use it to talk to both yourself and others so as to live better? Is this work desirable as a source of belief? Or at the very, can it influence your existing beliefs in consequential ways? Can it make a difference?

I think it’s fair to concede that there are many different routes that may help us converge on Truth. Edmundson, however, seemed weary of denying that Truth exists at all in the book. While it may be fun to play around with semantics so that you can avoid offending anyone, I think it’s intellectually dishonest to avoid calling verifiably false claims in religious texts myths. At several times throughout the book, I got the impression that Edmundson thought truth was relative or that Truth doesn’t exist. Do works like the Bible and the Koran belong with works of historical non-fiction or do they belong in the same category as Homer’s Illiad? I think the answer is obvious, but many academics are too scared of offending people who perhaps desperately need to be offended. Many of us, myself included, would likely stand to learn a great deal about the world and ourselves, if only we aren’t so afraid of being offended.

Arguably, one of the most important benefits that comes from studying philosophy in particular is that it helps us discern what is and isn’t important in this one life that we are given. Without some philosophizing, one runs the great danger of wasting a lifetime on things that aren’t really valuable. For example, one may be really good at their job and make a lot of money. However, that doesn’t really matter if the job isn’t important and if money ultimately fails to satisfy, right?

As most people who get involved in education realize, souls can be won or lost within the confines of the academy. Edmundson thinks a liberal education is required in order to promote democratic humanism and with him, I agree. Plato, however, scorned democracy and whether or not democracy is the best form of government is something that I think needs to continually be debated in philosophy classes. The debate over pedagogy shows no signs of going away anytime soon either and Why Read is an important book for those interested in that discussion.

[click the following for amazon.co.uk and amazon.ca copies of the book]


Book Reviews: Rush and The Darwin Economy

In this essay, I experiment with weaving the ideas from two different, but similar books together for one review.

***

A key insight from the work of Adam Smith is that specialization in an economy makes us all better off in absolute terms. While this is undoubtedly true in material terms, it ignores the psychological costs that can come from hyper-specialization. And Smith himself was well aware of these psychological costs. In fact, he believed that the division of labor taken to an extreme can turn people into savage creatures “as stupid and ignorant as it is possible for a human being to be”.

In absolute terms, it’s an amazing time to be alive.  Virtually every measurement of progress we humans have concocted would seem to support this conclusion.  Many poor Americans today live a more materially abundant life than the richest of the rich in other societies just a few centuries ago.  Yet, as Gregg Easterbrook pointed out in the Progress Paradox back in 2004, things are getting objectively better on almost every level and people are feeling worse and more unhappy.  And, in 2011, Matt Ridley reminded us in the Rational Optimist that there is damn good reason to be optimistic about the future too.  So where does all the unhappiness and pessimism come from?

The answer, I think, stems from an insight that can be gleaned from Charles Darwin.  In many domains, absolute terms don’t matter as much as relative terms.  Consider the following example:  Would you rather live in a house in a neighborhood that has 2,500 sq. ft. and the other houses in the neighborhood were all less than 2,000 sq. ft. or would you rather live in a 3,000 sq. ft. house in a neighborhood in which the other houses were all 4,000+ sq. ft.?  Any rational agent should clearly choose the latter scenario because a 3,000 sq. ft. house is obviously better than a 2,500 sq. ft. house (assuming, of course, the houses are equal in all other respects). Yet, in study after study, most people show a preference for the former option.

Based on this hypothetical example and similar studies, it could be logically argued, then, that relative status is more important to our overall well-being than absolute status (once absolute status reaches a certain level anyway).  I think much of our collective unhappiness can be explained by understanding this point.  Since it’s simply not logically possible for everyone to be relatively better than their peers, no matter how good things get in absolute terms, some people will still be miserable.

Part of what it means to be human is to compete with other humans.  Whether it’s sports, school, work, or social life, humans have a desire to try and be better at things than their fellow humans.  It’s not necessarily always the most pleasant part of human nature, but to deny that it exists would be extremely naive.  In order to combat the ugly side-effects that can stem from competition, many people on the Left seem to suggest that we should try to make outcomes more equal through social engineering.

Enter the self-esteem moment.  In a competition with only 10 people some people want to give out special rewards to all the participants in order to avoid hurting any one participants feelings for not doing as well as the others.  As such, we often give out 10 ribbons to all 10 participants.  Obviously, the ribbons lose any sense of being special if everyone gets one.  Although I think people on the Left do this with good intentions, they seem to think that everyone can be above-average (as the old saying goes, “the road to hell is paved with good intentions”).  Put another way, many people on the Left would rather live in Lake Wobegon than in reality.

To willfully ignore that we evolved to compete with each other for relative status and that this competition (to a degree) is good for our psyche is the essiential theme of Todd Buchholz’s book Rush.  He asserts, that trying to stifle our desire to compete with each other has the potential to cause more unhappiness than it alleviates.

On many levels, I do think that the rat race makes people miserable and I am no exception.  On a personal level, I recall working many a long week in Chicago (60+ hours).  I’d wake up early, ride a crowded train to work, spend a long stressful day in the office doing an unfulfilling job, ride a crowded train home, eat dinner, and then go to bed in order to do it all over again the next day.  What’s the point?  One doesn’t have to be a psychologist to realize that this routine will eventually make you a miserable human being, even if just on the inside.

Many self-help authors prey on people who become depressed from too much competing in life.  They claim to the know the secret to escaping the rat race or the secret to transcending the desire to compete.  This, however, is non-sense.  Rat races exist wherever there are humans competing for scarce resources.  There are even mini rat races within bigger rat races.  Even activities like yoga that are supposedly immune to competition blossom with rat races.  That’s right, people compete to be more Zen than thou (check out this video).   Even those who have supposedly created enough wealth to escape the rat race still end up becoming stressed out competing against their elite peers (the competition is stiffer).  There’s no escape!

So what if you actually need the rat race in order to be happy?  That sounds like an audacious claim, but on some strange level, I think there is some truth to that claim.  Vacations are wonderful, but I think I would be equally miserable spending my days drinking Corona on the beaches of Mexico for the rest of my life as I would working in an office in downtown Chicago.  Overall, I think Buchholz makes a very valid point in Rush about the need for stress and competition in our lives.  Without it, we may be equally as miserable as we would be with too much of it.

I think a reasonable person will agree with Robert Frank when he suggests in the Darwin Economy that we need some competition and stress in our lives, but not too much.  If we can find ways to preserve relative status and remove destructive behaviors, then this would be a net benefit to society.

Suppose there were two employees who were competing for a promotion and let’s assume that in America, the norm is that people work 40 hours a week.  Let’s further suppose that employee A is more productive and a better employee, however we choose to measure that.  Employee B then has an incentive to work more than 40 hours a week to make up for his deficiencies in talent and skill, so he decides to work 45 hours a week in order to vie for the promotion.  Employee A, however, isn’t just going to stand by and let her colleague appear better than her, so she too works 45 hours a week.  Now, both employees will work an extra 10 hours (if they are salaried, then they did this for free financially); however, the relative status is still preserved.  In his book, Luxury Fever, Frank dubbed this problem the “smart for one, dumb for all” principle.  It is in employee B’s interest to work more, but if all other agents respond rationally, he ends up making everyone worse off (including himself) without changing his relative position in the slightest.  This, in essence, is what creates the rat race.

The key to improving happiness and making sound public policy decisions is to find solutions that are beneficial to all parties without curtailing relative status.  The solution to many problems that stem from relentless competition is to realize that relative status often matters more than absolute status.  As such, we should find ways to curb destructive behaviors to the species that arise out of competing for relative status.  Frank argues that Darwin will go down in history as the most important economist, since he was the first to realize that what appears beneficial to the individual of a species is often detrimental to the species as a whole.  We humans are unique in that we can recognize this flaw in our nature.

I certainly agree with Buchholz that we do indeed need some stress and competition in our lives, just as we need some rest and relaxation.  It’s simply a matter of a degree.  However, I didn’t think he dealt with nuance of this point in the depth that it needed in Rush.

Darwin’s key insight was that relative position often matters more than absolute position and that what’s good for the individual is not always good for the group. Our economic activity is no exception. Libertarians who think government has no right to restrict their freedom to any degree, often fail to understand this point. Using an insight from the libertarian hero, Ronald Coase, Frank argues that government intervention is often beneficial for all parties in certain markets.

In the end, I enjoyed both books, but I think Frank makes a very compelling case that no rational libertarian will be able to ignore.

[click the following for amazon.co.uk and amazon.ca copies of Rush and the following for amazon.co.uk and amazon.ca copies of The Darwin Economy]


Book Review: The Management Myth

Is there anything more absurd than trying to measure something that can’t be measured? In my opinion, most performance reviews are an utter waste of time because they try to measure things that can’t or shouldn’t be measured. The most important aspects of many jobs can’t be measured, but managers (usually armed with an M.B.A.) delude themselves into thinking that scientific performance reviews measure an employee’s worth. How dare a manager acknowledge that someone is doing a good job without any data to support such a claim.

Early in my finance career, I vividly remember being the victim of one particularly ludicrous performance review in which I was lambasted for failing to meet a certain quantitative metric. At this firm, the important things, come review time, were only what could be measured. After this review, I was in a spiteful mood. I vowed that I would act like homo economicus for the rest of my time at this firm, i.e., I would do the economically rational thing and neglect all of my job responsibilities (no matter how important) that weren’t quantitatively evaluated in my next performance review. Luckily, I quit before I got promoted.

***

Not too long ago, I stumbled across an article published in the Atlantic (back in 2006) that caught my attention. The author was Matthew Stewart and the subtitle of the article read “If you want to succeed in business, don’t get an M.B.A. Study philosophy instead.” Needless to say, I was intrigued. After reading the article, I decided I needed to pick up Stewart’s book, The Management Myth.

At large, this book is a polemic attacking scientific management and business schools, although the book has an autobiographical element as well. The essence of Stewart’s argument is that management is not a science and it is silly to worship science in business problems that are fundamentally philosophical in nature. Those who understand the nuances of management make better managers than most spreadsheet jockeys.

In other words, the secret to being successful in business, while remaining an ethical personal, is not to study scientific management, but rather it is to be well-educated person. This, of course, includes an extensive study of the humanities.

Stewart has a chapter titled “The Accidental Consultant” and I can’t help but think that this is a play off Montaigne, who called himself “an accidental philosopher”.  Stewart, after his doctoral studies, somehow accidentally became a management consultant. As an insider, he saw the inner workings of the corporate monster from a consultant’s perspective.

When you think about it, consulting is really an improbable business. Bruce Henderson, the founder of the ­Boston Consulting Group, is quoted by Stewart in the book and he sums up the improbability of management consulting by asking the following: “Can you think of anything less ­improbable [sic] than taking the world’s most ­successful firms, leaders in their businesses, and ­hiring people just fresh out of school and telling them how to run their ­businesses, and they are willing to pay ­millions of dollars for their ­advice?”

In the book, we learn that the idea of scientific management, which ultimately gave birth to consulting as a business, has an interesting history that can trace its roots back to the early 20th century. And Frederick Taylor is the man to blame for it all. It seems that when the Taylorist mindset became commonplace, ethics and most corporations simply decided to part ways.

It’s worth noting that the father of modern economics, Adam Smith, warned us what the division of labor (and for that matter, an obsessive focus on scientific management) does to people when it is not tempered with humanity, i.e., it makes people “as stupid and ignorant as it is possible for a human being to become.”

Taylor, borrowing from the classical economists, scientifically studied the way pig-iron was handled by laborers at Bethlehem Steel. His research ultimately led to the sullen belief in the efficacy of scientific management. However, Mr. Stewart points out that Taylor fudged both his ­research and his results, but that’s a secret better left unspoken within the walls of Harvard Business School.

Taylor, however, isn’t the only person that Stewart picks on. He has a bone to pick with other management theorists and many modern day gurus too, e.g. Michael E. Porter and Peter Drucker.  Stewart writes: “The gurus are often accorded respect as great authorities. But what exactly are they authorities in? And why is it that millions of people are eager to pay for instruction on what they should do in the fantastically improbable event that they become CEO of a major corporation? How did management theory become so personal, so spiritual, so impractical?” I must say, I often wonder the same thing.

This book resonated with me deeply and I absolutely agree with Stewart’s critique of business schools and modern corporations. “If any political party funded political science departments in the way corporations fund the business schools, we would naturally consider their research to be little more than propaganda.”  Stewart is a very clear writer with keen prose and an often humorous tone. He’s a witty philosopher to boot.

I’ve written about the problems that come with trying to quantify everything before (here). I don’t know the secret to being an excellent manager, but I do know that trying to quantify everything is a sure-fire way to be a terrible one.  Undoubtedly, Stewart would agree.

[click the following for amazon.co.uk and amazon.ca copies of the book]


Book Review: The Machine Stops

E.M. Forster’s short story, The Machine Stops, was published in 1909.  Considering that fact, it is dripping with technological prescience that is downright spooky.  Is there a danger of becoming too reliant on technology?  What happens when the machine stops?

I’m well aware that it’s mildly ironic that I’m reviewing this book online.  However, I’m not a complete technophobe, although I’m not necessarily a techno-optimist either. This is because technology is neither categorically good or bad.  In my opinion, technology becomes evil when we use it in ways that are anti-human, but it also has many positive applications as well.  There is, however, a very real danger of becoming too reliant on technology.

Essentially, this novella is about a dystopian future in which humanity has the lost the ability to live on the surface of the earth.  Humans, then, live below ground in ‘cells’ breathing artificial air and eating techno-food.  In short, the physical needs of humans are met by what is called ‘the Machine’.  Humans interact with the world and each other through “cinematophotes” (think televisions) and through Skype-like videoconferencing.  When I look around and see people glued to their iPhones, while failing to recognize the other human beings in their immediate presence, I can’t help but think that Forster had a Delorian time machine.

Forster ultimately envisions a future world in which technology is used to shape a depressing human experience.  Some of these visions, as I’ve mentioned, sound strikingly similar to aspects of our modern world.  This is absolutely incredible if you think about what the world was like when this book was written.

In an increasingly global economy, we live in a culture which values hyper-specialization, but it comes with plenty of costs.  The rise of hyper-specialization means that most of us have no clue how most things we use work or how to repair them.  As such, we further reinforce Forster’s point that there is an element of danger in becoming increasingly reliant on things we don’t really understand.

In the novella, Forster wrote:  “You talk as if a god had made the Machine,” cried the other.  “I believe that you pray to it when you are unhappy.  Men made it, do not forget that.”  Sadly, many who suffer from neophilia have forgotten this fact.  In my opinion, there is an abundance of techno-optimism in the world today.  Techno-realism, however, is in short supply and I think we need more of it.  I think Forster would agree.

[click the following for amazon.co.uk and amazon.ca copies of the book]