It’s impossible to pigeonhole Chuck Klosterman. In a career that’s spanned over 20 years, and taken him from North Dakota to New York, the writer has become a kind of pop-culture public intellectual known for his incisive and very funny essays about music, sports, television, film, and much more. He first gained attention for his 2001 memoir Fargo Rock City: A Heavy Metal Odyssey in Rural Nörth Daköta and his 2003 essay collection Sex, Drugs, and Cocoa Puffs: A Low Culture Manifesto.
Klosterman’s new book, But What If We’re Wrong?: Thinking About the Present as If It Were the Past, is a fascinating look at how conventional wisdom might not be so wise after all. He suggests that what seems obvious to us now might seem absurd centuries for now, turning his critical eye to politics, football, literature, and more, and establishing a principle he calls “Klosterman’s razor.” Chuck Klosterman spoke to Men’s Journal by telephone from his home in New York.
Could you explain the concept of what you’re calling “Klosterman’s razor”?
I was thinking about the history of ideas — which sounds like a pretentious thing to say, but that’s what I was thinking about — the way the understanding of reality changes over time, and how we construct the way we understand both the present and the past. When we’re in the present tense, a lot of people, when trying to figure out a complex problem, talk about Occam’s razor, the idea that whatever solution seems to have the least number of flaws, the simplest conclusion, is usually the best conclusion. You would definitely hear this if you’re talking to a lawyer or a journalist or a cop.
But the history of ideas pretty much proves that we are continually wrong about how we perceive things, and the way we perceive the past is never the same as the way we perceive the present, or very rarely. So I tried working through the premise where I thought about the present tense as if it were the past, building in this idea that whatever seems like the logical, obvious answer is, in all likelihood, incorrect, because that’s always been the case. It’s a real kind of inverted way of thinking, where you think of a problem, and you think of what the most obvious conclusion is, and then you almost immediately discard it. It’s a way of thinking that’s pretty hard to use in day-to-day life, but works great in a book of thought experiments.
Why do you think we’re so addicted to trying to predict things? Is it just kind of a basic human instinct that comforts us?
The goal with predicting the future is to [imagine] what the world will be like, and to better condition yourself [to] that world. So you look at the conditions of life, and think to yourself, “What will reasonably happen next?” and prepare for that. In many ways, it’s a pragmatic way of thinking. It’s the whole idea of buying insurance. You buy insurance because you assume that bad things happen to some people, and it might happen to you. But this has crossed over into something else, where there is a kind of person who feels that in order to reflect their intelligence, they need to be absolute about how certain they are about things that have never happened. And that’s kind of the problem.
I feel like you see that on the Internet sometimes, where it’s not good enough to say “I think that this candidate is going to win the election” or whatever. You have to overstate it in these really certain terms.
It is part of human nature, but the Internet has made it worse. Because there are now so many voices talking at the same time about the same topics, that the only way to get attention is to be both more extreme and more certain. So in some ways, this concept of almost enforced certitude is just a reflection of the attention economy, that in order to succeed in an economy based around attention, you need to seem more certain about your view. And to seem certain, you have to take on the identity of someone who knows everything.
I can see that when it’s about art, or something subjective, but people will make these crazy assertions about things that will be proven right or wrong in the next week, which is weird.
I can understand why a scientist might express absolute certainty in their belief system, because science doesn’t really work as an abstraction in a lot of ways. If you don’t seem certain, it almost appears as though you lack confidence in your own work, in your own system, in the scientific method. Whereas in art, I don’t see why anybody would feel certain.
Right. And people tend to make guesses about which musicians and books and TV shows are going to endure, and say things like “destined to be a classic,” or “this will never be forgotten.” What do you think gives people that kind of confidence to make those assertions? Is it sort of an extension of their fandom, or something else?
Part of it has to do with the popularity of talking that way, which sort of lowers the stakes of everything. In this book, I’m saying, let’s use the literal definition of “timeless.” Let’s talk about this idea that something’s greatness is not necessarily enough, because it’s not as though our understanding of the arts is just a collection of the greatest things. They become great because you remember them.
You make that point with David Foster Wallace’s Infinite Jest, the novel, and 9/11, and this idea that it doesn’t really matter that the novel predated 9/11, that it will still be associated with it.
Say someone perceives Infinite Jest as the best novel from this era, and they perceive 9/11 as the most important event of this era. As we get further away from both, there will be a real temptation to combine them, [even though] we know the book predates the event by five years. Because when we’re talking about, say, a novel from the 1830s, the Civil War hasn’t happened, but it would be very tempting to discuss this book in the context of the Civil War that was about to come, because it seemed like such a meaningful event, and you want to show how the art is meaningful, so you’ve got to tie it to life. And it seems very possible this could happen with this book and that event.
In the introduction to the book — and I sympathize with this so much — you mention some times that you made some predictions that ended up being wrong, like Obama never being elected president.
I was at a bar with my friend, Chris Ryan, who’s now an editor at The Ringer, and this was probably in January or February, before [Obama’s] nomination. He was just totally confident Obama was going to be president. And at the time, it just seemed like there was no way he wouldn’t lose to Hillary, so I bet him 100 dollars to one, and he won.
Have you kept track, just out of curiosity, of how many political pundits predicted at the end of last year that Donald Trump would never win the nomination?
It was unilateral! It would be far easier to find any credible political commentator from 2015 who would have said, “I think Trump will probably get the nomination.” I don’t remember that from anyone. It didn’t seem possible that he could even remain high in the polls. I did a podcast with Bill Simmons and we talked about this at one point last year, and I said, “Well, what date will be the first date where Trump will not lead most national polls on the Republican side?” And I think I said [it would be] around the time of the Super Bowl. A few other people said January. But everybody was off.
That reminds me of your chapter on sports, where you say that some people that believe football is doomed, and some people believe that football is going to survive, [but] not the way it is now. And one of the things you say is that both predictions share a “faith in reason.” Do you think people were hesitant to predict the rise of Trump because of a similar faith in reason?
Absolutely. Because even when people look at insane scenarios, when they explain that insanity to other people, they reframe it in a rational context. The only place that’s actually rational is the inside of people’s skulls. That’s the only place where things happen rationally. In any context outside of that, in the world at large, things happen somewhat rationally, somewhat irrationally, and mostly arbitrarily.
Do you think people tend to put too much faith in their own predictive abilities? Like they maybe ignore the misses and count the hits, and they only remember the predictions they’ve made that turned out right?
That’s definitely true among people who make lots of predictions. There’s a kind of person who is hesitant to make predictions, and [for them], probably not. They probably remember the handful of times that they were very wrong, and that causes them to make fewer predictions later. But people who make predictions constantly, like the pundits on ESPN or whatever, these are people who essentially make predictions under the assumption that you will forget them if they are wrong. It’s almost like a kind of fishing, where they’re trying to be accurate 5 percent of the time, but if they’re accurate 10 percent of the time, they’re geniuses.
I read some sportswriters who predicted the Spurs were going to take it all the way this year, and I thought, “Well, that’s definitely going to happen,” because I’m a huge Spurs fan. Do you think people tend to believe the predictions that have favorable outcomes to them, or the opposite, or does it vary?
I think that in the short term, people gravitate toward predictions that match their biases, and you used the perfect [example]. For somebody who wants the Spurs to win the title, the [idea] that the Spurs actually have a better team than Golden State is very desirable. But long term, I think it kind of becomes human nature to assume that dire predictions are more plausible. America’s doomed, the climate is going to destroy us, it’s only a matter of time before the economy collapses. Everybody is an optimist in the short term and a pessimist in the long term.