Calling Bullshit: The Art of Skepticism in a Data-Driven World
The world is awash in bullshit. Mostly the figurative kind. Which is why it is critical we’re able to discern between fact and bullshit, especially as the bullshit becomes increasing dressed in data.
Which is why this month we’re reviewing the book Calling Bullshit: The Art of Skepticism in a Data-Driven World by Carl T. Bergstrom and Jevin D. West. This is a book that aims to equip us with the tools to critically evaluate data-driven claims and arguments that we encounter in various aspects of our lives. From our work to the news to the posts we see on social media.
Product Thinking is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
The book discusses the concept of "bullshit" - statements that are presented as factual but are actually misleading, incorrect, or incomplete. The authors provide a framework for detecting and calling out bullshit, emphasizing the importance of skepticism and critical thinking.
The book focuses on understanding data and the problems we see with it—namely how it can be manipulated so easily. Then how we can spot the bullshit. And ends with tips on how to call out bullshit when we see it.
As the book states, it is infinitely easier to create bullshit than it is to clean it up. It is easier to create misinformation or to misrepresent data than it is to go back and help everyone understand the truth. Which is why learning to detect bullshit and call it out is such a fun theme which we’ll explore more now.
There are several great takeaways from this book, but we’ll focus on a few key concepts.
Abundance of Data
The abundance of data does not guarantee its accuracy: The book highlights how data can be manipulated, cherry-picked, or misinterpreted to support a particular argument or agenda. The authors provide multiple examples of misleading statistics and graphs, and explain how to identify such manipulations.
In Chapter 2, the authors discuss the concept of “data dredging” or “data mining,” which is the practice of sifting through large amounts of data until a statistically significant result is found, without a clear hypothesis or prior knowledge. They explain how this can lead to false positives and illustrate the problem with several examples.
We can find spurious correlations to just about anything if we look hard enough. The chart below shows the close correlation between brooding storks and newborn babies. As storks have decreased, there has been a correlated decrease in newborn babies, which aligns with our hypothesis that storks may actually be delivering babies.
As the authors state many times, more data isn’t necessarily better, it is simply more. What we do with it is important. More data may give us more insights, but it can also increase our ability to find chance correlations. This will only increase as we incorporate more data into our work and lives, so understanding that and being able to spot the bullshit is critical.
Questioning assumptions and biases is a favorite topic of this newsletter and podcast. So it should be no surprise that I love this theme in the book as well.
The authors stress the need to be aware of one's own biases and assumptions, as well as those of others, in order to avoid being swayed by misleading arguments. They provide strategies for questioning assumptions and biases, such as considering alternative explanations and seeking out diverse perspectives.
In one example I love (since it involves skiing here in Utah), one of the authors (in their younger days) talked to people at Solitude ski resort about the best ski resorts in the country. Most of the people answered that Solitude was one of the best for a variety of reasons. That caused the author to reassess their perspective on the ski resorts in Utah.
If you are (or aren’t) familiar with resorts in Utah, Alta and Snowbird tend to rank very high nationally as the best ski resorts in the country and in Utah. If you’re coming from outside of Utah, chances are high that you’ll want to hit one of those resorts. For us locals, Solitude and Brighton are often the resorts of choice.
But the subtle problem here is that talking to people about the best resort in Utah while you’re skiing at Solitude will give you bad data. People who ski at Solitude are more likely to rank it very high. The author hadn’t considered this until their father pointed it out.
It is key to question biases and assumptions.
In another example, the authors discuss how we can manipulate data based on our assumptions or biases about certain facts.
In the example below, it is easy to see that different bracketing of wealth paints a different picture depending on how we want to present the data. If I believe taxes are unfair to the wealthy, I could easily show that most of the wealth is concentrated in lower income brackets and we should focus on slightly increasing their taxes. But if we adjust the brackets slightly, that picture changes.
We have to understand our own underlying assumptions and biases and then be able to question or challenge them when we’re looking at data or news. It is easy to agree with information that confirms our beliefs, but our first reaction to anything should be a healthy dose of skepticism.
So what can we do?
In the final chapters of "Calling Bullshit," the authors summarize their framework for spotting and calling out bullshit. They emphasize the importance of being skeptical, asking questions, and being willing to challenge assumptions and biases. They also provide a list of "red flags" to watch out for, such as cherry-picked data, exaggerated claims, and unsupported conclusions. They stress the need to approach claims with an open mind, but also to demand evidence and transparency from those making the claims.
Here are some of the best pieces of advice from the book for spotting and calling bullshit:
Be skeptical: The authors emphasize the importance of being skeptical of claims that seem too good to be true or that lack sufficient evidence. If anything seems too good or too bad to be true, it probably is.
Ask questions: The authors suggest asking questions to clarify claims and to probe for weaknesses or inconsistencies. They also suggest asking for evidence to support claims.
Be aware of biases: The authors stress the need to be aware of one's own biases and assumptions, as well as those of others. They suggest seeking out diverse perspectives and considering alternative explanations. Just because we can offer an explanation, doesn’t mean it is the correct explanation.
Check the data: The authors suggest being critical of data and statistics that are presented to support a claim. They advise checking for potential biases, errors, or inconsistencies in the data.
Demand transparency: The authors stress the need for transparency in data-driven fields, such as science, politics, and business. They suggest demanding access to data and methods, as well as clear explanations of how claims were arrived at.
Overall, the book provides a valuable guide to spotting and calling out bullshit in a data-driven world. It emphasizes the importance of critical thinking, skepticism, and ethical responsibility in evaluating claims and making decisions.
We’re only going to see more data, which means we’re only going to see more bullshit. We need to be ready to spot it and call it out so we can help others not fall victim to misrepresentations of data and their interpretation.