Advertisement

Data Science pp 145-154 | Cite as

Understanding Critical Thinking

  • Doug Rose
Chapter

Abstract

Questions are powerful and essential to your data science team. In this chapter, you’ll find out how to harness the power of questions. Then you’ll learn that those interesting questions are part of critical thinking. You’ll also find out about critical reasoning and how you can pan for gold to find the right questions.

Keywords

Critical Thinking Critical Question Data Analyst Research Lead Good Question 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Questions are powerful and essential to your data science team. In this chapter, you’ll find out how to harness the power of questions. Then you’ll learn that those interesting questions are part of critical thinking. You’ll also find out about critical reasoning and how you can pan for gold to find the right questions.

Harnessing the Power of Questions

Imagine that you’re giving a presentation to a group of coworkers. You’ve come up with a way to increase your company’s sales—a strategy that took you weeks to prepare. In the middle of your presentation, someone interrupts you to ask a question about your assumptions: “how did you come up with your results?” How would you react to this question? In some organizations, this would be seen as confrontational and combative. Usually, these types of questions come from skeptical supervisors or someone who disagrees. Either way, it’s outside the normal rhythm of a presentation.

In Sidney Finkelstein’s book Why Smart Executives Fail: And What You Can Learn from Their Mistakes,1 he points out that many executives accept good news without question. They save their questions for bad news or if they disagree, which means that most organizations see questions as a type of disagreement. When there aren’t any questions, people usually repeat the same mistakes. They’re prone to groupthink and have blind spots. A lot of public failures can be traced back to crucial questions that were never asked.

As mentioned throughout this book, most organizations still focus on getting things done. They have mission statements, encourage teams to drive and deliver, and work with clearly defined goals and aggressive timelines. It’s difficult to imagine an organization or meeting where everyone asks interesting questions. In many organizations, there simply isn’t any time to encourage this type of questioning. However, it’s important for your data science team to exist outside of this reality. Your team needs to create an environment that’s open to interesting questions. The rest of your organization may live in a world of statements, but your team needs to be comfortable in a world of uncertainty, arguments, questions, and reasoning.

When you think about it, data science already gives you many of the answers. You’ll have reports that show buying trends as well as terabytes of data that show product ratings. Your team needs to use these answers to ask interesting questions. It’s up to you to create a setting where everyone feels comfortable questioning each other’s ideas.

There are a couple things to remember to help your data science team stay on track.

First, if you have a newly formed data science team, it’s unlikely that the team is good at asking the right questions. That’s because they haven’t had much practice. Most teams don’t ask questions because good questions challenge your thinking and are not easily dismissed or ignored. They force the team to unravel what is already neatly understood, which requires a lot more work than just passive listening.

When you were in school, your teachers probably moved quickly through material because they expected you to memorize facts and read through expert advice. When you raised your hand, it was probably for a pretty simple question. It was probably something mundane, like “will this be on the test?” No one asked bolder questions like, “why are we learning this subject?” or even, “can we learn something different?”

At work, you probably haven’t had many opportunities to ask interesting questions. Most companies still promote people based on their ability to follow through with a corporate vision. You need to work well with your coworkers. Always asking questions isn’t always the best way to get along. You need to change that view for your data science team.

The second thing to remember is that asking questions is really hard work. Most people still prefer to make simple statements. It’s pretty easy to tell the world what you think. It’s not so easy to defend what you think to someone who can ask good questions. For example, think about something that you do for yourself that’s healthy. Maybe you eat certain foods or do certain exercises. Now ask yourself, how do you know it’s healthy? Is it because someone told you or because of how you feel? If it’s because someone told you, how do you know that person is right? Many experts disagree on what’s healthy. Which experts are right?

It doesn’t take long to realize that questioning can be exhausting. It takes a lot of work to deconstruct what you already believe to be true. Now imagine doing that in a group setting.

Try to remember that asking good questions is difficult to do and not always well received. Still, it’s essential to your data science team. The best questions will give you new insights into your data that will help you build your organizational knowledge.

Panning for Gold

Asking interesting questions is a key part of critical thinking. So let’s ask an interesting question. What is critical thinking? Most people think of critical thinking as a form of criticism. You’re judging something and deciding if it is good or bad or right or wrong. So does that mean if you don’t agree with someone that you’re applying critical thinking? Most people would say no.

Critical thinking is not your ability to judge something. The “critical” in critical thinking is about finding the critical questions that might chip away at the foundation of the idea. It’s about your ability to pick apart the conclusions that make up an accepted belief. It’s not about your judgment—it’s about your ability to find something that’s essential.

Many organizations complain that they don’t have anyone who applies critical thinking. Trying to find the critical questions isn’t something you can do all the time. It’s a little like running. Most people can do a little, and then with some exercise they can do a little more. Even the best athletes can’t run every day.

Think about our running shoe web site. Imagine that the company gave out customer coupons and had a one-day sales event at the end of the year. At the end of the month, the data analyst ran a report that showed a 10% increase in sales, as shown in Figure 15-1. It’s very easy to say that the lower prices encouraged more people to buy shoes. The higher shoe sales made up for the discounted prices and the promotions worked. More people bought shoes and the company earned greater revenue. Many teams would leave it at that.
Figure 15-1.

Average sales quantity

Pivoting average sales quantity by item SKU and by coupon code (including no coupon code), taking average sales quantity of each coupon code and minus the average sales quantity of no coupon code, you get how many more units sold on average when using each of the coupon code versus not using coupon code. For the coupon code with the highest discount (60%), there were on average 0.1 more unit sales than not using any. See how to create this chart at http://ds.tips/6acuV .

Here’s where your data science team would want to apply their critical thinking. Remember that it’s not about good or bad; it’s about finding critical questions, such as:
  • How do we know that the jump in revenue was related to the promotion? Maybe the same number of people would’ve bought shoes regardless of the promotion.

  • What data would show a strong connection between the promotion and sales?

  • Do the promotions work?

Everyone assumes that promotions work. That’s why many companies have them. Does that mean that they work for your web site? These questions open up a whole new area for the research lead. When you just accept that promotions work, everything is easy—they worked, so let’s do more promotions.

Now that the team has asked their questions, it’s time for the research lead to go in a different direction and ask even more critical questions, such as:
  • How do we show that these promotions work?

  • Should we look at the revenue from the one-day event?

  • Did customers buy things that were on sale?

  • Was it simply a matter of bringing more people to the web site?

This technique is often called panning for gold. It’s a reference to an early mining technique when miners would sift through sand looking for gold. The sand is all of the questions that your team asks. The research lead works with the team to find gold-nugget questions that are worth exploring. It’s not easy, because determining which questions are gold nuggets is a value judgment. It is up to the research lead to determine whether the questions are interesting.

The point of panning for gold is that even though you will have a lot of throwaway questions, the few gold nuggets can change the way your organization operates. There will be a lot of sand for every nugget of gold. It takes a lot of patience to sift through that much material.

If you are the research lead for the team, try to focus on actively listening to everyone’s questions. Often, their questions will be an earlier version of your question. Don’t be afraid to ask big “whys.” It might seem obvious to everyone that promotions work. That doesn’t mean that you should ignore the question. If you’re not satisfied with the answer, you may want to work with the data analysts to create reports.

Also, be sure to watch out for your own conclusions. Remember that critical thinking is about breaking down these conclusions. Make sure that you evaluate what the rest of the team is saying.

This can be really tiring work. You don’t want to be in a position where you’re being forced to accept a conclusion because you didn’t take time to ask questions. If you’re not getting to these critical questions, feel free to reschedule the meeting. Get back together when everyone feels more energized.

Focusing on Reasoning

Many of us have strong beliefs that guide us and help us understand new things. When you’re working on a data science team, beliefs might strongly influence how you and other people look at the same data. That’s why a key part of critical thinking is understanding the reasoning behind these beliefs. You should not just be able to describe your beliefs—you need to describe your reasoning behind those beliefs.

Reasoning is the evidence, experience, and values that support conclusions about the data. When you’re working on a data science team, it’s important to understand each other’s reasoning. This will help the team come up with interesting questions.

Let’s look at a simple statement as an example. “You should drink a lot of green tea because it’s good for your health.” The idea here is that you should drink a lot of green tea. The reasoning is that it’s good for your health. When you apply critical thinking, you want to ask questions about the reasoning. Why is it good for your health? How do you know it’s good for your health? Is it good for everyone’s health? If you don’t apply critical thinking, you’re left with just the idea. You just accept the fact that you should drink a lot of green tea.

Now, let’s go back to our running shoe web site. Imagine that the design team is exploring some feedback they received from customers. Many of the pictures on the site depict runners in peak physical condition. Your data science team is trying to determine if changing these pictures might impact sales.

Your team works with the web designers to run a few experiments. They randomly replace images of fit runners with those who are less fit and older. The team works with a data analyst to create reports to look at the difference in the data after the pictures were changed. The reports show a drop in overall sales, as shown in Figure 15-2.
Figure 15-2.

Drop in overall sales

Look at the time series and you will see that the “less fit and older” version of the page had slightly lower total sales by day. If you look at the five-days moving average, the “less fit and older” version is lower for the entire month. See how to create this chart at http://ds.tips/X3xex .

Now the team needs to talk about the results. Your project manager suggests that the drop in sales was because runners are motivated by the images. They don’t want pictures that show what they look like. Instead, they want pictures of who they’d like to become. The drop in sales made the shoes look less effective. It blurred the message that if you buy the shoes, you will become more fit.

The data analyst disagreed and suggested that the drop in sales was because customers thought the pictures represented an ideal customer. As a result, the customers assumed that these shoes were designed for people who just started running.

To apply critical thinking, you have to look at the reasoning behind each of these statements. In these two examples, the keywords are “because” and “as a result.” These words suggest that the reasoning will follow.

For the project manager, the reasoning was that customers are “motivated not by who they are but who they’d like to become.” For the data analyst, the reasoning was that “customers assumed that the product was designed for people who just started running.”

Now that you have the reasoning, you can start to look for critical questions. Are customers motivated to look young and fit? Did customers really believe that less fit people meant the shoes are for new runners? Who do you think has a stronger argument? More importantly, what are the weaknesses of each argument? Why would a less fit runner be considered one who just started running? You would think it would be the opposite. An older runner has usually been running for years.

There are also weaknesses in the project manager’s argument. Would a customer really believe that buying a pair of running shoes would make them look younger? Does that mean that images of even younger and more fit runners would increase sales?

Now that you have the reasoning and some critical questions, you can work with the research lead to look for data and determine the most interesting questions. What’s the median age of the customer who buys certain shoes? What strategies can be used to determine if they’re new runners? These questions will help you gain new insights about what motivates your customer.

Reasoning can be a first step toward finding critical questions. Remember that critical thinking helps your team gain more value from their reports. You can help the research leads decide what’s interesting. These interesting questions will help your team have the best insights.

Testing Your Reasoning

Think about the last time you heard someone say he or she was wrong. Not wrong about a restaurant or movie, but rather wrong about something he or she passionately believed. Can you think of any? If you can’t, that’s okay. It’s pretty rare to see someone change his or her mind. In some organizations, it’s seen as wavering or bad leadership, and it’s just something you don’t see very often.

A University of California physicist named Richard Muller spent years arguing against global climate change. He helped found the group Berkeley Earth. Much of his work was funded by the gas and oil industry. Later, his own research found very strong evidence of global temperature increases. He concluded that he was wrong. Humans were to blame for climate change. Muller saw the facts against him were too strong to ignore, so he changed his mind. He didn’t do it in a quiet way. He wrote a long op-ed piece in the New York Times 2 that outlined his original arguments and why the counter-arguments were stronger.

Remember that it’s easy to be skeptical of someone else’s ideas. What’s hard is to be skeptical of your own. Think of critical thinking in two ways:
  • Strong-sense critical thinking: When you think of critical questions about your own beliefs.

  • Weak-sense critical thinking: When you only find critical questions to pick apart someone else’s beliefs.

You probably know many more people who apply weak-sense critical thinking. They have well-thought-out arguments for what they believe and won’t ever question their own beliefs. If you come up with questions, they’ll do their best to defend their positions. What they won’t do is build on your questions or help create new questions on their own. On your data science team, you want to apply strong-sense critical thinking. Everyone on the team should question his or her own ideas, come up with interesting questions, and explore the weak points in his or her own arguments. That’s how you should apply critical thinking on your data science team.

Try to imagine what this would look like on your data science team. Let’s say the running shoe web site runs a promotion and sends out a coupon to everyone who buys a product. The data science team looks at the number of people who used the coupon to make a purchase. The data shows that 8% of your customers looked at the coupon. Of that 8%, about 5% of the customers used the coupon before it expired. The data also shows that there was a jump in revenue on the day that the coupon was sent to your customers. See Figure 15-3.
Figure 15-3.

Number of people who used the coupon to make a purchase

The graph on the left shows that around 50% of the customers received coupons. The second bar shows that among those, only 8% of customers actually clicked on the coupon. Only 5% of those people used the coupon. The graph on the right shows a spike in sales on the day the coupon was sent to customer. The coupon did affect the "no-coupon" sales, but if you compare the actual number, coupon sales accounts only for 10% the total sales due to the fact that not many people actually clicked and used the coupon. See how to create these charts at http://ds.tips/pre6E .

Your data science team wants to see how much revenue was generated by this promotion. So let’s apply some strong-sense critical thinking. You could argue that all of the new revenue that came into the site was a direct result of the promotion.

Now what are the weak parts of this argument? Maybe some of the customers who received the coupon ended up purchasing a product that was outside of the promotion. Should they be counted? Maybe you should only count the people who actually used the coupon. The problem with that is that you’re not looking at the full effect of the promotion. Maybe it would be just as effective to send an e-mail asking customers why they haven’t shopped in a while. This could be an interesting experiment.

Your data science team should be able to question all of these ideas. Someone on your team might feel strongly that any new revenue is the result of the promotion. That same person should also understand the weaknesses with that approach and be able to ask interesting questions, such as, “Are we fully understanding the customer if we look at the data this way?” Perhaps, the customer just needed to be reminded of your web site. If you only look at customers who actually use the coupon, it’s easier to separate them into two groups: those who were motivated by the savings and those who just needed a reminder.

When your team applies strong-sense critical thinking, it should feel more like an open discussion. No one should feel like they’re defending themselves. This approach is a great way for your team to ask interesting questions and, in the end, gain greater insights.

Summary

In this chapter, you learned how to harness the power of questions and that those interesting questions are part of critical thinking. You also found out what critical thinking is and how you can pan for gold to get to the great questions. Finally, you explored using reasoning while asking questions and testing your reasoning. In Chapter  16, you will learn how to encourage people to ask questions.

Footnotes

  1. 1.

    Sydney Finkelstein, Why Smart Executives Fail: And What You Can Learn from Their Mistakes. Penguin, 2004.

  2. 2.

    Richard A. Muller, “The Conversion of a Climate-Change Skeptic,” The Opinion Pages, The New York Times, January 2, 2016, http://www.nytimes.com/2012/07/30/opinion/the-conversion-of-a-climate-change-skeptic.html?_r=0 .

Copyright information

© Doug Rose 2016

Authors and Affiliations

  • Doug Rose
    • 1
  1. 1.AtlantaUSA

Personalised recommendations