Skip to main content
Explore URMC


Hoekelman Center Newsletters

Hoekelman Center Newsletter September 2022

Visit the News Archive   Subscribe

A Note From The Director September 2022

Picture from Wikipedia

Detecting Nothingburgers

I recently read a new book called The Voltage Effect: How to Make Good Ideas Great and Great Ideas Scale. As regular readers of our newsletter know, how to scale up good projects is of great interest to us here at the Hoekelman Center. We even have a little acronym. SCALE:  Situate, Collaborate, Advocate, Launch, Evaluate. In the last edition, I wrote about Situate. This time, I’ll take on Evaluate. In general, nobody wants to talk about this, but it’s essential and it can be easy.  The key concepts don’t involve any math or statistics.

One of the main examples of scaling discussed in The Voltage Effect is the DARE program. DARE (Drug Abuse Resistance Education) was remarkable for its enormous reach and longevity. Millions of children across the US participated in this for decades. It’s well established now that the original DARE had pretty much zero impact on drug use. Hence the question for people who want to help kids is how to keep things like that from scaling up. To answer this, The Voltage Effect author describes DARE as a "false positive."

He tells the story of how in the 80s someone did an evaluation of the DARE program in Honolulu. The results were so good that DARE wound up getting copied all over the country. It was a “false positive” because even though it worked in Hawaii, that success could not be replicated on a larger scale. “False positives” for scaling are possible. You can have a program that’s super successful because of a charismatic leader, and that’s great but you can’t put that secret sauce of charisma in a bottle, and so you can’t replicate the program in other places. The moral of the story is to watch out for these one-hit-wonder “false positives.”

The problem is that this story is wrong. For the Hawaii 5-0 version of DARE to count as a false positive, it would have to have been super successful. But was it? Let’s evaluate that claim.

The first question to ask is “What’s the outcome?” Well, that’s not very hard to figure out.  DARE materials tell us the objective was “to keep kids off drugs,” and “drug abuse resistance” is right there in the name. Therefore, the relevant outcome is drug use. If DARE works, a smaller percentage of kids who go through DARE will use drugs than the percent using drugs in a control group. So how did the Honolulu study determine drug use? They didn’t. They didn’t even try. Not even by self-report. 

You can stop there.

Whatever else is in the Honolulu report doesn’t really matter. It’s not strong evidence that DARE works. It’s not a “false positive” because it’s not even measuring the relevant outcome. It can’t be positive or negative.  It’s not a basis for deciding one way or another about launching a nationwide program. See, that was easy. No math.

Well, you say, there must have been something to the Honolulu report. Okay, let’s look it up online and read it. Oops! You can’t. It was never published. That’s another big red flag. I’m not saying that appearing in an academic journal is a guarantee of truth. But I can tell you from personal experience that peer-review can be grueling. And the point of it is to improve the quality of what gets published. So, looking at reports that just skip that whole process is asking to make your decision-making iffier than necessary.

If you look for the Honolulu report, what you can find on Google Scholar is the one-paragraph abstract. I quote its conclusions here:

“While results were not definitive, there was support favoring the program's preventive potential. The program was educational, providing students with skills and information they could and did use in various situations. Participants found the program enjoyable and actively engaged with police officers in a positive, constructive learning environment. Overall, results suggest that DARE provides children with information and skills that maximize their potential for adopting healthful, drug-free habits.”

You might have noticed right away that the authors are not saying that they did prevent drug use. They claim the program provided knowledge and skills. That may be true. But without some prior strong evidence that knowledge about drugs leads to keeping kids off drugs, that’s not a valid intermediate outcome. Kids “found the program enjoyable.” That’s nice, but so what? Kids find many activities enjoyable. If this report was the basis for expanding DARE all over the country, it’s not because the evaluation was a strong positive, false or otherwise.

The review paper referenced below from the American Journal of Public Health lists more ways that the Honolulu report is relatively weak evidence, but we don’t even need to get into those. Just based on the very simple analysis we conducted above, we can tell that Honolulu’s DARE experience should never have been the basis for a gigantic national scale-up.

As a result, the lesson is not about avoiding “false positives.” It’s about avoiding nothing-burgers. Oftentimes, all you need to do for that is to ask “What’s the outcome?” The outcome is the stuff in the middle of the burger. Could be anything you want. Maybe it’s beef, maybe it’s a mushroom. Whatever it is, it shouldn’t be nothing. You’re not buying the burger for the bun.


Disclaimer: The views expressed on this page do not reflect those of the University of Rochester Medical Center.