Nudges already proven to work in the real-world increased uptake of COVID-19 boosters; nudges based on lab findings and expert insights, not so much
The COVID-19 pandemic presented the field of behavioral science with an insanely consequential and robust real-world testing ground. Once vaccinations became available, the challenge was to encourage as many people as possible to get vaccinated.
UCLA Anderson’s Hengchen Dai and Carnegie Mellon’s Silvia Saccardo, along with UCLA Health’s Maria Han, Daniel Croymans and other co-authors, contributed to this field-testing bonanza with a large-scale real-world research project in 2021 that studied the effectiveness of a series of specific text nudges sent to nearly 100,000 patients of the UCLA Health System to remind them they were eligible for a jab.
Opt In to the Review Monthly Email Update.
That research, published in Nature, found that a text nudge promoting a sense of ownership with a personalized note to “Claim your dose by making a vaccination appointment ” was more effective than a text that simply provided a link to the online vaccination scheduling tool.
Theory Vs. Practice
The same research also unearthed an interesting schism between theory and practice for one particular type of nudge. In online experiments conducted via Amazon Mechanical Turk and Prolific Academic, participants who were shown a video about the value of COVID-19 vaccinations reported a higher probability of getting vaccinated. That finding was in line with other research suggesting the same promise that informational videos might increase vaccination intentions. But when the video was put into a real-world test in the 2021 field test, adding a link to the video in a text message didn’t increase uptake.
Dai and Saccardo, along with UCLA Geffen School of Medicine’s Han, Sitaram Vangala, Juyea Hoo and Jeffrey Fujimoto, are now back with fresh research that drills down on this potential disconnect between what people in a hypothetical scenario say they will do, and what people actually do in the real world.
COVID-19 vaccines were once again the focal point in their new field experiment. This time they tested various nudges gleaned from different forms of research — prior field tests, online surveys, expert predictions — for their effectiveness in getting people to schedule the booster vaccine.
In an article published in Nature Human Behavior, the authors report that nudges that had proven effective in prior real-world field tests were also effective in encouraging people to get a booster vaccination, including the “ownership” framing from their earlier COVID-19 research.
But nudges based on hypothetical findings or expert predictions got lost in real-world translation, failing to increase the likelihood of receiving the booster shot.
“While hypothetical surveys and self-reports are undoubtedly valuable for providing foundational evidence on the mechanisms of human behavior, our findings suggest that they may not always translate to complex real-world situations where various factors can affect behavior,” they write.
Taking a Shot at Increasing Booster Uptake
The researchers texted more than 300,000 patients in the UCLA health system one of 14 messages that prior field tests, lab research or expert surveys suggested might encourage them to get the booster shot; a control group did not receive any text message.
The previously field-tested effective nudges of a simple reminder, and a note playing up the psychological sense of ownership (claim your dose), compelled more patients to get a booster than simply being told the booster was available.
All other nudges, gleaned from research dependent on hypothetical scenarios or expert predictions, fell flat.
A series of a half-dozen text nudges were aimed at pulling at psychological strings, which prior lab research has shown can shift behavior. One message pushed the notion of consistency (hey, given you completed the first round of vaccinations…), another strove to appeal to “uniqueness” briefly explaining the booster was indeed different in that it was designed to attack the most prevalent COVID-19 strains at that stage of the pandemic. Another played up the “severity” or fear factor, texting “the chances that a healthy adult will develop severe or long-lasting COVID-19 symptoms are higher than many people realize.”
In the field test, these nudges weren’t effective. That is counter to what the researchers found when they tested the same six nudges in an online Mechanical Turk survey of more than 1,700 participants: Five of the six nudges in that hypothetical survey “significantly” increased a participant’s likelihood of scheduling a booster shot.
There was a similar disconnect when the researchers field tested the value of nudges that bundled messaging of getting a COVID-19 booster shot with a seasonal flu vaccine. A small survey of behavioral experts and a separate survey of regular folk suggested a nudge that combined messaging about both protective vaccinations would be more effective than just a simple text reminder of the bivalent booster. But when tested in the real world, the bundling nudges didn’t move the needle.
Though the researchers take care to point out their findings are limited to the field of COVID-19 booster vaccinations, they also center their work as part of the intensifying conversation about the real-world efficacy of research built on hypotheticals or theoretical assumptions.
“Growing concerns about the replicability and reliability of scientific findings have sparked a much-needed conversation about the importance of scientific rigor,” the authors write in. “While hypothetical surveys and self-reports are undoubtedly valuable for providing foundational evidence on the mechanisms of human behavior, our findings suggest that they may not always translate to complex real-world situations where various factors can affect behavior.”
At a minimum this seems to suggest that before resources are devoted to the implementation of any type of nudge at a policy or institutional level, it should first be taken for a spin in a real-world road test.
Featured Faculty
-
Hengchen Dai
Associate Professor of Management and Organizations and Behavioral Decision Making
-
Maria Han
Chief Quality Officer for the UCLA Health Department of Medicine and Assistant Clinical Professor
-
Sitaram Vangala
David Geffen School of Medicine at UCLA
About the Research
Saccardo, S., Dai, H., Han, M.A., Vangala, S., Hoo, J. & Fujimoto, J. (2024). Field-Testing the Transferability of Behavioural Science Knowledge on Promoting Vaccinations.