Research Brief

Trying Out Bonus-Pay Theory on Unsupervised, Low-Skill Tasks

Incentives boost output, but benefits level off at a fairly low point

Executives have good reason to be uncertain about the value of financial incentives:

  • “Rewards typically undermine the very processes they are intended to enhance,” the Harvard Business Review has advised
  • McKinsey reached the opposite conclusion: “Generous and specific financial incentives are one of the most effective tools available for executives to motivate employees.”

The value of incentive ultimately comes down to whether it works — does it encourage employees to put in extra effort? And that, most agree, depends largely on how it is designed. A paper in Management Science proposes a methodology for devising an incentive contract that leads to greater productivity.

Opt In to the Review Monthly Email Update.

Cornell’s Nur Kaynar and UCLA Anderson’s Auyon Siddiq provide a theoretical model for quantifying how the size of a bonus influences worker output. In a test of the approach with workers on Amazon’s Mechanical Turk “crowdwork” platform, the authors find that a financial bonus can in fact lead to higher-quality work, but that its effect levels off at a relatively low bonus level.

What Bosses Don’t Know

The authors examine the question of financial incentives in the growing area of on-demand jobs: ride-hailing companies such as Lyft and Uber, food-delivery services like DoorDash and freelance task work for such outfits as TaskRabbit and Amazon’s Mechanical Turk. Incentive payments here are relatively common. Lyft, for instance, pays a bonus to drivers who complete a certain number of rides within a set time period.

One characteristic of these jobs is supervisors have limited direct oversight, so it is difficult to tell whether incentives actually spur employees to greater effort. This lack of supervision presents a classic “moral hazard” problem, where the results of the workers actions can be seen but the amount of effort they put into doing a job well is invisible to managers.

The lack of visibility into worker effort presents a challenge for devising an effective incentive program. Incentives are designed to spur greater effort, which should lead to better results. But because effort is unknown, the relationship between a bonus and an increase in productivity is difficult to study in a mathematical model.

Why reward effort rather than outcome? Managers like to understand their own operation, including what an individual worker does to achieve an outcome and how that might apply to other workers. Hence the desire for more information and specifically for an understanding of how effort translates into productivity. “When the employer can’t observe effort, then it is more challenging for them to design the optimal incentive contract to shape employee performance,” Siddiq explains in an email exchange.

Finding Typos for Pay

To address this difficulty, the authors devised an algorithm that looks at data from past incentive contracts and their results and then they reverse engineered a behavioral model that closely matches the data.

To test its effectiveness, the authors applied the model to freelancers using Amazon’s Mechanical Turk task platform, where groups of workers are paid modest fees to perform tasks that are simple for humans but difficult to automate — identifying pictures of cats, for example.

For the study, they recruited 500 Mechanical Turk workers to identify typographical errors in a one-page, 500-word excerpt from a newspaper article. The workers were promised a base amount if they found at least 25% of the typos, along with a bonus for finding at least 75% of the typos. Each worker was randomly assigned a unique pay scale, with combined base and bonus payments starting at 10 cents and rising at 1-cent intervals to $1.

Remarkably, many of the participants found none of the 10 typos in the excerpt. This, the authors say, is fairly common with Mechanical Turk tasks, in which workers often put in little or no effort in the hopes of getting paid anyway.

When the base payment was low, the experiment found, an increase in the incentive payment produces a moderate increase in the percentage of workers who qualify for the bonus (from 21% at a 10-cent bonus to 36% for a $1 bonus in the largest group studied). When the base payment was $1, however, raising the bonus payment resulted in only a small increase in the share of those who earn the incentive payment, from 34% to 37%.

“These results suggest that increasing the bonus payment can indeed increase quality,” the authors write, “but the effect is significantly diminished when the base payment is already high.”

Featured Faculty

  • Auyon Siddiq

    Assistant Professor of Decisions, Operations and Technology Management

About the Research

Kaynar, N., & Siddiq, A. (2022). Estimating Effects of Incentive Contracts in Online Labor Platforms. Management Science. https://doi.org/10.1287/mnsc.2022.4450

Related Articles

Illustration of a man reaching for cherries on top of a pile of math symbols Feature / Productivity

Picked All the Low-Hanging Fruit? Finding Opportunity in Mathematical Models

After management does its best, new analytical approaches take effectiveness up another notch

A woman hands off cases of drinks to a male volunteer. Research Brief / Management

How Nonprofits Can Better Engage Volunteers

When to allow unpaid workers to call dibs on recurring tasks — and when not to

Chinese women working at a garment factory Research Brief / Productivity

Giving Workers a Say Boosts Productivity

Output at a Chinese garment factory rose 10%

A couple watching TV Research Brief / Compensation

New Technology Lifted Pay, Especially for the Bosses

Top executives saw much larger gains after broadband adoption than the workers below them