In early 2020, gig workers for the app-based delivery company Shipt noticed something strange about their paychecks. The company, which had been acquired by Target in 2017 for US $550 million, offered same-day delivery from local stores. Those deliveries were made by Shipt workers, who shopped for the items and drove them to customers’ doorsteps. Business was booming at the start of the pandemic, as the COVID-19 lockdowns kept people in their homes, and yet workers found that their paychecks had become…unpredictable. They were doing the same work they’d always done, yet their paychecks were often less than they expected. And they didn’t know why.
On Facebook and Reddit, workers compared notes. Previously, they’d known what to expect from their pay because Shipt had a formula: It gave workers a base pay of $5 per delivery plus 7.5 percent of the total amount of the customer’s order through the app. That formula allowed workers to look at order amounts and choose jobs that were worth their time. But Shipt had changed the payment rules without alerting workers. When the company finally issued a press release about the change, it revealed only that the new pay algorithm paid workers based on “effort,” which included factors like the order amount, the estimated amount of time required for shopping, and the mileage driven.
The Shopper Transparency Tool used optical character recognition to parse workers’ screenshots and find the relevant information (A). The data from each worker was stored and analyzed (B), and workers could interact with the tool by sending various commands to learn more about their pay (C). Dana Calacci
The company claimed this new approach was fairer to workers and that it better matched the pay to the labor required for an order. Many workers, however, just saw their paychecks dwindling. And since Shipt didn’t release detailed information about the algorithm, it was essentially a black box that the workers couldn’t see inside.
The workers could have quietly accepted their fate, or sought employment elsewhere. Instead, they banded together, gathering data and forming partnerships with researchers and organizations to help them make sense of their pay data. I’m a data scientist; I was drawn into the campaign in the summer of 2020, and I proceeded to build an SMS-based tool—the Shopper Transparency Calculator—to collect and analyze the data. With the help of that tool, the organized workers and their supporters essentially audited the algorithm and found that it had given 40 percent of workers substantial pay cuts. The workers showed that it’s possible to fight back against the opaque authority of algorithms, creating transparency despite a corporation’s wishes.
How We Built a Tool to Audit Shipt
It started with a Shipt worker named Willy Solis, who noticed that many of his fellow workers were posting in the online forums about their unpredictable pay. He wanted to understand how the pay algorithm had changed, and he figured that the first step was documentation. At that time, every worker hired by Shipt was added to a Facebook group called the Shipt List, which was administered by the company. Solis posted messages there inviting people to join a different, worker-run Facebook group. Through that second group, he asked workers to send him screenshots showing their pay receipts from different months. He manually entered all the information into a spreadsheet, hoping that he’d see patterns and thinking that maybe he’d go to the media with the story. But he was getting thousands of screenshots, and it was taking a huge amount of time just to update the spreadsheet.
That’s when Solis contacted
Coworker, a nonprofit organization that supports worker advocacy by helping with petitions, data analysis, and campaigns. Drew Ambrogi, then Coworker’s director of digital campaigns, introduced Solis to me. I was working on my Ph.D. at the MIT Media Lab, but feeling somewhat disillusioned about it. That’s because my research had focused on gathering data from communities for analysis, but without any community involvement. I saw the Shipt case as a way to work with a community and help its members control and leverage their own data. I’d been reading about the experiences of delivery gig workers during the pandemic, who were suddenly considered essential workers but whose working conditions had only gotten worse. When Ambrogi told me that Solis had been collecting data about Shipt workers’ pay but didn’t know what to do with it, I saw a way to be useful.
Throughout the worker protests, Shipt said only that it had updated its pay algorithm to better match payments to the labor required for jobs; it wouldn’t provide detailed information about the new algorithm. Its corporate photographs present idealized versions of happy Shipt shoppers. Shipt
Companies whose business models rely on gig workers have an interest in keeping their algorithms opaque. This “information asymmetry” helps companies better control their workforces—they set the terms without divulging details, and workers’ only choice is whether or not to accept those terms. The companies can, for example, vary pay structures from week to week, experimenting to find out, essentially, how little they can pay and still have workers accept the jobs. There’s no technical reason why these algorithms need to be black boxes; the real reason is to maintain the power structure.
For Shipt workers, gathering data was a way to gain leverage. Solis had started a community-driven research project that was collecting good data, but in an inefficient way. I wanted to automate his data collection so he could do it faster and at a larger scale. At first, I thought we’d create a website where workers could upload their data. But Solis explained that we needed to build a system that workers could easily access with just their phones, and he argued that a system based on text messages would be the most reliable way to engage workers.
Based on that input, I created a textbot: Any Shipt worker could send screenshots of their pay receipts to the textbot and get automated responses with information about their situation. I coded the textbot in simple Python script and ran it on my home server; we used a service called
Twilio to send and receive the texts. The system used optical character recognition—the same technology that lets you search for a word in a PDF file—to parse the image of the screenshot and pull out the relevant information. It collected details about the worker’s pay from Shipt, any tip from the customer, and the time, date, and location of the job, and it put everything in a Google spreadsheet. The character-recognition system was fragile, because I’d coded it to look for specific pieces of information in certain places on the screenshot. A few months into the project, when Shipt did an update and the workers’ pay receipts suddenly looked different, we had to scramble to update our system.
In addition to fair pay, workers also want transparency and agency.
Each person who sent in screenshots had a unique ID tied to their phone number, but the only demographic information we collected was the worker’s metro area. From a research perspective, it would have been interesting to see if pay rates had any connection to other demographics, like age, race, or gender, but we wanted to assure workers of their anonymity, so they wouldn’t worry about Shipt firing them just because they had participated in the project. Sharing data about their work was technically against the company’s terms of service; astoundingly, workers—including gig workers who are classified as “independent contractors”—
often don’t have rights to their own data.
Once the system was ready, Solis and his allies spread the word via a mailing list and workers’ groups on Facebook and WhatsApp. They called the tool the Shopper Transparency Calculator and urged people to send in screenshots. Once an individual had sent in 10 screenshots, they would get a message with an initial analysis of their particular situation: The tool determined whether the person was getting paid under the new algorithm, and if so, it stated how much more or less money they’d have earned if Shipt hadn’t changed its pay system. A worker could also request information about how much of their income came from tips and how much other shoppers in their metro area were earning.
How the Shipt Pay Algorithm Shortchanged Workers
By October of 2020, we had received more than 5,600 screenshots from more than 200 workers, and we paused our data collection to crunch the numbers. For the shoppers who were being paid under the new algorithm, we found that 40 percent of workers were earning more than 10 percent less than they would have under the old algorithm. What’s more, looking at data from all geographic regions, we found that about one-third of workers were earning less than their state’s minimum wage.
It wasn’t a clear case of wage theft, because 60 percent of workers were making about the same or slightly more under the new scheme. But we felt that it was important to shine a light on those 40 percent of workers who had gotten an unannounced pay cut through a black box transition.
In addition to fair pay, workers also want transparency and agency. This project highlighted how much effort and infrastructure it took for Shipt workers to get that transparency: It took a motivated worker, a research project, a data scientist, and custom software to reveal basic information about these workers’ conditions. In a fairer world where workers have basic data rights and regulations require companies to disclose information about the AI systems they use in the workplace, this transparency would be available to workers by default.
Our research didn’t determine how the new algorithm arrived at its payment amounts. But a July 2020
blog post from Shipt’s technical team talked about the data the company possessed about the size of the stores it worked with and their calculations for how long it would take a shopper to walk through the space. Our best guess was that Shipt’s new pay algorithm estimated the amount of time it would take for a worker to complete an order (including both time spent finding items in the store and driving time) and then tried to pay them $15 per hour. It seemed likely that the workers who received a pay cut took more time than the algorithm’s prediction.
Shipt workers protested in front of the headquarters of Target (which owns Shipt) in October 2020. They demanded the company’s return to a pay algorithm that paid workers based on a simple and transparent formula. The SHIpT List
Solis and his allies
used the results to get media attention as they organized strikes, boycotts, and a protest at Shipt headquarters in Birmingham, Ala., and Target’s headquarters in Minneapolis. They asked for a meeting with Shipt executives, but they never got a direct response from the company. Its statements to the media were maddeningly vague, saying only that the new payment algorithm compensated workers based on the effort required for a job, and implying that workers had the upper hand because they could “choose whether or not they want to accept an order.”
Did the protests and news coverage have an effect on worker conditions? We don’t know, and that’s disheartening. But our experiment served as an example for other gig workers who want to use data to organize, and it raised awareness about the downsides of algorithmic management. What’s needed is wholesale changes to platforms’ business models.
An Algorithmically Managed Future?
Since 2020, there have been a few hopeful steps forward. The European Union recently came to an agreement about a rule aimed at improving the conditions of gig workers. The so-called
Platform Workers Directive is considerably watered down from the original proposal, but it does ban platforms from collecting certain types of data about workers, such as biometric data and data about their emotional state. It also gives workers the right to information about how the platform algorithms make decisions and to have automated decisions reviewed and explained, with the platforms paying for the independent reviews. While many worker-rights advocates wish the rule went further, it’s still a good example of regulation that reins in the platforms’ opacity and gives workers back some dignity and agency.
Some debates over gig workers’ data rights have even made their way to courtrooms. For example, the
Worker Info Exchange, in the United Kingdom, won a case against Uber in 2023 about its automated decisions to fire two drivers. The court ruled that the drivers had to be given information about the reasons for their dismissal so they could meaningfully challenge the robo-firings.
In the United States, New York City passed the country’s
first minimum-wage law for gig workers, and last year the law survived a legal challenge from DoorDash, Uber, and Grubhub. Before the new law, the city had determined that its 60,000 delivery workers were earning about $7 per hour on average; the law raised the rate to about $20 per hour. But the law does nothing about the power imbalance in gig work—it doesn’t improve workers’ ability to determine their working conditions, gain access to information, reject surveillance, or dispute decisions.
Willy Solis spearheaded the effort to determine how Shipt had changed its pay algorithm by organizing his fellow Shipt workers to send in data about their pay—first directly to him, and later using a textbot.Willy Solis
Elsewhere in the world, gig workers are coming together to
imagine alternatives. Some delivery workers have started worker-owned services and have joined together in an international federation called CoopCycle. When workers own the platforms, they can decide what data they want to collect and how they want to use it. In Indonesia, couriers have created “base camps” where they can recharge their phones, exchange information, and wait for their next order; some have even set up informal emergency response services and insurance-like systems that help couriers who have road accidents.
While the story of the Shipt workers’ revolt and audit doesn’t have a fairy-tale ending, I hope it’s still inspiring to other gig workers as well as shift workers whose
hours are increasingly controlled by algorithms. Even if they want to know a little more about how the algorithms make their decisions, these workers often lack access to data and technical skills. But if they consider the questions they have about their working conditions, they may realize that they can collect useful data to answer those questions. And there are researchers and technologists who are interested in applying their technical skills to such projects.
Gig workers aren’t the only people who should be paying attention to algorithmic management. As artificial intelligence creeps into more sectors of our economy, white-collar workers find themselves subject to automated tools that define their workdays and judge their performance.
During the COVID-19 pandemic, when millions of professionals suddenly began working from home, some employers rolled out software that captured screenshots of their employees’ computers and algorithmically scored their productivity. It’s easy to imagine how the current boom in generative AI could build on these foundations: For example, large language models could digest every email and Slack message written by employees to provide managers with summaries of workers’ productivity, work habits, and emotions. These types of technologies not only pose harm to people’s dignity, autonomy, and job satisfaction, they also create information asymmetry that limits people’s ability to challenge or negotiate the terms of their work.
We can’t let it come to that. The battles that gig workers are fighting are the leading front in the larger war for workplace rights, which will affect all of us. The time to define the terms of our relationship with algorithms is right now.