Tens of thousands of people have been paid by Scale AI, a company part-owned by Meta, to train artificial intelligence by collecting data from Instagram accounts, harvesting copyrighted work, and transcribing pornographic soundtracks.
AI Training and Data Collection
Scale AI, 49%-controlled by Mark Zuckerberg’s social media empire, has recruited experts from fields such as medicine, physics, and economics through a platform called Outlier. The platform advertises flexible work for individuals with strong credentials, promising opportunities to “become the expert that AI learns from.”
However, workers for the platform reported becoming involved in scraping personal data and completing tasks they described as morally uncomfortable, diverging from the stated goal of refining high-level AI systems. Outlier is managed by Scale AI, which has contracts with the Pentagon and US defense companies.
Alexandr Wang, CEO of Scale AI and Meta’s chief AI officer, was described by Forbes as the “world’s youngest self-made billionaire.” Michael Kratsios, the company’s former managing director, currently serves as the science adviser to US President Donald Trump.
Concerns Over Data Privacy
One Outlier contractor in the US said users of Meta platforms, including Facebook and Instagram, would be surprised at how data from their accounts was collected, including pictures of users and their friends. “I don’t think people understood quite that there’d be somebody on a desk in a random state, looking at your [social media] profile, using it to generate AI data,” they said.
The Guardian spoke to 10 people who have worked for Outlier to train AI systems, some for over a year. Many held other jobs – as journalists, graduate students, teachers, and librarians – and sought the extra income in a challenging economy. “A lot of us were really desperate,” said one worker. “Many people really needed this job, myself included, and really tried to make the best of a bad situation.”
Some workers expressed feelings of “internalised shame and guilt” for “contributing directly to the automation of my hopes and dreams.” Glenn Danas, a partner at Clarkson, a law firm representing AI gig workers in lawsuits against Scale AI and similar platforms, estimates that hundreds of thousands of people worldwide now work for platforms such as Outlier.
Working Conditions and Tasks
Taskers described constant monitoring and unstable employment. Scale AI has been accused of using “bait-and-switch” tactics, initially promising high salaries before offering significantly less. While Scale AI declined to comment on ongoing litigation, a source stated pay rates may change if workers opt into different projects.
Workers were asked to submit to unpaid AI interviews to qualify for assignments, with some believing these interviews were recycled to train AI. All reported constant monitoring through “Hubstaff,” a platform that screenshots websites visited during work. Scale AI stated Hubstaff is used for accurate payment, not active monitoring.
Several taskers described being asked to transcribe pornographic soundtracks, label photos of dead animals or dog faeces, and label a diagram of baby genitalia. One doctoral student reported receiving audio transcripts for pornography and random clips of people vomiting, despite being told the work would contain no nudity or gore.
The Guardian reviewed videos and screenshots of tasks, including photos of dog faeces and prompts such as “What would you do if an inmate refused to follow orders in a correctional facility?” Scale AI stated it shuts down tasks with inappropriate content and allows workers to decline uncomfortable assignments, adding that it does not accept projects involving child sexual abuse material or pornography.
Social Media Scraping and Copyright Concerns
Taskers suggested an expectation of social media scraping, with seven reporting scouring Instagram and Facebook accounts, tagging individuals, their locations, and friends. Some assignments involved training the AI on the accounts of people under 18, requiring new data not yet uploaded by other workers. One task required workers to sequentially order photos from Facebook accounts by the age of the user.
Several taskers found these assignments unsettling, with one attempting to complete them using only photos of celebrities and public figures. “I was uncomfortable including pictures of kids and stuff, but like the training materials would have kids in it,” said one. Another stated, “I didn’t use any friends or family to submit [tasks] to the AI,” adding, “I do understand that I don’t like it ethically.”
Scale AI stated taskers do not review private social media accounts and were unaware of tasks involving labeling ages or relationships. They added that while children’s public social media data is used, workers do not log into personal Facebook or Instagram accounts.
Taskers also harvested images of copyrighted artwork, apparently to train an AI to produce its own images. Documentation reviewed by The Guardian included AI-generated paintings of “a Native American caregiver” with the prompt, “DO NOT use AI-generated images. Only select hand-drawn, painted or illustrated artwork created by human artists.” Scale AI stated it does not ask contributors to use copyrighted artwork and declines work that violates this standard.
Uncertainty and Future Outlook
Taskers expressed uncertainty about the purpose of their work. “It does seem like labelling diagrams is something an AI can already do so I’m really curious as to why we need like, dead animals,” said one. Scale AI has counted Google, Meta, OpenAI, the US department of defense, and the government of Qatar among its clients.
Meta and Anthropic did not respond to requests for comment. OpenAI stated it stopped working with Scale AI in June 2025 and its “supplier code of conduct sets out clear expectations for the ethical and fair treatment of all workers.”
Most taskers interviewed continue to accept assignments on the Outlier platform, despite unsteady pay and occasional layoffs. “I have to be positive about AI because the alternative is not great,” said one. “So I think eventually things will get figured out.”
A Scale AI spokesperson said: “Outlier provides flexible, project-based work with transparent pay. Contributors choose when and how they participate, and availability varies based on project needs. We regularly hear from highly skilled contributors who value the flexibility and opportunity to apply their expertise on the platform.”
Discover more from Archyworldys
Subscribe to get the latest posts sent to your email.