Meta terminates its contract with Sama following reports from Kenyan workers who encountered intimate videos recorded by users of Ray-Ban smart glasses.
**TL;DR** Meta terminated its contract with Sama after Kenyan data annotation workers revealed to Swedish journalists that they had seen explicit footage, including sexual activity, captured by Meta’s Ray-Ban smart glasses. The 1,108 workers received just six days’ notice of their layoffs. The controversy triggered a class action lawsuit, regulatory investigations in the UK and Kenya, and an advisory from the Electronic Frontier Foundation (EFF). This situation highlights the human labor behind AI: those who train models view everything but own nothing, and they may lose their jobs for speaking out.
In February 2026, workers at Sama, an outsourcing firm in Nairobi contracted by Meta, disclosed to Swedish newspapers Svenska Dagbladet and Göteborgs-Posten that they had reviewed footage from users of Meta’s Ray-Ban smart glasses. This footage included sensitive content such as sexual activity, bathroom usage, and undressing. Their role was to label this content for Meta’s AI systems to learn. Less than two months following the investigation, Meta ended its agreement with Sama, and on April 16, it issued formal redundancy notices to 1,108 employees, citing that Sama "doesn't meet our standards." Sama contested this claim, stating it had not been alerted to any shortcomings. Naftali Wambalo, co-founder of the Africa Tech Workers Movement, suggested that the real motive was Meta's retaliation against the workers who spoke out, a claim Meta has yet to address. Ultimately, those who trained the AI viewed what the glasses captured but subsequently lost their jobs.
**The glasses**
In 2025, Meta sold over seven million pairs of Ray-Ban smart glasses, significantly increasing its market volume from the previous year. The product line has since included prescription models aimed at the billions already using corrective eyewear, shifting from a novelty to almost a standard item. These glasses can record videos, take photos, stream audio, and process inquiries through Meta AI, which analyzes images and voice commands either on the device itself or in the cloud. An LED on the frames lights up when the camera is activated, something Meta claims is a privacy feature aimed at those nearby, indicating to strangers that they might be recorded. It does not inform them that this recording could later be assessed by a worker in Nairobi, who labels the footage to help an algorithm distinguish between a kitchen and a bedroom, a handshake and an embrace, or a document and a face.
Meta’s privacy policy states that users granting permission for AI training data will have their footage processed by the company’s AI systems. However, the policy does not mention the human reviewers involved in this process. AI training data must be labeled by individuals who observe and describe the scenes before a model can learn to interpret them. The Swedish investigation illustrated this process, showing workers in Kenya employed by a contractor who examined some of the most private moments of people's lives, cataloguing the footage without prior anonymization. Viewers could see faces and personal details without any means to contact those filmed, flag footage they suspected was taken without consent, or refuse the job without risking their position.
**The workers**
Sama was founded in 2008 as a social enterprise aimed at providing dignified digital employment in low-income communities. Operating in Kenya, Uganda, and India, the company has supplied data annotation services to major tech firms, including Google, Microsoft, and Meta. The contract with Meta for annotating smart glasses data was one of several agreements between the two entities. Workers were responsible for labeling images and videos captured by the glasses, which required them to view and categorize whatever the cameras recorded.
The Swedish investigation reported that workers described seeing typical, everyday content—users engaged in intimate acts, using the toilet, undressing, and sharing financial details. This content was simply the result of a camera worn throughout the day capturing whatever the wearer viewed. The workers expressed that the job was distressing, but options were limited: it offered better pay than most alternatives, and non-disclosure agreements typically constrained open discussions about the content they reviewed. The publication of the investigation provided these workers the voice they had previously been denied.
On April 16, less than seven weeks after the investigation was unveiled, Sama informed 1,108 employees of their impending layoffs with only six days' notice. Meta attributed the contract termination to a failure on Sama's part to meet certain unspecified standards, while Sama expressed surprise and disappointment, noting they had not been alerted to any issues prior to the announcement. Labor advocates, regulators, and the workers themselves pointed out the timing of the layoffs. Wambalo claimed that Meta was enforcing secrecy rather than actual quality standards.
**The pattern**
This isn’t the first controversy in Sama’s dealings with Meta. From 2019 to 2023, Sama employed content moderators in Nairobi to review posts flagged for possible violations of Facebook’s community standards. Moderators were exposed to graphic violence, sexual abuse, hate speech, and other disturbing content for hours daily,
Other articles
Meta terminates its contract with Sama following reports from Kenyan workers who encountered intimate videos recorded by users of Ray-Ban smart glasses.
After reporting to journalists that they witnessed users engaging in sexual acts and undressing in footage from Meta smart glasses, 1,108 Kenyan workers were terminated from their positions. This development prompted reactions from regulators and legal professionals.
