Flock Uses Overseas Gig Workers to Build Its Surveillance AI
Summary
An accidental exposure of internal materials shows Flock — the company behind a widespread network of AI-enabled cameras and automatic licence-plate readers in the US — has been using gig workers overseas (notably people on Upwork in the Philippines) to review and annotate footage used to train its machine-learning systems. The leaked panel and training slides indicate tasks such as transcribing licence plates, classifying vehicle make, colour and type, labelling people and clothing, and even reviewing audio for events like “screaming” or “gunshots.”
Those revelations raise fresh concerns about who can access sensitive footage gathered from thousands of US communities, how well that data is controlled, and the ethics and privacy implications of outsourcing annotation labour for mass surveillance systems. After queries from journalists the exposed panel was removed and Flock declined to comment.
Key Points
- Leaked internal material shows Flock used freelance annotators (via Upwork) — some located in the Philippines — to label footage from US cameras.
- Annotation tasks included transcribing licence plates, classifying vehicles, labelling people and clothing, and audio classification (eg, gunshots, screams).
- Flock’s network is widely deployed; law enforcement can search vehicle movements nationwide and has reportedly conducted lookups for agencies including ICE.
- Training guides and a Flock patent indicate the system aims to detect detailed attributes, with mentions of race detection in filings.
- The exposed dashboard displayed large volumes of annotations and lists of annotators; it was taken down after journalists were alerted.
- Outsourcing sensitive surveillance labelling overseas highlights data‑handling, privacy and legal risks, especially where police access is routine and often used without a warrant.
Context and relevance
Outsourcing annotation is common in AI development because it’s cheaper, but this story sits at the intersection of three fast-moving trends: mass deployment of video surveillance, increasing reliance on human-labelled data to train predictive systems, and the gig economy that supplies that labour. For communities under camera networks, it matters whether footage of everyday movements is analysed abroad, how strictly access is controlled, and what oversight exists when law enforcement queries the system.
Author note — punchy: This isn’t just a data-labelling story. It’s about who gets to watch public life, where those viewers are located, and how little daylight there sometimes is on the process. If you care about privacy, policing or the ethics of AI, the details matter.
Why should I read this?
Short version: because it explains how a major US surveillance network is built on cheap, outsourced human labour — and why that matters. If you want to understand the risks (privacy leaks, cross-border access to footage, and weak oversight) without wading through full investigative pieces, this summary saves you time and points to the bits worth digging into.
Source
Source: https://www.wired.com/story/flock-uses-overseas-gig-workers-to-build-its-surveillance-ai/
