Meta Is Now Tracking Its Employees. Funny How That Works.



Part of the Big Tech's War on Users series

Tracking people without meaningful consent is wrong. It was wrong when Meta did it to users. It's wrong now. The only difference is who's upset about it.

Meta recently rolled out a tool called the Model Capability Initiative (MCI), part of a broader internal program branded the Agent Transformation Accelerator. It logs mouse movements, clicks, keystrokes, and takes periodic screenshots from employees' work computers across hundreds of applications. The stated goal is to train AI agents to perform the kind of everyday computer tasks — dropdown menus, keyboard shortcuts, navigating interfaces — that humans currently do. There is no opt-out.

And here's the kicker buried in that last sentence: this behavioral data feeds directly into Meta Superintelligence Labs — the same unit that just shipped Muse Spark, Meta's new flagship AI, which is completely proprietary. No open weights. No downloading it. No fine-tuning it. Not for you, not for developers. Meta spent years building goodwill in the open-source community through its Llama model family, then quietly walked away from that when the stakes got high enough. Now employees' daily work habits are being harvested to train a closed model that Meta alone controls. The community that made Llama what it was gets a vague promise that maybe future versions will be open-sourced someday.

It's a neat trick: mandatory surveillance of the workforce to build a closed model nobody outside Meta can touch.

Employees responded by plastering protest flyers across multiple U.S. offices — in meeting rooms, on vending machines, and on top of toilet paper dispensers. The flyers asked: "Don't want to work at the Employee Data Extraction Factory?" They pointed to a petition citing US labor law. UK employees went further and launched a unionization campaign with United Tech and Allied Workers. All of this is happening right as Meta prepares to cut 10% of its workforce by May 20. Employees are being asked to train the AI that may replace them, with no choice in the matter.

And this is far from the first time. This coming round of layoffs isn't a one-off correction — it's the latest in a years-long pattern. Meta cut 11,000 jobs in late 2022. Then 10,000 more in 2023, which Zuckerberg cheerfully dubbed the Year of Efficiency. More cuts followed in 2024 hitting WhatsApp, Instagram, and Reality Labs teams. Then another round in early 2025. Now this. Over 20,000 jobs gone since 2022, headcount swelling back between rounds, then cut again. Wash, rinse, repeat — except now the remaining employees are also generating training data for the machine while they wait to see if they make the next cut.

The story blew up on Lemmy, and the top comment pretty much said everything: "We will monitor all our users and sell their metadata to advertisers, but we draw the line at Meta tracking our mouse clicks."

Yeah.

Here's the thing — Meta tracking its own employees is genuinely bad. Mandatory keystroke logging with no opt-out, on workers simultaneously watching layoff countdown clocks, is dystopian. I'm not going to pretend otherwise.

But I also can't ignore what Meta has spent twenty years doing to everyone else. And since each of these deserves its own deep dive, here's the short version:
  • The Meta Pixel — an invisible tracking script embedded on millions of third-party websites. It captures what you view, what you search, what you put in forms — and reports back to Meta whether you have a Facebook account or not. It was found on hospital systems and telehealth sites, hoovering up protected health data.
  • Shadow Profiles — Meta builds dossiers on people who never signed up, assembled from other users' contact lists, public records, and cookies. Zuckerberg was asked about this under oath in 2018 and more or less confirmed it exists while calling it a "security" measure.
  • The In-App Browser hijack — click a link inside the Facebook app and it doesn't open your browser with your privacy settings. It opens Meta's browser, where they inject JavaScript and watch everything you do on that page.
  • Cambridge Analytica — tens of millions of users' data harvested without consent. The FTC fined Meta $5 billion. Meta kept operating.
  • Facial recognition biometrics — collected without informed consent. Settled for $650 million.
  • The 2012 mood manipulation experiment — Meta secretly altered 700,000 users' feeds to study whether emotional content was contagious. Then published a paper about it.
  • WhatsApp metadata — sold to you as end-to-end encrypted and private. The metadata — who you talk to, how often, from where — still feeds Meta's systems.
  • Tracking minors — the Pixel wasn't blocked for Facebook users under 16.
Each of those is its own post. And none of them generated flyers on bathroom walls, because the people being tracked were just users. They got a cookie banner. They got terms of service written by lawyers to be unreadable. They got a congressional hearing where Zuckerberg explained to senators that Facebook runs ads.

The employees at Meta have labor law, union organizers, and Reuters on speed dial. The three billion people Meta tracks across the open web had essentially nothing.

They were never users. They were always the product.