Today's cybersecurity headlines are brought to you by ThreatPerspective


Ethical Hacking News

No-Gym-Intelligence: A Lurking Threat to Personal Security


HelloGym's exposed audio database contains 1.6 million recordings of gym customers and staff, raising serious concerns about personal security and data protection.

  • HelloGym's database of audio calls was discovered to be unencrypted and non-password protected, exposing sensitive customer information.
  • The database contained approximately 1.6 million audio files from various franchise locations, including names, phone numbers, and reasons for the call.
  • The vulnerability highlights concerns about organizations handling personal and financial information, with potential malicious uses such as monitoring calls or impersonating gym staff.
  • Advances in AI technology pose a significant threat to individuals' identities, particularly with the creation of realistic deepfakes using stolen audio recordings.
  • The discovery emphasizes the importance of proper security measures, including encryption, penetration testing, and segmenting data that is no longer in use.



  • HelloGym, a company that provides sales, marketing, phone-answering, and VoIP call services for several top gyms including Anytime Fitness, Snap Fitness, and UFC Gym, among others, has inadvertently exposed sensitive information about its customers through an unencrypted, non-password protected database. The discovery of this leak was made by security researcher Jeremiah Fowler, who stumbled upon the repository of audio calls in late July 2025.

    According to Fowler, the database contained approximately 1.6 million audio files from various franchise locations of some of the largest fitness brands in the US and Canada. These recordings were stored as MP3s and included people's names, phone numbers, and reasons for the call, such as renewing or canceling memberships. The date range for these calls varied between 2020 and 2025.

    Fowler discovered that the database was likely intended to serve as a storage repository for VoIP audio files intended for internal use only. However, due to its lack of proper security measures, it was accessible to anyone with the necessary technical expertise. This exposed sensitive information about HelloGym's customers and employees, including potentially biometric data in the form of audio recordings.

    The vulnerability of this database highlights a serious concern regarding the handling of personal and financial information by organizations. The exposure of these files can be used for various malicious purposes, such as monitoring calls or voicemails to intercept sensitive payment information over the phone, impersonating gym staff through social engineering attacks using stolen credentials, or even creating deepfakes to impersonate company executives.

    Furthermore, with advancements in AI technology, such as voice cloning and deep faking, these malicious activities can become more sophisticated. Microsoft had previously documented instances of AI tools being used for spoofing voice identification or impersonating specific speakers. The possibility of using stolen audio recordings to create realistic deepfakes poses a significant threat to individuals' identities.

    The discovery of this leak serves as a warning to organizations collecting customer information and biometric data, emphasizing the importance of proper security measures such as encryption, penetration testing, and segmenting data that is no longer in use. Fowler's experience underscores the need for vigilance in protecting sensitive information, particularly in industries where personal data may be more susceptible to exploitation.

    In conclusion, the discovery of HelloGym's exposed audio database serves as a cautionary tale for organizations handling personal and financial information, highlighting the risks associated with inadequate security measures. It is essential that companies take proactive steps to secure their databases and ensure that sensitive information is protected from potential threats.



    Related Information:
  • https://www.ethicalhackingnews.com/articles/No-Gym-Intelligence-A-Lurking-Threat-to-Personal-Security-ehn.shtml

  • https://go.theregister.com/feed/www.theregister.com/2025/09/09/gym_audio_recordings_exposed/


  • Published: Tue Sep 9 12:25:54 2025 by llama3.2 3B Q4_K_M













    © Ethical Hacking News . All rights reserved.

    Privacy | Terms of Use | Contact Us