Cybersecurity researcher Jeremiah Fowler has disclosed a massive data exposure involving over 1.6 million audio recordings belonging to U.S. and Canadian gyms. The files, left unencrypted and without password protection, contained personal details from internal phone calls, voicemails, and even sensitive staff communications.
Fowler reported the discovery to Website Planet, explaining: “The publicly exposed database was not password-protected or encrypted. It contained 1,605,345 audio recordings in .mp3 format.” The recordings spanned from 2020 to 2025 and included names, phone numbers, billing issues, and membership details .
Many of the recordings referenced major fitness brands, though corporate representatives clarified they do not record calls directly. Instead, the exposure was traced back to a third-party contractor: “Although the records appeared to contain calls and voicemails belonging to a number of franchise locations of several well-known fitness brands, the database was managed by a third-party contractor known as Hello Gym.”
The risks of exposing voice data are profound. Fowler noted: “The voicemails that I heard should not have been publicly accessible, as they often included personal details such as names, phone numbers, and the reasons for calling.” He warned that criminals could impersonate gym staff to extract payment details, execute fraudulent cancellations, or even bypass physical security systems if alarm credentials were exposed.

One recording captured staff members providing internal passwords for account changes, while another revealed a manager requesting that a security service disable an alarm system. Such data could enable cybercriminals or intruders to blend cyberattacks with physical breaches.
Perhaps most concerning is the potential for misuse in the age of generative AI. Fowler wrote: “Cloning voices used to seem like something out of a science fiction movie, but now technology has provided average people (and criminals) with the ability to create text, audio, and videos that are often hard to identify as fake.”
He cited real-world cases: in 2019, criminals cloned a CEO’s voice to steal $243,000; in 2024, deepfake executives tricked an employee into transferring $25 million; and in 2025, voice cloning scams extorted $15,000 from a Florida woman by imitating her daughter’s cry for help.
With millions of gym-member recordings exposed, the threat of AI-enabled fraud and impersonation grows considerably.
The Hello Gym breach shows how overlooked third-party platforms can create serious vulnerabilities for well-known brands and unsuspecting consumers. Exposed audio recordings are not just a privacy concern, but a potential goldmine for scammers, fraudsters, and AI-powered attackers.
While the database was quickly secured after disclosure, the incident is a stark reminder that in the digital age, even a voicemail can become a weapon.
Related Posts:
- Unauthenticated Attacker Can Read Sensitive Files in Mitel OpenScape Xpressions
- Unsecured Database Linked to Navy Federal Credit Union Exposed Online
- Rockerbox Data Leak Exposes 245,949 Records: SSNs, Driver’s Licenses, Military IDs Leaked from Unsecured Cloud
- Now It Can Listen: Google Gemini Adds Support for Audio File Uploads
- Windows 11 Gets “Shared Audio”: Play Sound Through Multiple Devices Simultaneously