App Store apps are leaking personal data from millions of users

  • A cybersecurity project detects nearly 200 iOS apps that expose sensitive user data.
  • The leaks particularly affect artificial intelligence applications with millions of accessible records.
  • The failures are mainly due to misconfigured databases and cloud storage.
  • The case once again raises questions about the protection of privacy on the internet, even within the App Store.

Apps in the Apple store leaking personal data

Apple has spent years building an image of a company that It puts privacy ahead of almost everything else.especially compared to other major tech companies. However, even the App Store's strict controls don't prevent apps with serious security flaws from slipping through, ultimately exposing users' data.

A recent cybersecurity investigation has revealed a problem affecting Dozens of apps available for iPhone and iPadThese tools, many of them very popular, are allegedly leaking sensitive user information without their knowledge: from names and emails to complete histories of private conversations.

The work is being led by CovertLabs security research laboratorywhich has launched a specific project to track these leaks. The initiative has been announced under the name Firehound, a repository that scans, locates, and indexes apps that expose personal data through cloud services or poorly protected databases.

According to the first published results, Firehound has identified nearly 200 apps on iOS with flaws of this kind. At the time the information was made public, 196 of the 198 applications analyzed were leaking user data in one way or another, which gives an idea of ​​the scale of the problem.

The project has spread primarily through social media, where various security researchers have shared examples and figures. One of the most active participants, known on X (formerly Twitter) as @Harris 0n, claims to have found security breaches that leave huge databases accessible to anyone with sufficient technical knowledge to search for them.

Firehound: This is how the leak tracker works on the App Store

Firehound presents itself as a kind of database of vulnerable applicationsIts objective is to identify apps that mishandle user information, either because they use cloud storage systems without proper security measures or because they expose internal files that should remain completely private.

The repository offers free access to some of the results of their analyses, but limits the most sensitive details. To see more specific information about the leaks, it is necessary to register on the platform, something that those in charge justify as a way of to prevent anyone from exploiting those flaws while they work on reviewing and masking the most sensitive data.

According to them, some of the reports contain direct paths to exposed databases, message sets, or email lists that, in the wrong hands, could be used for malicious campaigns. phishing, extortion, or identity theftTherefore, Firehound only publicly displays partial and general information about each affected application.

@Harris0n himself indicates that, in cases, All messages sent by users through certain apps are stored without protection.In other words, anyone who knows where to look could read entire conversations that users had set as private, also associated with data such as their email address or even their phone number.

This situation highlights a weakness that has been known for years: although Apple reviews apps before publishing them, It is not always able to detect insecure configurations on external servers that the applications use to store information. And that's where most of these leaks are occurring.

Privacy and personal data in Apple apps

Nearly 200 iOS apps with exposed personal data

Firehound officials explain that, so far, they have located around 200 iOS apps with security issuesThese are not small, unknown projects, but tools with a considerable number of downloads, many of them dedicated to artificial intelligence and cloud services.

On social media, one of the researchers involved pointed out that, as expected, The most affected apps are those related to AI servicesThese applications typically handle large volumes of data, as they store conversations, requests, corrections, and other types of content that users share expecting it to remain private.

One of the most striking examples cited in the project is the app Chat & Ask AIAccording to data collected by Firehound, this application would have over 406 million records exposedThis information pertains to more than 18 million users. It includes messages, queries, and other data that were stored without proper security measures.

The problem, the researchers emphasize, is not only the volume of information, but also the quality and sensitivity of that data. We are not talking about simple technical identifiers, but about full conversations linked to email accounts and phone numbersThis allows each chat history to be linked to a specific person with relative ease.

In this scenario, an attacker could not only read private messages, but also cross-reference that data with other information available on the internet to profile victims, target personalized scams, or even generate blackmail if it detects particularly sensitive content in the leaked conversations.

Particularly sensitive data: mental health, emotions, and medical consultations

One of the aspects that most worries researchers is the type of content being leaked. Many people use AI-based applications to to talk about emotional issues, mental health, relationship problems, or even medical questionsThese are matters that, by their nature, should be treated with the utmost confidentiality.

If those exchanges are stored on misconfigured servers and, furthermore, are linked to contact information such as email or mobile numberUser privacy is seriously compromised. The most personal details of their lives could end up in the hands of third parties, with unpredictable consequences.

Researcher @Harris0n gave precisely this example: people who rely on an AI app to vent about anxiety, depression, family problems, or medical symptomsAll that material would be stored without sufficient protection, waiting for someone to come across the exposed database and decide to take advantage of it.

This type of leak not only poses a cybersecurity risk, but also a social and psychological problem. Knowing that your More intimate conversations may be circulating online It can generate distrust in technology, fear of asking for help or using digital support services in delicate moments.

From a European perspective, this situation also falls under the umbrella of regulatory compliance. The General Data Protection Regulation (GDPR) requires companies to implement security measures appropriate to the level of riskThis is something that is difficult to achieve when an app makes its databases publicly accessible via the internet.

What types of apps are leaking information and why is this happening?

Firehound's list shows that the problem isn't limited to a single sector. Among the affected apps are... entertainment tools, educational platforms, health-related services and professional design or productivity apps. In other words, it's a cross-cutting phenomenon that touches several niches within the App Store ecosystem.

Most of these errors have in common that the apps use misconfigured databases or cloud storage systemsIn many cases, developers use external services to store information (such as chat histories, user profiles, or shared files) and do not properly activate authentication or encryption options.

The practical result is that those databases remain accessible without the need for advanced credentialsSometimes, these resources are even unprotected. Anyone who knows the address or performs certain technical searches can locate, download, or explore them with relative ease.

Another common problem is the use of access keys embedded within the app's codeFirehound also helps detect this. If an attacker analyzes the code and finds these credentials, they can use them to connect directly to the cloud service the application uses and extract all the data stored there.

This type of error is usually related to poor development practices and lack of security auditsThis is more likely due to malicious intent on the part of the app developers. However, for the user, the consequence is the same: their private information ends up exposed without clear notification or explicit consent.

It's not just happening in the App Store, but it directly affects Apple's image.

Firehound officials insist that these types of failures It's not exclusive to the Apple storeIn fact, very similar problems have been detected for years in the Google Play Store and other software catalogs. Whenever an app relies on poorly secured cloud services, there is a risk of data breaches.

Even so, the fact that these vulnerabilities appear in the App Store has a special impact, because Apple has long maintained that its A closed and monitored ecosystem is safer than its competitors. Every time a case like this arises, the debate opens up about to what extent this prior control is sufficient to truly protect users' personal information.

In Europe, moreover, the focus on these types of incidents is greater due to regulatory pressure. Data protection authorities in various countries, including Spain, can demand explanations from both developers and, in some cases, from the major platforms that The affected applications are distributed among millions of people..

Apple, for its part, usually reacts by removing from the store those apps that are proven to violate its privacy policies or that pose a clear risk. However, the cycle between the publication of an application, the detection of the problem, and its eventual removal is not straightforward. It may be long enough that the data has already been copied by third parties.

In the end, Firehound's research reminds us that, even if large technology companies strengthen their review processes, Leaks can appear in virtually any app catalogThe weakest link is often the way developers manage the servers where the information is stored, an area where Apple and other giants have less direct control.

This whole case serves as a wake-up call for users and developers: on the one hand, it is advisable Think twice about what kind of data we share with third-party applications or features, such as Instagram Friends Map featureEspecially those apps that use artificial intelligence and store large volumes of personal content; on the other hand, it's clear that security doesn't end with passing an App Store filter, but requires responsible and continuous management of the infrastructure where the data is stored. The combination of independent projects like Firehound, strict regulations in Europe, and a greater culture of privacy can make all the difference in preventing millions of records from once again becoming accessible to anyone.

Discover Instagram's new Friends Map feature to share your location with your followers
Related article:
Discover Instagram's new Friends Map feature to share your location with your followers