The importance of getting privacy by design right – and the damage that getting it wrong can do – is illustrated by a new study which looked at how 120 of the most used Android apps in Belgium handle the personal data of their users.
Breaking the personal data rules
The study – which Collibra conducted with partners at UC Berkeley and AppCensus – showed that more than one-third of the apps were transmitting personal identifiers either without users realizing they were doing so, without asking for permission, or without informing users transparently about which companies receive this data and what they intend to do with it. This is potentially in violation of the EU’s General Data Protection Regulation (GDPR).
For example, one game app in the study shared user personal information with 23 external companies in just 10 minutes. The game asks users if they are willing to share their information, but is unclear how the data will be used and makes it difficult to decline. The personal identifiers could include geolocation information, a user’s Android ad ID or information the user provided to register for the app, such as age or email address. An Android ad ID is used by advertisers to assemble profiles of user activity, and can be erased manually by users through their Android devices.
In addition, more than one-fifth of the apps tested appear to be breaking Google’s policy (in violation of GDPR) and best practices for app developers. These apps were sending out persistent identifiers, which are pieces of personal information that are very hard to erase, alongside the Android Ad ID. One example of this is the identifier associated with a mobile phone, which is called an International Mobile Equipment Identity number (IMEI). Persistent identifiers are very difficult to erase in the same way as the Android’s ad ID, and therefore behavioral information will be associated with it forever.
This study was conducted in Belgium in partnership with De Tijd, a top Belgian business newspaper, and Serge Egelman, research director of the Usable Security & Privacy Group at the International Computer Science Institute (ICSI). He also holds an appointment in the Department of Electrical Engineering and Computer Sciences (EECS) at the University of California, Berkeley. Egelman’s company, AppCensus, provided the analysis of the apps in the study.
Understanding the impact
The apps caught using personal information incorrectly included a bank, a train ticket website, and the newspaper De Tijd itself. For De Tijd, the cause was an old piece of code that was still operating within its app – the newspaper promised to remedy the problem. These kinds of personal data challenges can happen within a variety of types of organizations across a range of industries.
It’s important for all organizations to be sure they are getting data privacy right. Today, organizations need to be in compliance with the growing number of data privacy rules. However, beyond that, it’s also vital to maintain the trust of the individuals who use their products and services. For the internet to thrive, society needs to be able to trust the organizations it engages with there.
An individual’s personal information about their behavior is one of the most valuable resources on the global market today. In the process of developing services which make the lives of consumers easier, organizations also gather data about individuals. While consumers welcome this, data about them is assembled into behavioral profiles which with some certainty can predict what individuals want now, tomorrow, and further into the future.
While in many circumstances consumers are happy to trade information about themselves in exchange for a more personal experience, they remain sensitive to the possibility of their personal data being misused, particularly within technologies like AI.
If consumers perceive a risk that their personal or behavioral data might not be safe or could be misused, they may not want to engage with individual organizations online. There is already evidence that this cause-and-effect can happen. Recognizing these risks, software giant Microsoft wrote in its 2018 annual report that, “If we enable or offer AI solutions that are controversial because of their impact on human rights, privacy, employment or other social issues, we may experience brand or reputational harm.”
To survive and succeed, organizations need to build and maintain the trust of their users by having strong data privacy policies supported with robust processes. As well, products and services should be created using privacy by design approaches to greatly reduce the risk of personal data being misused.
Adopting a privacy by design approach
To implement privacy by design, organizations need to integrate or “bake in” data protection into their processing activities and business practices from the design stage through the lifecycle. Proactive ways in which organizations can embed privacy by design are:
- Include the data team at the beginning of any new project, because their expertise can help the organization better recognize potential data privacy issues up-front
- Understand what personal data is being captured by the organization’s products, services and processes
- Monitor data lineage to understand how personal data is being used within the organization
- Support data privacy policies with automated processes, reporting and security arrangements
- Test new products and services to be sure that they are engaging with personal data in a way that aligns with the organization’s policies and compliance obligations
It’s vital that organizations to do everything they can to avoid the negative financial and reputational consequences of poor data privacy practices. By creating a robust privacy by design approach, organizations can greatly reduce this risk while supporting innovation and positive customer relationships.
Learn more about how Collibra Privacy & Risk supports privacy by design programs.
Pieter De Leenheer, Vicky Froyen and Serge Egelman contributed to this article.
The Collibra, UC Berkeley and AppCensus study was conducted based on data collected on August 19, 2019. It does not take into account changes or updates to the apps analyzed since that date.
Additional resources on app personal data use: