Guest Articles

Monday
December 16
2019

David Medine / Gayatri Murthy

Nobody Reads Privacy Policies: Why We Need to Go Beyond Consent to Ensure Data Privacy

In the age of the internet and Big Data, when much of our economic and social activity takes place online (and leaves a digital trail), it’s hard to maintain even a semblance of data privacy. That’s why some analysts, like prominent privacy lawyer Rahul Matthan, have largely lost faith in the concept of providing consent for outside entities to use our personal data. As Matthan put it in a recent article, “Consent is cumbersome to obtain, and so privacy policies are drafted in the widest possible language to give companies considerable leeway in third-party data transfers – so much so that there is no need for them to ever seek our consent again.” For this and other reasons, he says, consent “no longer effectively protects personal privacy in our present data-rich world.”

However, just as digital tech makes obtaining and monetizing user data easier, it can also make it easier for companies to obtain user consent for these practices. For instance, Matthan points to better designed privacy policies and “just-in-time” consent requests that happen at the moment when data is being collected as areas of potential progress.

We agree that consent can play a valuable role in protecting consumers’ data privacy, and we’re encouraged by some of the innovations that are emerging to advance that goal. These innovations are particularly encouraging in light of the fact that burgeoning mobile access has brought the benefits and risks of Big Data to developing countries – home to many of the world’s newest and most vulnerable digital consumers. But these efforts are notably limited in providing customers with meaningful protection, for three primary reasons:

 

People simply don’t want to read privacy notices

This reality has been amply demonstrated by people across cultures. In a world where there are over 2 billion smartphone users, and the average smartphone user has 60-90 apps, almost no one reads app privacy notices, not to mention website privacy policies. We all “say” we have read them when we click to agree to terms and conditions, but several studies have shown this is not the case.

A recent Deloitte survey of 2,000 consumers found that 91% of people consent to legal terms and services conditions without reading them. For younger people aged 18 to 34, the rate is even higher, with 97% agreeing to conditions before reading. Fewer than 2% of Microsoft customers have used its Privacy Dashboard, and of the 2.5 billion visits to Google’s accounts page, only about 20 million people have even viewed their ad settings. There are numerous cookie notices on websites today, driven by the EU’s General Data Protection Regulation, but people tend to just keep clicking until they get to the web page they want to see.

To be fair, this is not simply a story of user apathy: Even if someone wanted to be diligent, research shows it would take them 76 work days to read all the privacy policies they encounter in a year of internet usage.

 

Even if we read the privacy notices, we don’t understand them

In a study published last year, the National Institute of Public Finance and Policy asked college and law school students in India to review the privacy policies of five popular websites (Uber, WhatsApp, Google, Flipkart and PayTM) and then tested them on their comprehension. Despite being well-educated, most students could not correctly answer difficult questions about the policies – even though they could go back and consult those policies before answering – and 20-40% of the “easy” questions were answered incorrectly.

  

‘Better notices’ do not necessarily work

Responding to people’s failure to read or understand privacy notices, some have pushed for more and better ways of providing notice and getting consent – e.g.: layered notices with key points on top, simplified language on consent forms, and “just in time” consent requested at the point of information collection. But though they’re well-intentioned, these approaches are generally ineffective in attracting customer attention, as they cut against people’ desire to get information or entertainment, conduct transactions, or play games with as little interference as possible.

 

Solutions for Going Beyond Consent

For these reasons, CGAP has concluded that we need to explore solutions for going beyond consent. We need to shift some responsibility for data protection onto the entities that collect and process data, and we need to add new, improved tools for data protection in today’s digital economy. While CGAP focuses on financial inclusion for the poor, these solutions apply across the board.

Fortunately, there are some promising efforts that might signal a path toward these goals. For instance, in the financial services space, Matthan flags an innovative data protection tool gaining traction in India: an account aggregator framework that involves third-party consent brokers who act as intermediaries between financial information users and financial information providers, to facilitate the flow of information between these entities. This approach could facilitate access to financial services for the underserved by making it easier for them to provide documents and information when applying for loans and other services.

However, a key limitation of this new tool is that it is not well-suited to handle consent that would govern the subsequent use, disclosure and retention of information. Consumers are generally in a good position to consent to what information of theirs is provided, but they are not in a position to police how their data is being used. Furthermore, while consent to this information flow may now be limited to what financial firms collect, over time this could come to include a wide range of non-financial information, such as SMS messages and social media, potentially leading to consent fatigue among consumers who have to approve numerous access requests.

Instead of leaving this in the hands of consumers or consent brokers, downstream flows of data are best handled by laws that impose the responsibility for protecting consumer data upon companies – including financial service providers. Two approaches are worth considering. One would be to place a fiduciary duty on providers to only use data in ways that benefit the consumer. So for instance, using this data to target advertisements for support services to customers would be acceptable under this model. But sharing customer data with other companies, such as retailers or insurers, so they know which customers are more likely to pay higher prices based on past purchases would not be acceptable, as it doesn’t advance those customers’ interests. Another option would be to limit providers’ use of data to legitimate purposes – namely uses closely related to the purposes for which the information was collected in the first place. Under this approach, a legitimate use of data would not include sharing commercial sales data with political campaigns, but it would include sharing information needed to deliver a product or service. These changes are all premised on the belief that consumers should be entitled to assume that their data will be used properly.

Regardless of the approaches used to acquire consent, it’s increasingly clear that consumer consent alone is not enough to protect individual autonomy in the 21st-century digital economy. In addition to improving the way consent is obtained, some responsibility for data protection should be shifted to the entities that collect and process data in the first place.

 

David Medine and Gayatri Murthy are financial sector specialists at Consultative Group to Assist the Poor (CGAP).

 

Photo courtesy of BiljaST.

 


 

 

Categories
Finance, Technology
Tags
data, digital finance, digital inclusion, financial inclusion, fintech