Developers know that data and privacy are important - our most recent developer survey (soon to be released) proves that. More to the point, they want to do the right thing - and are looking for guidance on how to interpret what users and policy makers are telling them. While the answers are not yet crisp, there are concrete steps that developers can take today to build applications that better reflect what users and governments are asking for, and a handful of principles that can guide future development.
What follows is an introduction to the data and trust project that the Developers Alliance began in late 2017. Throughout 2018 the project will deliver developer insights, best practices with data, educational material to demystify data for users and policy makers, and a dialogue series to bring in experts and outside voices on the topic of data and privacy.
Before we jump to the tactical advice, it helps to have some context. It starts with principles, and we believe that Transparency, Security, and Stewardship are the keys to building user trust.
The most powerful step developers can take in promoting trust in data collection practices is to be up-front, honest, and clearly and specifically tell users what data is being collected or accessed, why it is being collected, what it will be used for, and more importantly, what it will NOT be used for. Transparency is what allows users to make informed decisions about the services they interact with, and forms the basis for user consent. Transparency is more than a dictionary or legal definition, it is a principle that should inform the entire user-developer relationship. It means behaving honestly, candidly, and focusing on informing the user, not hiding the ball or distracting users from what’s really going on.
From a practical point of view, transparency means informing users up front, in plain and clear language (as the GDPR puts it), of what a program does with data before seeking user consent. This is easier said than done, and so in the months ahead we’ll explore use cases such as a browser interface, or a mobile app, or a voice interface to tease out just how this might be done. Even more challenging, we need to develop guidance in how best to communicate the complexity of the many data types and and the nuances of how it might be accessed; for instance, how do we explain the difference between an app that instantaneously queries a user’s location, versus one that collects, or locally stores, or transmits, or remotely stores, or shares that same information? We’ll need to build a taxonomy - an agreed set of definitions and categories - that allows developers and users to communicate clearly, but simply, what’s going on.
Security, possibly the best defined of the three principles, is none the less worth a little explanation. The challenge here is that the concept means many things depending on the context, and the developer's task is to put themselves into the user's frame of reference and define security from the user viewpoint.
For a user, data is secure when they are confident that it cannot be accessed by someone without their permission, and will remain intact and available to them if that’s part of the agreement. From the user perspective, a hacker, a spy, a summer intern, a reporter, an employer, a friend, or even a misunderstanding can present a security risk. For a developer, this will imply best practices in end-to-end encryption, data access controls in the operating environment, cybersecurity, data anonymization and secured data sets, and risk minimization techniques so that breaches are small, contained, and costly to execute.
The final principle is probably the easiest to understand, but the hardest to put into practice. Stewardship means putting yourself into the consumer's shoes, and managing user data in a way that the consumer would expect and condone were they looking over your shoulder and making decisions on their own. In many cases, getting explicit consent is the best way to ensure good stewardship. In some cases, however, it will be unrealistic or impossible to anticipate what consent is required, and so the developer will need to make wise and careful decisions on the user’s behalf - ideally with some knowledge about the user’s priorities and beliefs.
Take, for example, the emergency call feature on a mobile phone. Good stewardship tells us that users would agree to loan their phone to someone to call for help in an emergency, and so we bypass the password and lock screen to enable this specific feature for everyone - without asking permission each time. Rather we agree as a society that this will be how all phones will work, for the benefit of everyone. Alternatively, where a user gives us permission to track the location of their keys in the event they get lost, good stewardship would say that they have NOT entrusted us with storing a complete history of all their travels over time, despite the explicit permission. Good stewardship can mean deleting data that it is no longer appropriate to keep.
Implementing these three principles, Transparency, Security, and Stewardship, is the focus of the tactical advice that the Developers Alliance will be collecting and sharing in the weeks and months ahead.
But what should developers be doing NOW to improve data privacy and user trust?
First: Identify and categorize the external data that your applications rely on. What data does your app NEED to function? What services or features are impacted if data access of a specific type is limited? What is the impact of limiting the data that you use - in time, or quantity, or by finding alternative ways to accomplish your goal that use less data? The insights you’ll gain will help inform the next steps as you implement data strategies. Figure out who the third parties are that share data with you, and prepare to task them with better controls as well. Most importantly, figure out where your data is and who’s in control!
Next: Identify where and how you could inform users about how your app uses and stores data, in a way that allows them to actually make informed decisions. Perhaps this means a new install process, or a one-time detour. How can this be done without overwhelming them? The goal is not to have your lawyers do the first cut, but to put yourself (or your mom) in the users shoes, and to present them something that would actually be useful in making a choice.
Third: The day is coming where you will be asked to extract a single user’s data from your dataset, package it up in a useful and generic way, and deliver it into the user’s safe keeping for them to take to a competitor. You need an import and export capability. You need a database structure that separates the data you’ve built, from the data you’ve been given or collected, from the data you’ve bought.
Fourth: If your app, or your users, fall into a special or sensitive category, you need to tighten up your controls and access limits even further. Children, data related to health or finance, data that relates to vulnerable or protected groups or categories, will require a unique set of safeguards and processes. Identify and segregate this information in anticipation of those special controls. If you rely on anonymized datasets, make sure they cannot be combined with external data to re-identify the contents.
Fifth: Raise your security game, fast. End-to-end encryption, breaking databases into smaller indexed and distributed sets, and encrypting everything separately might be wise. Increase the cost of unauthorized access and reduce the return from breaching a single file or network. Limit who can access your code, and secure the development environment. Put some audits in place so you can keep improving and detect a breach quickly.
And Finally: While the team is working through the discovery and architecting above, nominate someone to engage with the developer community and start collecting ideas and solutions to the many challenges ahead - we’re ALL going through this together.
We at the Developers Alliance are happy to help with that process, and we look forward to hearing from you and sharing what we learn.