iStock-838876578-LB.jpg

Responsible Digital ID: Priority Data Governance Policies and Practices to Support Human Rights

Brandie Nonnecke, Director of the CITRIS Policy Lab, University of California at Berkeley, introduces a new report that explores how to protect civil and political rights when implementing national digital ID systems

Nearly one fifth of the world’s population—an estimated 1 billion people—lacks formal identification, significantly restricting their ability to meaningfully participate in the economy and society. National digital ID systems hold great promise to provide individuals with an ID to enable access to social, economic, and civic opportunities. However, lack of sound data governance policies and practices present significant risks to civil and political rights, such as legal recognition, individual liberty, security of person, procedural fairness, privacy, and non-discrimination.

To identify strategies to mitigate negative effects of digital ID systems on civil and political rights, the CITRIS Policy Lab, headquartered at UC Berkeley, has published a report providing case study evaluations of national digital ID systems in Argentina, Estonia, Kenya, and China with the aim to inform priority recommendations for data governance policies and practices in the areas of data protection, political participation, and inclusion of diverse identities.

Data Protection

Data protection is paramount to ensuring civil and political rights, including individual liberty and security of person, procedural fairness, and privacy. To better ensure these rights are upheld, digital ID systems should be built upon the Fair Information Practice Principles (FIPPs) to protect personally identifiable data, including implementation of data minimization, purpose specification for all data collected and shared, and cybersecurity safeguards that are continuously monitored and updated in relation to threats.

While the ability for digital ID systems to collect and share data between institutions, such as humanitarian agencies, the private sector, and the public sector, provides substantial value, it also creates security and privacy risks. Governments wishing to share data must put in place appropriate procedures for third-party access and use of data, including legally enforceable compliance and accountability standards; legally enforceable privacy protections for digital ID data, including enabling individuals to access, correct, and object to the use and management of their data; and oversight to ensure robust cybersecurity strategies and safeguards are implemented across institutions.

Political Participation

Digital ID systems should enable public participation at every stage — from public consultation in the formation of the original design to establishing adequate procedures for individuals to object to the collection and use of their data. Public consultation and audit strategies are critical to ensure individuals can exercise their rights to procedural fairness, legal recognition, and privacy throughout the buildout and implementation of a digital ID system.

For example, tracking civic behavior on national digital ID systems poses a significant risk of stifling freedom of speech, expression, and liberty. If governments know whether, and for whom, an individual voted or political activities they’re engaged in, government actors could use this information to target political dissidents and restrict their access to government services. Digital ID systems should neither be used as a form of repression, nor be used as a way to encourage, forcibly or otherwise, individuals to vote for certain political parties. Governments should establish appropriate public consultation mechanisms and safeguards to restrict the exploitation of civic data, such as encrypting voting behavior when tied to personally identifiable information.

Inclusion of Diverse Identities

By not recognizing or putting in place appropriate safeguards for the inclusion of marginalized and vulnerable populations, digital ID systems make these populations more susceptible to exploitation, suppression, and discrimination. Legal, procedural, and social barriers to enroll in and effectively use digital ID systems should be evaluated, and remedies need to be implemented to mitigate negative effects on the rights of these individuals, such as legal recognition, non-discrimination, and protection against unlawful attacks.

Digital ID systems contain highly sensitive personal information. For marginalized and vulnerable populations, balancing the benefits of collection and collation of their data should be weighed against the additional risks this poses to their civil and political rights. Freedom to opt-out in full or to selectively opt-in to aspects of the system without risk of losing access to government services and the ability to actively participate in civic and economic pursuits should be made available.

Navigating the Promise of Digital ID

These priority recommendations are a starting point for ensuring national digital ID system data governance policies and practices support civil and political rights. Institutions deploying these systems should put in place an iterative and multistakeholder review process informed by these recommendations. This process will equip stakeholders to consider the human rights impacts of data governance policies and practices as the system expands and changes over time. National digital ID systems hold great promise to support an equitable and thriving society, but only if these systems are built upon human rights-driven data governance policies and practices.

To read the full report, please visit https://citrispolicylab.org/wp-content/uploads/2019/09/Responsible-Digital-ID_CITRIS-Policy-Lab_September-2019.pdf


The CITRIS Policy Lab, headquartered at UC Berkeley, supports interdisciplinary research, education, and thought leadership to address core questions regarding the role of formal and informal regulation in promoting innovation and amplifying its positive effects on society.

Comments

Add new comment

captcha

All comments must be approved before they are published.
When you click Submit you accept our Privacy Policy and Community Moderation Policy.
Need to delete your comment?