banner shutterstock_512231740.jpg

Photo: Shutterstock / franz12

Trump's Surveillance State: Technology & Targeting Migrants

In this opinion piece, Holly Barrow condemns the use of digital identification technology on migrant communities in the United States

In the US, not even the COVID-19 pandemic could deter the Trump administration from proposing new, invasive immigration policies. At a time when global solidarity and a collective sense of humanity is desperately needed, Trump remains dedicated to targeting migrants.

In September, Buzzfeed News reported that the Trump administration had drafted a proposal which aims to expand the number of migrants required to provide personal information in their immigration applications - including significantly ramping up the requirement for biometric data.

Currently in America, migrants over the age of 14 are required to provide their fingerprints, signatures, and photographs. However this proposal would see even those below this age required to provide substantially more personal data - including eye scans, voice prints, facial recognition and DNA samples. This immediately sounds alarm bells to many who recognize what such intrusive immigration policies may mean for thousands.

The proposed policy would allow the Department of Homeland Security to demand biometrics from a vast number of migrants, including those who have acquired a green card or work permit, and permanently monitor them until the point that they become US citizens.

While many American citizens vehemently refuse to wear facial coverings during the pandemic - claiming that this infringes on their civil rights and liberty - migrants are facing a real threat to their freedom, privacy, and dignity.

It is well established that the use of technology to constantly surveil migrants and asylum seekers comes at the expense of these already marginalized groups

A 2019 report by Privacy International outlines how governments exploit technology to rapidly increase the burden of proof placed on migrants. It examines how biometric schemes and data-driven immigration policies are stripping away migrants’ agency throughout the immigration process, instead placing their fate into the hands of algorithmic systems - many of which are subject to both racial and gender bias.

Life-changing decisions are being made as a result of this data, despite there being limited safeguards in place to regulate its use throughout these processes.

There is a common misconception that tech and algorithms are free of human bias, providing a fairer, more just way of evaluating a migrant’s identity, authenticity, and eligibility for their desired immigration status. However this is not the case. The data entered into these systems is often incomplete and actually perpetuates pre-existing social inequalities, leading to discriminatory practices such as racial profiling. Facial and voice recognition have a significant problem with this.

Photo: Shutterstock / NAUFAL ZAQUAN

A landmark study in 2019 analyzed 189 software algorithms from 99 developers and found higher rates of inaccuracy for Asian and African-American faces relative to images of white people, often by a factor of 10 to 100 times. Similarly, in 2018, researchers from MIT (Massachusetts Institute of Technology) found that facial-analysis software shows an error rate of just 0.8 percent for light-skinned men, compared with 34.7 percent for dark-skinned women.

This high rate of misidentification of people of color can have devastating repercussions, since this very software can be used to determine an individual’s immigration status and legality. There is no room for mistakes in what can be a matter of life and death for some of the most vulnerable, particularly asylum seekers, who may be removed back to the countries they have so desperately fled as a result of such errors.

Deputy director of immigration policy for the American Civil Liberties Union, Andrea Flores, explained that these proposals would serve a catastrophic blow to the US’s immigration system: “Collecting a massive database of genetic blueprints won’t make us safer - it will simply make it easier for the government to surveil and target our communities and to bring us closer to a dystopian nightmare.”

With migrants and refugees unable to access the same legal protections as citizens, intrusive surveillance - and the use of biometric data as a form of evidence - raises drastic human rights concerns, especially when considering that fingerprints and facial recognition only have a high accuracy with middle-aged, lighter-skinned men.

Equally concerning is the Department for Homeland Security’s further suggestions that, in expanding these biometric requirements to include those under the age of 14, they would be able to rely on DNA samples to ‘verify claims of genetic relationships’ between adults and minors in the department’s custody. This move could see non-biological guardians interrogated and faced with increased barriers, potentially leading to the separation of family members, or allegations of fraud.

These renewed efforts to expand the surveillance of migrants and refugees is not for the good of society, contrary to DHS’ claims. In fact, it likely has to do with Trump’s determination to make US immigration as difficult as possible, and the profits incurred through outsourcing such surveillance.

The rise in tech corporations working with government agencies to share sensitive data is well-established. Data extraction companies such as Cellebrite are known to bypass passwords to track location and movement through digital devices. Migrants are at real risk of identity theft through the sharing of such data - whether intentionally or via hacking.

The chilling effect of Trump’s proposals on migrants and refugees cannot be understated. With little regulation surrounding the use of data and how it is inferred or observed, biometric schemes are open to exploitation, misinterpretation, and corruption. As Dragana Kaurin writes for Digital Freedom Fund:

The overreach of intrusive technologies always starts with the most marginalized, invisible communities before it is normalized and eventually used widely on others