Private technology firms involved in developing and maintaining a range of digital surveillance tools for the UK’s immigration authorities are rarely scrutinised or held accountable for their involvement in the border regime, according to civil liberties campaign group Privacy International (PI).
The UK’s privatised migration surveillance regime report published by PI, which analysed the role dozens of private technology firms play in the UK’s border regime, stated that the close-knit relationship between immigration authorities and the technology sector means “UK authorities are able to call on intrusive surveillance powers matching those of anyone else in the world”.
On the front end, this includes tools such as mobile phone extraction devices, which are used to analyse migrants’ metadata and access their GPS location history; aerial surveillance drones to patrol the Channel; and mobile biometric scanning devices that are able rapidly identify people and check their immigration status – all of which are provided by private companies.
The capabilities of these front-end tools and many others are supported by a number of back-end systems, which are used by various agencies across the UK’s immigration control regime “to process immigration data, track people through the borders…or which are relevant because they enable forms of surveillance”.
This includes the Home Office Biometrics (HOB) database currently in development and the existing Case Information Database (CID) used to record personal details of all foreign nationals who pass through the immigration system.
These are again supported by a number of private technology companies, and hold a diverse range of data including biometrics such as DNA and fingerprints, travel histories information, and various metadata from phones or Wi-Fi networks.
The UK’s immigration authorities, the report noted, also buy information from data brokers such as GB Group or Experian, which “trade on the information of millions of people and build intricate profiles about our lives”.
However the report, which is based entirely on open source information, noted that “many of the key actors involved are resistant to transparency”, and that the general secrecy surrounding the Home Office’s technology ecosystem means the companies involved “enjoy minimal scrutiny and are seldom held accountable”.
Some 39 technology firms are named in the report, including IBM, Accenture, BAE Systems, Elbit Systems, Palantir, Deloitte Digital, Fujitsu, Northrop Grumman, Thales, Tekever, Cognizant and Leidos.
Private interests take precedent at the border
Speaking at a virtual launch event for the report on 10 February 2021, its author and PI advocacy director Edin Omanovic said the narrative around immigration in the UK largely centres around “there are too many people” or “the system is broken”, meaning it “is very susceptible to being securitised” because everyone coming into the country is seen as a threat that needs to be monitored.
“That’s almost become ingrained in the national conversation, and the issue fundamentally comes down to this lack of transparency and the secrecy surrounding this entire ecosystem,” he said, adding that technology firms and contractors use this “guard of secrecy” to hide “what the actual problems are, how they sold their systems, what kind of meetings they had in the background, so we can’t as a democratic populace assess what went wrong”.
Mary Atkinson, a campaign officer at Joint Council for the Welfare of Immigrants (JCWI), said the “massive reach” these companies have into the Home Office stands out in the report, and that the relationships and processes described are “absolutely key to [understanding] the hostile environment”, which has “always been an agenda based on data sharing…between, for example, the NHS and Home Office, and vice versa.”
She added that “the report shows how data is used in ways that many people don’t know about to increase surveillance and track people in many aspects of their everyday lives”, all with the aim of pursuing the hostile environment.
Atkinson also said that the Home Office’s data-driven approach to borders is “far from intangible”, as mistakes in the data held on people could have serious consequences.
“[It] could lead to an immigration raid being ordered on your house, it could mean you being detained and taken away from your family, and there are cases of people being hounded by the Home Office for years because they have the same name as somebody who has a different immigration history,” she said.
“As the immigration system becomes increasingly digital only, things like that will haunt people in more and more aspects of their daily lives.”
Petra Molnar, Mozilla fellow and associate director of the Refugee Law Lab, said that the increasing reliance on data and automation in the immigration process is accompanied by very little governance and regulation. This then gets mapped onto pre-existing “lines of power in society”, she said, meaning their effects are felt most sharply by already vulnerable communities such as migrants, asylum seekers and other people on the move.
“We are seeing this play out time and time again in this ‘data-fication’, and this increasing reliance on migration management technologies at and around the border,” she said.
“Communities that we work with already have little power to exercise their rights, let alone mechanisms of redress or even sometimes knowledge that this is even happening. This is done deliberately on part of the state to obfuscate decision making, make it more difficult to follow a line of reasoning, and then perhaps even mount a legal defence.”
Molnar, who found that already vulnerable migrants are being used as “testing grounds” for a host of migration “management” and surveillance technologies in a November 2020 report, added that the “increasingly inappropriate over-reliance on the private sector” in immigration control shows just how firmly embedded the priorities of the private sector and big tech are in the conversation.
“Why are we ‘innovating’ and creating these new so-called solutions that are, once again, putting certain communities at the sharp edges of this technological development?” she said.
“Why aren’t we using all the sexy technology to root out racist border guards, for example? Well, because very clear priorities are at play when it comes to what technology is funded, what we’re even allowed to imagine is possible in this space.”