Police forces across the UK stand accused of unlawfully processing people’s personal data within the Microsoft 365 cloud productivity platform, after failing to carry out the required data protection checks before deploying the technology, a Computer Weekly investigation has found.
Thirty police forces are involved in a national roll-out of Microsoft 365 (M365) as part of the National Enabling Programme (NEP), which was jointly created in 2017 by the National Police Chiefs’ Council (NPCC) and the Association of Police and Crime Commissioners (APCC) to spearhead the delivery of new cloud-enabled ways of working for the police.
According to figures obtained from the NPCC, the Microsoft cloud-based platform will store and process the personal data of more than one million UK citizens, many of them vulnerable or at risk.
Approved by the Home Office and supported through its Police Transformation Fund, the programme is a core element of fulfilling the Policing Vision 2025 strategy, and involves the NEP team working with the Police ICT Company on developing national approaches to technology investment and data standards.
The NEP’s M365 roll-out has been under way since September 2017, but Freedom of Information (FOI) requests submitted by Computer Weekly show every force involved in the national roll-out, with the exception of Kent, are yet to conduct the data protection impact assessment (DPIA) that is required by law before the deployment takes place.
Also, 29 of the 30 police forces involved in the initiative confirmed that they hold “no information” on either the contract or terms and conditions in place with Microsoft.
This is despite the fact that, according to an NEP blogpost, nearly one-third of UK police forces have achieved the requirements to fully roll out M365, and all forces in England and Wales have now signed up to the programme.
The NEP told Computer Weekly that all forces are expected to complete a DPIA once M365 has reached the roll-out stage, but legislation is clear that an assessment must be conducted before implementation.
Failure to complete a DPIA puts police forces at risk of multimillion-pound fines initiated by anyone who finds that their personal data has been handled illegally. Data protection experts even suggest there is a risk that lack of a DPIA could be used to undermine evidence presented in court.
Changes in data protection law
Before the introduction of the Data Protection Act (DPA) of 2018, police forces’ data handling was governed by the previous 1998 version of the DPA (DPA 98), which provided much more limited personal data rights for citizens and allowed law enforcement data, despite its sensitivity, to be treated and processed in essentially the same way as other forms of data.
Under the terms of the DPA 2018, however, police forces are obliged to conduct data protection impact assessments and ensure that processors seek permission before transferring data internationally to a third country.
They must also ensure the service provider maintains logs of its processing activities, as well as consultation logs to establish why a user has looked at a particular piece of data.
The DPA further dictates that each police force, as individual data controllers, must conduct this due diligence themselves to ensure their personal data processing complies with the Act, and that they must also be able to demonstrate this compliance.
To do this, each force must ensure that a contract in writing exists between itself and Microsoft (the data processor) which sets out details of the processing, including its duration, the nature of it, and the type and categories of personal data involved. To be valid, the contract or terms of service must be explicit in how they meet the DPA requirements.
By not properly conducting their due diligence before adopting these public cloud services, police forces implementing M365 could have opened these data subjects up to a number of risks.
These include the transfer of their data to jurisdictions with lower data protections, which, for example, risks exposing their personal information to Microsoft staff, the US government or other third parties, and an inability to exercise their data rights to rectification, erasure, and not be subject to automated decision-making.
After reviewing the FOI responses on behalf of Computer Weekly, data protection specialists and lawyers familiar with policing said they had cause to question the lawfulness of police forces running people’s personal data through the cloud-based Microsoft business productivity suite.
Specifically, they questioned whether the police forces involved are ensuring ahead of time that their deployments comply with Part Three of DPA 2018 – which sets out, for the first time, specific statutory rules for processing personal data by law enforcement entities.
Independent privacy consultant Owen Sayers, who has over 20 years’ experience in the delivery of national policing systems, said that despite coming into force over two years ago, the complexity of this policing-specific piece of data protection legislation means there are still relatively few lawyers and specialists currently working with it.
“At first glance, it is similar to the more commonly discussed General Data Protection Regulation [GDPR], but in practice has significantly different obligations to be met,” he said.
“The technical credentials, commitment and operational experience of many of those working on the programme are not really the question, nor is the intrinsic capability of the Microsoft technology and what it is perceived to bring to policing.
“Rather, it appears that there has been a near total failure of those running the programme to fully understand and consider the legal implications of using those technologies in the way they have been contracted, or to conduct the diligence that is legally required.”
Sayers added that “doing nothing” would mean forces will continue to contravene the law, and that members of the public may not accept that “to uphold some laws, the police are choosing to flout those that protect their personal data”.
The Information Commissioner’s Office (ICO), which will initially seek to consult non-compliant organisations about the problem and advise them on how to fix it, can issue two tiers of monetary penalty for failing to comply with Part Three of the DPA 2018 – a “standard maximum” of roughly £9m or 2% of annual turnover, or a “higher maximum” of £18m or 4% of annual turnover. In both cases, the offending organisation will be fined whichever amount is higher.
But to get to this stage, the ICO would need to receive a complaint from someone who thinks their data has been handled unlawfully. This is something they would only be able to know if harm had already been caused, or they had otherwise submitted a subject access request (SAR) to find out what data the police force holds on them.
If a person believes their data rights have been breached, the other avenue available is the courts, which, unlike the ICO, will be able to award compensation if the prosecution is successful.
In response to questions from Computer Weekly, an NEP spokesperson said: “UK policing needs to get smarter in the way we embrace new technology and – in common with the wider public sector and in line with the government’s strategy – we are adopting the ‘cloud first’ approach.
“Quick, safe and proportionate data sharing between forces and partners is vital to investigating crime and keeping people safe from harm. Protecting the public is our priority and cloud technology offers the most effective solution for policing in an age of ever complex threats.
“Having assessed all potential providers, Microsoft was chosen as our preferred supplier.
“We accept there are risks associated with cloud storage, just as there are in holding data on servers at individual police premises. Any potential risks are balanced against opportunities to better protect the public. And, like all risks, they are constantly assessed and managed by putting appropriate and adaptable safeguards in place at national and local levels.
“We have been taking legal advice and consulting with the ICO throughout the life of the programme to ensure we are working within all existing legislation.
“We have developed a national DPIA template that forces will adapt locally once they roll out Microsoft 365 products across their organisations. The roll-out began in October and will run over the course of next year.
“As this technology is constantly evolving, the NEP is facilitating continuous dialogue and learning between all forces and Microsoft through a number of working groups.”
Data protection impact assessments
Unlike privacy impact assessments (PIAs), which were optional under the previous legislation, DPIAs are now mandatory and entail a much broader assessment of the potential for adverse impacts on the rights and interests of citizens, involving a number of extra steps to identify and mitigate risk.
Because of the change from PIA to DPIA, all previously assessed systems must be reassessed under the new process if the context or nature of that activity has changed significantly.
Police, as data controllers, are also obliged to carry out a DPIA before the start of any new personal data processing, where a type of processing is likely to result in a high risk to the rights and freedoms of individuals. This includes where the system is not yet live, but real personal data is still being used.
On this point, many forces have already begun using M365’s online collaboration service Microsoft Teams to share personal data “for law enforcement purposes”, both internally and externally with other forces, despite having not having conducted a DPIA.
North Yorkshire Police, for example, has confirmed it is copying data out of the National Firearms Licensing system into Teams along with information drawn from NICHE, its own incident management system, which contains data related to domestic violence and other worrying behaviour.
This means that details about criminal and alleged criminal offences, as well as other intelligence related to an individual’s suitability to hold a firearm, and potentially what firearms they hold and where, are being shared on M365.
South Wales and Gwent Police have also confirmed they have used Teams to conduct a joint “serious organised crime partnership meeting”, in which slides were shared and audio recorded – all sensitive data that could be held in the cloud indefinitely.
Cumbria, too, is using the M365 app for its “safeguarding hubs”, which means that highly sensitive information about vulnerable people – including children and high-risk offenders – is being shared with partner agencies over Teams.
However, despite the highly sensitive nature of the data processed by law enforcement entities, none of the forces listed above have completed their own DPIAs for M365. Instead, these forces – and most of the others responding – said they were relying solely on a national DPIA conducted by the NEP itself.
A total of nine forces responded by saying they would be adding “local modifications” to the NEP DPIA, at least five of which are confirmed to already be using M365 anyway; four said they were in the process of completing their own; three said they had not conducted one of their own and made no reference to the national version; and only one said a force-specific DPIA had been completed.
The remaining 12 respondents said they had accepted the national DPIA. Only Durham is yet to respond to the original FOI request sent in July 2020.
When asked in a separate FOI for a copy of the national DPIA, which would quickly prove the lawfulness of the national M365 roll-out, the NPCC initially refused to disclose it on law enforcement and national security grounds.
However, after requesting an internal review of the decision, Computer Weekly was granted access to a heavily redacted version of the final document. This version has since been uploaded to the NEP website.
The DPIA was also reviewed by data protection specialists and lawyers familiar with policing, who identified a number of critical issues that further bring into question the lawfulness of police forces processing people’s personal data in M365.
It is worth noting that UK police forces have struggled with creating sufficient DPIAs before. For example, South Wales Police attracted criticism from both the ICO and judges in an appeal case for its use of automated facial recognition, in which the court ruled: “The DPIA failed properly to assess the risks to the rights and freedoms of data subjects and failed to address the measures envisaged to address the risks arising from the deficiencies we have found.”
Critical issues with the national DPIA
A major problem the experts identified with the national DPIA is the substantial inconsistency with which the data protection risks have been assessed.
For example, despite identifying that “there is an increased risk that sharing and combination of data leads to the creation of additional personal data that is then not easily corrected”, which it says represents a “probable” and “significant” harm to individuals’ right to rectification, the DPIA concludes the overall risk is “medium”.
In the next assessed risk, relating to the right of data erasure, the DPIA similarly finds a “probable” and “significant” harm to individuals, but concludes that the overall risk is “low”.
This pattern is repeated throughout this section of the DPIA, in that the overall risk level does not seem to match the likelihood and severity of harm indicated.
When asked about these inconsistencies – as well as how these conclusions were reached and what risk matrices were used in the process – an NEP spokesperson said the programme has undertaken “a robust and detailed security risk management assessment” of the blueprint design.
The spokesman added: “The blueprint design provides detail of the configurations to be applied, as well as a residual risk position defined within the security model. This included significant consultation with the ICO and leading and highly respected data protection experts, and the National Police Risk Management Team which reports to the Police Information Assurance Board.”
For Shaun Beresford, a senior data protection officer at ClearComm with decades of information management and data protection experience in a policing context, the identification of risks and their mitigation within the DPIA was generally “glossed over” and “could have been done with a crayon”.
“They’ve stripped it back to the basics without really exploring risks and coming up with proper mitigation,” he said, adding that the only solution proposed for all of the 12 risks presented is staff training and strict access controls, which has got “nothing to do with the risk that’s been presented”.
Beresford said the DPIA does not go far enough in distinguishing between the police’s separate GDPR and DPA obligations, given that it only mentions Part Three processing – the law enforcement-specific data protection legislation – in one paragraph.
“It appears to be merging criminal investigation information with employee data – one of those sits on Part Two of the DPA, which is the GDPR [transposed into English law], and the other sits in Part Three,” he said, adding that although the GDPR does apply to police forces for certain data-processing activities – such as human relations management or business support – only the DPA applies to operational law enforcement data and the related processing activities.
Therefore, ensuring compliance with the GDPR does not mean a police force is absolved from abiding by the rules set out in the DPA, and vice versa.
Beresford added that each risk should have been assessed and reconciled specifically against either the GDPR or DPA Part Three, as appropriate.
“They’ve just lumped it all together, but the risks are different depending on which side of the business you’re dealing with,” he said, pointing out that the document also does not identify the legal bases used for the processing against either piece of legislation.
Although the DPIA does say “the lawful bases for processing information via the NEP solutions are in fact no different to the lawful bases for the processing which forces currently undertake”, it makes no effort to explain these bases and how they apply to different processing activities.
Beresford said: “In terms of the Part Three processing, you need to qualify your lawful basis with a Schedule 8 condition, but Schedule 8 is not even mentioned in here.”
Schedule 8 of the DPA 18 lists nine conditions, at least one of which needs to be met before the processing of sensitive data can occur.
He added that he would also have expected to see a whole section about data subjects’ rights that referred to how each was affected by different processing activities under the different pieces of legislation.
When asked to identify the Schedule 8 condition being relied on for the processing, the NEP spokesperson said: “This detail will be included in the force-specific DPIAs, as the condition used will depend on the purpose of the processing in each case.”
For Chris Pounder, director of data protection training firm Amberhawk, the conflation of the two data protection regimes in the DPIA is not necessarily “fatal”, but could lead to the statutory code of practice on the Management of Police Information (MoPI) not being followed.
The MoPI rules are designed to help law enforcement entities properly manage and share operational information for policing purposes, and although it is not fully aligned to the definition of law enforcement processing in Part Three of the DPA, it is very close.
“There could be an issue if forces have to ensure MoPI rules are followed for some personal data and not for others,” said Pounder, adding it could open up “unofficial backdoors” for the sharing of law enforcement-related information between police officers and staff.
“The problem is if the personal data in this new, easier-to-use system gets out of step with the personal data on established police systems – with the risk being that the MoPI rules are not followed when there is law enforcement processing with the new system,” he said.
Pounder further commented that the DPIA sidesteps the issue of maintaining activity and consultation logs (a requirement under the DPA) for the system’s use, “which are there to basically underpin the veracity of the evidence” used by police in prosecutions, because it shows who has accessed the data, for what purposes, and when.
“If information is used in a court, there is a risk that the barrister for the defence would start using the argument that, to put it crudely, if it sidesteps the logging arrangements, how do we know they are met?” he said.
In its attempt to consider the implications of the GDPR and the DPA side by side, experts suggest the DPIA has therefore conflated the two regimes, in turn making it unclear whether the data protection risks have actually been mitigated.
An NEP spokesperson confirmed that risks had been assessed differently under the GDPR and the DPA 2018 “where appropriate or applicable”, but provided no further detail.
While the NEP spokesperson claimed the programme has refined its risk and mitigation approach “as further information has come to light and further feedback has been received from various stakeholders”, including the ICO, there is no evidence in any of the disclosures that this has taken place.
For example, the spokesperson claimed that in the most recent version of the DPIA (which will be “published on the NEP website shortly”), the above approach to describing risk levels has been amended “to align with the approach taken on other national policing projects, and to reflect further developments in case law, regulatory guidance and privacy custom and practice”.
The NEP spokesperson added: “The ICO was provided with a full copy of the DPIA and provided detailed comments and feedback on the document, which have been addressed in later iterations of the DPIA.
“As with any DPIA, it is a living document, and so there have been various updates to the DPIA since that date, to reflect changes in law and regulatory guidance.”
However, the version now published on the NEP website is identical to the version disclosed to Computer Weekly, both of which say the document has not been updated since being marked “final” on 2 May 2019. Therefore, although the NEP claims that ICO guidance was incorporated into later versions of the DPIA, these documents were not disclosed to Computer Weekly, which was also not advised about the potential existence of newer versions during the FOI disclosure.
On top of this, when asked by Computer Weekly if it had indeed been consulted on the national DPIA, an ICO spokesperson directly contradicted the NEP’s claims, saying: “We provided informal data protection advice on the National Enabling Programme, but a data protection impact assessment was not formally submitted for consultation with the commissioner.”
When further asked if it would commit to publishing an unredacted version of the DPIA, the NEP spokesperson said: “The latest version of the DPIA will also be published on the same site in the New Year (subject to redaction under similar principles to protect national infrastructure from crime or terrorist acts).”
But Sayers said: “If a service cannot comply with the law in key areas, then creating another DPIA will not fix that. If the new DPIA says it is in fact compliant, or can be made to comply, then that DPIA is suspect and probably not worth the pixels wasted to display it. If the DPIA is honest and factual, then the police will need to stop using the service.
“It is the M365 service and how it operates plus the Microsoft terms of service that need to be fixed, not the DPIA.”
There is also no mention in the DPIA of Microsoft being a data processor, despite the document listing 12 M365 applications that will make up the police productivity suite.
Instead, although it is clear that the bulk of processing activities will take place on Microsoft cloud services, it is BT and Deloitte that are identified as “key data processors in the NEP solution ecosystem”.
However, when asked why the DPIA does not mention Microsoft as a processor, the NEP spokesperson said: “The national DPIA explains in some detail which Microsoft components and tools are included within the NEP product set, together with a description of what each item is and what it will be used for.”
But Sayers said the way in which the NEP has assessed the risks is inconsistent.
“They haven’t considered the Data Protection Act Part Three bits at all, nor have they complied with the Part Three requirement to demonstrate how you’re meeting all the obligations. That in itself means it’s not a valid DPIA, even if everything else was correct,” he said.
“And then to omit Microsoft as a processor – that just makes no sense to me whatsoever. So I can say without any reservation at all that this is not a valid DPIA, and anyone that relies on that DPIA is putting themselves into a really dangerous legal position.”
Police forces ignore contents of DPIA
Despite 12 forces responding to Computer Weekly’s FOIs to say that they were relying solely on the national DPIA, the document itself clearly says this is not an option.
“Each police force will be its own data controller for the personal data which it collects and processes using the NEP solution,” it says. “Each police force is therefore its own data controller for the purposes of its use of the NEP solution. This DPIA does not replace the specific risk assessments which each individual force must undertake.”
The document adds that only the forces themselves are able to assess their own specific data protection risks based on their own circumstances.
This is further corroborated in the data protection officer’s summary section of the DPIA.
“While this document will be shared with forces, each force is required to consider the privacy impact, risks and mitigations for its own account,” it said, adding: “As forces and other tenants are transitioned into using the NEP solution in a live environment, they (acting as data controllers) will need to undertake their own impact assessments.”
The NEP also identified that it “is not itself a data controller” and “has no ability to enter into contracts”.
Pounder said that although it might be reasonably argued that each individual controller does not have to conduct its own separate DPIA for similar processing activities – as there are 43 police forces in England and Wales – they do need to review them to confirm they are valid and that they agree with them.
“Obviously they can coordinate their efforts, but at the end of the day, they have to be satisfied that what has been done in their name is actually what they want to happen,” he said, adding that each force has its own senior information risk owner (SIRO) who is ultimately responsibility for signing off on the risks associated with their force’s data processing activities.
This means that even if the national SIRO – which in this case is City of London Police commissioner Ian Dyson – accepted risks on their behalf in the national DPIA, “the controller would not have a leg to stand on” if any damages were later caused to the rights and interests of individuals.
“If I was a SIRO, I don’t think I’d take the word of another SIRO without actually just having a look through the documentation,” said Pounder.
Therefore, although the national DPIA clearly indicates that it has been shared with police forces, it is unclear whether the 12 forces relying on it actually reviewed the document before accepting it as valid.
Sayers said: “It seems clear from these findings that forces already consuming M365 services on the basis of this assessment have left themselves open to very serious legal and financial risks, which of course Microsoft as the data processor also shares. Since the affected forces carry individual responsibility for this in law, they should quickly and directly engage the resources needed to address the issues.”
When asked for the justification for forces relying on a single national DPIA rather than conducting one of their own, an NEP spokesperson said “forces will conduct their own”, adding that the central DPIA will be “amended and added to where necessary by forces locally when they come to access and utilise the NEP solutions for the processing of personal data”.
The spokesperson added: “Forces will each have the opportunity to consider any additional, different or local privacy risks when reviewing this DPIA and completing their own validation checks. Again, to assist forces across the country, the NEP has sought to standardise the format for the DPIA and has included, at Appendix A, space for individual forces to reflect on any additional or different local risks and mitigation strategies.”
‘Local amendments’ to DPIA
The NEP spokesperson said it is only as forces achieve the “full roll-out stage” that they will develop a local version of the document, adding: “The NEP has only recently reached the full roll-out stage with the first tranche of 12 forces, who rolled out at the end of October or start of November.
“The full roll-out stage means forces have achieved the technical standards which allow them to progress the implementation from a pilot of around 250 users to making M365 tools available for wider use, including use of sensitive data, in a staged process, across the whole workforce.
“Each force then decides which services within the M365 suite are used, by who, and how they are implemented.”
However, conducting a DPIA only upon “full roll-out” does not mirror the legal requirement, which is to conduct before commencing any data processing activities.
The forces in the first tranche include Cumbria Police, Durham Constabulary, Derbyshire Police, Kent & Essex Police, Norfolk & Suffolk Constabulary, North Yorkshire Police, West Yorkshire Police and BCH.
However, of these, only Kent confirmed it has completed a DPIA, and only BCH – which responded jointly to the FOI – said it was in the process of adding “local modifications”.
While the DPIA does contain an appendix for forces to add their local risk considerations, only nine out of 30 confirmed in their FOI responses that they were doing so.
Both Sayers and Beresford expressed further concerns that the appendix, which is a single page divided into five headings, was too lightweight to give forces the space to consider key aspects of M365 and its implementation from their local perspectives.
“I think it’s been drafted to try and encourage people, to give them the feeling, that if they just covered off these five areas then send it back, that’s it, and life goes on,” said Beresford.
“If I was a force data protection officer (DPO), I would look at the one that been provided and go, ‘That nowhere near covers everything’. It makes a reference to Part Three processing [once], but I don’t think people really understand the difference and the nuances [between Part Three and GDPR]. There is a requirement under Part Three to identify categories of people whose data is being processed for law enforcement purposes, there is a requirement to include logs – these are all things that, at the higher level, there should be reassurances given.
“When you start to review the content of the DPIA, actually they’ve not looked in any depth at anything.”
The experts who reviewed the DPIA on behalf of Computer Weekly also noted a number of smaller discrepancies that further undermine its validity.
For example, despite being marked “Final” and “Final for sharing with forces”, there is no evidence in the document that it has been signed off by the appropriate police officers or staff, as the section on “sign off and record outcomes” has been left completely blank.
Elsewhere in the document, redactions related to personal information and law enforcement exemptions have been clearly marked, indicating that instead of being redacted, it was simply not finished.
The scope of the DPIA consultation was also called into question, because it was limited to just NEP post holders, senior Police ICT Company staff and local force representatives from Kent, Essex and Sussex. In response to this, Sayers said, “This is principally an echo chamber, rather than a consultation exercise,” noting the need for “critical friends” to be involved from further afield.
The DPIA also claims that the NPCC “has set a UK Policing Vision 2025 to have all 48 police forces in the UK digitally enabled and cloud ready,” but that document does not mention the word “cloud” once in 11 pages.
‘No information held’
As previously mentioned, each force must also ensure that a contract in writing exists between themselves and Microsoft setting out details of the data processing, including its duration, the nature of it, and the type and categories of personal data involved. To be valid, the contract or terms of service must be explicit in how they meet the DPA requirements.
Despite these requirements, 29 of the police forces Computer Weekly contacted were unable to provide the requested information about their contracts with Microsoft, claiming that their obligation “is to provide information that we hold, which does not extend to undertaking analysis or interpretation of contractual documentation”.
However, 15 of these forces provided an additional link “in the interests of transparency” to the data protection addendum the NEP holds with Microsoft, as well as to the online terms of service, which they claimed “contains all the mandatory clauses required under Section 59 of the Data Protection Act 2018”.
This section principally lays down a requirement that a controller must only use a processor who gives certain guarantees in a written contract but, as the vast majority of forces do not hold this information, Sayers said, “It is impossible to see how the force, as a controller, could fulfil its legal obligations”.
The section cited also does not cover the activity and consultation logging requirements, or the international transfer restrictions, that Computer Weekly specifically asked about, and which are found in completely different sections of the DPA.
When asked whether the NEP could point to the specific text in the Microsoft-NEP data protection addendum that covers the DPA 2018 clauses asked about, an NEP spokesperson said: “We cannot be certain which document you are referring to as you have not provided a link to the data protection addendum in question.”
The spokesperson added: “We are therefore not able to provide a comprehensive line-by-line mapping of all of the Section 59 obligations to the data protection addendum. However, we confirm that this has been addressed within the DPIA, and the attendant risks (and corresponding mitigation strategies) also described within the DPIA.”
There is only one data protection addendum on the NEP website, and 15 forces (half of the respondents) gave near-identical responses linking Computer Weekly to the document.
Sayers said the actual terms linked to by the police forces in their FOI responses only make “simple generic statements” about key aspects of the processing – for example, including the type and categories of personal data involved – that must be included in a written contract, and that the documents themselves make no reference to Part Three of the DPA.
They do, however, make frequent reference to the GDPR, which came into force at the end of May 2018 alongside the DPA.
“I think there’s a fundamental problem that the DPOs themselves don’t understand the legislation, or what the division is between GDPR and the DPA Part Three,” said Sayers.
This was corroborated by Beresford, who said the consensus when delivering training sessions on Part Three processing to police forces in 2019, was that “the training was the first they’d received on Part Three” since it came into effect in May 2018.
He added that the perception of police officers and staff he had spoken to during these sessions – delivered when working for data protection training and management firm IT Governance – was that, “It’s all GDPR”.
In response to questions about the lack of contracts and force-specific DPIAs, an NEP spokesperson said: “We are working to advise and enable every local force to make appropriate information available to them in order to complete their local DPIAs. This will include providing information which they might not hold at this stage regarding their contract and terms of service from Microsoft.”
International data transfers
Two of Computer Weekly’s FOI questions related to international transfers of data, which in the context of law enforcement must meet strict thresholds, as set out in the DPA, to be considered legal.
One of these questions was whether the contract or T&Cs with Microsoft specifically required the processor to seek and receive permission before transferring data to a third country, for each particular transfer made. Forces generally claimed this was covered in the NEP’s data protection addendum with Microsoft.
While Section 59 of the Act does set out the obligation that processors may only transfer data to a third country if instructed by the data controller, Sayers said: “The Microsoft terms of service make no such guarantee, nor does the linked terms of service include the clauses as suggested. In fact, quite the contrary – the standard terms empower Microsoft to transfer data at its discretion as part of its day-to-day services under a blanket approval given by the controller when accepting the terms of service.”
The DPA also contains four further criteria that need to be met for a law enforcement entity to be able to transfer data internationally to a “non-relevant authority”, in this instance Microsoft, that does not carry out statutory law enforcement functions in their jurisdiction.
These conditions include a stipulation that the transfer be considered “strictly necessary”, which is a significant point of law, said Sayers. “It means that you cannot do your business in any other way,” he said. “Say you had to process a digital photograph for forensic evidence and the only software that existed in the world that could do it was in the US, then you could demonstrate that it was strictly necessary.
“However, if you could process that material in Huddersfield, it’s not strictly necessary to send it overseas – it might be your preference, or it might be orders of magnitude cheaper to process it in the US, but it’s no longer strictly necessary.”
He added: “Since services already exist within the UK and EEA that can fulfil the requirements of a police force to process this data – and they have been using them for many years – it follows that forces are sending this data offshore as a matter of expediency and are choosing to do so. They do not need to do so.”
When asked whether the NEP had considered these kind of services, specifically sovereign cloud providers with the ability to carry out all of their storage and processing activities within the UK, a spokesperson responded: “All options were considered and discussed with leads from across policing.”
The rationale for choosing Microsoft, they said, was based on a number of factors, including: local forces were already progressing with their individual move of Office 365; the Crown Commercial Services framework discounts for on-premise licensing were being removed and applied to M365; the continued development of Microsoft products for solely on-premise functionality was being reduced and deprecated in some cases, requiring a different approach to maintaining operability and transition to the cloud; and the existing investment already made across national policing into Microsoft products.
The spokesperson said Google was considered, but had no UK-based services to consume at the time when forces were making their decisions, adding: “We are continually looking at opening up options for forces to select other cloud solution providers whose services align with the technical requirements (blueprint) approach.”
UK data storage and processing
When asked in Computer Weekly’s FOIs whether the contracts or T&Cs specifically require data to be stored and processed in the UK, a number of police forces answered: “The region selected for storage of data is the UK.”
Although there is no formal legal requirement to ensure data remains in the UK, the question posed reflects an operating process reality – that unless the four transfer conditions are met, the data should not be transferred to another jurisdiction.
However, although forces claim the data is being stored in Microsoft’s UK region, the company says on its website that there are exceptions for cloud services, “which back up web- and worker-role software deployment packages to the United States regardless of the deployment region”.
An NEP spokesperson confirmed that the group “have always been aware of this and it was considered in detail”, again reiterating the claim that the NEP has undertaken “a robust and detailed security risk management assessment”.
The spokesperson added that the NEP “continues to follow the UK government’s cloud-first strategy while balancing the wide range of risks against the opportunities provided by modern approaches to technology – this risk is also addressed in the DPIA, together with the attendant mitigation strategy”.
Computer Weekly and the experts it consulted have been unable to identify the areas of the DPIA where this risk is addressed and mitigated.
Sayers said: “Use of the M365 service requires the [data] controller to accept that data can, and shall, be internationally transferred to fulfil the services at Microsoft’s own discretion and without further authority being sought. A force therefore cannot evidence they have met the requirement on this basis.”
He added that since the data protection addendum between Microsoft and the NEP references only the GDPR, its transfer commitments “would carry very little weight for a DPA Part Three test in any foreseeable circumstances”.
Guidance on data sovereignty issued by the UK Ministry of Justice states that organisations should consider where data is located, whether it can be accessed (as well as modified, copied or deleted) remotely from a third country, who is managing the service, and where those entities are legally instantiated or located.
“For example, Microsoft Azure’s datacentre is in the UK, but the system administrators can be located in Brazil, New Zealand, the US, etc,” it said, giving the example of Amazon Web Services, which has UK datacentres, but it is nevertheless a US company with global support staff.
“The ‘where’ data is processed is the combination of the answers to the questions above and is much more than just where the servers and hard drives are physically located (data hosting),” it added.
The omission of Microsoft from the national DPIA therefore means the risks associated with it being a US company subject to a wide range of US government surveillance powers have not been clearly addressed.
When asked how this risk has been mitigated, an NEP spokesperson said: “This issue has become more heightened since the publication of the Schrems II judgment in July 2020. It has therefore been addressed in detail in the latest version of the DPIA.”
Again, the DPIA obtained by Computer Weekly, and which was subsequently uploaded to the NEP website, says it was last amended on 2 May 2019, although another version is due for publication in the new year, according to an NEP spokesperson.
Beresford said: “I’ve been involved in the BlackBaud Raiser’s Edge data breach, which was a ransomware attack, and it is emerging now that information [on its cloud platform] was being sent back to America and some of the clients didn’t even know about it.” He added that the DPIA goes into no detail on how this issue was approached by police.
“When the DPIA talks about BT and Deloitte, it does make reference to the fact that there would be no international data transfers, but is that just something they’ve put in there and we’re meant to believe it?” said Beresford. “How was it explored, how was it agreed?”
When asked in FOIs to provide other documents that could potentially demonstrate further compliance – including two official risk assessment documents from the Home Office and the memorandum of understanding agreed between the NEP and Microsoft – Computer Weekly received non-disclosure responses.
While the national DPIA did not mention Microsoft as a processor, let alone in the part of the text relating to international transfers, it did make the assertion that no data would be transferred outside the EU.
When asked whether it could guarantee that police forces will always be certain of where their data is stored and processed on M365, a hyper-scale global cloud service, an NEP spokesperson said: “We have assurances from Microsoft that the majority of the products within M365, including those holding sensitive data, will be held within the UK. The exceptions to this relate to disaster recovery and some products have been specifically risk assessed for this and balanced against the potential for loss of data in these circumstances.”
The spokesperson added that Microsoft is a “core police technology provider” and, as such, has given the NEP access to detailed engineering and technical information on the structure of its M365 software as a service. “This information formed part of the security risk management process undertaken by NEP. The blueprint therefore takes account of the administration and demarcation of support services provided by a global supplier.”
The NEP spokesperson also maintained that the risk of not knowing exactly where data is stored and processed has been considered in the DPIA, along with steps that can be taken to mitigate the risk. However, Computer Weekly has found nothing in the DPIA that considers this or presents mitigation steps.
When asked separately by Computer Weekly whether its terms of service cover data processing under Part Three of the DPA 2018, as well as for clarification on whether data is in fact stored and processed, and the systems supported, from within the UK, Microsoft said it currently does not have any information to share.
Speaking to Computer Weekly about the implications for enterprise data sharing following the Schrems II decision – which invalidated Privacy Shield as a mechanism for sharing data between Europe and the US – Phil Lee, a partner in law firm Fieldfisher’s privacy, security and information group, said that simply committing to storing or hosting data in the UK is not enough, in and of itself, to mitigate the risks associated with the service provider’s remote access.
“Often a lot of the argument around this is driven by positions that are based on psychological perspectives or emotional positions, rather than strict legal positions,” said Lee. “Commercially, in my experience, it’s absolutely true to say that a lot of customers will just feel better if their data is hosted in the EEA or the UK.
“Even if there is still remote access [from overseas] and even if legal problems still exist, because there still could be a technical export there, the fact that it’s hosted here just makes people feel better.”
Any additional safeguards put in place to ensure data remained in the UK would not be able to eliminate the risk completely when using US-based companies, said Lee.
“Vendors can agree that they won’t disclose data on a purely voluntary basis to governments, but if they are forced to hand over data because they receive a court order, they will still have to do so,” he said. “Vendors can agree to host data in the EEA or the UK but, if you look at legislation like the US Cloud Act, governments can reach into overseas datacentres anyway.
“None of the measures we take are going to be safe – the only one that will really solve this issue is a political solution.”
Despite the obvious risks of sending sensitive personal data to a US-based company that is subject to the US government’s surveillance regime, the omission of Microsoft from the DPIA implies this risk has not been properly considered.
Movers and shakers
According to publicly available information, the introduction of Microsoft technologies to UK policing under the NEP has been driven by a small group of senior police officers, principally drawn from former force chief information and technology officers.
As the national senior information risk owner (NSIRO) and NPCC lead for information management, Commissioner Ian Dyson of the City of London Police is responsible for managing the information risks associated with any national policing capabilities, which includes the implementation of M365 through the NEP.
Alongside Dyson, the key movers include NEP director Wayne Parkes, a former head of ICT at West Mercia and Warwickshire, and Ian Bell, CEO of the Police ICT Company a former head of ICT at Cambridgeshire, who was made first director of the NEP in November 2016.
Both have featured regularly in NEP and Microsoft articles extolling the value of the programme and the suitability of the M365 technology for police use, but did not respond to Computer Weekly’s request for comment on the legal issues identified. Instead, responses to their questions, as well as Dyson’s, were provided by an NEP spokesperson, and directly mirrored those given above.
The NEP, which is supported through the Police Transformation Fund, operates under the combined business governance of the APCC and the NPCC’s Information Management and Operational Requirements Coordination Committee, which is chaired by NSIRO Dyson and established to help chief officers interpret data protection in the police environment, overseeing areas such as freedom of information, information assurance and records management on behalf of the police service.
Dyson was also the senior responsible owner (SRO) for the programme from its inception, before handing the SRO responsibility to Essex deputy chief constable Pippa Mills some time in early 2020, whose force is one of the early adopters of M365.
In the DPO advice section of the national DPIA obtained by Computer Weekly, one of the suggested actions is to share a draft with the ICO so it can review and comment on the document.
However, as the sign-off section has been left blank, it is unclear whether this DPO advice was accepted or overruled. Under the DPA, it is mandatory to send a DPIA to the ICO when the processing of personal data presents a high risk that cannot be mitigated.
But when contacted by Computer Weekly, the ICO initially refused to confirm whether or not it had received a copy of the DPIA.
In a separate response from the NEP press office, it said: “We have taken independent advice through specialists such as the ICO.”
The DPIA itself also shows that the NEP was planning to send a version by “no later than 30 April 2019”, but when this information was put to the ICO, a spokesperson responded: “We provided informal data protection advice on the National Enabling Programme, but a data protection impact assessment was not formally submitted for consultation with the Commissioner.”
Where do police go from here?
Three years into the programme, it is not immediately clear what the NEP and forces can do to recover from the legal deficiencies identified by Computer Weekly and others.
Sayers said: “Forces are certainly in a catch-22, with a service in place that does not, and almost certainly cannot, enable them to meet their legal obligations, but with significant public money already spent and more committed. The temptation to do nothing is probably high.”
In response to questions about how much money had been spent on the project so far, an NEP spokesperson responded: “The total programme budget is £57.7m over the four years.”
Sayers added: “Doing nothing is not, however, a valid option, since forces would continue to act in contravention of the law, public confidence would be permanently damaged and risks of fines or civil actions will only grow over time.”
However, he does believe options exist, some of which might enable forces to leverage the internal work they have already done towards M365 adoption, and still retain the new capabilities they feel necessary to meet modern policing challenges.
“There is a healthy domestic cloud and secure hosting landscape within the UK that could legally accommodate these services, along with a number of specialist providers who have both the skills and a desire to innovate in this sector,” said Sayers. “If forces act positively and properly respect the legislation, they could probably pivot to those services fairly easily. If they do so, most of the digital policing strategies may yet still be ultimately achieved.”
Computer Weekly has contacted a number of UK-based cloud and hosting providers with experience in the delivery of police and criminal justice sector services that said they were broadly positive and receptive to the idea of working with police to develop a UK sovereign cloud capability, if of course forces do decide to explore such opportunities.