The UK’s Data Protection and Digital Information (No 2) Bill passed its second reading in the House of Commons on 17 April 2023. Completion of that formal stage in Parliamentary proceedings confirms approval of the Bill in principle. From there, the Bill moves into its committee stage for more detailed scrutiny. The second reading debate highlighted the issues most likely to dominate discussions during that committee stage:
EU-UK Adequacy
Introducing the Bill Julia Lopez (Minister for Data and Digital Infrastructure) confirmed that the intention was to ensure a business-friendly regime, without creating regulatory disruption for businesses, particularly those that trade with Europe and want to ensure a free flow of data. The Minister offered reassurance that the UK government has been in constant contact with the European Commission and believes that the UK will maintain adequacy following enactment of the Bill.
Despite that reassurance, opposition MPs highlighted areas of concern already identified by the EU. They include:
- a significant expansion of the Secretary of State’s powers to make regulations to amend the legislation (including the list of “recognised legitimate interests”) with limited Parliamentary scrutiny and to direct the content of codes of practice;
- replacement of the Information Commissioner’s Office (ICO) with a newly constituted Information Commission, subject to more extensive Ministerial influence over appointments, the setting of strategic priorities, approval of codes of practice and the ability to remove the need for consultation on impact assessment;
- increased scope for surveillance, including the use of facial recognition technologies, by law enforcement bodies and intelligence services.
It was also clear from the debate that for some MPs, maintaining EU adequacy is not a priority. Former Minister John Whittingdale dismissed that concern, suggesting that the availability of alternative mechanisms such as standard contractual clauses would suffice. Although perhaps a minority view, such interventions are likely to ensure that the EU remains alert to signs of significant divergence, and ready to accelerate its review of the UK’s adequacy should its concerns remain following the committee stage.
Reduced burden?
The Minister’s introductory statement heralded a reduction in “red tape” and administrative burden. While the principle of streamlining and reduced burden was broadly welcomed, opposition MPs pointed out that the actual effect on businesses operating across jurisdictions might be less beneficial, and that the need to navigate multiple data protection regimes would in itself increase costs and create bureaucratic headaches for business – with businesses that deal with the EU having in any event to adhere to EU GDPR.
In our view, this assessment is likely to be correct – the requirement for many of the UK’s most successful businesses to manage their operations under diverging UK/EU regimes is likely to outweigh the perceived benefits of a modified UK approach. In addition, international companies may be deterred from developing business in the UK if the benefits of operating in the UK domestic market are seen to be offset by additional regulatory burdens and exposure.
Automated decision making and AI
The Bill includes provisions intended to support the “pro-innovation approach” set out in the UK governments AI White Paper. Citing concerns over bias and opacity in algorithms, opposition MPs drew attention to risks such as the SyRI case in the Netherlands, in which innocuous datasets such as household water usage was used by an automated system to accuse individuals of benefit fraud, and to trust-eroding episodes such as the 2020 “debacle of the exam algorithms”. Those concerns are likely, during the committee stage, to focus on clause 11 of the Bill, which would permit solely automated decision making in a wider range of contexts.
Setting out those concerns, Labour MP Rebecca Long Bailey said: AlgorithmWatch explains that automated decision making is ‘never neutral’. Outputs are determined by the quality of the data that is put into the system, whether that data is fair or biased. Machine learning will propagate and enhance those differences, and unfortunately it already has. She also expressed concern that the Bill removes important safeguards that protect the public from algorithmic bias and discrimination. It also provides Ministers with “Henry VIII powers” (allowing the Executive to alter primary legislation through statutory instruments with limited Parliamentary scrutiny) that would allow the Secretary of State to make sweeping regulations on whether meaningful human intervention is required at all in such systems.
Responding to those concerns, Paul Scully (Under-Secretary of State for Science, Innovation and Technology) confirmed that the government intends to introduce a statutory instrument to provide for the monitoring and correction of bias in AI systems “by allowing the processing of sensitive personal data for this purpose with appropriate safeguards”. He also asserted that the absence of meaningful human intervention in solely automated decisions, along with opacity in how those decisions can be reached, will be mitigated by providing data subjects with the opportunity to make representations about, and ultimately challenge, decisions of this nature that are unexpected or seem unwarranted. For example, if a person is denied a loan or access to a product or services because a solely automated decision-making process has identified a high risk of fraud or irregularities in their finances, that individual should be allowed to contest that decision and seek human review and, if that decision is found to be unwarranted on review, the controller must re-evaluate the case and issue an appropriate decision.
The views expressed by Paul Scully did not convince Darren Jones MP (a solicitor and data protection law expert). He observed: Under current law, fully autonomous decision making is prohibited where it relates to a significant decision, but the Bill relaxes those requirements and ultimately puts the burden on a consumer to successfully bring a complaint against a company taking a decision about them in a wholly automated way. Will an individual really do that when it could take up to 20 months? In the world we live in today, the likes of ChatGPT and other large language models will revolutionise customer service processes. The approach in the Bill seems to fail in regulating the future and, unfortunately, deals with the past. I ask again, which stakeholder group asked the Government to draft the law in this complex and convoluted way? It was certainly not consumers.
Next steps
Having cleared its second reading stage, the Bill will move into committee. Those proceedings are due to complete by Tuesday 12 June 2023. From there, the Bill will be scheduled for its third reading before moving to the House of Lords. The program motions agreed to at the end of the second reading debate included a provision that will allow the Bill to be carried over into the next session if any stages remain to be completed. With that timetable, the target date for the Bill coming into force seems likely to be in 2024.