Today, we have no insight into how algorithms make key decisions that impact businesses and individuals… and the UK Government is beginning to act.
In the last 18 months the UK Government has released four reports examining key areas of harm online.
Authored by the Department for Digital, Culture, Media and Sport (DCMS) and The Department for Business, Energy and Industrial Strategy (BEIS), over 750 pages of analysis dig into:
- The sustainability of public interest news (Cairncross Review);
- Systemic privacy breaches in digital advertising (ICO’s Report into Adtech and RTB);
- The dangers of online targeting (CDEI’s report Review of Online Targeting); and
- Harms caused by the dominance of Google and Facebook (CMA Online Platforms and Digital Advertising Report).
Read together, a number of themes emerge around the harms of data centralisation. Across all four reviews, centralisation is identified as a root cause for many problems that each study independently investigates.
In this post, we take a look at opaque algorithms used in online targeting - an issue flagged by both the Centre for Data Ethics and Innovation (CDEI) and The Competition and Markets Authority (CMA).
Large, centralised data sets are core to the provision of targeting systems that drive the personalisation of services online. Increasingly sophisticated, these systems can be harmful, discriminatory and exploitative when making decisions on behalf of consumers whose data they hold.
But when data is stored centrally, access is often restricted, and this limits the ability of individuals, business and regulators to scrutinise how it is being processed. This is problematic across multiple areas of the digital landscape. And, unsurprisingly, it features prominently in both CDEI and CMA’s reports.
Both studies highlight ‘black-box’ decision-making as damaging for businesses and individuals. Neither are able to fully understand or challenge how decisions are made in products and services they use daily. Worse, with data siloed and hidden, there is also no independent verification. Online platforms both set the rules and mark their own homework - with consumers and businesses unable to hold them to account.
At present, opaque algorithms are not effectively scrutinised, and outcomes are meekly accepted by sceptical users who must trust platforms like Google are operating in their best interest.
Recommendations and Action
After taking time to highlight the potential harms of unaccountable targeting in their reports, both CDEI and CMA set out steps to fix the issue.
Notably, CDEI tasks the soon-to-be-created Online Harms Regulator to provide oversight of online targeting. And, in the same vein, the CMA also turns to government enforcement as a required solution – recommending that a Digital Markets Unit be established to enforce a code of conduct that will govern online platforms.
In our opinion, this shows that both DCMS and BEIS believe online platforms like Google and Facebook won’t change their behaviour or act in the best interests of consumers without intervention. Enforced access to previously inaccessible and siloed data will be key in enabling an effective audit of algorithms and models that are increasingly central in our lives.
While welcome, consumers and businesses don’t have to wait for government intervention to end black box decision-making.
At Glimpse, we are breaking data out of centralised silos and bringing it into the control of individuals by default. We built Glimpse to enable businesses to deliver personalised services to consumers at scale while ensuring the user retains absolute privacy and control.
To find out more …