Campaign group calls for public register of algorithms used by councils, files complaint with Information Commissioner

Campaign group Big Brother Watch has hit out at councils’ use of algorithms, claiming that most of those it had uncovered were “secretive, unevidenced, incredibly invasive and likely discriminatory”.

In a report, Poverty Panopticon: the hidden algorithms shaping Britain’s welfare state, BBW claimed:

  • 540,000 benefits applicants were secretly assigned fraud risk scores by councils’ algorithms before they could access housing benefit or council tax support.
  • Personal data from 1.6 million people living in social housing was processed by commercial algorithms to predict rent non-payers.
  • 250,000+ people’s data was processed “by a range of secretive automated tools to predict the likelihood they’ll be abused, become homeless or out of work”.

BBW said one algorithm, used by two London councils, claimed to predict residents’ risks of negative impacts arising from the coronavirus pandemic “and even whether they were likely to break self-isolation rules”.

The report also criticised the London Borough of Hillingdon’s ‘Project AXIS’, aimed at assessing children’s risk of future criminality, which “gathers data from police, schools, social care, missing persons, care homes, and even social media, without residents’ knowledge”.

Article continues below...


BBW called for information on algorithms to be made publicly available, saying “secretive systems of digital suspicion should not be used behind closed doors”.

It added: “With private companies contracted to supply many public sector algorithms there is still little detail known about how most of these so-called ‘black box’ algorithms work. Commercial confidentiality can also mean that individuals rarely know how automated systems could be influencing decisions about their lives.”

BBW said it wanted to see a public register of algorithms that inform decision-making in the public sector, and for authorities to conduct privacy and equality assessments before using predictive tools to mitigate the risks of discrimination. It claimed that such assessments were rarely conducted.

The group has also lodged a complaint with the Information Commissioner, calling for an “urgent inquiry to uncover and regulate the Wild West of algorithms impacting our country’s poorest people.”

Jake Hurfurt, Head of Research and Investigations at Big Brother Watch, said: “Our welfare data investigation has uncovered councils using hidden algorithms that are secretive, unevidenced, incredibly invasive and likely discriminatory.

“The scale of the profiling, mass data gathering and digital surveillance that millions of people are unwittingly subjected to is truly shocking. We are deeply concerned that these risk scoring algorithms could be disadvantaging and discriminating against Britain’s poor.”