UK’s Home Office to face legal challenge over ‘digital hostile environment’

feature-top

London, June 19: Immigrants’ rights campaigners are to bring the first court case of its kind in British legal history in an attempt to turn off what they claim is a decision-making algorithm that creates a “hostile environment” for people applying for UK visas online.

The Joint Council for the Welfare of Immigrants (JCWI) has been granted a judicial review to challenge the Home Office’s artificial intelligence system that filters UK visa applications. They claim the AI programme is designed to discriminate against applicants from certain nations, The Guardian reported.

In their submission to the high court the JCWI said the algorithm created three channels for applicants including a “fast lane” that would lead to “speedy boarding for white people” to enter the country.

The rights group said applications from people holding suspect nationalities received a higher risk rating. These applications were subjected to far more intensive scrutiny by Home Office officials, took longer to reach a decision and were much more likely to be refused, the JCWI said.

It claims that this results in racial discrimination and therefore breaches the 2010 Equality Act.

The JCWI will argue in the judicial review case that the AI streaming tool is also too opaque and secretive. They point to the existence of a secret list of suspect nationalities. So far, the Home Office has refused to provide the JCWI with meaningful information about the algorithm in their pre-legal action correspondence.

In this action the JCWI and a new group campaigning for justice in the technology sector, Foxglove, will ask the court to declare that the streaming algorithm is unlawful. They will urge the court to order a halt to its use pending a substantive review of the AI decision-making system.

Chai Patel, the JCWI legal policy director, said: “The Home Office’s ‘streaming tool’ has for years had a major effect on who has the right to come here to work, study or see loved ones. And it has been run in a way that, by the Home Office’s admission, discriminates, singling out some people as ‘suspect’ and others as somehow more trustworthy, just because of where they come from. This is the digital hostile environment.”

Martha Dark, a director of Foxglove, said: “Algorithms aren’t neutral – they reflect the preferences of the people who build and use them. This visa algorithm didn’t suddenly create bias in the Home Office, but because of its feedback loop, it does accelerate and reinforce them. Now, when systemic racism is high on the public agenda, is the perfect time for government to reassess this algorithm and all similar systems. The Home Office should scrap the streaming tool and set up a scheme that’s fair for everyone, regardless of colour or creed.”

A spokesperson for the Home Office said it would be “inappropriate to comment whilst legal proceedings are ongoing”.

It is understood that the algorithm is primarily used to establish if a visa application requires more scrutiny from Home Office officials. The final say on whether that individual enters the UK or not rests with human decision-makers.

Earlier this year the Home Office refused to release details of which countries were deemed a “risk” in the visa decision-making algorithm system.

Campaigners were sent a fully redacted list of nations in different categories of “risk” – with their names entirely blacked out – in a Home Office response to their legal challenge over the artificial intelligence programme.

https://www.theguardian.com/uk-news/2020/jun/18/home-office-legal-challenge-digital-hostile-environment

Add a Comment