Married partner in UK – Fictitious Marriage Algorithm

Married partner in UK - Fictitious Marriage Algorithm

Public Law Project (PLP) – more on the secret algorithmic criteria used to decide whether to investigate an alleged marriage as “sham”

Investigating sham marriages can be invasive and frustrating, and they seem to target some nationalities more than others.

PLP is concerned about the lack of transparency and possible discrimination in an automated triage system, and we would like to reach out to the people who might be affected as well as the organizations that support them.

Fictitious Marriage Algorithm

Documents previously obtained by PLP under the Freedom of Information Act 2000 indicate that the triage system comes into play if one or both of the spouses who submitted the notice to the registrar come from outside the European Economic Area, are not resident in the UK, or do not have a valid English visa.

If one of these conditions is met, the couple enters the triage system.

The algorithm processes the pair’s data, applies secret criteria, and gives the pair a green or red light.

A red light means that an investigation is required to identify or rule out fictitious activity.  The couple is asked to provide additional information and, often, to attend interviews and cooperate with visits to their residence.

This can be a very intrusive process that many couples may be reluctant to agree to.

If they refuse, the couple cannot register their marriage.

If they do comply, immigration officials will use the new information to determine if the marriage is a sham.

If the decision is issued against the couple, they can still marry, but their immigration status will be in jeopardy and one party or the other could face deportation from the UK.

Despite repeated PLP requests, the Home Office refused to disclose the criteria used by the algorithm.

What is there to worry about?

As we now understand it, the system raises three main concerns.

First, because the criteria used by the algorithm remain secret, there are concerns about procedural fairness.

The Department of the Interior insists that publication of the criteria could harm its ability to investigate possible sham marriages and would not be in the public interest.

We believe that public law standards require disclosure of how the system works, and refusal to publish is unlikely to be justified on public interest grounds.  When decisions are made by an algorithm, there must be transparency and accountability.

Second, there are concerns that the algorithm may be discriminatory.  This is because some nationalities, including Bulgarians, Greeks, Romanians, and Albanians, are more likely to be investigated than others.

Perhaps nationality is included in the algorithm’s criteria.  In that case, the algorithm may be outright discriminatory, contrary to Sections 13 and 29 of the Equality Act 2010.

Even if the criteria do not include nationality, the algorithm may nevertheless be indirectly discriminatory if it has a systemic negative effect on people of a particular nationality, contrary to sections 19 and 29 of the Equality Act 2010.

A third problem is that decision-making at the investigative stage can be flawed because of “automation bias,” that is, over-reliance on automated decision-support systems.

If the officer investigating a sham marriage knows that the algorithm has given the couple a red light, he or she may be predisposed to conclude that the relationship is sham.

In other words, the system may encourage reliance on irrelevant considerations.

Leave a Reply

Your email address will not be published. Required fields are marked *