Victoria Police have tried to distance themselves from controversial facial recognition company Clearview AI. (Photo: Victoria Police, Facebook)

Face Off: our police and the secret spy tech

If a Victoria Police officer wants to find out what George Clooney looks like, they’ve got Clearview AI. For the rest of us, we have an opaque system of police secrecy to contend with.

Hatch has received documents through a Freedom of Information (FOI) request relating to Victoria Police’s use of the controversial facial recognition app, Clearview AI.

The FOI request was filed in March when Buzzfeed News revealed four Australian police departments – the Australian Federal Police, and the South Australian, Queensland and Victorian police forces – had used Clearview.

These departments were among more than 2200 public and private institutions across 27 countries using the app, including some in countries with poor human rights records.

While initially denying any involvement, it was later confirmed over 1000 searches had been made across the four departments.

Hatch can report seven members of the Victoria’s Joint Anti-Child Exploitation Team (JACET) were trialing the technology from November to as recently as March this year.

Thirty-two emails from Clearview to JACET have been obtained by Hatch, showing a sloppy and dated sales pitch which has not been tailored to an Australian market.

Clearview AI’s sales team has clearly not been through cultural awareness training.

Officers are encouraged to try searches on themselves, or on ‘well-known’ celebrities like George Clooney or Joe Montana.

Clooney and Montana were likely thought to be recognisable in a male-dominated police department, but here in Australia ex-NFL star Joe Montana doesn’t carry the name recognition they might have hoped for.

The officers were told to ‘run wild’ with searches which will not be saved on the Clearview database, unlike the 3 billion photos the searches are checked against – which was compiled without the subjects consent.

This is why Clearview is so controversial; it’s immense database is scraped from publicly available images on social media platforms and other sources such as mug shots.

Social media platforms such as Facebook have requested Clearview remove images obtained from their platform, but Clearview has refused as it is not obliged to do so.

Clearview AI’s database is built on publicly available images from social media and mugshots, among others.

While pushing clients to make as many searches as possible, the tech company makes the unsubstantiated claim that “investigators who make 100+ Clearview searches have the best chances of successfully solving crimes with Clearview in our experience”.

This is despite the inclusion of a disclaimer in an earlier email advising Clearview search results are indicative and are not to be used as evidence in a court of law.

Despite promises of the searching power of the app, Clearview makes this disclaimer of its limitations.

Exactly how accurate Clearview’s results are is difficult to ascertain, however while facial recognition technology has become increasingly accurate in recent years, there are continuing concerns over gender and racial bias in its effectiveness.

A 2019 US study found Asian and African American people were more than 100 times more likely to be misidentified using facial recognition than white men. Women, children and the elderly were all more likely to be incorrectly identified as well.

As with most technologies, this is due to facial recognition systems being trained on people similar to those developing them – mainly middle-aged white men.

For this reason the city of San Francisco banned the use of facial recognition systems in it’s police department last year despite being the epicentre of burgeoning tech.

In response to the recent Black Lives Matter protests over policing of minority communities, tech giants IBM, Amazon and Windows all announced they would not sell or at least pause the police use of facial recognition until the matter is legislated federally.

Victoria Police was only engaged in a trial of the product and ultimately decided not to pursue it. We do not have any evidence to suggest it assisted with any investigations.

The department continues to use their own facial recognition system ‘iFace’ however.

Australian Federal Police (AFP) initially rejected lawful FOI requests pertaining to its relationship with Clearview until a question on notice from Shadow Attorney-General Mark Dreyfus forced the department to respond.

“The use by AFP officers of private services to conduct official AFP investigations in the absence of any formal agreement or assessment as to the system’s integrity or security is concerning,” he said in a statement.

In October of last year, the Parliamentary Joint Committee on Intelligence and Security reviewed two related bills, the Identity-Matching Services Bill 2019 and the Australian Passports Amendments Bill 2019.

Contained within these bills is the implementation of a National Biometric Facial Recognition Capability, or as it is known as in shorthand as: ‘the Capability’.

The Capability is the implementation of a nation-wide, interjurisdictional program for matching facial biometrics of people’s faces against its database for swift identification of people.

It was first suggested in 2014, and in 2015 the then Minister for Justice, Michael Keenan, announced $18.5 million would be allocated to develop the Capability.

The Department of Home Affairs has repeatedly stated the Capability in its current form will not be able to identify people in real time through public CCTV cameras, but will be mainly used for fast tracking identification work.

Despite these assurances, the Parliamentary Joint Committee on Intelligence and Security, among many public advocacy groups, expressed privacy concerns over this bill.

It has been withdrawn for further review.