Australia’s two most populous states are experimenting with facial recognition software to allow police to check whether people are at home during COVID-19 quarantine, trials that have caused controversy over the vast majority of the country’s population.

The little-known technology company Genvis stated on its software website that New South Wales (NSW) and Victoria (Sydney, Melbourne, and home to more than half of Australia’s 25 million people) are experimenting with its facial recognition products. Genvis said these trials were conducted on a voluntary basis.

The Perth, Western Australia-based start-up company developed the software 2020 in cooperation with the Western Australian Police to help implement pandemic movement restrictions, and expressed a desire to sell its services abroad.

South Australia began experimenting with a similar non-Genvis technology last month, sparking warnings from privacy advocates around the world about potential over-surveillance. The states of New South Wales and Victoria did not disclose that they are experimenting with facial recognition technology, but their involvement may exacerbate these concerns.

New South Wales Governor Gladys Berejiklian said an email that the state “is about to pilot some home isolation options for returning Australians” but did not directly answer questions about Genvis’ facial recognition software. The New South Wales Police referred the issue to the governor of the state.

Victoria Police referred the question to the Victorian Health Department, which did not respond to a request for comment.

Under the system being tested, people respond to random registration requests by “selfie” at the designated home isolation address. If the software that also collects location data does not verify the image based on the “face signature”, the police may follow up to visit the location to confirm the whereabouts of the person.

See also  The best shared grocery list app to help you save your next shopping trip

Although the technology has been use in Western Australia since November last year, it was recently promoted as a tool that allowed the country to reopen its borders, ending the need for international entrants to stay within two weeks since the pandemic began. system. Isolate in a hotel under police custody.

addition to the pandemic, the police force has also expressed interest in the use of facial recognition software, which has triggered strong opposition from rights groups against the potential for targeting minorities.

Although identification technology has been used countries such as China, no other democracies are considering using it for coronavirus containment procedures.

‘Keep the community safe’

Except for the disclosures on the product website, Genvis CEO Kirstin Butcher declined to comment on these trials.

She said a telephone interview: “If you want to ensure the safety of the community, you cannot go to home quarantine without compliance checks.”

“You can’t do physical compliance checks at the scale needed to support the (social and economic) reopening plan, so you have to use technology.”

But human rights defenders warned that the technology may be inaccurate and may open windows for law enforcement agencies to use people’s data for other purposes without requiring specific laws to stop it.

Toby Walsh, a professor of artificial intelligence at the University of New South Wales, said: “I am not only troubled by the use here, but also because it is an example of the gradual use of this technology our lives.”

Walsh questioned the overall reliability of facial recognition technology, that he could be hacked to provide false location reports.

See also  C # Software Engineer Supervisor-Vice President

“Even if it works here…then it validates the idea that facial recognition is a good thing,” he said. “Where does it end?”

The Western Australian Government stated that it prohibits the police from using data collected by COVID-related software for non-COVID matters. Police Western Australia said they have used facial recognition technology to isolate 97,000 people at home without any incidents.

“The law should prevent the use of isolated surveillance systems for other purposes,” said former Australian Human Rights Commissioner Edward Santo, who now leads the artificial intelligence ethics project at the University of Technology Sydney.

“Facial recognition technology seems to be a convenient way to monitor quarantined people, but… if there is a problem with this technology, the risk of injury is high.”

© Thomson Reuters 2021