As guests arrive at eastern Australia’s Warilla Hotel, a small camera equipped with facial recognition software scans their faces as part of a scheme to tackle problem gambling.
The tech – which uses artificial intelligence (AI) to identify addicts who have asked to be barred from betting sites – is set to be rolled out across gambling venues in the state of New South Wales next year.
Supporters say it will help curb problem gambling in a country where the addiction affects about 1 percent of the population and annual losses run to billions of dollars.
But the technology is “invasive, dangerous and undermines our most basic and fundamental rights”, said Samantha Floreani, programme lead at the non-profit group Digital Rights Watch.
“We should be exceptionally wary of introducing it into more areas of our lives and it should not be seen as a simple quick-fix solution to complex social issues,” she said.
The Warilla Hotel did not respond to requests for comment. Its website states it supports “responsible” gambling.
The AI scheme’s organisers, industry bodies ClubsNSW and the Australian Hotels Association NSW (AHA NSW), said “strict privacy protections” were in place.
Facial recognition systems use AI to match live images of a person against a database of images – in this case, a gallery of people who have voluntarily signed up to a “self-exclusion” scheme for problem gamblers.
If the camera identifies someone on the state-wide database, a member of staff is alerted so they can be denied entry to casinos or escorted away from slot machines in hotels and bars.
“We think this is the best opportunity we’ve got in preventing people who have self-excluded from entering the venues,” said John Green, director of AHA NSW.
The data collected will be secured and encrypted and will not be accessible by any third parties, including the police and even the gambling venues, said Green.
However, digital rights groups said the tech was ineffective in stopping problem gambling and could go on to be used for wider surveillance, adding such projects underline the need for tougher privacy and data rights laws to protect citizens.
“People who opt into self-exclusion programmes deserve meaningful support, rather than having punitive surveillance technology imposed upon them,” said Floreani of Digital Rights Watch.
“And those who have not opted into these programmes ought to be able to go to the pub without having their faces scanned and their privacy undermined.”
Digital rights campaigners want Australia’s 1988 Privacy Act to be reformed to better address the use of facial recognition technology, and clarify when and how it can be used.
Facial recognition technology is increasingly used across the globe for everything from unlocking mobile phones to checking in for flights. It has also been adopted by some police forces.
Advocates say it helps keep public order, solve crime and even find missing people.
Critics say there is little evidence it reduces crime and that it carries an inherent risk of bias and misidentification, especially for darker-skinned people and women.
Gambling industry bodies have said the facial recognition cameras would only be used to enforce the self-exclusion scheme.
But a draft law introduced in New South Wales’s parliament last month, which will formally legalise the tech in clubs and pubs includes language that would incorporate other uses, including people banned for being too drunk.
“There’s a capacity for scope creep, the capacity for this to facilitate further uses,” said Jake Goldenfein, a senior lecturer at Melbourne Law School, who studies technology.
He called for more regulation on facial recognition due to the sensitivity of the data captured and the heightened risks from data breaches.
“Facial templates are … not something we can change. If we lose control over our biometric information, it becomes particularly dangerous,” he said.
Advocates for reform have pushed for measures such as reduced opening hours of gambling venues and limits on the value of bets.
The use of facial recognition technology is the industry’s way of delaying such reforms and is unlikely to have a “practical effect” on problem gambling, said Tim Costello, chief advocate at the Alliance for Gambling Reform, a pressure group.
“The clubs are trying to look proactive … it’s complete window dressing to stop real reform,” he said.
Green at AHA NSW said a survey of self-excluded gamblers found that more than eight in 10 respondents felt using facial recognition would be effective.
There is growing pushback against facial recognition in Europe, the United States and elsewhere, with companies including Microsoft and Amazon ending or curbing sales of the technology to the police.
In Australia, retail giants Bunnings and Kmart halted the use of facial recognition technology to monitor customers in their stores earlier this year after the country’s privacy regulator opened an investigation into whether they had broken the law.
Consumer rights group CHOICE, which referred the brands to the regulator, said the tech was “unreasonably intrusive” and “customers’ silence cannot be taken as consent” to its use.
The Australian Human Rights Commission last year called for a ban on facial recognition technology until it is better regulated with “stronger, clearer and more targeted” human rights protections.
“There are questions that existing law doesn’t have very good answers to,” said law lecturer Goldenfein.
“There’s so many ways to help problem gamblers that the idea that facial recognition technology is the solution is, frankly, preposterous.”