
The trial of facial recognition technology (FRT) by supermarket chain Foodstuffs North Island (FSNI) complied with the Privacy Act and can be used, with appropriate guardrails to protect individuals’ privacy.
That is the overall finding by Privacy Commissioner Michael Webster, who this week released his review of the FRT trial.
FSNI trialled facial recognition between February 8 and September 7 last year, in 25 supermarkets. The Privacy Commissioner’s report said during that time, nearly 226,000 faces were scanned. This includes multiple scans of the same person.
Of the scans, 99.999 per cent were deleted within one minute, the Privacy Commissioner said in its inquiry.
During the trial, FRT produced 1742 alerts of which 1208 were confirmed matches. When a potential match alert happened, it was sent to a handheld FRT Zebra device carried by authorised staff.
The Zebra device is about the size of a smartphone, operates on the instore network only and cannot upload individual matches to the supermarket’s watchlist used to verify persons of interest.
Staff can also edit alerts and add notes to the FRT watchlist with the device.
There were limitations with the Zebra devices, namely that the faces they display are extreme closeups of the face only, with no hair or clothing shown. This made it harder for staff to accurately locate and identify individuals, and increased the risk of misidentification, the Commissioner noted.
Bigger image sizes for the devices were one of the recommendations made by the Privacy Commissioner, and they should also provide real-time notifications to staff if misidentification occurred, or if a process step wasn’t followed.
Nevertheless, FRT’s technical accuracy could be affected by real-world conditions such as lighting, movement and facial occlusions, the report noted.
Bias and misidentification risks with technology
Although the FSNI trial was generally compliant with the Privacy Act, the Commissioner noted there was potential for misidentification and bias. Apart from the Zebra devices needing improvement, the Commissioner noted some documented demographic characteristics like skin tone can lead to higher error rates.
For New Zealand, this is a particular concern as there is no specific training data for the country’s population. This makes it difficult to be confident about accuracy for Māori and Pasifika people in particular.
The Commissioner noted that even with low error rates, scanning millions of people’s faces can result in a substantial number of misidentifications. Of the 1742 alerts, nine were mistakes. Due to data limitations, the actual number could be higher however, and the misidentifications caused harm to the individuals in question.
A minimum trigger threshold of 92.5 per cent, and a potentially higher one when FRT is operational of 94 per cent, was recommended by the Commissioner before staff intervene.
Staff need robust training and refreshers to appreciate the seriousness of misidentification and to exercise human judgement to confirm match accuracy. They also need to be aware of unconscious bias, particularly when it comes to misidentification risks for Māori and Pasifika customers, which FRT itself can be beset with.
Chilling effect from increased surveillance
FRT is not a technology most people are comfortable with, the Commissioner’s report noted.
Many complained that their privacy was breached with FRT, and they felt they were under surveillance, with their right to free movement being violated.
Furthermore, supermarket shoppers weren’t universally aware that FRT was being trialled, despite signage - the report found just 67 per cent were aware the technology was being used.
Since live FRT can be used for surreptitious real-time monitoring of large groups of people, it is considered highly intrusive and contributes to a culture of surveillance, the report found. FRT can also be used to target people based on their past behaviour, which is known as profiling.
This raises privacy and ethical concerns, particularly since earlier behaviour isn’t always indicative of future actions. The Commissioner also noted the sheer scale of biometric information gathering from all customers entering and moving through the supermarkets is a major concern.
Reduction in violence seen during trial
Justice Minister Paul Goldsmith welcomed the report, saying FRT “is an effective way of combating retail crime."
“It found the technology is effective at reducing harmful behaviour towards retailers, especially serious violent incidents,” Goldsmith said.
The number of violent incidents saw a steep drop during the trial, the Commissioner’s report said, suggesting FRT has a strong deterrent effect.
“... putting all the data together with our own observations, we accept that the evaluation demonstrates that introducing FRT made a clear difference in FSNI trial stores, especially as it relates to the most serious incidents involving violence,” the report stated.
“Twelve out of 85 incidents included an element of physical violence in the first half of the trial, which decreased to one out of 48 incidents including an element of violence in the second half,” the report said.
Carolyn Young, the chief executive of Retail New Zealand, also pointed to the safety aspects FRT is said to bring.
“Retailers are crying out for proactive solutions that prevent crime and enhance the safety of their staff and customers. Our members continue to face high rates of violence and crime, putting both their employees and the public at risk, as well as threatening the financial sustainability of retail businesses,” Young said.
The association said retail crime affects nearly all retailers and costs over $2.6 billion a year.
Overall, the Privacy Commissioner said FRT has potential safety benefits, but the tech also raises significant privacy concerns, such as over collection of personal information and unfair application.
“Organisations should consider the seriousness of the problem they’re trying to solve and consider what other options are available. If FRT is considered necessary, its use should be done in a way that minimises the associated privacy risks,” the Commissioner said.
FRT will be covered by the Office of the Privacy Commissioner’s Biometric Privacy Code, expected to be published in the middle of the year.
2 Comments
There is some sort of camera / IT phobia going on here.
This the commissioner looks at the new system like it's a big deal.
As for a staff member close to the door seeing and identifying troublemakers. And sometimes getting it wrong. No worries.
Not sure that's the right take though. FRT is able to - as the story says - to monitor large amounts of people and their movements across stores, tirelessly, unlike humans. That is something the Privacy Commissioner would have to consider, as "mission creep" with the new tech is really tempting for FRT operators.
We welcome your comments below. If you are not already registered, please register to comment.
Remember we welcome robust, respectful and insightful debate. We don't welcome abusive or defamatory comments and will de-register those repeatedly making such comments. Our current comment policy is here.