Close

HEADLINES

Headlines published in the last 30 days are listed on SLW.

Say 'cheese': A temperature scanner took your photo. But wait. Why?

Say 'cheese': A temperature scanner took your photo. But wait. Why?

Source: Business Times
Article Date: 07 Jan 2021
Author: Olivia Poh

Some of the information that are captured  may be compromised and thus a huge cause of concern. There is also the issue of consent.

Seemingly innocuous, temperature-taking cameras with facial-recognition capabilities have been installed across the country. Thousands of people walk up for a temperature check, but are presented with a photo opportunity as well.

A quick snap, and a headshot appears with the user's temperature recorded - but wait, what was the photo taken for and why would operators need these photos?

In view of recent discussions over the use of TraceTogether data, it may be timely to look into how and why private operators are collecting photo and temperature records of the public at entry points of buildings and businesses.

According to device specifications seen by The Business Times, temperature scanners that have front-facing cameras installed are able to keep a log of facial records.

These machines can also recognise faces despite different expressions and headgear being worn. One even claims to accurately identify faces with masks on.

Temperature scanners with camera functions can supposedly detect a user's forehead from farther away, read temperatures faster, and tag temperature and timestamps to users to help in contact-tracing efforts.

BT has also seen one which can also alert visitors if they are not wearing their masks properly.

With video-logging functions and facial recognition, contact tracing can be done via video analytics too, going by certain makes. Some devices can also provide an additional layer of monitoring that logs facial images for general security surveillance.

These extra capabilities can cost up to a few thousand dollars more per device.

But in this case, more might not be better.

The darker side may not be apparent at first glance: The ways user data from these facial-recognition devices are stored, managed and safeguarded are often unclear.

Vendors for these camera-based temperature scanners told BT that it is up to operators to activate the device's photo-taking function, which enables them to keep a log of up to thousands of visitors.

But there is no way visitors will know if operators have chosen to use this function.

Most of these photo-taking temperature scanners are deployed privately for staff in hotels, industrial sites or office buildings, but many are also in shopping malls and retail establishments.

Because visitors are not told that their photos are being taken, they cannot express explicit consent to the collection of their data.

Individuals will also be unaware if their photos have been stored or if there are any safeguards put in place to protect this data.

Furthermore, operators can choose to keep these photos for up to 60 days, in line with personal data protection regulations. This is well beyond the Covid-19 incubation period of 14 days. Whether protocols are in place to manage the collected data responsibly are also at the operator's discretion.

Should such sensitive data be so loosely handled? Security experts told BT that most of these devices have dismal security protection at best. In other words, they can be easily accessed by bad actors, especially if operators connect these devices to the local WiFi.

Ori Sasson, chief executive of cybersecurity intelligence firm Blackscore said that in certain scenarios, these images can be used to fool facial recognition or other image recognition identification.

Hackers can use software to animate stolen facial records, bypassing some in-app facial recognition systems. This might thus allow them to access bank accounts or carry out scams, especially if they also have access to the user's locked phone.

Sanjay Aurora, senior vice-president and managing director (Asia-Pacific and Japan) of cybersecurity firm Darktrace, said that unprotected facial-recognition data are a cause for "extreme concern".

If a person's facial records are compromised, it may already be too late: it is much easier to change a password than a face.

Some may argue that little or no sensitive information is given up, since faces are covered by masks; others may point out that certain devices - such as those with facial-recognition technologies embedded - have more robust security protection.

But that would be skirting the issue at hand. Being at the core of data privacy and protection, operators should not be collecting information from users without consent, or unnecessarily.

Without a good reason, operators are left with additional data to protect for little added value or even none, entailing unnecessary risks and liability.

David Roi Hardoon, former chief data officer for the Monetary Authority of Singapore and currently a senior adviser for data science and artificial intelligence at several organisations, said that any data that is collected but not used poses a risk - no matter how small - for organisations.

"It is thus critical to minimise risks upfront by not collecting what is not needed and that which is not planned to be used."

Mr Aurora of Darktrace called for "carefully crafted regulation" on how organisations use facial-recognition technology, including how they collect and handle the data.

Dr Hardoon agreed, saying: "If we truly want to drive adoption in new technologies, we need to have accountability and a responsibility to make sure data is used properly… It is not only about coming up with principles and guidelines, but operationalising them as well as educating the wider population."

Source: Business Times © Singapore Press Holdings Ltd. Permission required for reproduction.

Print
2072

Latest Headlines

No content

A problem occurred while loading content.

Previous Next

Terms Of UsePrivacy StatementCopyright 2021 by Singapore Academy of Law
Back To Top