“Without trust, the benefits the technology may offer are lost”

By Dr Sue Chadwick, Strategic Planning Advisor Pinsent Masons LLP and Research Fellow, Open Data Institute

“Without trust, the benefits the technology may offer are lost” is a key phrase from a recent blog by the Information Commissioner on the use of Live Facial Recognition Technology (LFR) in public places, supported by a formal Opinion.


This is an important topic for the property industry as we explore the use of surveillant technologies in spaces such as leisure venues, shopping centres and residential developments. These technologies have enormous potential to make those spaces safer and healthier, and to generate useful data to improve the wider environment. But without the proper safeguards, they represent a risk – not just in terms of breaching data compliance, but also reputational damage. As the Commissioner noted, of the six organisations investigated, none were fully compliant and “All of the organisations chose to stop, or not proceed with, the use of LFR.


There are three main points to highlight from the Opinion:


Scope of technology


We tend to think of LFR in the same way as a kind of CCTV but unlike CCTV, once the image is captured, it can be shared with other parties, and merged with data on pooled watchlists. The Commissioner describes LFR enabled billboards where embedded cameras categorise the individual on demographic characteristics including “age, sex, gender, ethnicity, race, and even clothing styles or brands” enabling a visitor in one location to be “identified at other locations or on a return visit and served with targeted advertising”. The Opinion points to the potential for LFR to be used for age identification at point of sale, queue time monitoring and management in airports, and for LFR data to be incorporated into larger ecosystems, compared against images from social media and analysed with machine learning technologies including sentiment analysis.


It’s not just GDPR


The Opinion only looks at the GDPR implications of using LFR but it recognises that there is a “high bar for its use to be lawful”. The Opinion references a number of studies showing that LFR has been shown to operate in a discriminatory way which raises wider issues in relation to equalities, and a recent case where it was shown to engage and breach Article 8 of the European Convention on Human Rights which is one of the Convention rights in the Human Rights Act 1998. Reputational risk is also highlighted, and the Opinion notes that LFR has been banned by San Francisco, while IBM, Amazon and Microsoft have all suspended the sale of their facial recognition products to police forces.



Key recommendations


LFR and other biometric-based technologies offer important benefits to the property industry, but we need to take more care when thinking of embedding it into buildings or spaces, whether they are public or private. Here are a few recommendations:

  • Careful procurement. As the Opinion recognises, this sort of technology is often supplied by third parties and there is an opportunity to manage and mitigate risks up front with procurement that anticipates the potential issues;

  • DPIA. Data protection impact assessments are an essential part of ensuring that the use of LFR is compliant with data protection law and the Opinion has a useful Annex exploring how they should be used.

  • EQIA. A good DPIA will include assessment of equalities impacts, but if you are doing a separate assessment as part of a planning application, you could – and perhaps should – include a section on digital impacts, and benefits.


Finally, and as noted in a recent RICS Modus article, you may not need a separate person looking after data ethics, but you should be building them into your property practice, and the RED Foundation Data Ethics Principles are a good place to start.


40 views0 comments