top of page
pjansta

Thermal Screening, is everyone a medical expert now?


Last year I published a short article with a similar title. I have expressed some concerns with the rapid emergence of cameras detecting skin surface temperature. This second edition will provide a detailed analysis and growing problems in this sector. Before we dive into the technical part, I will lay out the legalities surrounding this topic.

General Data Protection Regulations (GDPR)

GDPR regulates data processing in the EU. The UK GDPR stipulates general data protection regimes for most UK businesses and organizations. The key terms are:

· Data Controller

The end-users: for example, a shopping mall equipped with a security camera system is the controller of the video surveillance data it collects. (Companies that keep personal data on their employees are considered data controllers).

· Data processors

Cloud providers store personal data on behalf of an end-user. For example, Integrators and manufacturers directly handle video recording data on behalf of the end-user.

· Data subjects

The people being recorded on camera.

· Biometric data

This category is defined as any technique which uniquely identifies a person. For example, Video analytics techniques like facial or age recognition.


Impact and Considerations

Consultants' will advise the end-user to put the signs indicating that the video surveillance occurs. GDPR goes further with details to be provided, such as

  • The identity and contact details of the data controller

  • The purposes of the processing for which the personal data are intended as well as the legal basis for the processing

  • The period for which the personal data will be stored, or if that is not possible, the criteria used to determine that period

  • Informing data subjects of their right to lodge a complaint with a supervisory authority

  • The existence of the right to request access, rectification, and removal of the data

  • The contact details of DPO, if you have one

  • If the data will be transferred to another country, the relevant safeguards in place

  • Recipients of the personal data (if other than the end-user)

  • The existence of automated decision-making, including profiling and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject (facial recognition and biometric techniques)


Furthermore, GDPR legislation stipulates a Data Protection Impact Assessment (DPIA) to be conducted before any system is installed.

  • systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller

  • assessment of the necessity and proportionality of the processing operations concerning the purposes

  • assessment of the risks to the rights and freedoms of data subjects

  • measures envisaged addressing the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with this Regulation considering the rights and legitimate interests of data subjects and other persons concerned


The GDPR allows more extended storage periods for public interest, scientific, or historical research purposes. An end-user that indefinitely stores its video recordings is likely to violate the GDPR unless it can prove it is acting according to these reasons.

Under Art.9, processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data to uniquely identify a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation shall be prohibited.

However, the GDPR recognizes several exceptions to this prohibition. For video surveillance, the relevant exception is the vaguely termed "reasons of substantial public interest." Individual EU member states are currently defining these public interest reasons, mostly related to law enforcement and crime prevention.


Temperature Screening

The GDPR doesn't mention video surveillance. However, fever screening with cameras is directly impacted by the GDPR. Article 9 states that processing personal data "concerning health shall be prohibited, while Article 22 states the people shall have the right not to be subject to a decision based solely on automated processing. However, these Articles have significant exemptions, such as consent or substantial public interest.

In 2020 UK Medicine and Healthcare product Regulatory Agency (MHRD) stated that most products were initially designed for a non-medical purpose. Furthermore, there is no evidence to support an accurate medical diagnosis.

The agency also advised that anyone selling the hardware and claiming a direct link to the virus diagnosis will be prosecuted. Note: There is no scientific evidence to support temperature screening as a reliable method.


Temperature monitoring algorithm

The elevated body temperature detection camera market has exploded, with some manufacturers offering dubious solutions. Such as self-adjusting algorithms, in effect systematically manipulating temperature readings.

For example, Dahua's algorithm relies on taking ambient temperature and subject temperature, using these values to select an adjustment value, the "compensation temperature." This amount adjusts the surface temperature reading, and final reading - the actual temperature is reported. Manufacturers systematically manipulate readings, regularly reporting fever-level temperatures as normal.

Numerous studies concluding that these devices are utilised for detection of possible infectious disease; now globally employed. The implications for inaccurate detection are a widespread false sense of security.



















38 views1 comment

1 Kommentar


Mike Yarrow
Mike Yarrow
17. Jan. 2022

Thermal Screening.

Largely uncontrolled, unenforceable, data is fluid, editable, distributable.

Can be deleted, sold, given, copied and its use subject to abuse for, how many reasons can you think of? A lot.

By whom? Many.

Its collection/recording, storage by whom?

Its efficient, reliable, safe and secure oversight in its entirety by, no one completely.

But, is the collection of your image in anyway more an infringement upon your person than the details you may give in a questionaire, job aplication or governmental document?

Gefällt mir
Post: Blog2_Post
bottom of page