Thermal cameras: the aim does not justify the means!

21 April 2022 | GDPR, Privacy

Automating. It seems so obvious with technology. And we are all better off for it, aren’t we? And yet …

When technology plays with our personal data, we still do not seem to have developed the reflexes to protect it. And often this goes against convenience, against added value and usefulness, even against fighting a pandemic.

Digital fever scanners? They may have disappeared from the scene for a while, but let the penalties for the airports of Zaventem and Charleroi be a wake-up call to respect GDPR principles.

Temperature measurement as processing of personal data

The gradual recommencement of social and economic life during the pandemic was not without its difficulties. For example, it is not inconsiderable that persons developing a fever were denied access to premises in order to prevent further contaminations.

It is also to be expected that technological solutions will be sought to make this as efficient as possible.

The airports of Zaventem and Charleroi have chosen to implement this for a certain period of time via a so-called advanced thermal camera system.

Temperature readings only

The mere measurement of a person’s temperature using a conventional or digital thermometer does not in itself constitute the processing of personal data. At least, insofar as these results are not individually registered afterwards.

A first danger lurks just around the corner: if, also in the context of an access control, this “scan” is filmed, or is accompanied by the simultaneous reading of an access badge (and the refusal of access), then the individual will be able to be individualised. And the GDPR will therefore apply without prejudice.

Temperature measurement with registration

It becomes more concrete when a manual temperature measurement is explicitly linked to an additional registration – so the temperature itself does not have to be registered. It may be that the access refusal is documented (e.g. within the framework of a contractual relationship).

The GDPR then applies without question. The starting point is that the processing of such medical data is fundamentally prohibited.

Advanced electronic measurement

However, when we use a digital advanced fever scanner (this is a kind of thermal camera), this falls under the GDPR anyway. Even if there is no further storage or recordings.

This is because the term ‘processing’ does not only refer to the mere storage of data, but has a much broader application. In other words, automated processing involves the collection without further storage or recording, but the data are of course first processed (electronically).

It is a common mistake in automated processing or processing where the purpose does not lie in the storage itself: “But it is not being stored”, or “But I am not interested in that data”, or even better “But it is already deleted after 15 minutes”. Unfortunately. That is not the point….

Fundamental prohibition of health data processing

At the peak of the pandemic, theJurists regularly had to advise on the permissibility of such systems in companies. The main obstacle each time was the fundamental prohibition of processing medical personal data, unless one could invoke a specific exceptional situation. But those are very limited and only apply to a limited extent.

The temperature checks in Brussels and Charleroi

Chaos and improvisation were certainly there. As was urgency. Difficult circumstances and good intentions too. And yet, the Data Protection Authority ruled that the airports of Brussels and Charleroi had made a serious mistake and immediately imposed a fine of EUR 200,000 and EUR 100,000 respectively.

Where did it all go wrong?

No one disputed that the maligned temperature checks involved processing of personal data and were therefore subject to the GDPR. Thermal cameras were used during the first-line checks. These automated systems are in any case subject to the GDPR. However, the passengers were also filmed and these images were made available to the operator for a short period of time so that he could remove the people with high temperatures from the queue and have them do a second walk-through.

The images were given a frame: red for >= 38° Celsius and green for<38° Celsius. This immediately involved medical data, which enjoys special protection.

The DPA has taken this opportunity to make a number of things very clear. Covid-19 may be off the table for the time being, but the principles elaborated in the decision are guiding many other discussions on the application of the GDPR.

No valid legal ground

Processing can only take place if it is lawful. This is not the same as deciding that you are ok with the processing – the legitimacy must be proven by one of the grounds of the GDPR.

Furthermore, when it comes to medical data, the processing is in principle prohibited, unless one can invoke one of the specific grounds for exemption that are also listed in the GDPR.

The airports here argue their task of general interest (legitimacy), as well as the Protocol for Commercial Aviation and the Ministerial Decree that hangs over it (legal basis as grounds for exception).

Both arguments are dismissed: the protocol is not a precise legal norm whose application is predictable for those concerned. Nor is it demonstrated that the processing is necessary for the general interest.

Failure to comply with the obligation to provide information

The absence of a valid legal ground may be a very technical discussion, but significant is the DPA’s decision that the airports have failed in their information obligation.

Here we enter the realm of privacy statements. We too have noticed that these are often very ambiguous and vague, and thus do not meet the transparency requirements of the GDPR. Yet this is one of the cornerstones of this regulation.

It seems very likely that the airports, like many companies, have embellished a ‘model declaration’ with some generalities, without providing clarity that can be understood by those concerned. This leads on the one hand to marketing talk (“your health and safety are important to us”), and on the other to vagueness such as a retention period that is “no longer than necessary“.

The DPA is once again putting the spotlight on the importance of real transparency.

A defective data protection impact assessment

Equally significant is the finding that the data protection impact assessment (DPIA) was not carried out properly.

A DPIA is a necessary condition that must be met before certain processing activities are started.

Even though the GDPR is vague in how this DPIA should look like, you better make sure it is done seriously and thoroughly.

What if your DPIA shows that the intended processing of personal data entails a high privacy risk for those involved, and you cannot find sufficient measures to limit this risk? You will then have to consult with the DPA before starting the processing. This is called a prior consultation.

Conclusion

Automated processing of personal data, especially when it concerns medical data, should not be taken lightly. After all, a valid legal ground cannot simply be manipulated. Necessity may break the law, but not the GDPR.

It also makes it clear once again that transparency is essential: say what you do, and do what you say. A good privacy statement is therefore certainly not a “one size fits all”!

Moreover, if you find yourself in a situation where an DPIA might be necessary, then you better be safe than sorry by doing serious work on this before starting the processing. And take the results into account!

Do you also feel the need to be compliant, but can’t see how? Talk to our experts about it at hallo@dejuristen.be. We will identify the needs of your company, and work out ready-made solutions together with you!

Written by Kris Seyen, Partner theJurists

Our services

Information Technology

Intellectual Property

Privacy

e-Compliance