How would you feel if big data analytics were to decide on your freedom and your actions? And if these analyses would determine the course of your process? Are we sufficiently aware of the impact this has on an individual’s life?

Screening

Using national security as an argument, governments are increasingly controlling their citizens: screening of databases and digital watchlisting systems have been the trend in recent years. The fight against the Covid-19 pandemic seems to reinforce the massive data collection with health as a trump card.

Since Snowden revealed in 2013 that governments have access to data of numerous (not only) American citizens through all kinds of technical and legal channels, our awareness and concern about this does not seem to stop. Even though governments have legally regulated their access to datasets containing both public and private data, the problem is not only governmental. Private companies (especially technology giants) get ever more access to massive data, which can lead to an unprecedented balance of power between citizen/consumer and government/corporation. The more people know about you, the more an individual ends up in a precarious position.

Blacklisting

Big data technologies make it possible to detect and then isolate data from digitally generated data that stands out, including associative and correlative data, data patterns and algorithmically matched data. I It is important to keep in mind that both association (when there is a pattern that is too strong to be purely coincidental) and correlation (which is a form of association in which the relationships in a linear trend are scaled according to their proximity) can be both strong and weak.

We also refer to these data that stand out from the analysis, as suspicious data.

Linking these suspicious data to a series of consequences for the individual, whether by a government or a private company, obviously has an enormous impact. After all, big data technologies can create a suspicion of guilt when a person is linked to suspicious digital data or results from the screening of databases. This individual then ends up on the ‘blacklist‘.

Big data blacklisting is the process by which individuals are categorized as ‘guilty’ on the basis of suspicious digital data and results of the screening of databases, until their innocence has been proven, thus excluding them from certain rights or benefits.

Black Box Society

The mechanism that leads to blacklisting implicitly shifts the burden of proof to those who are digitally blacklisted.

A potential risk of this is the ‘Black Box‘ problem: in a ‘Black Box Society‘ there is often a lack of access to all the evidence of how an individual ended up on the blacklist (due to the inability to interrogate algorithms and databases).

Moreover, it is wrong to think that big data analysis only uncovers causal relationships, whereas it is actually only about correlation, which can also be strong or weak. The problem with correlations is therefore that they are not always correctly interpreted, so that they are often generalized for a population group and therefore do not apply to the individual, with the result that individuals are wrongly placed on the blacklist.

Conclusion

A decision that is made via big data technologies must be fearful of transparency. If the trap of the Black Box is not avoided, such a decision will often not be in accordance with the presumption of innocence, which is necessary for the principle of freedom.

Want to know more about big data blacklisting, don’t hesitate to contact us at hallo@dejuristen.be or read our next blog in this series!

Written by Emiel Koonen, Legal Adviser theJurists, and Kris Seyen, Partner theJurists

Our services

Information Technology

Intellectual Property

Privacy

e-Compliance