Reuse of Twitter data: Belgian DPA fines NGO for ‘fake news’ study
On January 27, 2022, the Belgian Data Protection Authority (“ODA”) fined the NGO EU DesinfoLab (“EU DisinfoLab”) and one of its volunteer researchers (together “accused”), for violating the General Data Protection Regulation (“GDPR“) as part of a study of “tweets” posted on Twitter concerning the “Benalla case »an incident that caused a stir in the French and international media.
The Benalla case and research on fake news
EU DesinfoLab is a Belgian NGO which focuses on the fight against disinformation campaigns and “fake news”. In 2018, the French media revealed a series of incidents concerning Mr. Alexandre Benalla, security officer of French President Emmanuel Macron. Noting unusually high activity on social media following the Benalla affair, she analyzed Twitter posts on the subject in a study titled “Benalla case. The stations of a hyperactivism on Twitter” (free translation: “Benalla case. Means of hyperactivism on Twitter”). The study looked at how and why the case was such a big topic on Twitter and whether misinformation played a significant role there. As part of this research, the defendants analyzed the political profile of the authors of tweets relating to the case and established that some could be linked to Russian media such as “Russia Today” and “Sputnik”. Faced with criticism after the study was published, the defendants published raw data, including the Twitter profiles of a large number of people.
Following this, a number of affected individuals filed complaints with the DPA and its French counterpart, the Commission Nationale d’Information et Libertés (“Commission Nationale d’Information et Libertés”)CNIL”) regarding: (i) the reuse of personal data from 55,000 Twitter accounts to carry out the study (in which more than 3,300 accounts were politically classified); and (ii) the posting of files containing the raw data of the study.
Since EU DesinfoLab is headquartered in Belgium, the DPA acted as lead supervisory authority.
Exceptions for journalism and scientific research?
In its assessment of the case, the DPA distinguished between two processing activities: (i) the EU DesinfoLab study which is based on personal data made public on Twitter; and (ii) the online publication of Excel files containing raw data on which the study was based, containing large amounts of personal data.
In response to complaints that the processing is not transparent and that data subjects do not receive clear information about the processing activities, the defendants referred to the exemptions for processing for journalistic and/or scientific purposes.
The DPA assessed these exemptions and found that the defendants could not rely on the exemption for scientific research, as this requires additional safeguards under Article 89 of the GDPR, such as pseudonymization. These safeguards, and indeed any significant form of internal or external data protection compliance documentation, were lacking in the defendants. In the absence of such documentation, the DPA held that the defendants could not rely on the exemption for scientific research.
As regards the exemption for journalistic purposes, the DPA noted that this exemption had not been transposed into Belgian law at the material time. While subsequent Belgian law limits this exemption to data controllers subject to a journalistic code of ethics, this was not the case when the facts occurred. Accordingly, the DPA held that the defendants could avail themselves of this exemption and therefore were not obligated to notify data subjects provided the additional criteria were met. The DPA went on to state that EU DesinfoLab was exempt from this requirement as it could have jeopardized the study and its subsequent publication.
But is the data publicly available on Twitter?
With regard to the legal basis of the processing, the defendants referred to the fact that the research is based on data which is publicly available on Twitter and which was published there by the data subjects themselves. Aware that publication is often considered as consent for further use, the DPA took the opportunity to specify that personal data published on social networks is always protected by the GDPR. This means that public data must always comply with the principle of purpose limitation, unless a derogation applies.
Derogations from the purpose limitation principle apply as long as the new purpose is compatible with the original purpose. If this is not the case, the processing must be justified by the consent of the data subject or by another legal basis, including the legitimate interests of the controller.
The DPA acknowledged that the study pursued a legitimate interest of the defendants. However, to rely on Article 6.1.f of the GDPR (i.e. the “legitimate interests” legal basis), the processing must: (i) be limited to what is strictly necessary for that purpose ; and (ii) legitimate interests must be balanced against the rights and freedoms of data subjects. The DPA found that this necessity test was not met when the defendants released the raw data. However, given the absence of pseudonymization and the potential significant impact of this publication on the persons concerned (the DPA mentions risks of discrimination and reputational damage), the DPA concluded that the defendants could not invoke Article 6.1.f of the GDPR as a legal basis allowing the publication of raw data in support of their study. Indeed, it found that the risk to data subjects outweighed the legitimate interests of controllers and that controllers had put in place insufficient safeguards (such as pseudonymization) to counter the risks to data subjects. concerned.
In addition, to take advantage of the “journalism exception”, the controller must carry out a case-by-case assessment of the balance between the right to freedom of journalistic expression (contribution to a debate of general interest ) and the right to access to data protection (impact of publication). The DPA found that such balancing on a case-by-case basis was not possible from the outset given the large number of Twitter accounts involved (55,000).
The importance of GDPR documentation
As mentioned above, the DPA found that the defendants did not have adequate documentation of data protection compliance. For example, the parties did not have a clear data protection notice, they had no record of processing activities and they also did not have contracts with processors.
In addition, the DPA held that the defendants should have carried out a data protection impact assessment, as the ongoing research clearly triggered the “high risk” criteria under Article 35 of the GDPR. . Indeed, the research covered a large number of data subjects and included sensitive data categories, such as political affiliation. The DPA explained that the obligation to carry out a data protection impact assessment applies, even if the personal data has been processed for journalistic purposes.
Interestingly, the DPA felt that a warning would suffice, but the CNIL insisted that a fine would be appropriate. In its decision, the DPA considered mitigating factors, such as the fact that the defendants had improved their GDPR compliance in response to the investigation, their public apologies, and the fact that the exceptions for journalistic and scientific purposes were n had not been implemented under Belgian law, against the aggravating factors, in particular the seriousness of the infringement (considering that it relates to the basic principles of the GDPR) and the large number of data subjects and the sensitivity data (including political affiliation or opinion). On this basis, the DPA imposed a fine of 2,700 euros on DesinfoLab and a fine of 1,200 euros on its employee.
This decision recalls that public information does not fall outside the scope of the GDPR, even if this data is used with the best intentions and for journalistic or scientific purposes.
The full decision is available here (in French).