Ireland’s Data Protection Commission (DPC) is reported to have handed out €1.64 billion in GDPR fines in the year to January 2023. Although a large proportion of these fines were issued to social media companies, the increase in penalties has nevertheless been widely interpreted as sending a message to organizations operating in all sectors and industries about how they handle personal data, and their adherence to GDPR regulations.
Since May 2018, European regulators have reportedly issued €2.92 billion in GDPR fines, with a 50% year-on-year increase in the value of fines Europe-wide during this time.
Ireland issued 66% of all EU data breach fines from January 2022 to January 2023, with social media companies bearing the brunt of these penalties. Most notable of these was the €380 million fine issued to Meta Ireland for breaching GDPR regulations — €210 million in relation to its handling of users’ data on Facebook, and €180 million for similar Instagram breaches.
The enforcement of these swingeing penalties poses serious questions for companies, both in and outside of the social media sector, about how consumer and user data is handled, as well as how online services will be able to be funded in the future.
Transferring personal data between countries
In particular, what has been thrown into focus is the transfer of personal data to third countries. Under GDPR regulations, such data transfers are prohibited if there is a possibility, however remote, of foreign governments or other agencies being able to access this data in any way.
As a consequence, experts such as John Magee, Head of Data Protection at global law firm DLA Piper, argue that a risk based-approach to cyber security and protecting data, i.e., identifying the highest compliance risks and prioritising them, before then moving onto lower levels, is no longer adequate in this regard.
This is important because in an increasingly global economy where there is a growing worldwide remote workforce, UK and EU companies who hire workers in countries outside of their own national or EEA borders need to be especially vigilant with regard to protecting personal data.
With the sort of sensitive data that is associated with employment records, it is crucial that companies understand how to manage cross-border payroll risks so that they are not only protecting valuable data, but so that they do not fall victim to GDPR sanctions.
In particular, UK and EU organisations of all sizes need to be aware that further protocols could potentially be applied to some international transfers of personal data. These could include so-called ‘Schrems II’ requirements, as set out in Chapter V of the GDPR.
In broad terms, this means that if an individual’s data is protected under GDPR, those protections (or an equivalent standard of appropriate safeguards) should continue to be applied even when that data is transferred to a country outside the UK or beyond the jurisdiction of the EU.
Potential challenges posed to data protection by AI
A further area in which regulators are focusing their attention is Artificial Intelligence (AI). AI is trained in part through the use of personal data, as well as facial recognition technology, and several fines have been issued following complaints by organisations who are tasked with protecting individuals’ digital rights.
It is thought likely that both providers and users of AI are increasingly likely to be subject to regulatory scrutiny, especially as AI and other forms of machine learning become more widely implemented, and adoption rates in all sectors increase.