Adrian Weckler (AW): The German federal data commissioner, Ulrich Kelber, has repeatedly criticised this office in relation to big tech investigations, saying it is moving too slowly. How do you respond to such criticisms?
Helen Dixon (HD): The particular individual who is commenting there has a role specifically in public sector enforcement and has no experience in terms of the type of supervision that's being referenced.
I think anyone who understands the complexity of the new law, the novelty of this one-stop shop that's been created with the cooperation and consistency mechanisms, the fact that there are still independent national DPAs, and that the law in the GDPR has to be reconciled with those national laws - and yet also involve EU administrative law in terms of Article 60 [co-operation between national data protection authorities] - would understand that certainly the first time round, there are a lot of issues that need to be worked out.
AW: Yet he recently compared Ireland's data protection set-up with Germany's 'go-slow' automotive regulator on the diesel emissions issue. Clearly, he is voicing substantial frustration.
HD: We produce an annual report that details all of our activities. You can see that there's a huge span of activities, from individual complaint handling to approving binding corporate rules, to conducting larger-scale investigations, to enforcing against the public services card.
We also receive the annual reports of other data protection authorities. I wonder if anyone were to assess those annual reports, how such a comment ['go-slow'] would stand up to any kind of scrutiny as regards the effectiveness of the data protection regulation.
Up to the end of 2019, just three cross-border cases involving fines had gone through Article 60. None of those were from Germany. Two were from Malta. One was from Lithuania, where the highest fine of €62,000 occurred.
The highest number of concerned data protection authorities involved in decision-making on any of them was three. So there's no particular Ireland story here.
As for that quote you brought up, that was made in the context of the suggestion that the EU should go down the route of a single digital regulator, that would effectively handle the large cross-border cases in terms of big tech companies - just like the EU Commission does in competition law.
It's a matter for the Commission to comment on that, but the EU already considered it when it was creating the one-stop shop under the GDPR, so I think there's very little prospect of that happening in the near term. Neither would it be a silver bullet, because if you look at the EU competition cases that are handled by the EU Commission under Margrethe Vestager, the decisions in these big cases don't happen any quicker anyway. They take several years from start to finish.
AW: So Ireland is not dealing with these big cases too slowly?
HD: There's no particular Irish angle on this unless you want to generate a story or get a few headlines. It's true to say that there is a particular focus from the Germans on Facebook. This is apparent in frequent commentary that is reported from them. So that may well be a reason as to why we've been particularly picked out.
If you look at Germany, there have been none [investigations] of a cross-border nature that have resulted in fines.
If you look at what was the biggest DPA in Europe last year, the ICO, with 700 staff and a large number of cross-border organisations that they supervise, they have yet to bring one through Article 60. France is the same. And the Netherlands, with organisations like Uber and Netflix.
The Dutch commissioner says that they're now running at a backlog of six months, and up to five years, in terms of the volumes coming in. Prior to the application of the GDPR, the Dutch did not have a policy of handling every individual complaint. They applied a policy of 'selective to be effective'. But the GDPR has now necessitated that they handle every individual complaint. So again, there's nothing unique with regards to Ireland.
AW: The Germans also say that Ireland continues to under-resource its data protection authority.
HD: This office needs to continue to grow. But if you look at the growth that it has had and the effort that was involved in winning the budget to recruit the experts in technology and data protection law that we now have, it couldn't have happened any faster than it has happened. But yes, there has to be continued investment.
AW: Your office has flagged before that decisions on WhatsApp and Twitter are likely to be the first of the big ones regarding multinational tech companies. But what is the actual timeline for those now?
HD: I can't quote timelines any longer, because we simply have to take what's coming at us as it comes at us, and deal as expeditiously as we can, step by step.
My quoting a timeline is a little bit like a red rag to a bull. It has become counterproductive. What I can say to you is that we have resolved and bottomed out a lot of procedural questions that were raised with us that we had to answer to reassure data controllers that their interests were protected in the Article 60 process proposed, and which we also had to answer and resolve for ourselves.
Step by step, we will circulate draft decisions now that we've resolved the significant procedural issues.
AW: There is going to be an expectation of substantial fines, especially after the US Federal Trade Commission's $5bn (€4.5bn) fine on Facebook last year. Are we looking at that scale of a fine?
HD: It's only once we've made the decision and found substantive infringements that we can then consider the corrective measures and the quantum of any fine applicable.
But I will say to you that one thing we have been busy doing in a general sense, not by reference to any particular inquiry, over the last number of months is procuring expert legal external counsel advice on EU competition law. The fining regime under the GDPR is based on competition law. So we have been looking at how fines by the EU Commission have been applied dating back to the 1960s and looking at what's been published by the commission about the methodology that applies.
But one of the things that has become very clear to us is looking at the deterrent effect of a fine.
Under the GDPR, deterrence is a particularly important reason why the fines are included. They could have stopped at the corrective measures. But the fines are there to be punitive and give rise to deterrence. And deterrence is based on what's already in the [fine] landscape.
So when you're looking at that deterrence effect, a very relevant factor in terms of quantum is the level of fines already existing, globally, in the area.
So if you ask whether the [$5bn] FTC fine is relevant, it is. Just as the fines we've already seen from those existing three decisions in 2019 are relevant. All of that will have to be brought to bear. So there is a relevance because deterrence is based on what's already the landscape and whether you need to move beyond it.
AW: So you're crossing the Ts and dotting the Is to prepare for the announcement of a fine?
HD: Well, a fine is an inevitability at some point. But we've taken the advice not by reference to any particular inquiry. It's just a necessity for us to be armed as a data protection authority.
AW: That sounds like you're lining a fine up.
HD: That's not what I'm saying. I'm saying we procured this legal advice in a general sense in order to make sure I'm armed and equipped, and understand how to exercise my obligations.
AW: On another issue, we're seeing a significant escalation in cameras and facial recognition in public places. In the UK, police are conducting high-profile trials of facial recognition cameras to catch criminals. Do you have any thoughts about this in an Irish context?
HD: It is a particularly invasive technology. From a technological point of view, there are all sorts of issues such as false positives and people of colour faring worse in terms of the matching in facial recognition systems. But it is becoming more and more pervasive. The EU recently gave thought to banning it in public places, although now they seem to have retreated from that position.
But there's no policy or legal position on facial recognition in this jurisdiction. It falls to be regulated largely under the GDPR and by reference to EU case law, in terms of necessity and proportionality.
It would be helpful if policymakers and lawmakers have a strong view on this and give effect to that in a law.
Otherwise, it does fall to be dealt with on a case-by-case assessment by data controllers and subsequently by the Irish DPC.
I think the bar is going to be very high for any data controller who is using facial recognition systems. There are also issues around the databases and the storing of details of individuals alongside their photographs. So it is an area of significant concern. We haven't seen the same types of projects that we're seeing in Wales and London rolling out here in Ireland, but the technology usually catches up in this jurisdiction. So I've no doubt we will be coming across cases of that type to consider.
AW: What about the fast growth of domestic cameras and video cameras such as Amazon's Ring doorbell? In the US, Amazon has brokered agreements between 800 police forces and residents for the sharing of footage from their front-door 'smart' video cameras.
HD: There was an interesting article in the 'New York Times' in the context of the Amazon Ring device and what we're doing to ourselves as societies by installing all of this and inflicting on ourselves a surveillance technology.
The article demonstrated that Amazon's Ring is useful in securing convictions only in a tiny number of cases, a number that is potentially disproportionate to its downsides. So while people seem keen to invest in this type of home technology, it actually doesn't deliver what they think it does.
AW: I think the 'New York Times' has also reported that burglaries and property crime have fallen in the same period as sales of such smart home video cameras have risen though.
HD: That may well be true and is worth considering. For example, the data from the UK, in terms of police using body-worn cameras in riot situations, is that behaviour improves once people realise that the authorities are using audio and video recording equipment.
In terms of regulation, there's no law that prevents individuals from buying an Amazon Ring and installing it. But I think it's worth reflecting on all of this for ourselves that we're building fortresses to what end, around ourselves.
From a data protection point of view, there is case law in the [CJEU] Rynes case. That was on CCTV technology but very specific to a household implementing CCTV technology.
What that said is that a household can avail of the household exemption under the GDPR, provided the CCTV is trained within the perimeter of the house. Typically, Amazon Ring doesn't fall outside of that.
So it may well not be an issue that's for regulation by the data protection authorities under the GDPR, but that still doesn't mean there aren't things to think about.