
Bytesize Legal Updates | Fieldfisher
Fieldfisher are experts in European digital regulation and guide businesses through the complexities of the EU’s rapidly evolving regulatory environment. Europe is one of the world’s largest internal markets - with our focus on digital regulation for online platforms, social media and emerging technologies (AI, automation, AR/VR etc.) we keep you up-to-date with the EU’s digital agenda, and latest impacting European legislation for the industry.
Bytesize Legal Updates | Fieldfisher
Bytesize Legal Update: Automated decision making back in the spotlight
In the recent case C 203/22 (Dun & Bradstreet Austria), the court of Justice has returned to the issue of automated decision-making and provided some useful clarification on what information needs to be disclosed about this kind of data processing to data subjects – in particular, addressing the question of how these obligations interact with trade secrets rules.
In our latest Bytesize Legal Update, Fieldfisher's James Russell and Richard Lawne unpick what the court decided, how this decision adds to the previous caselaw like SCHUFA, and ultimately what this means for companies that use automated decision-making as part of their business.
James: Hi, I am James.
Richard: And I'm Richard.
James: And we are both tech and data specialists in Field Fisher's, Silicon Valley office. In today's bite-sized legal update, we're gonna be looking at the recent decision of the Court of Justice of the European Union in Dun and Bradstreet, So in this case, we had an individual who has refused a 10 Euro monthly phone contract on the basis of a credit worthiness assessment carried out by Dunn and Bradstreet, or DNB as we might be referring to them. And the question for the court was essentially what information did d and b have to disclose to the data subject and how does this interact with the business' interest in protecting their trade secrets.
Richard: Yeah, if you [00:01:00] recall, the CJEU has issued a judgment on automated decision making before. So in December, 2023, it issued its ruling in the Schofer case, and in that case it said that a credit reference agency engages in.
Automated decision making under Article 22 of the GDPR when it generates a credit score as a result of automated processing, and then makes that available to lenders who rely heavily on the score to establish or terminate contracts with individuals. That was a really important case because it established the principle that a controller might be engaged in automated decision making, even if it is not directly. Making automated decisions, but instead generating information that others may rely on for those decisions. Naturally, that case had significant consequences for credit rating agencies, but it potentially is also relevant for other types of businesses involved in providing scores, assessments, insights, or information to [00:02:00] others that might be relied on in an automated processing context.
So if you think about AI and the wave of new tools and applications that are being used, , particularly where there's automated processing you, you can think that this case might have very important ramifications.
Fast forward to February, 2025, and we have this new decision from the CJEU. It's a separate case, but can be seen as part two on automated decision making.
Richard: And now digging into a little bit more detail about some of the specific obligations under the GDPR in an automated decision making context.
so in this case, the individual had submitted an access request under Article 15 of the GDPR to the credit reference agency.
And as part of that request, they had also asked for meaningful information about the [00:03:00] logic involved in the automated decision. So they were two questions for the call. Firstly, what information must be provided? To the individual. And secondly, to what extent could the controller rely on trade secrets to withhold providing that information?
So, on that first question, the CJEU ruled that, , under Article 15 of the GDPR and this right to obtain meaningful information, the data controller must disclose all relevant information concerning the procedure and principles actually applied in order to use the individual's personal data.
Richard: With a view to obtaining a specific result. Now that's quite wordy, but in other words, the controller is required to disclose any relevant information that enables the data subject to understand the reasoning, which led to the automated decision, such as the creation of a credit score [00:04:00] or credit profile.
The CJ expanded on this bike saying that to ensure that the data subject is able to fully understand the information provided, the explanation must also be concise, transparent, intelligible, and easily accessible as well.
The CJEU didn't give, , any details about this or explain exactly what this could mean in practice,
But in theory that could include, for example, explaining what data was used as factors in the decision. Information about the specific model or procedure involved, the main criteria or factors applied, potentially also information about the weight or relative importance of the criteria in plain terms,
then also broader explanation of the overall goal or purpose of the automated processing, as well as the specific outcome or result for the data subject involved.
Ultimately, article 15 of the GDPR must enable the individual to ensure that their personal data is correct and accurate, and also that it's [00:05:00] processed in a lawful manner. This is also important to ensure that they can exercise their right to contest the automated decision and to express their point of view.
Okay,
James: because those are all complimentary GDPR rights as well. Right. So we're not just looking at this in the context of Article 15, but you need that information for all these other rights that data subjects have under the GDPR. Is that right?
Richard: Exactly. All of these rights are linked. Interestingly, the CJEU also said that a controller can't satisfy this requirement simply by sharing the mathematical formula, like the algorithm or the model, or by providing a very detailed and technical description of all of the steps involved. IE, you can't just overly share everything you have.
You have to be able to explain it and so that the average person can understand basically the criteria and the rationale involved in the decision making
James: That makes sense, but okay. What if I'm standing in d b's shoes and I'm saying like you mentioned before that all this information is trade secrets.
I don't wanna be given out the keys to the kingdom. I don't wanna be [00:06:00] handing out all of my secret sauce. What did the Court of Justice say about that?
Richard: Well, as a starting point, recital 63 of the GDPR states that the right for a data subject to gain access to their personal data shouldn't adversely affect the rights or freedoms of others. And the recital expressly said that this could include somebody's, uh, trade secrets or their intellectual property rights.
For instance, the copyright, protecting the software or algorithm. So here, the CJEU clarified that firstly, the protection of trade secrets cannot be used as a blanket refusal to avoid providing information to a data subject. And secondly, that really, this right for protection of trade secrets has to be balanced against the individual's rights.
And you have to weigh up these two competing rights to decide ultimately what information should be disclosed. Now really the CJ EEU didn't provide much more detail than that. And essentially it just restated principles that we can already [00:07:00] derive under the GDPR. But they did make, uh, quite an interesting point.
So they went on to say that if a controller wishes to. Refuse to disclose information based on the trade secrets argument. Then in that context, the National Court or Data Protection Authority can play a role and effectively they can decide whether disclosure is likely to result in a infringement of trade secrets.
Or whether the information should be disclosed. And actually there's a role for the authorities there to apply the law within context, weigh up the rights and interests of the parties involved, and then determine the extent of the disclosure. And that's quite an interesting comment really, because it's suggesting that actually in these cases, all of those competing parties should submit their arguments to the court, the regulator, et cetera.
And ultimately it's for them to determine whether the information should be provided or not.
James: Makes sense. So businesses aren't necessarily always gonna be able to make this balancing [00:08:00] decision themselves, but where they're saying no, where they think that they shouldn't be disclosing it, we can actually go to the courts or to the DPA to try and do that balancing act for us.
Richard: Yeah, they really highlighted the role of the courts and the authority there in assessing the facts of the case and whether in those particular circumstances, certain information should actually have been disclosed or not.
James: Okay, so trying to pull some of these threads together, then we've got an idea of what meaningful information means. We've got an idea about how we're supposed to balance trade secrets. What does this mean for business? What are the big outcomes that businesses should be taking away from this case?
Richard: Well, there's some good news and, and bad news, I guess firstly, for organizations involved in automated decision making, you're not necessarily expected to share your algorithm, your source code, or your full secret source, if you will, with individuals in response to an access request.
Essentially, article 15 doesn't give data subjects unfettered access into your black box. Instead you have to be able to provide, a [00:09:00] meaningful explanation of how these decisions are made. So that's the criteria, the logic. The rationale, et cetera, and not, not giving this in detailed, technical terms, but instead providing an explanation that individuals can understand, naturally that could be challenging in an AI context, for example, where some of these systems operate in black boxes, and it might be difficult to explain exactly how AI has, , generated a certain output prediction or assessment. But in principle, you need to understand things like what data is used, what are the main, uh, parameters, and to provide some insight into the logic behind the decision making.
Makes sense. So it's not giving away the secret sauce, but it's giving enough information that people can meaningfully rely on their
Yeah, and this really comes back to the transparency principle under the GDPR, which is that you have to be transparent in your processing and in the context of making decisions, it's explaining that decision as [00:10:00] well.
James: And that's maybe something that we should just emphasize before we wrap up as well. But in this context, we've just been looking at the GDPR, but you're absolutely right that there's these other regulations specifically on ai, but also under consumer regulations, which are increasingly doing these same sorts of things, aren't they?
Richard: Absolutely. We have AI standards that are based around explainability.
We also have new regulations like the AI Act, which are gonna provide a certain forcing function for AI businesses to adopt explainability. To ensure that they're able to explain to individuals how their ai, operates, how it's using their data, and how it's coming to certain predictions or conclusions.
So although this case was specifically, around Article 15 of the GDPR, I think some of those principles. Could become relevant in other contexts. And interestingly, we have already seen a reference to the CJEU, , not on the GDPR, but this time on some of the interpretation of other EU laws like the AI Act in the context of automated decision making.
So suffice it to say we're gonna see [00:11:00] a lot more to come on this subject. Sounds We've had chapter two on automated decision making, but chapter three is on the way. That's right.
James: Well, thanks very much for joining us, Richard Appre, really appreciate you taking the time to come and speak to us about this a little bit. bit. Yeah, absolutely. Lots of fun. Thanks, James. And thank you to the listeners for joining us on this latest episode of Field Fisher's bite-sized legal podcast. Your source for concise legal updates on the key legal developments in technology and data protection law. If you have any questions about today's update, don't hesitate to reach out to us.
And if you found it useful, do make sure to give us a like or review on your podcaster of choice. Thanks again and we'll see you next time.