The problem with new methods and sources of knowledge is that these can bring forth facts that do not always please all those involved. But to prohibit the pursuit for knowledge and insight is rather absurd in a transparent state at the latest after the Age of Enlightenment.
In France this seems to be seen differently. In fact, France has recently adopted a law reforming the judicial system. Article 33 of the law has caused an international controversy. It reads literally:
“Les données d’identité des magistrats et des membres du greffe ne peuvent faire l’objet d’une réutilisation ayant pour objet ou pour effet d’évaluer, d’analyser, de comparer ou de prédire leurs pratiques professionnelles réelles ou supposées.”
Translated into English, that’s something like:
“The identity data of magistrates and members of the judiciary cannot be reused with the purpose or effect of evaluating, analysing, comparing or predicting their actual or alleged professional practices.”
This is followed by a reference to several provisions of the French Criminal Code, which result in a penalty of up to five years imprisonment. Five. Years. Prison.
The French state thus threatens its citizens with a prison sentence of several years if they evaluate publicly available information in a certain way in order to gain knowledge from it. This is independent of whether it is for commercial or scientific reasons, or simply to find out for fun if you can find anything at all.
Fear of predictive analytics?
The method, which is now forbidden under French criminal law for judicial decisions, has been discussed for some time under the term “Predictive Analytics“. Put simply, this is about recognising patterns.
The aim of this method is to identify recurring patterns in the data through statistical evaluation. Based on these patterns, previously hidden irregularities or other issues can then be detected. Or the pattern is used to predict the future in terms of probability. This probability then describes a correlation between two or more events. Correlation does not mean causality. The evaluation of the data only shows whether – and not why – certain correlations exist. Thus, this is a purely statistical and, among other things, not a legal method.
In their international bestseller “Big Data – A Revolution That Will Transform How We Live, Work and Think”, the two authors Professor Viktor Mayer-Schönberger (University of Oxford) and Kenneth Cukier (Data Editor at the Economist) portray a society in which many decisions in almost all areas of life are based on such a predictive analysis of data.
In the legal world, this method is often applied to decisions by certain judges in an attempt to predict future decision-making practice. Commercial providers are already active in various jurisdictions, mainly in the USA, but also in France. Such a service is not yet available in Germany.
In contrast, science has mostly focussed on the international courts, such as the European Court of Human Rights. In addition, as part of hackathons, countless lawyers, programmers, statisticians and other innovators try to extract something useful from the available data.
What France might want to achieve with the ban
There are various presumptions as to why France has imposed this criminal prohibition. On the one hand side, France has for some time been making efforts to improve transparency by making all its case law accessible to the public with as little redaction as possible. On the other hand, in order to accommodate the judges, who are not entirely in favour of this development, the new ban could be a political compromise to balance the supposedly conflicting interests.
Another version is given by the French lawyer Michäel Benesty: In his spare time, he developed a tool together with a friend of his, an expert in machine learning, which can evaluate French case law. Both of them were struck by enormous imbalances when it came to the rights of foreigners and above all asylum seekers. Some judges had rejected almost all applications for asylum while their colleagues had accepted almost all applications at one and the same court.
According to the lawyer, he could not find a reasonable explanation for these discrepancies – except that the judges did not base their decisions on the law but on their personal world view.
They didn’t keep their experiment to themselves, instead they put the tool on the internet for everyone to test and verify for free. The results of their evaluation were also published in an article. A few hours after the publication of the article, the public outcry started. In particular, the French judiciary was pas amusé. According to Michäel Benesty, Art. 33 of the French Justice Reform Act is now the answer to these events.
Ban not limited to evaluation by computers
The law expressly does not prohibit the publication of the names of the judges. This is provided only for the names of the parties. The judgments are therefore deliberately published by the state with the names of the judges, but these names may not be used afterwards to analyse the judgments and to assign the produced evaluations. The law bans effectively all forms of analysis of individual judges – and not just data driven social scientific inquiries, but also doctrinal legal analysis.
However, it would probably be permissible to evaluate the judgments according to judicial bodies and not certain judges. This can still produce relevant findings for cases in which all members of a panel decide together on a regular basis. If, however, the decisions within a panel are always transferred to the single judge and these decide differently in each case, the results of a statistical evaluation of the panel are watered down to uselessness.
Interestingly, the French law does not link the ban of evaluation to an automated procedure. Therefore, if a lawyer in France prints out several judgments on a certain topic by hand, reads them by himself and then sends his client an evaluation of individual judges, there is now a risk that he will become a felon.
Mixed echo on France’s ban
The reactions to this law, both in France and internationally, ranged from astonishment and disbelief to complete bewilderment. Artificial Lawyer has compiled some of these reactions.
In France, for example, Louis Larret-Chahine, co-founder of a company affected by this law, assessed the law as a “disgrace to democracy”. Judgement is spoken in the name of the people, and the attempt to conceal its findings from lawyers or citizens can never be right in his view.
Canadian lawyer Fernando Garcia believes that a legal system must be open, transparent and verifiable at all times. If criminal law is used to protect themselves from criticism and scrutiny, it becomes dangerous.
Professor Malcolm Langford (University of Oslo) and Professor Mikael Rask Madsen (University of Copenhagen) called the French law “a flagrant violation of the freedom of expression” on the well-known Verfassungsblog. In their view, the new law “represents an affront to basic values of academic freedom, and disregards basic principles of the rule of law.”
The US judge Emily Miskel from Texas has taken the floor on Twitter and rather sees the advantages of the method which is now forbidden under French criminal law:
In general, I love data:
1) I would love to see a statistical analysis of my own decisions to better monitor my own performance
2) Attorneys always have rumors about how judges lean, and I would love to see how much is actually backed up by data
In Germany, Dirk Hartung, Executive Director Legal Technology at Bucerius Law School, has already carried out extensive statistical evaluations of German court rulings with students. In his opinion, it makes sense to use the models to examine what influence the involvement of a particular judge has. For the still young scientific discipline of computational legal studies, it is of utmost importance that all kinds of legal data are as easily accessible as possible, and the French approach – to put it in a friendly way – is rather backward-looking.
First law of its kind
The French prohibition law is – as far as can be seen – the first of its kind in the world. Until amended by the legislator or repealed by a court, Art. 33 of the French Judicial Reform Act is now in force. French companies and law firms that offer such analyses must limit the scope of their analyses accordingly, and international providers must carefully examine whether and how they are affected by this law. In addition, scientists and participants of hackathons now have to be careful that they do not make themselves liable to prosecution if they work with French judgments.
Of course, the digital transformation of the legal system needs appropriate rules. In the case of data-driven approaches, for example, this includes a verifiable balance of the data with has to be free of human biases. In addition, it should be possible for algorithms and their results to be verified by third parties.
But a blunt prohibition law to simply not allow evaluating existing data certainly does not belong to this category of future oriented rules. In particular, a ban on analysing judicial decisions is unworthy of a transparent and enlightened state. No one is above the law – including judges.
This post is based on a German op-ed in Legal Tribune Online.