Episode 13 Transforming our courts using technology
Yves Faguy: Hi I'm Yves Faguy. In this episode of Modern Law we discuss the transformation of our courts using technology.
You're listening to Modern Law, presented by the Canadian Bar Association's national magazine.
It's October 12, 2022. Now COVID-19 isn't gone by any means, but a semblance of normality has returned, and yet much about our justice system has changed, permanently perhaps, forced, as it was, to quickly adopt technology during the pandemic. A return to in-person hearings is certainly underway, and this is especially true for the most complex cases, but the flexibility of virtual hearings are also likely to remain, to some extent, a permanent fixture of our court proceedings.
And there's more. Plans are afoot in several courts to improve electronic filing. Courts in several jurisdictions around the world are looking at how they can put data analytics and AI to work to maximize the use of data and modernize and automate processes. But, as a report by the CBA Task Force on Justice Issues Arising from COVID-19 cautioned last year, enthusiasm for technology must be balanced with sober thought about their implications for fair and assessable justice. The report, known as the No Turning Back Report, highlighted some 18 recommendations to guide the courts and justice stakeholders in realigning the justice system with the digital reality of our century.
So we thought it would be a good time, some 18 months after the release of the report, to revisit some of what we discussed in it and take stock of it. What has changed, and what the remaining challenges ahead are. So we invited Karen Eltis, the contributor editor of the CBA No Turning Back Report, released back in February 2021. Eltis is a law professor at the University of Ottawa, and she specializes in artificial intelligence, innovation law and policy, and cybersecurity issues. She's also the author of Courts, Litigants, and the Digital Age.
Thank you Karen Eltis for joining us today.
Karen Eltis: My pleasure, thank you so much for having me.
Yves Faguy: So, I mean this is the second time we've had you on the show. I mean the show was known as After the Pandemic two years ago, but now it's Modern Law, but no matter to that. We spoke a couple of years ago, and I think it was before the release of the report –
Karen Eltis: Yes.
Yves Faguy: - we spoke a couple of years ago just prior to the release, about the pressing need to modernize our courts, risks involved, and the precautions we should be taking. You know, and I think there were a few takeaways from that discussion and we saw those fleshed out in the report a little further. One of them was that you were issuing a word of caution that we've got to be careful about not swinging from one extreme to the next, where you know, we have this justice system that has been very reluctant to embrace change, technological solutions, even in the face of mounting backlogs and delay. And then the other extreme where we are embracing technologies in a manner that is a little bit careless, I suppose, and maybe sloppy without property thinking through all of the unintended consequences of implementing tech.
So just taking that general idea, how do you see that issue today? Has it evolved any or are the concerns still the same?
Karen Eltis: I think that's a really important question, and I'd like to preface it just by saying that even though I think were in what many and myself have referred to the Fourth Industrial Revolution prior to COVID, the crisis of the pandemic and the related measures have moved us way into the future. Have created a migration of services online, overnight, as we've said in the past, and have swept away this psychological resistance to digital transformation, which is wonderful.
But in regards to the caution, crisis is catalyst for change is welcome, but I think when it comes to this abrupt transition, we really haven't had a chance to think it through. It's funny that you mentioned the podcast being titled After the Pandemic which clearly was premature. And some may query with this now, but in any event –
Yves Faguy: That's why we changed the name, by the way.
Karen Eltis: Yeah, but even at this stage, at the very least we've had, I think, now a chance to take stock. And if we haven't, it is time to do that. Because we're out of the immediate crisis phase. And this, I think, requires us – us as a society and certainly as institutions – to take stock, to really reflect on building on the change that the crisis occasioned, and look at whether in fact, as you've said and I've said in the past, we've swung from one extreme to the other, and if we have, what sort of tweaks we need to build in, in order to ensure that we're not driven solely by efficacy, but that we're building trust, certainly with something as sensitive as the justice system.
Yves Faguy: Is there an issue with trust, now, two years after that last conversation? Is there a different issue with trust?
Karen Eltis: I don't if the issue is different. I think it's a question of scale, and I think this is also something that we only see with – and I'm struggling for the [unintelligible 00:05:32] – with [unintelligible 00:05:34] with looking back, and you might have the proper word for me.
Yves Faguy: Hindsight.
Karen Eltis: Hindsight. That is exactly it. So I think in terms of trust issues, that will only be able to be ascertained in a few years. Why? Similar to – I wrote Courts, Litigants, and the Digital Age, the first edition was published in 2012, so I must have written it, you know, somewhere around 2009, 2010 given the publication process. And only a decade later do many of us see the pertinence, right? Of the Digital Age on courts. So it really takes a bit of hindsight with regards to the specific issue of trust, I don't think we're in any type of crisis. But I think as the issue build up, and as there is a greater social interest – and we've seen this with Cambridge Analytica, we've seen this now with the rapid migration and the pandemic. People are taking notice of this significant shift, and I think we have to be proactive about nurturing and maintaining trust, rather than sort of going forward in a tactical rather than a strategic way, at a point that we will inevitably encounter some sort of issues that might hinder trust, and as justice needs to draw on what Easton has called the reservoir of goodwill, there is not much leeway in terms of sacrificing trust for efficiency.
So this is a time to take stock and reflect on how we can ensure that this revolution is sustainable.
Yves Faguy: There's a lot of discussion today about the erosion of trust in our institutions in Canada. And when we're looking at the legal sector, the big issue is often access to justice. A lot of that has to do with the complexity of law and procedures and delays. Some of it has to do with the whole weight of the system. But there are other issues as well in terms of trust that we're seeing play out in our economy that might be sensitivity to the use of our data, where it goes, who gets access to it. All of these things play into, I think, issues that our justice system is struggling with in terms of embracing technology. How do you see that?
Karen Eltis: We need to be cautious. This is a time when resources are limited and technology provides a seemingly simple, inexpensive, and very seductive solution. And while many aspects of technology – and I've clearly, judging by the title of my book, have been pushing for technology to be adopted in many respects and I'm glad that the psychological barriers have fallen. Technology is not a silver bullet for access to justice. And we have to be really careful, as the [Talon? 00:08:26] declaration has said, that as we – I use the word embrace – you know, as we cautiously embrace technology, we do so in a way that is open, efficient and inclusive. And there are a few points and these I can elaborate on later.
The first point is one of unmitigated or unquestioned dependence on private portfolios, often portfolios that come from outside of Canada, often platforms that clearly set themselves out as being commercial platforms, which is completely fine. But that doesn't naturally create a fit for government and more specifically for justice, right? If the business model is one of private platforms do we want our interactions with the justice system to be entirely mediated by these commercial actors? And that's a question that I think we really need to ask ourselves in light of judicial independence very specifically, and in light of trust more generally.
And the second point, I recently presented a paper at the International Conference for Pattern Recognition, which was AI conference – I was the only lawyer amongst many, many computer scientists conveniently in Montreal. And we talked about due process. And I spoke of research conducted decades ago by a professor of psychology and of law in Yale, Tom Tyler. And seemingly counter intuitively, his conclusions, I think, are brilliant for the AI context. And he said, you know, a system that's perceived as offering due process – so procedural justice –and avenues for fair contestation – so fairness – is better heeded, creates more trust than one that is efficient. And in fact he's shown that people prefer fair procedures even though they might be waiting for results, even though they might lose in court, they want to understand that the procedures are fair.
So this is where the concern with ultimate fallout that comes from either obscurity or abdication to non-justice actors come to the forefront. And I'll stop here but I'm happy to elaborate on that.
Yves Faguy: I mean that's really interesting because, I mean, really what you're saying is that people are prepared to hang in there and wait out for a judicial solution on the premise that they will get due process, that the procedures will be fair, everyone will be treated fairly. I think the problem though, is when they also fall through the cracks because of questions of cost and they can't afford to hang in that long because of the complexities of the justice system.
So I guess the question there is where can technology or algorithms or artificial intelligence assist us in making the system more efficient where it should be, without sacrificing those concerns about due process?
Karen Eltis: You've just put your finger on what is the most important problem for the justice system that the majority of people are often left behind because of the prohibitive cost of justice. And this is a tremendous obstacle that I think we must acknowledge and address. And this was a problem long before the pandemic, and it's one exacerbated by the pandemic for many reasons.
The question is, whether technology is the silver bullet or whether – and I pose this as a question in terms of leaving people behind, whether we are in fact not creating a two-tier system where those who have access to justice continue to have access to justice that is characterized by fairness, and those who you mentioned who are increasingly, you know, at large, a majority of everyday people, don't have access to what we know as justice, but rather a poor substitute or a poor man's justice which is a technological substitute. So essentially steering – are we in so doing steering everyday people away from the justice system that we've known, and instead saying, well, we have technological solutions in place for you.
Now what I'm concerned about is that we do that in accordance with people's capacity to pay, as opposed to nature of disputes. I think AI and technology can absolutely help us – and we've seen this in BC – with solving certain issues, very often, you know, small claims or I've written a piece using a Nobel Prize, Daniel Kahneman's paradigm, Think Fast, Think Slow, where there is a simple rule to be applied in a straightforward manner that does not require a lot of judicial discretion, a technological system and AI system can certainly step in to provide people with rapid resolution. But that would be according to the nature of dispute rather than the character of the litigant.
Whereas when you have a tremendous deal of discretion at stake that the judge must supply, the technological solutions may not be as well suited. When you have self-represented litigants who need guidance, the technological solution may not – so my concern, and this is just a concern that I want to flag, is kind of resisting or at least confronting or acknowledging the temptation to say that for many disputes we're going to privatize and steer people away from the justice system that we've built towards quick fixes, not having properly sat down and reflected on the ramifications of that privatization for all intents and purposes, and the quality of justice ensuring that's not a two-tier.
That's different from saying that we shouldn't steer cases that are simple to technology systems and that may be a good way of looking at it because we're not going according to the ability to pay of the litigants, but rather we are looking at the nature of the dispute and some which may very well be handled by technology, again ensuring that no one is left behind. There are tremendous accessibility issues that we're sometimes not alive to, of course.
Yves Faguy: Now you know, obviously the pandemic forced the judicial world into virtual hearings. I'm wondering from your discussions with people, and those who have presided some of these hearings, have they been able to appreciate what's worked and what's not in terms of what kind of hearings virtual hearings work for, or what kind of situations technology can assist. Are they bringing the critical eye to the debate?
Karen Eltis: Absolutely, and I think – we've conducted and the report bears this out – a triage, for instance, the care that must be taken with criminal law, and the care that must be taken with older, more vulnerable populations, people with disabilities, and so on. So I've mentioned self-represented litigants. So I think there's been tremendous alertness to ensuring that technology, or to recognize that technology is a wonderful tool, not a crutch and not a panacea, and there's been great openness and reflection to that.
And seeing what areas of the law technology might lend itself best to, facilitating understanding of the law for self-represented litigants and so on, as opposed to simply being imposed. I think there's been tremendous awareness in ensuring that technological standards are aligned with democratic values.
Another issue has been one of concern – and I can come back to this – but not to standardized justice too much not to nudge it in a way towards conformity, there's been great awareness. I think where we still need to do work is the courtroom of the future, because [the? 00:16:20] court is seen as a service now, not just a place. So when will we need physical courtrooms? How should these physical courtrooms be equipped? What should they look like? There's a recognition – and this is important as we look towards ODR and ADR and there's been great sensitivity towards that, that the adversarial model is not always the best model. And there's a lot of writing recently in the Globe and Mail about the shadow pandemic and domestic violence and other such issues, where the adversarial model is not necessarily the best, and where certain forms of technology may be helpful in creating community justice and other forms of justice in various contexts.
I think there's been tremendous sensitivity and understanding that the family law context is a thorny one and kind of having – and this is something that there's still a lot of work to be done – coming to a balance of, we want access to justice and we want it to be efficient, but we also want to ensure that the particularities of sensitive context are not neglected. So definitely there's been tremendous sensitivity and work done in that arena.
Yves Faguy: Is there an implication in there somehow that being present in court, or going to court as a place is more inclined to produce adversarial situations?
Karen Eltis: I don't know if it's more inclined per se. I think one thing that has been recognized is there is a ceremonial value to going to court, to physically, you know, getting, one, out of their pyjamas and showing up in front of a judge. There's a solemnity that goes along with that. Now the question is, as with everything else, equilibrium and trade-offs. So on the one hand, that solemnity is necessary in certain contexts – and law is not just about dispute resolution, especially in some areas, and one can think of criminal law – but it is society coming together for a particular statement of sorts.
In terms of adversarial, I don't know that the philosophical courtroom creates a more adversarial climate, but I think the physical courtroom has certain attributes that are traditional and classic, if you will, whereas the digital form of this is, this is a lovely part of it that we should harness, can create alternative modes, right? So the flip side of, well on the one hand we want to have this ceremonial justice, because that has an important social value. The flip side of that is, can we see justice in a different light and are there other beneficial ways that are perhaps less daunting that we can apply in other contests that are more collaborative, less adversarial and less daunting that would allow people to have greater access, not just from a financial perspective, although that's probably the number 1 concern for people, but from a comfort level.
So in answer to your question, I don't know that the physical courtroom makes it more adversarial, but it makes it traditional trappings, for better or worse, and the digital has allowed us to explore alternatives to that, that may work better in some areas and not as well as others. And that comes back to the question of triage which is still relevant.
Yves Faguy: Another way to explore alternatives is by looking elsewhere, and the No Turning Back Report calls for judicial bodies across the country to collaborate and coordinate their efforts at adopting technology and sharing best practices.
How are they doing?
Karen Eltis: There's an important balance to be struck. I think courts are doing very well in terms of judicial education. I'm happy to see the IOJT, International Organization of Judicial Training, which every two years takes place somewhere around the world. This year in a few weeks will be here – well here, in Ottawa, Canada. So there's a lot of discussion, there's a lot of collaboration through various institutions. It's a balance between, on the one hand, having some level of harmonization so one doesn't have a patchwork and having conversations and seeing what's being done elsewhere is very important. But on the other hand, as I said, being cautious about overly streamlining in order to ensure that we comport with not only judicial independence, but with the imperatives of federalism and of local particularities and of individual courtrooms and their specific needs. So kind of striking this correct balance between what our best practice is, and I think that's very important, and the individual character of various courts.
And I attract attention to this even in my first book, you know, over a decade ago, but certainly now have very often for case management or course management there's standardization software that is being used, and that we have to be careful of in terms of judicial independence and over tracking judicial productivity. So streamlining is good to a certain extent, but also maintaining the particularities is helpful as well. And there's a comparative study that looks at, you know, the federal jurisdictions have done better than [unintelligible 00:21:40] in some ways in striking that balance. But more generally, and we can come back to this, I think we can learn from foreign jurisdictions for what I identify as the major issue, which is dependence, as I've said, on mediation by private commercial actors not because of the actors themselves, but because this is a very public forum. And looking at who has developed platforms in house, and there are certain examples for video conferencing and looking at what can be helpful for that. Looking at protecting very fragile cyber infrastructure. We've seen recently with Telecom, we've seen it in Australia, the cybersecurity issues. That's where harmonization is very important as opposed to just individual court practices and the like.
Yves Faguy: Are you saying that the courts should be considering some of these platforms in-house, or should they be outsourcing them to private –
Karen Eltis: Yeah, I think – and again, you know, this is a gargantuan task, and very often we say, well what's appealing about platforms is that we can get them for free. But that's, I've always said, you know, there's no free lunch, right? We have to – what's appealing is it's already for us. But convenience, you know, we have to think about the trade-offs for convenience and we have to think about the particularities of justice and the sensitivity and we were talking about trust. And have to think not only about the short term, we have to think about the sustainability. And what the pandemic and the cyber revolution has done is that they collapsed boundaries between the brick and mortar world and numeric world. But also between private and public. We're at work, but we're at home. We're in court but we're at home. And all of our interactions are mediated by commercial platforms.
And we have to think about the implications of that injustice, so yes, I think that it's important and many will shy away from this because it's expensive and this is what we're trying to get away from. But this is why I say technology is helpful, and we should be moving with technology, but pre-packaged solutions in the crisis situation are, of course, more helpful than they are, to use the previous name of the podcast, after the pandemic, where we take stock and say, you know, there are costs to ensuring that a system – again, especially when it comes to sensitive information – is properly vested with trust. Because of course now as we look into cybersecurity, justice is a treasure trove of information.
And so this is something that we may not want to address right now, but that I think will very much need to be addressed.
Yves Faguy: The counter to that would be – I don't know if it's a counter, but a concern, especially in a place like Ottawa, where I think public servants are still haunted by the Phoenix nightmare, is it really feasible to consider that the justice system and the courts could build portfolios in-house? Presumably they would have to collaborate somehow with outsiders. But if platforms are [inefficient? 00:24:51] making access to justice even more difficult than it was back in the analogue times, that's not going to help much the cause of improving access to justice and restoring people's trust, or earning people's trust in the justice system either.
Karen Eltis: Yeah, I think that's a fair point and I think there's a two-pronged approach, as we reflect on it. The first is the one that you correctly mentioned previously, which is international cooperation. And we talk about reinventing law for the digital age more generally, right? Because we have private law and public law and data prediction law, and none of these work in a straightforward manner post digital revolution. We have to recognize that. And that's a tough fight to fight, and it took a hundred years after what we commonly refer to the Industrial Revolution, and here's how it ties in. Before the pandemic if I would have spoken of international treaties or initiatives, even more generally, it would have been laughable.
Whereas now with cybersecurity and again for [unintelligible 00:25:49] we can look [in this? 00:25:50] more generally, [unintelligible 00:25:50] and various likeminded democratic countries putting our heads together thinking, this is borderless, this is a global problem, can we better address it together. And now coming back to the courts we can think of various jurisdictions that are exploring these issues.
So yes, going in-house alone is extremely difficult and can be counter intuitive, but looking at like-minded democratic countries exploring types of solutions together is another, and perhaps a more modest – because that would perhaps be something that can be looked at in the long-term, but something more modest in the shorter term would be – and this is already beginning to be looked at, procurement, solutions, and setting – and there's tremendous research on this which is fascinating on the American side, is setting certain requirements [unintelligible 00:26:42] framing the type of technology that we procure for such sensitive purposes and I think the time is right if not for the former then certainly for the latter, and I [want to? 00:26:51] slip into the AU proposal for AI liability, but now really concretely looking at entities responsible for procuring AI that results in very biased or other such [damages? 00:27:06].
So I think the time is right, at least for certain conditions. If we do want to go the route of pre-packaged solutions then certainly it might be worth turning our minds to what types of frameworks and precautions we would like, rather than just wholesale dependence. So what kind of precautions or framing we might want if we do go for a pre-packaged solution, at least in the short and medium term.
Yves Faguy: The need to safeguard sensitive data was highlighted as a recommendation in the report, the Not Turning Back Report. Are you satisfied that the powers-to-be are taking that seriously enough as a question to be thinking?
Karen Eltis: It's definitely a question that people are thinking about, that's clear. But it’s not a box to be checked, meaning this is a long-term struggle that we must grapple with. So what I want to see continued. So there certainly is an awareness. But this is something that we must continue to look at and explore. We're far from having, understandably, finished our process of reflection on these issues, and we can see from the various initiatives both in Canada, federally, provincially and internationally that there's kind of a reckoning with and an understanding that there is a flip side to the technology revolution, that technology isn't just something that happens to us, but that we must really mindfully understand what the ramifications are of these pre-packaged solutions that are implemented, or these really tempting and wonderful solutions, but just as law regulates – every hour is meant to regulate almost every aspect of human behaviour, we're at a point in history now, as we kind of take stock after the pandemic, we're at a point where we're using a framework developed for the brick and mortar world where most of our interactions, including but not limited to justice, and our sensitive interactions, to use a term that comes up repeatedly in the new legal paradigm embodied by the new [unintelligible 00:29:21] initiatives, are not in the brick and mortar world. You've probably had the same experiences as I have for any type of mundane interaction where you get a popup that says, you're data is being collected in accordance to the law and legal requirements.
And I always kind of can't help but laugh to myself if you think of the federal protections, PIPEDA, we all agree – and this is why there's changes in the works, that framework is outdated and that's why we have Bill C27 obviously and we have AIDA and because we recognize that the law in place needs to be updated. So when we are told I a very generic context and in good faith that our data is being collected in accordance with the law, what does that mean? Right? If the law is – I think few would disagree with – there maybe disagreements on how to update it. I think everybody says, [unintelligible 00:30:17] data.
Yves Faguy: They just keep on asking me to update my opt-out options. Over and over again. I mean I've answered this question so many times on the same sites. It's crazy.
Karen Eltis: Yeah, so that's actually a by-product and I can do a whole other podcast on that –
Yves Faguy: We will.
Karen Eltis: - [unintelligible 00:30:37] because really – and I'm kind of – I tend to digress, or monologue as my son says – so we're at a point of [inflection? 00:30:44] where the legal frameworks that we know, and this is, you know, a global issue, are being recrafted and refreshingly so. But in the interim to say that sensitive matters are being conducted in respect to the law, when – and I put that emphasis on the law – when the law itself is in flux, right? Can be really problematic. So don't worry, we're conforming to the law, meanwhile the law – you know, everybody says we're in the process of changing the law.
So we have to understand that we're in this period of transition, this period of change, and therefore kind of act with the requisite not only caution but awareness, because at the end of the day, and I'll end on this, more generally but certainly with courts, it's all about legitimacy and trust. So what you do in very sensitive context in a situation where these norms are in flux, can have repercussions later on back to the fallout point.
Yves Faguy: Another thing I want to get to is that – and I think one of the interesting things about the – you know, beyond the technology itself – one of the interesting things about the pandemic and sudden shift to virtual hearings, is that you know, we also saw the courts just drop a lot of old practices. Probably coming to the conclusion that perhaps these were vestiges of a different time, and that perhaps their utility was up for question at this point. And I wonder that now that again our courts are still inundated with case backlogs, now how are they looking – are they drawing inspiration from other countries? Are they looking at data analytics at all? Are they thinking of using algorithms and AI to work through some of these issues and these processes that could really be streamlined. And maybe it's not streamlining them technologically, maybe it's just eliminating steps in the process that are no longer necessary to deliver good justice to people.
Is there any appetite for that in the justice system now?
Karen Eltis: I think that one of the good things, you know, that came out of this very difficult period was that it forced us to be [unintelligible 00:33:02] more nimble and to abandon practices that seemed irreplaceable right? So that's a wonderful thing, and there certainly is appetite – and in terms of algorithms I think they permeate every aspect of life and they do have their place. We have to recognize they're already here. Right? And they do have their place for eliminating a redundancy, and that's where I kind of revert to the Think Fast, Think Slow, we have to go through this triage and think there are areas where a tremendous amount of discretion is not necessary. We don't have to have somebody mulling over, and the algorithm can certainly assist us in other areas where we have to be careful, and this is where – and I don't remember if we talked about this in the previous time – but if you think about something like the person's case, you know, are women people, or even Brown versus [Board? 00:33:45], if you have too much streamlining then these outlier cases will never be decided, right? And the law wouldn't advance.
This is somewhere where you do need a lot of discretion, where you need the court to say, look, this is how we decided things in the past, but we're going to go a different route because this would be unjust. And if you had an algorithm the algorithm would be correct in saying that women are not persons, this is not, you know, how it's been interpreted in the past, and the algorithm would be perfectly correct. But you can't have that because then of course, not to speak of social justice, but to put it in very mild terms there's stagnation and marginalization that doesn't fit a certain mould, and that's an issue more generally with limiting chances for social advancement. Very often justice is a tool and we spoke about – my interest was, and still is, the impact of technology on constitutional rights and democratic institutions. And when the Charter, the Canadian Charter of Rights and Freedoms was entrenched, one of the main, wonderful things that came out of it is that people felt they could go to the courts to advance their rights and this was – you know, it was known as dialogue and the courts processed the idea that if you're a group that has been marginalized and you can't have political victories because you're in the minority, you can go to court and challenging that.
If things are over-standardized that becomes a problem. But again, you know, can things be over-standardized for small claims? Like, can things be over-standardized for traffic court? The dangers of that with the amount of oversight are minimal. Do you want to standardize constitutional case law? That's a different story.
Yves Faguy: But is it necessarily a question of standardizing, I guess, the substantive case law? I mean there has to be some sort of distinction between probably the substantive elements of the law – and I understand where bias, engrained bias, and further engrained bias by means of artificial intelligence or algorithms is a danger to be avoided. At the same time there's a procedural side to the things. I suppose that sometimes they can overlap, but we are thinking here about developing an approach to a more user friendly system of justice.
I guess the question becomes, then, should the courts, should our tribunals be playing a bigger role in testing solutions in terms of making the whole process a little bit speedier, a little bit more efficient. And can they use data? Can they use analytics? Can they use algorithms for that purpose? Or should they be partnering with someone else to help them in that endeavour?
Karen Eltis: Yeah, so in terms of partnering that's a really important component that you raise. And partnering with civil society, partnering with community groups, certainly members of traditionally marginalized groups, people from poverty, advocates, that's extremely important. [Balkan? 00:36:31] talks about the triangle that I've mentioned in other contexts, between corporate actors, community groups, and government. And that's kind of [unintelligible 00:36:41] and I keep mentioning the Industrial Revolution, but in the Industrial Revolution that is what happened. If you look at the civil code of Quebec there's constant reference to [unintelligible 00:36:48] and to the traditional practices of corporate actors at the time, if you will, less so for community groups although that's very important at this, you know, far more advanced democratic stage one would hope. So partnerships are essential.
In terms of making justice more user friendly, absolutely. I think that has a lot to do with being nimble and a lot of the pomp and circumstance, if you will, deter people from accessing justice in certain cases, so this is a wonderful contribution that technology can make. I think that the concern that I've been referring to, when you look at comparative [unintelligible 00:37:22] you look at certain jurisdictions that have embraced – and this is again, you know, when we started the podcast with going through one extreme to the other – certain courts that have embraced technology and artificial intelligence wholesale in terms of decision making have seen a side effect – Professor Ben Liebman of Columbia, his research points to that – of potentially descaling human judges where the human judges are given the task of oversight in order to ensure that the rule is properly applied. Because the technology will take the rule, right? And will apply it firmly. But the rule could be applied, and this is where my examples came from, in a manner that is overly rigid.
So we're trying to make things more user friendly but we have to be careful in so doing not to get the opposite effect which is extremely rigid. Right? Technology can be rigid. Anyone who's ever done any coding, you put in the wrong – you know, I'd be terrible at it because I'm full of clerical errors. And you have one clerical error and the whole thing falls, right? So then you say, OK, don't worry, we have human oversight and we always talk about human oversight as the [unintelligible 00:38:22] the safety valve. But then the human kind of loses and that's what we've seen in systems that use a lot of artificial intelligence in justice and elsewhere, that the human is kind of afraid he or she loses agency, as I said, descaling, because [you're? 00:38:38] saying that if the algorithm said this is how the rule is applied, deep in their hearts we all have this lack of self-confidence, who am I to know better? Who am I to say – and if the system – and this is the administrative side of implementing technology, if the judge is being monitored for efficiency, and that's a consideration that I raised over a decade ago in terms of separation of power, if the judge is being monitored for productivity, does he or she really want to take the time to question? We see this even now. It's like, on appeal, what'll happen?
So that's just something that I want to flag. And flagging it doesn't mean, you know, that we're rejecting technology. I think it comes back to we're not here to say we adopt it or reject it. Just like for the Industrial Revolution nobody said, you know, we're going to keep making things by hand, although we've come to learn after decades of fast food that some slow food is also good, we've kind of learned to pick and choose. And some of it that is really going to be this continuing flow and struggle with reaching an equilibrium between these different considerations, to ensure that justice matches our democratic value system.
Yves Faguy: The thought of having judges undergo productivity monitoring is quite a scary one, I've got to say. Yeah, I don't think anybody really wishes that upon the justice system or those who use it. Do we also have to just be thinking in terms of money and the funding that goes into the justice system? Because there is obviously pressure on courts to get through their backload, the backload of their case law. And there is pressure on them to deliver justice in a more efficient way, and then there are obviously these concerns about overly rigid efficiencies in the justice system that you raise.
How should the justice system be guided in applying resources to these problems?
Speaker 3: That's an excellent question. And that no one can abstract financial considerations that they are a consideration. But when it comes to democratic institutions this is an important consideration. I think if we're mindful of what we said at the beginning, that technology's not a silver bullet. It depends how we frame technology. If we say, look, we can take a pre-packaged commercial platform whose mission is understandably to make profit out of data and to better itself – to train its systems using data, and we take that and we take the algorithms that are proposed and we give it – we give those, we encourage the class of people that cannot afford lawyers to go online regardless of their disputes and kind of get it settled with and then we pat ourselves on the back and say, we saved a whole lot of money because this application was free and because all these disputes [unintelligible 00:41:33] are solved and people all got there quote, unquote, day in court and were happy, and there was no community consulting and then you end up with allegations of bias and you end up with erosion of trust, and you end up with data scraping and you end up with tremendous cybersecurity essential infrastructure being menaced.
At the end, the bill will come in. So, I think it's a short-term, long-term. So it would be naĂŻve to say, you should not – the finances are not – they're a tremendous issue. But then you have to look at whether the fix, what kind of savings the fix is engendering. And in some cases yeah can certainly say, you know, this was a wonderful – and I mention, you know, BC and I mentioned there's certain ODRs that we can really congratulate ourselves and say, this is working wonderfully and people are satisfied and this is nimble and they cut through, they don't have to miss work and people who are disabled don't have –
And there's so many wonderful things that we really can say, this is great and this is not only money saving but this is fostering trust and nurturing trust. So this is a wonderful thing. We've, you know, killed two birds with one stone.
But other matters we just have to soberly reflect upon and that's all that kind of introducing this friction of reflection and saying, you know, a few years from now will we have saved both money and the legitimacy of the system, or will we have an – unintended way, and it's certainly an unintentional way, just gotten carried away and shot ourselves in the foot through some unintended consequences. And this is a question – and this is not a question that has one answer. This is a question that has many answers. And this is a process of reflection that we've thankfully begun in an overdue manner, but that we are certainly attending to and that we need to continue thinking about. This is not an easy fix. There are a lot of appealing opportunities.
This is an exciting time and there are a lot of wonderful opportunities that we can avail ourselves of. But avail ourselves of in a really thoughtful and sustainable manner.
Yves Faguy: And one thing I'm curious about, is in the No Turning Back Report, there was a recommendation that we should be looking to the Haig Institute for innovation of law. For innovative solutions to improve the justice system. Are there any examples that come to your mind of interesting solutions that have emerged from the institute?
Karen Eltis: I think they've been very concerned with communities and with everyday problems that are left behind, very basic disputes that go – an acknowledgement that the courts are handling very sophisticated disputes whereas many people are left behind and that there are community solutions in deploying technology and ensuring that they do reach individuals. So I think that that's been very important work that's been done, that's been explored.
And I think, you know, if we look at the UK, if we look at Australia, there's been tremendous reflection on having this vision. I think in Australia they use Kotter's model for change, and they look at, you know, the various steps that we can go through in ensuing that the digital transformation is done in a way that is inclusive.
One of the trends that I've seen is – and we were talking about budgeting, but hiring more technologists, getting technologists to come into courts. And certainly in terms of technological support that's a good thing. But I think again, you know, since my role seems to be one of caution, we have to be careful not to replace the legal minds, right? Because the law isn't just a technical process. And not the community voices. So if we look at the funding having, you know, technology experts in there as support is tremendous. But having community advocates, as I've said, poverty advocates having judges and jurists kind of feel empowered in terms of cybersecurity right? Because cybersecurity is people, at the end of the day. Having judges be empowered to use these new technological resources, I think that's a tremendous matter to be invested in.
We don't want to be in a situation where judges kind of defer and say, well, let – I don't understand this stuff, let the tech guy look at it. Because there's an overlap now between the technology itself as we do use algorithms, there's no question we're using algorithms and we'll only continue to do more of it. We don't want judges in terms of their independence to be in a situation where they're saying, you know, that's – I can't do that. Let the tech people do it. Instead, you want them to be trained not to understand a particular form of technology – and I really like this quote about explainability with [unintelligible 00:46:31] technology, you don't need to know how the – what the technology is, but you do need to be able to explain to the people you're using it for how you're using it for them, right? So, make sure that the judges understand how what the technology is being used to support them rather than the intricacies of the technology itself.
And that is really important for democracy to have this empowered and educated society, not technically literate, but understanding what role technology plays in the justice process.
Yves Faguy: You're here more and more that, you know, in the medical sector and the health sector they're hiring technologists to onboard research teams for, you know, curing cancer, or finding [cross talking 00:47:19] for multiple sclerosis for example. And so they're participating in that research effort because it can make the research go faster. It's interesting.
I've got a final question for you. And I'm going to put you on the spot, because you are a law professor. You know, since you've written this report I'm wondering if you can grade somehow, or give me on a scale of 1 to 10 how far have Canada's courts come since the release of the report? And how far have they got to go? I'm asking you to hand out a grade.
Karen Eltis: Yeah, so, I'll give that to my grading assistant. But – so I'll abdicate responsibility for an actual letter grade given the discrepancy between institutions and their grading scales. But – and I – you know, is it an advanced course, is it an introductory course? But just [unintelligible 00:48:04] I think the courts – and plus it's we didn't set it out in the syllabus so where's the procedural justice – you know. Not even telling [unintelligible 00:48:15] that were graded.
But in all seriousness, I think justice and courts have come such a long way in times of unbelievable stress and pressure and crisis, to ensure that access is continued, that it's extremely laudable. So that is tremendous. And if you look at the judicial education and just the events that I've participated in and I'm sure there are many others that I have not, that have dealt with the topics that we're broaching today. So just in one month I had a panel on AI and justice and another – actually in the same day – another on just videoconferencing tools and the likes.
So there's tremendous work being done, tremendous work aimed at creating a vision, communicating the vision, and most importantly making it stick. But because of this incredible period – and I always use this from a Princeton colleague – it's like meditating in Times Square. The ground is shifting under our feet. There's more work, right? So we've come such a long way. Words cannot do it justice. But there's so much more to be done because of this period of incredible potentially unprecedented change that humanity generally is living through, and democratic institutions have a greater challenge than most of us mere mortals because of the tremendous trust and the sensitivity – and sensitivity is a word that is recurrent in most of the new normative [frameworks? 00:49:51] that's coming out – the sensitivity of the data and their very unique role in fostering trust in democracy and you know, as Justice Barak said, democracy must fight with one hand tied behind its back.
So courts have that additional challenge where they could otherwise say, let's make our life easy and put these questions aside. But they don't. And they do try to ensure that the balance is met on a daily basis, all the while dealing with all the regular day jobs, so to speak.
So, tremendous, tremendous work being done all around.
Yves Faguy: Many thanks, Karen Eltis for your time today. Appreciate the conversation.
Karen Eltis: It's been a wonderful pleasure, thank you so much.
Yves Faguy: You can hear this podcast and others on our CBA podcast channel on Spotify, Apple Podcast, Google Podcast, and Stitcher. Please rate review us if you can, subscribe to receive notifications for new episodes. And to hear some French, listen to our [unintelligible 00:50:56] podcast. And if you enjoyed this episode, please share it with your friends and colleagues. And if you have any comments, feedback or suggestions of topics you'd like to hear us discuss here, feel free to reach out to us on Twitter @CBAnatmag and on Facebook.
Also, check out our coverage of legal affairs at NationalMagazine.ca. And thank you all for listening to this episode of Modern Law. Catch you next time.