In the latest episode of Impetus Digital‘s Fireside Chat, I sit down with Anthony de Fazekas, Head of Technology & Innovation at the international law firm Norton Rose Fulbright. Among many other things, we discuss the role of IT law in healthcare and Anthony’s journey towards his current role. We also dive into audacious topics such as the legal and ethical implications of implementing artificial intelligence in the healthcare industry, as well as Anthony’s predictions for the “new normal” post-COVID.
Here is a sneak peek of our conversation:
Q: From a legal perspective, what are some of the ethical concerns that companies should be aware of before they start exploring and delving into different AI applications?
A: It is sometimes important to step back, at least from a company perspective, because what we’re seeing, especially across larger enterprises, is that there is increasing adoption of advanced information technology, including AI and machine learning. So, part of what we’re seeing is that a lot of investment is being made and value is being created by having, for example, a more robust approach around analytics. Digitization of processes in another area. Sometimes, AI is being used extensively across a very specific application. But other times, it is being layered across the entire digital systems of the enterprise. So one of our philosophies is to take a holistic approach and look at how the company is approaching innovation, including information innovation, and what the key areas are where we need to look at the risks and help mitigate those risks from a legal and ethical point of view.
Certainly, as you point out, AI is being used extensively within healthcare companies. Whether digital healthcare or pharmaceutical, there’s extensive adoption of AI. in terms of the key risks that people should be looking at, there are many, but I like categorizing them into three buckets. One, we have the risks that relate to impact on finances of our society and loss of work. The second category that I bucket AI-related legal and ethical risks into is really errors in decision-making that are contributed to by AI, such as bias, but also breaches of bioethics for example. A lot of times people focus on the bias alone, but there can be other problems in decision-making that impacts not just the business but its stakeholders in a negative fashion. The third category has to do more with losses and error and liability caused by the AI decision-making.
Those are the three buckets that I like to look at, because otherwise you have a very, very long list. The specific use cases of AI and the risks associated with them, and the way to mitigate from a legal and ethical point of view, are going to be very specific to the circumstances. So rather than having a laundry list of risks, I think it’s better to have those three categories and then look in a rather granular way around what is the solution, what is the role of AI, and who are the different entities that are contributing in a way that’s going to impact on that resolve. It could be the data, it could be the technology. Some of it could be internal, some of it could be external. It’s very important to map it out and then look at those risks…
For more of our discussion, you can watch the whole Fireside Chat with Anthony de Fazekas, or listen to the podcast version, below.
To check out previous Fireside Chats and to make sure that you don’t miss any future updates, subscribe to our newsletter or follow us on YouTube, LinkedIn, Twitter, Facebook or our podcast. If you enjoyed this episode, kindly leave a review on iTunes.
About Impetus Digital
Impetus Digital is the spark behind sustained healthcare stakeholder communication, collaboration, education, and insight synthesis. Our best-in-class technology and professional services ensure that life science organizations around the world can easily and cost-effectively grow and prosper—from brand or idea discovery to development, commercialization, execution, and beyond—in collaboration with colleagues, customers, healthcare providers, payers, and patients.