Home » News » Safety and Regulation of Digital Technologies

Safety and Regulation of Digital Technologies

Apps, medical devices and healthcare software are increasing day by day, but how do we keep up with them and how can we keep them safe?

Please note the views expressed in this post are personal views of the author and do not represent the views of Health Informatics Unit or RCP.

With the ever increasing number of apps, medical devices and healthcare software – how exactly do we keep up with all of them and regulate these to ensure safety and security? What about all of the web tools that provide medical advice or even seemingly simple calculations that are then used by healthcare professionals or patients themselves? What if the tools used are unsecure, flawed and give incorrect information that goes against one's professional training and knowledge? Regrettably, information given by machines is often trusted more than professional judgement.

On 18th July there was a thought provoking meeting held at Royal College of Physicians on "Medicine and Machines: Big Data, Digital Technologies & Healthcare Regulation" exploring the question ‘Is the Digital Agenda Safe and Effective?’ The meeting is third in a series of such workshops and included presentations and an open panel session with experts in the field of law, medicine & health technology.

The morning kicked off with a presentation from Duncan McPherson, a Senior Clinical Adviser with the Medicines and Healthcare products Regulatory Agency (MHRA). MHRA is responsible for ensuring that medicines and medical devices work and are acceptably safe, in line with EU legislation in this area. They have developed a roadmap to change assessment routes for digital technologies to make them stricter. Seems pretty straightforward thus far. However, MHRA classifications are based on conformity self-assessment and organisations themselves can add the CE mark to their product saying they conform to the legal requirements. The resource MHRA would need to have to carry out checks and licence products and apps themselves is insurmountable. To put this into perspective, there are around 60 new apps in healthcare created per day! One must ask – is CE mark fit for purpose for digital tech – a relatively new field?

Many app developers like to include disclaimers suggesting they shall not be held liable for any harm caused by the use of their product which is provided 'as is'. A disclaimer that is not legally binding, though, as Harold Thimbleby iterated on the day, it shows the attitude of the developers. If a company can be held accountable for a serious flaw in a physical medical device, why shouldn't developers of a health app be held accountable just the same – an app which is essentially a medical device?

The consensus in the workshop was that they should be held accountable. The market is over-saturated with low quality apps that are created by small suppliers without any clinical knowledge, are poorly tested for quality assurance if at all, with code that is not checked, and include errors that can and have evidently caused serious harm. But the question is how should quality and safety be assured and by whom?

Perhaps a 'quality mark' type of assurance would be feasible which would mean less scrutiny of those organisations with a proven track record in developing digital technologies in healthcare. Realistically, however, it is more feasible to regulate product than developer as there are numerous difficulties associated, e.g. the developer may be outsourced from overseas. One thing that is clear is that technical standards are not the answer as these would get out of date quite quickly – in some cases within weeks!

A good question raised at the meeting was – why does it require so many years of education and training to become a qualified clinician but any software developer can come up with an app that will be used in healthcare with unforeseen consequences? The panel members suggested that qualified computer scientists should work in teams and co-design software with clinicians working in close proximity. Though this may seem like common sense, this is not currently common practice, and still does not guarantee the product will be completely safe and without flaws.

A comment made by barrister Stephen Mason is fitting here – IT systems are not reliable. They do return viable results but we should not presume they are perfect. They are created by people writing code, and people are not perfect. Regrettably, the legal profession continues to work on the presumption that computer systems are reliable and Stephen would like to see this presumption abolished. And rightly so. Stephen and Harold Thimbleby provided numerous examples both in historical legal cases and clinical incidents showing this assumption is incorrect.

Harold suggested regulators should hold healthcare software developers to a high standard that should be aligned with requirements in aviation industry. Though this was countered by the view that the NHS cannot be comparable to such a commercial sector as aviation where every bolt is patented and can be traced to source of manufacturer. And it's true – though we need solutions and robust processes in regulating digital health, it is important to recognise that we can't plant one robust process from a commercial field into a public sector. Marty Chamberlain talked about the social ecology of regulatory activity and it is critical to note here – health providers are motivated by patient care, not patents or profit. But one thing we definitely can take from aviation industry where you can't have iterative agile developments based on trial and error, and that is simulation. Simulated environments allow testing in safe environments and there is a potential for health informatics / user experience / digital health innovation labs to fill the gap and provide a very welcome solution.

Additionally, whenever we talk about innovation and change we must not forget the crucial aspect of cultural shift. Wherever change is introduced, resistance will surely be there along with its friends: opposition, denial, anger, frustration, hostility. So we also need a transparent system for recording and learning from incidents as well as near misses in medical devices, apps or software. Post-it notes and workarounds won't do anymore. But we need to incentivise such culture change with a feedback loop that benefits those who enter the data. Such reporting could be part of feedback mechanism to propagate general learning as well as support those directly involved. And what if we further supplement this by adding requirement for 'black box' recording within the product? Another takeaway from the aviation industry, and is so much needed.

And what if we also tackle regulation from the procurement side? The panel and the audience agreed – thoughtful central procurement can lead to a step-change in quality of products. Advertisements for medical devices, including software, must be accurate and regulated. In addition, Jeremy Wyatt presented an overview of the current regulatory bodies that could have a role to play and suggested we need to agree on appropriate metrics for quality and evaluate tools against a gold standard using test cases, and then add safety ratings on products.

Bod Goddard, RCP President-elect talked about patient safety and the main new challenges we are facing which include genomics, AI, new drugs and personalised medicine. Bod said AI is very high on his list of priorities – “This is big, and a key thing for the College”. It was reassuring to hear that.

Bod also outlined the key questions for the digital agenda:

  1. What are the potential benefits – for patients and staff?
  2. What are the likely adverse events from AI?
  3. How to most effectively train doctors to get the best from digital technology?

We looked at a number of aspects around safety and regulation of digital technologies: development, assurance, monitoring, training, distribution. Whilst there are many issues to consider, the landscape is shifting, regulatory notions are changing and potential solutions are emerging. It was agreed that an interdisciplinary approach to digital safety risks was needed, involving collaborative partnership of regulatory, professional and technology industry to facilitate understanding, accelerate change and identify potential solutions to take forward.

What a productive day and a great conclusion to a series of three workshops. A report detailing a ‘future-visions’ blueprint for the policy and practice development of the role of Big Data, text mining, machine learning and A.I. in the professional regulatory sphere will be published in February 2019.

If you are interested in getting involved in tackling this or other topics and pressing issues in digital healthcare, do get in touch with RCP Health Informatics Unit on informatics@rcplondon.ac.uk, sign up to our newsletter and follow us on twitter. The twitter handle used for this event was #healthdatareg