An extract taken from Chris Vincent's knowledge piece that discusses the growing use of digital platforms such as social media to understand how people really use medical products.
The problem: medical device usability
Medical device manufacturers are required to demonstrate compliance with agreed criteria relating to the safety, quality and performance of medical equipment. As part of these requirements recent focus has shifted towards minimising use related risk – for example errors that arise as a result of “confusing or unclear on-screen user instructions” (AAMI, 2010). For medical devices, there are standards and guidance that can be applied either in terms of design requirements that are proven to mitigate error; testing that shows there aren’t use related concerns (usability engineering); or post market vigilance which allows monitoring of performance in terms of safety and usability.
For usability engineering, one approach revolves around simulated use testing – testing a device out of context but with real world users. The reason for the use of simulation is that testing a prototype in a real-world context (for example a hospital) may be unacceptable when technology has yet to be proven. The limitation of simulated use testing is that an artificial environment may not represent the subtleties of real world use (for example workarounds and adaptations) – these are important factors to take into account when seeking improvement opportunities. Although there is a benefit in simulated use testing, there is also benefit in understanding how users really work with equipment. Current practice recognises this need and recommends that those involved in the manufacture of medical devices consult incident reporting systems. For example the FDA MAUDE (Manufacturer and User Facility Device Experience) database contains records relating to the safety and performance of medical devices submitted to the FDA by manufacturers, health care professionals, health care facilities and members of the public. There were over 2 million submissions in 2016. There is a potential to use this information to identify known use related problems and factor them into the design of medical equipment. By understanding issues with existing technology the design of new technology can be improved.
Figure 1: Infusion pumps can be prone to number entry errors.
How does it work at the moment?
In terms of currently recognised practice 62366-1 and 62366-2 identify a series of external resources that can be used to identify known problems with a user interface. 62366-1 and 62366-2 are commonly recognised usability engineering standards that apply across the product lifecycle. As part of this process the manufacturer of a medical device is expected to review use related issues during and after the design process. For example, if a manufacturer is producing an infusion pump that contains a number entry feature and a database such as MAUDE contains examples of issues relating to this feature, they would be expected to identify and analyse this incident, use it as input into the risk management file and validate the design accordingly. Similarly, FDA final guidance: “Applying human factors and usability engineering to medical devices” (FDA, 2016) states that identification of known use related problems can come from a variety of sources including:
- Customer complaints
- Sales teams
- Previous human factors and usability engineering studies
- Journal articles, conference proceedings
- Relevant internet sites such as MAUDE
In principle reviewing these sources can provide input for the design process, in practice there are concerns that such information can be hard to analyse . 62366-2 provides reference to this issue:
“In some cases, use-related problem reports do not explicitly cite or describe USER INTERFACE problems. Rather, they describe an event without providing substantial details that would suggest there is a USER INTERFACE design issue. Moreover, searching databases using terms such as human factors or USABILITY ENGINEERING cannot result in findings. Consequently, analysts can need to conduct a broader search for problems and analyse each case to determine if they suggest a pertinent problem to be avoided.”
Learning from incident data is not straight forward. For example, mapping between the “free-text” narratives describing an incident, the root causes of the incident and the potential for improvement can be challenging. Sometimes the necessary information is not contained in the output. Take Figure 1 for example – there is no way of knowing what occurred and how the design of the equipment could have prevented it. It follows that those tasked with reviewing this data are often challenged when it comes to finding the right information; they may not be able review all records and may have trouble interpreting content. Incident reports may be focused on the network surrounding the device, but not the device itself. They may allude to a Root Cause Analysis but not describe it in full detail. Different interpretations of a report may occur and information may be unclear, ambiguous or conflicting.
Figure 2: Example MAUDE records relating to an infusion pump.
Taking the analogy of a spotlight illuminating real world practice: the focus of the beam impacts on the potential to “see” design improvements. If our spotlight is weak, unfocused or pointing in the wrong direction we might miss opportunities for improvement. This paper proposes ways that the spotlight can be strengthened – for example how the analysis of systems like MAUDE can be supplemented by review of other public domain sources such as blogs, discussion forums and social media.
The use of social media has gained recognition in other industries where there is a need to understand the safety implications of unconstrained / uncontrolled use i.e. the real world behaviours that occur when a product is released onto market. Analysis for the pharmaceutical industry has shown healthcare related discussions posted on Yahoo! or Google groups contain descriptions that meet the criteria for an adverse event for approximately 1 in every 500 posts (Nielsen, 2012). The analysis of social media offers advantages over existing techniques as there are many more records produced at greater frequency across a diversity of population. This data can be used to inform positive as well as negative claims about a product. For example, in an article exploring the use of social media to collect safety and efficacy information in the pharmaceutical industry, Martin Goldman outlines the potential to leverage the 38 million social media users in the UK to understand more about how people really use a product (Goldman, 2016). In this case the data was used to provide evidence of a positive user experience, given potential for refuting evidence. In terms of using this type of data to inform the design of new products, there are various approaches including:
- Analysis of on-line content (e.g. use of resources such as blogs)
- Online surveys utilising social media / feedback via apps
- Analysis of content that can be used to track user behaviour in real time
These techniques integrate in different ways across various parts of the usability engineering process. For example, designers may check a discussion forum to understand variations in workflow associated with the product; they may sensitise themselves to issues that users experience; they may review online content to inform risk analysis.
This extract has been taken from Chris Vincent's knowledge piece 'Learning from real world use – complementing medical device experience databases through analysis of social media'. To read and download the full version please click here.