As a follow on from our previous blog, with more technology being used to treat patients, user interface (UI) design is key to making devices safe and effective. Patients aren’t the only ones who need good UIs however, Health Care Professionals (HCPs) are more reliant than ever on technology to do their job and this is only set to rise. They – more than most – need informative, error proof UIs especially considering errors kill 12,000 patients a year in the UK and no doubt cause complications for many more. Here are a few design recommendations tailored to designing devices with the needs of health care professionals in mind.
Know your audience
It is tempting to think that as you are developing a UI for professionals, background knowledge of the device is a given. This is somewhat true, however often the HCP using it may be an inexperienced nurse as opposed to a doctor. This is mostly dependant on what environment the device will be used in. If the interface is for a MRI scanner, it is safe to assume that those who are using it are trained in using the device. If the device is an infusion pump likely to be used by a large variety of staff; it is worth refining the UI so it’s easy enough for inexperienced users to understand too.
The Siemens MAGNETOM Prisma comes with a suite of compatible software allowing advanced clinicians to use the MRI scanner to its full potential. Image Credit: Gizmodo
Balance efficiency with the need for safety
As mentioned above, some of your users are more likely to be experienced using the device. These are people for whom efficiency and functionality are hugely important. As hospitals are often time sensitive environments, any advanced features to empower users such as keyboard shortcuts and interface customization should be embraced. Do not let task completion times become a top priority however. The human body doesn’t have an undo button, safety and error prevention should always be prioritized for medical UIs. Make sure users review settings before beginning a procedure and be sure to make the user check any inserted value that seems outside the expected parameters.
Be careful with error alerts
This is a tricky one. On the one hand if something is wrong with a device, a HCP should be informed immediately in case this poses a risk to the patient. On the other hand overloading HCPs with alarms can be just as dangerous. Alarm Fatigue is one of the biggest hazards facing the modern hospital and it’s easy to see why. Last year a single hospital recorded over 2.5 million alarms in a single month (and 88% of them were false positives). With this amount of alarms, HCPs can become desensitised quickly, especially when most of the alarms are ‘crying wolf’. When designing an interface, alerts and alarms should be thought about intelligently. Consider having different tones depending on the severity of the problem or sending a notification to a nearby HCPs smart device instead of having an alarm.
Consistency of information between devices
Ever noticed how most websites generally follow the same template these days? This is because we are more likely to navigate and understand a website if we have an idea of how it works. The same is true for medical devices. If HCP’s approach a device they haven’t used before, they will use their knowledge of existing devices to help them understand the new one. If there are conventions for this particular type of interface it is best to embrace them when designing your own.
Consider the context of use
This is a more general recommendation. It is unlikely that your device will be used in isolation. Often it will be part of a larger hospital ecosystem containing other medical devices. A HCP may be able to operate an infusion pump perfectly well in isolation, but with a ventilator, a heart rate monitor and many others demanding attention, they may rush or overlook vital parts of the process. What happens if a user gets distracted mid-way through using a device and doesn’t complete the setup process? What happens if a user finishing a process is not the same one who started it? Observing users in their work environment will help you understand just how the devices fit into the broader system and work flow.
An intensive care unit for newborn babies. Try to imagine paying attention to all the devices in the above picture. Image Credit: Children’s hospital.org
As with patient medical devices, high costs, long development times and intense regulation means most healthcare companies can’t innovate at the same speed as the latest start-up, and hospitals can’t upgrade devices every two years like consumers (if the iPhone cost over $10,000,000 I doubt you’d be in a rush to upgrade to a 6s either). Change needs to happen however. In an industry that increasingly relies on technology, good user interfaces are vital for the future of healthcare.
Economic driver for home healthcare
It’s no secret that our healthcare bill is increasing. Worldwide, healthcare spending is at a record high, and an expanding and aging population means this shows no sign of slowing down any time soon.
One way of coping with this demand is to get patients who need specialist equipment to treat themselves at home. Healthcare professionals (HCPs) have been prescribing patients small medical devices such as blood pressure monitors and blood glucose monitors for years; however now we are starting to see larger more complex devices such as nebulisers, infusion pumps and even dialysis machines given to patients for self-treatment.
This trend has clear benefits: The more patients can care for themselves, the less time they need to spend in a hospital (or other healthcare facility), thus freeing up precious capacity in the healthcare system (both HCP time and hospital beds).
A patient carries out a dialysis procedure in his own home. Image credit: redding.com.
Risk of use errors
Unfortunately, as more patients start using more complex devices at home, the chance of use errors increases, and the ensuing results can be deadly. An error whilst using Facebook could result in you uploading the wrong photo, an error whilst using an infusion pump could result in a lethal overdose. Understandably preventing such use errors is taken very seriously. The Food and Drug Administration (FDA) in the US, for example, requires usability tests for all healthcare devices prior to granting them regulatory approval and won’t approve any device that could cause a fatal use error.
Key role of the user interface
User Interfaces (UI) have a key role to play in reducing the risk of use errors in a patient-used device, especially now that more patients use increasingly complex equipment. In fact, the benefits of a good UI go beyond reducing errors. People generally like using products that feel easy to use and creating a device that patients feel good about has the potential to improve their well-being, increase treatment adherence and better overall health.
Unfortunately, UIs of patient-used devices available today are often lacking. This is due to a range of factors, including a lack of competition as well as the fact that a device is often selected by the HCP, not the patient who is the ultimate user. To complicate matters more, the payer for the device is typically another party such as a private insurance company or national healthcare provider (e.g. NHS). As a result, parameters such as functionality, reliability and value are still often seen as more important.
However change is on the horizon, improving interfaces doesn’t have to be hard. Often following simple design principles can improve the user experience, a few examples of these include:
Speak in a language users will understand
Patients generally won’t have the same level of understanding as medical professionals. All text should be in plain language and should explain terms and phrases that patients may not be familiar with. This is especially important if something goes wrong. If an error occurs, the device should display what happened clearly and suggest a solution for the user.
Reduce reliance on instructions
Of course, patients should always be familiar with instructions before using any medical device. Unfortunately, though, instructions can get damaged or lost, and often patients do not think they need to read them. To overcome this problem, always design interfaces that provide guidance and help for any first time users.
Reduce cognitive load
Every extra piece of information a patient has to remember when using the device will increase the chance of a use error. Make vital information visible at all times and try to cut down on the amount of information a user has to enter. If a patient is likely to use only one dosage setting, keep it in an internal memory instead of making them enter it every time they need to use it.
Reduce capacity for errors
Even the most experienced patients are still capable of making the occasional error. Anticipate where errors are likely to occur and design the device in a way that prevents this happening. If a patient enters a value that is far higher than what they usually enter, insert a warning before they proceed. Controls placement plays a large role as well; keep important controls such as “activate” or “power” in locations where they will not be pressed by accident.
The easypod™ auto injector separates the main interface controls on the front of the device from the injection activation button on the top of the device. Separating the controls means patients are less likely to trigger an injection accidently or until they are absolutely ready. Image credit: Merck Serono.
Designing a winning patient experience
A final word… User interfaces should always be tackled in terms of the broader patient experience rather than designed in isolation. Even the best UI in the world will not help if a medical device has poor physical ergonomics or is too impractical for a patient with limited mobility to use. Everything from how a patient communicates with their doctor to how they receive their medication should be taken into account when designing a device, and using a Human-Centred Design process can help you achieve this.
FMEAs (Failure Modes and Effects Analysis) are a common tool used in industry by device manufacturers to help members of R&D think of risk mitigation strategies to embed within their process whilst they are in the product development stages. FMEAs traditionally focus on system/component failures that can affect the operation of a device whilst UFMEAs (User Failures Modes and Effects Analysis) are intended to help members of R&D to focus on use-related errors. The term ‘Use Error’ has recently been introduced to replace the commonly used terms ‘Human Error’ and ‘User Error’, after the need to change the term was prompted by a high number of manufacturers commonly attributing errors to the users as opposed to investing in fixing error-prone device design.
Over the last few months we have seen several presentations and articles still referring to and attributing ‘Human Error’ to the poor usability of devices. However the term ‘Human Error’ or ‘User Error’ insinuates that the fault lies with the user for using the device incorrectly rather than with the design of the device causing the user to use it incorrectly. ‘Use Error’ refers to using a device in a way that is not intended by the device manufacturer. It refers to a mismatch between user input (which includes cognition and environmental factors) and the device they are using. In other words it refers to a mismatch between how users think, perceive, rationalise, learn and understand which affects how they interact with a device. When conducting use-related risk analyses people often struggle to focus on what a ‘Use Error’ actually is.
Arguably in some instances, errors may actually be caused by a user due to either a momentary lapse in memory or a simple mistake caused by an exceptional circumstance. But how do you handle this type of error? This is a common question which often pops-up when discussing use-related errors. How can you possibly cover off every mistake a user might make, no matter how weird and wonderful it may be? The answer is you can’t (or maybe you could with great difficulty and a year of brainstorming use scenarios!). Trying to measure the multitude of possible mistakes users can make will not only drive you mad but, also in terms of likelihood of occurrence, most of them will rate very low. When thinking about ‘Use Errors’ you need to know where to draw that line. There are a number of things you can do to help decide on how to do this. Firstly you should look at the level of severity. Think about all of the tasks users need to perform in order to use a device and select those tasks which result in a higher severity if the device is used incorrectly. Secondly, do not think of all the weird and wonderful things a user might do (abnormal use), think about the errors you can foresee them making (reasonable foreseeable misuse) with the device that you are evaluating. These lessons are often hard to embed in our thinking because, as Humans, we have by nature a tendency to think of the worst possible outcomes that can affect us. By switching from the term ‘Human Error’ to ‘Use Error’ you start to remove some of this unwanted thinking and start to get people thinking more about the device and to consider the important issues users have with their devices that could potentially be designed out. This is the key change in thinking needed to be able to effectively conduct use-related risk analyses.
Users interact with devices in many different ways and their actions can be influenced by numerous factors such as their experience with similar devices, the intuitiveness of a device, the quality of training given to them, the clarity of instructions or their professional expertise and knowledge. By switching from the term ‘Human Error’ to ‘Use Error’ and by getting stakeholders within R&D (Management, Engineers, Designers and Testers) to understand and be empathic with their target end users’ challenges/needs whilst effectively considering use-related errors, manufacturers will be better equipped to design products and services which not only have a benefit to the user but can also add a competitive advantage over similar devices in the market.
In Human Factors the art of asking a good question that is non-leading yet to the point, simple yet scenario driven, open yet has boundaries to stop people going off on a tangent, whilst trying to get the user to answer as honestly as possible sounds like a breeze doesn’t it? Think again!
As a Senior Human Factors Consultant I moderate formative and summative evaluations worldwide and it never ceases to amaze me how differently I have to ask questions depending on where I am or who I speak to. Wherever you are there are a number of influences which affect the way you communicate with others. Whether this is age, culture, social status, language or outside influences, the key is being able to learn and adapt your approach to the individual user(s) you are evaluating to find the best way to communicate. I am reminded constantly of this wherever I go!
Instead of asking a question and getting feedback or opinions directly, I had to think of a way to get feedback and opinions without people realising! How can you do this?
Project techniques are a great way to tap into an individual’s subconscious. By getting a user to role play or imagine a scenario and then getting them to finish the scenario, you allow the user to project their feelings and thoughts into someone or something else. So for example a question in this scenario could be: “If your laptop was a fully interactive futuristic robot what would this robot’s function be?” Once the user thinks about their response to this, you can then probe them to explain their answer. For example if the user said “The robot would have the ability to instantly find someone in the building” you could start to analyse this as a possible need that the user wants their laptop to have instant messaging capabilities or in-built GPS synced to other peoples smart-phones or smart-watches in the building. Similarly a response like “The robot would be able to time travel” could mean that the user wants a better way to search their history on their laptop or pull up information easier without trawling through hundreds and hundreds of tedious long winded menus. Once these types of findings are elicited then you can probe and check your understanding.
Sometimes it’s not the question but the method by which you ask the question, or by the adaptation of an existing question to suit the needs of the user. A lot can be said about using scenarios to get the user in the frame of mind of the question you are about to ask them. If done correctly a scenario can be really powerful in not only retrieving an answer, but also in making a user feel at ease and more comfortable in an evaluation so that they can be more open and honest. However, if done incorrectly and your scenario is too long or has too much detail then you can leave a user feeling very confused, frustrated, annoyed and unable to answer you. The key here is to remember that people have a tendency to listen a lot at the start, a little in the middle and a lot at the end of a question or a scenario.
So, if you want feedback on how intuitive a new tap for a shower is and you give a scenario like: “I want you to imagine that you want to take a shower. You have recently cleaned the bathroom and all of your toiletries are laid out as you like them. Please can you tell me what you would do next?” This may sound like a strange scenario to ask in the first place but the point is that quite often it is tempting to fill a task scenario with too much detail in an attempt to help frame the users mind set and embed them in the situation, that we add pieces of information that only distracts the user and is not actually relevant. For example by including details about the bathroom having been cleaned the user might think that they have to talk about cleaning, or by mentioning toiletries the user might think that the next steps would be to use those toiletries. Instead try “I want you to imagine you are about to take a shower. You are standing in the shower cubicle, please can you show me how you would interact with the shower?” The key here is to self-evaluate your use scenario. Try to answer it in as many ways as possible to see how a user could misinterpret the scenario. By doing this it allows you to generate clearer more effective task scenarios which can also be applied to questions.
During my internship with PDD, I participated in the LUMA Institute + PDD’s Human-Centred Design (HCD) for Innovation workshop that the PDD HCD team runs several times a year in London.
I initially had doubts about how a two-day workshop could help me tackle the issues I face in clinical settings. From reducing the anxiety of patients as they wait for care to improving medication adherence when patients are on their own, the challenges that clinicians face are varied, complex and multidimensional. Even trying to make a change is a challenge in itself, as many clinicians are pessimistic about what can be done after seeing so many unsuccessful attempts to improve the situation.
Although hospitals have adopted new methods before, most notably Lean and Six Sigma approaches to help with quality improvement initiatives, such methods are often difficult to apply because they require extensive training.
What surprised me in the HCD workshop was how quickly the methods could be applied, and how they stimulated creative engagement through a combination of individual thinking and collaboration. A common problem I’ve seen with traditional idea generation methods is the lack of true collaboration. The group defers to the leader’s idea quite early, resulting in fewer ideas and limiting the creative input of individuals. A key strength of the HCD methods is that they allow for critical insights to be made by all participants. Instead of an exercise in group thinking, participants work off each other to collectively develop better ideas. By breaking down innovation into manageable chunks, HCD tools empower different stakeholders.
The tools can also be used to tackle difficult scenarios. For instance, several of the ethnographic techniques, such as contextual inquiry and walk-a-mile-immersion, encourage members of the design team to identify critical issues by immersing themselves in the environment of the people they’re designing for, enabling them to build empathy for the complex ecosystems that people live, work and play in. The HCD approach also emphasises communicating and capturing ideas in a visual way to foster collaboration and visual thinking, and to identify priorities so that effort is directed toward the most important areas.
As part of the workshop, I got the LUMA handbook of HCD methods and a set of HCD Planning Cards. I used them to map out the sequence of methods I’m going to use in a workshop to generate ideas for improving the hospital service for patients from the inner city community at an Edmonton hospital. Hopefully I’ll be able to give an update on the outcome in this blog.
Pharmapack 2014, Europe’s main exhibition for pharma and healthcare packaging and drug delivery, has been another successful event for the PDD medical team. Attended by our well-connected Head of Medical Alun Wilcox and me – Sergio Malorni, we had many meet-ups with our past and current clients where we discussed their latest developments and initiatives. Likewise with our simple and clear message, our stand attracted other pharma reps new to us wanting to know how we can potentially respond to their many product development and usability engineering challenges. Let’s see what develops for us in the coming year. As for the exhibition the appropriate size and quality of suppliers and conference talks made this event well-liked; the show was focused which make it attractive for decision-makers to attend.
Interestingly while we were setting up the stand, we couldn’t help notice that design consultancies exhibiting at the show were almost all British. Despite English being the predominate language, the show was in Paris and the majority of suppliers and attendees were non-British – so no reason for continental European consultancies to stay away – or was it, and I say this without a sense of nationalism, a testament of the focus and quality of services provided by British consultancies in this sector?
As for trends, the desire for pharma companies to develop new devices is still strong – driven by many outstanding challenges that exists in drug delivery – improving efficacy, compliance and adherence, convenience, cost and user experience. Thus, we’ve seen new products for pain reduction such as Terumo’s new Nanopass thin needles, better looking packaging and electronics-based delivery systems.
We also saw many companies selling customisable platform technologies and veterinary medicine made its presence felt in in many ways, including a conference talk and the award winning FlexiBag & FarmPack developed by Virbac aimed at improving ergonomics and safety for large volume drug delivery.
Image credit: FlexiBag & FarmPack developed by Virbac
One of the highlights for us was the consistent positive reactions for our conceptualisation and development work of easypod™ for Merck Serono – even after seven years of it being on the market. The device sample in our display case stopped many engineers, pharma reps and academics in their tracks to learn more about it and how we went about gaining insights and creatively translating them into a commercially successful product.
Image credit: easypod™ delivery device, designed by PDD for Merck Serono
Referenced several times as a first-of-its-kind and an effective product during conference talks, it also attracted one passer-by exhibiting at the show telling us how he uses the product every day on his daughter and how great the easypod™ is. It’s hard to get a better accolade than a personal story where our work positively and meaningfully improves the life of a loved one.
We’d like to thank all who came to our stand and hope to see you again soon.
Surgery is a fascinating branch of medicine; rooted in science, yet still very much a craft and (whether we like it or not as a patient) frequently dependent on the skills of improvisation by the surgeon and their team. Surgery has come a long way since its early days but the pace of innovation has anything but slowed down. On the contrary, new technologies are creating new possibilities to improve safety and efficacy of procedures, and cost pressures are creating an imperative to achieve more with less.
As one of PDD’s experts in the field of surgery and interventional medicine, I will be looking at some of the trends occurring in this exciting field through a series of blog posts over the coming months. Today though, I will start by reviewing some of the pioneering work that has laid the foundation for where we are today.
Image credit: Modern high tech hybrid operating room from Maquet Getinge Group
Decades later, it would be South African surgeon Dr Christiaan Barnard who, in 1967, performed the first successful heart transplant, another major milestone in the history of medicine.
Orthopaedic surgery is another branch of surgery that has very high visibility. One of its pioneers is Sir John Charnley, a British orthopaedic surgeon. He is recognized as the founder of modern hip replacement, creating the procedure known as total hip replacement, in which both the ball and the socket of the hip are replaced.
The first pacemaker was implanted by Swedish cardiologist and engineer Dr. Rune Elmqvist together with his Professor Ake Senning in 1958. Their patient, Arne Larsson, suffered from cardiac arrhythmia which had led to a drastically reduced heart beat causing frequent fainting that meant he had to be revived. For the first implant, the inventors coated two electrodes and the necessary electronics in an epoxy cup to protect the components from the ‘adverse environment’ in the body. The first pacemakers only lasted three hours, but they were quickly replaced and the prototypes that followed would last longer and longer. Mr Larsson lived another 44 years and used a total of 26 pacemakers.
Image credit: The first fully implantable pacemaker, image source: Wikipedia
Laparoscopy, also known as ‘key-hole surgery’ or minimally-invasive surgery (MIS) has been around for over 100 years, with the first laparoscopic procedure in humans being reported by Dr Hans Christian Jacobaeus of Sweden in 1910. The benefit to the patient of less invasive surgery is a reduced risk of infection and faster recovery times; however, laparoscopic surgery requires the mastering of an advanced level of skill by the surgeon. With experience and improved technology such as better mechanisms, better materials and also better cameras to be able to see through the keyhole, MIS has gained ground and really established itself as an integral part of the modern surgical landscape.
Robotic surgery has come onto the scene gradually over the past 20 years, pushed by technical possibilities and also driven by the desire to address some of the shortcomings of both open and laparoscopic surgery. The pioneer of robotic surgery has arguably been the company Intuitive Surgical. Having acquired a number of surgical robotic technologies invented at NASA and MIT, amongst others, the company went ahead to successfully bring robotic surgery to the masses with its Da Vinci surgical system. While the clinical benefits of robotic surgery remain the subject of a heated debate, no one can argue the fact that the use of robots has expanded what is possible clinically and paved the way for future innovations.
Image credit: Da Vinci surgical robot from Intuitive Surgical
When I graduated with a BSc in Ergonomics (Human Factors Design) in 2011, it was clear that the subject is neither widely known nor greatly recognised, in fact I am constantly corrected that I did Economics at University. However, it was also apparent that certain industries placed great value in the area I had studied for three years; these were mainly energy, rail, aviation and defence – all big industries with a lot to lose if something were to go catastrophically wrong. It is therefore baffling to think the same attention to human factors isn’t apparent in healthcare.
A scientific field that has benefited aviation for 50 years is only just beginning to be recognised and valued in a medical context, where its application could prevent a vast amount of injuries and save countless lives. It’s incredible to think that statistically you’d have to fly on a plane for 36,000 years before you’d incur a serious injury due to a preventable error. But, if that serious injury then landed you in hospital, you’d have around a 1 in 300 chance of death due to a preventable error. Why?
In March 2013 the BBC aired an episode of Horizon entitled ‘How to avoid mistakes in surgery‘ where Dr Kevin Fong, an Anaesthetic Lead at UCLH, highlighted a surgical case study which showcased how small and simple errors align to result in a preventable death. He then investigated systems and processes used in other safety critical industries which could be applied in the operating theatre to prevent such situations.
Throughout the programme Dr Fong touches on the training in the fire service, the teamwork of an F1 pit crew and the checklists used in aviation; all are safety critical industries and all use human factors within and beyond the examples he provides. He then demonstrates how lessons learned in these cases can, and have, been applied in and around operating theatres, with astounding results.
It still begs the question; with such remarkable results, why the lack of human factors in healthcare?
Dr Ken Catchpole (co-author of the study into F1 pit-stop and aviation models for use in patient handover) points out a few of the contributors to the lack of Clinical Human Factors in his TEDx talk in Santa Monica last year, it is definitely worth a watch here. He talks through resistance from doctors and nurses, easily solved drug packaging design, the surgical hierarchy, not discussing ‘near misses’ and other basic examples where applying human factors would make a huge difference.
Figures from Patient handover from surgery to intensive care: Using Formula 1 pit-stop and aviation models to improve safety and quality (Catchpole et al. 2007) – Image credit: clinicalhumanfactorsgroup.com
Despite factors such as proving its worth repeatedly in other industries, significant recommendations from the chair of an NHS trust (Sir Stephen Moss; his report can be found here), and impressive results when applied within a clinical setting, human factors still does not have its rightful place on the clinical map. Healthcare, as a whole, is staring into the face of huge challenges, however, IEC 62366 and current FDA direction ensure consultants such as I can at least promise a human-centred process that will ensure medical devices and equipment are safe and easy to use; a satisfying step in the right direction. Meanwhile, I look forward to the day the healthcare industry recognises clinical human factors for what it’s worth. Lives.
A global medical device company recently approached PDD with a request to identify product innovation opportunities*. Our human-centred approach for such projects typically involves full immersion in the environment of the user and this project was no different. In this post, I will share some of my experiences of working in the hospital environment, the techniques used, and just generally what I enjoyed from my time in the field.
*(Note: to protect the identity of the client, the clinical space and product are kept purposely vague in this post).
In my role at PDD, I get involved in a lot of medical projects, mainly because of my experience in designing a variety of medical devices, but also because I am simply passionate about improving healthcare and medical products. In partnership with the client company, we designed a research protocol that would take me to hospitals across Europe and the United States. This was great because it allowed me to work in a variety of languages I am fluent in and thus better connect with users and stakeholders. Interestingly, I didn’t expect that my expertise on the British Royal family and in particular “The Baby” would be tested so many times, but I’ll get back to that later.
While I am an engineer by training, I have always enjoyed and relished project phases in which I can immerse myself in a user’s environment. This project gave me extended exposure to the user and the product: I spent weeks in hospitals, talking to doctors, nurses, technicians and administrative staff, as well as observing procedures, and it really allowed me to get a better feeling for how this particular product is used and what drives this particular class of doctors. I can definitely say that there are some similarities across medical specialities in the hospital and many of them due to the particularities of the hospital as a working environment. However, each clinical speciality also has its own culture: personalities of orthopaedic surgeons and neurosurgeons, for example, are typically quite different! Of course, each product will also have specific features that lead to different use patterns, and there is no better way to identify these patterns than by spending time with the user and seeing them in action.
Image credit: PDD
No shortage of coffee in this nurses’ common room – note the scale of the box to the jam jar!
As I mentioned previously, communication with stakeholders is a key component of opportunity discovery programmes. While some insights can be gained from observation alone, conversations – formal, or actually more frequently, informal, while grabbing a coffee or a sandwich – are an important means for identifying opportunities. I am French-German but also speak English and Spanish, so this project was a perfect fit: With research sites in France, Germany, Spain and the US, I was able to ask questions in the local language and thus had access to all staff, not just the anglophiles. Speaking the local language, unsurprisingly, really helped to put interviewees at ease, and it actually led to a lot of softer insights relating to the product and its packaging, as well as the touch points with the client company’s sales and customer support team.
In preparing for the research, our team had identified all the key stakeholders in the hospital that are relevant for this product and procedure. We also reached out to key opinion leaders for insights and opinions on current trends in the field (key opinion leaders, or KOLs, are high visibility doctors who tend to influence clinical practice and are respected in the clinical community). To guide interviews, we put together questions that we felt may be particularly relevant to potential areas of opportunity. Of course, before having done the research, this can only be a best guess, and questions and focus areas for the research were refined as we learned more about the product and its use.
Image credit: Johns Hopkins Hospital
The stunning exterior of the recently completed Sheikh Zayed Tower at Johns Hopkins Hospital in Baltimore, one of our research sites.
As long as participants agreed to it, we used as much photography and video as possible. This is really useful for two reasons: 1) it allows for revisiting a moment that seemed interesting (maybe in order to focus on a member of staff’s body posture, or hand movement, if ergonomics are an area for innovation), and 2) it is a great way to share an insight with the client company. It’s very satisfying to show a picture or video in which a user struggles with a certain aspect of the client’s product and see the faces of the client team change as they realise this is something that can be improved. Indeed, seeing problems as opportunities is the right frame of mind for such a research programme, as the findings provide the information needed to make informed decisions on future developments.
So what about the baby? Quite possibly I was the last one to find this out, but Americans are seriously obsessed with the British Royal family. As a result, in literally every hospital that I visited, staff would ask me about the happy event and were somewhat disappointed when I couldn’t impress with nuggets of knowledge I am privy to due to me being a resident of this island. Fortunately, our client didn’t care much about our qualifications in this domain. Rather, they wanted to know what we had found out, and after weeks spent immersed in the hospital, we could comfortably point them to a range of innovation opportunities that we had identified which will soon form the basis for a completely new development programme. Happy times!
I recently watched a TED talk by Joseph Pine entitled ‘What Consumers Want’ (2004), where he talks about an “important change to the very fabric of a modern economy” and takes us through the evolution of ‘economic value’. The journey begins with a commodity-based economy, advances to a goods-based economy, and then progresses to a service-orientated economy. He points out that as a population we are moving past pairing economic value with services delivery, to regarding experiences as the “predominant economic offering”.
The term ‘Patient Experience’ is sometimes used in association with a patient’s satisfaction of their journey through a healthcare service, such as the NHS. This is indeed part of it, but what about the rest of the picture? And why is Patient Experience important to us?
Patients are increasingly becoming empowered users, medical consumers with complex personal tastes and needs. These users aren’t first and foremost patients; they are consumers and as a result they expect the same quality of experience when using a medical device as they do using a mobile phone.
We can therefore apply the same principles, tools and knowledge we have of designing great User Experiences to satisfy the preferences of this increasingly sophisticated medical consumer. Fundamentally, it’s about the interaction between a patient and all aspects of a medical device, service or manufacturer. Just like the User Experience, it is individual and as increased competition brings patients greater product choice, it provides that enormously powerful competitive edge.
An example of a product that gives more: iGB Star by Sanofi Aventis is a blood glucose meter that can be connected to the iPhone, not only can you take blood glucose readings, but the accompanying app from Sanofi enables you to track your readings over time (along with other useful functions). CC image courtesy Pearlsa on Flickr
From a frequent user (or patient) perspective, research has identified that people don’t want their condition to take over; they certainly don’t want it advertised, and they don’t want to think about it. Instead they want their treatment to be non-intrusive and fit seamlessly into their lives, whether at home, work, or on the go. They are looking for solutions to help minimise the impact of their condition on daily life. If we can create products and systems to be discrete, simple and convenient then we’ll answer these needs and enrich the patient experience.
Think about the stark difference between a pharmaceutical or medical manufacturer that doles out a product or service to meet your basic medical needs, and a one that provides a product or service that has been tailored with you in mind, that goes above and beyond ‘great service’ to make sure you have the best experience possible when you’re feeling the most vulnerable. Which would you prefer?