Yes, sometimes a bad user experience is the goal!

Do you realize how deeply your life is intertwined with technology? You live in a world where you can check the news, weather and your email from the comfort of your zero-gravity bed each morning. Once you get up, your smart thermostat can cool your room to just the right temperature while you get ready for work. When it’s time to leave, your self-driving car can take you there. Our experiences with technology are so convenient, aren’t they?

 

Now imagine this. Your 3-year-old daughter tells you that there’s a monster in her room. She points to a green light on the smart camera that you installed on the wall near her bed. You didn’t take her seriously at first. But one day, you walk into her room and hear pornography, along with the sound of hackers’ voices, playing from the intercom software in the camera.

 

This is exactly what happened to Tara Thomas, a California mom who had been using a Nest Cam to monitor her daughter. What the hackers accomplished was nothing special. To break into the device, they used simple software that most novices could figure out. When the story went public on the national news, people wondered why Nest couldn’t protect the Thomas family from this security threat and total violation of privacy.

 

The thing is, they could have. Very easily. Nest is a subsidiary of Alphabet, one of the largest tech companies in the world. They are security experts. But like most companies, Nest knows that protection from hackers and data breaches comes at a cost – and that cost is bad user experience.

 

Think of it this way: If your company sells a product or service, the last thing you want is for customers to feel that it’s too hard to use. If it’s too complicated, no one will want to buy it. There’s a term for this called “friction”, meaning that something gets in the way of customers achieving their goals. In the case of Nest, the friction would be caused by inconvenient steps – designed to prevent hacks – that users would have to take to login to their devices.

 

Which is worse: A bad user experience or a bad reputation?

The problem with companies trying to create “frictionless” experiences is that these often have their own set of dire consequences. When high-profile incidents such as the Thomas’ reach the news, the brand suffers. Even worse, people stop trusting the brand and the technology. This has happened time and again with all types of products and services.

 

Take, for example, the two recent Boeing Max 8 plane crashes. Both crashes are suspected to have been a result of an anomaly with Boeing’s new technology, the Maneuvering Characteristics Augmentation System (MCAS), on their new product – the Max 8 plane. However, it has been rumored that Boeing’s marketing department omitted the anomaly from pilot training manuals because they wanted to make the introduction of their new plane as simple and effortless for people as possible. If this is true, then their attempt to avoid friction cost 346 lives. As a result of these accidents and the subsequent Max 8 grounding by the FAA, Boeing’s reputation has been suffering tremendously.

 

In instances with severe consequences like this, friction should never be considered a negative thing. Sometimes, a bad user experience is a good thing. Having to take extra steps – as inconvenient as they may initially seem – can protect users’ security and even save their lives. If the complex technology can truly improve lives and people have a real need for it, then we must design it in a way that they can easily adapt to. There is, after all, always a way to reduce friction while maintaining security and usability. All that’s needed to do this is a deliberate focus on Human-Centered Design.

 

Human-Centered Design can help achieve an easy-to-use interface for users and a complicated backend that causes friction for unintended users.

How, you ask? Well, Human-Centered Design must be considered bimodally from the very start of the design process. The project team needs to ask themselves: “Where do we need to ensure easy-to-use interfaces and where do we need to introduce difficulty for the integrity and safety of the product?”

 

Let’s first look at how Human-Centered Design can help ensure easy-to-use interfaces – especially when it comes to complicated products. And speaking of complicated products, no industry stands out quite like the medical technology (“med tech”) industry, which creates complex software and devices. Med tech has been plagued over the last several decades by product recalls, data breaches and wrongful patient deaths. The costs associated with recalls alone cost the industry $2.5 billion to $5 billion annually.

 

It is Electronic Health Records (EHRs), however, that are one of med tech’s biggest and deadliest design offenders. Poor and overly complex interface design of EHRs has led to errors such as incorrect medical history, incorrect personal information, incorrect lab and test results, and incorrect prescription information being linked to patient records. This has resulted in an unacceptable number of patient injuries and deaths. According to Kaiser Health News, 3,769 safety-related incidents in 2018 were linked to EHR and other healthcare IT issues.

 

It’s interesting to note that studies have shown 80% of software users only use 20% of the software features! This is an important lesson for designers that good design doesn’t mean a product must do everything for users. It only needs to do one thing for users: It needs to solve one problem for them better than anything else on the market. If it can do that, it will be a success – even if there’s a learning curve to use it.

 

To create something like this in terms of EHRs or any other complex product, designers must first weed out any features that don’t add value to the product. If there are features that a majority of intended users wouldn’t use, as evidenced through thorough user testing and validation, then they need to go – no matter how cool or groundbreaking the function or technology may seem to designers! Many times, these extra bells and whistles are what cause complexity and confusion in the first place. If complex features can’t be eliminated, designers can also try to hide them. If people won’t be using these features very often, they can be kept somewhere that does not distract from the primary task.

 

Designers can also find ways to reuse components of an interface for different purposes, which creates a more consistent experience for users. For example, if users can learn how to perform one task, they can certainly use this knowledge to perform a second task. Over time, they will become familiar with the product and know what they can expect from it. This is not only true for EHRs… it’s true for all tech products including Nest, MCAS, and more.

 

But designers must also realize that easy-to-use interfaces are just one part of avoiding bad user experience. The second part is integrity and safety. Human-Centered Design teaches that a bimodal yin and yang thinking of design is always necessary. Designers can’t make products so easy or so hard to use that they introduce risk. There must be balance. There must be an understanding of the goals and the needs of users and motivations of unintended users in certain cases.

 

In the case of the Nest, designers understood that users wanted a device that was simple to use. But they failed to prevent the risk of unintended users. Remember: Designers are not only responsible for foreseeable use, but they are also responsible for foreseeable misuse. If disaster occurs due to a foreseeable misuse, and the company ends up in a product liability suit, the company will be liable for the incident and its impact on end users if the incident could have been predicted.

 

Once in a while, a faulty design does gives way to a positive outcome.

 

This probably happens more than you think… and a prime example can be found in a discontinued Medtronic insulin pump. Each day, type 1 diabetics must count – down to the gram – the amount of carbohydrates in everything they eat. Those who wear insulin pumps must program how much insulin to dispense based on their daily intake. That was, until a group of hackers in 2014 discovered that there was a security flaw in certain (now discontinued) Medtronic insulin pumps.

 

Hardware components of the pump and a continuous sensor for glucose were available to diabetics, but there was no way to connect the two. So, hackers used a flaw in the system to override the pumps with their own algorithm. This algorithm was able to calculate insulin doses based on real-time glucose data. Today, thousands of diabetics are hunting down these discontinued pumps on Craigslist and Facebook so they can experience much easier insulin control. Think about that for a minute. Users are literally taking it upon themselves to improve the usability of the pump based on their needs!

 

While doing this is dangerous, it highlights the importance of post-market user evaluation – even when it comes to “bad designs” that have been discontinued. An evaluation is always needed to understand how users are actually using products and where they may be “working” around a design that doesn’t quite fit their needs. In this case, bad user experience can be a good thing. Smart companies can use post-market evaluations as opportunities to not only ensure and improve safety, but also to improve their products or create new ones that meet unidentified user needs and create product innovations!

Leave A Comment