Why the Future of Healthcare Interfaces is Fluid

Tools we use to do our daily work have rapidly evolved following deep human-computer interaction (HCI) research.  However, when it comes to healthcare, the HCI research has not found its way to healthcare practice. Most healthcare professionals still rely on memory, experience, pen & paper and difficult-to-use healthcare software. The future of medical tools needs to be catalytic as opposed to a hindrance. In this piece, we attempt to untangle the set of tools used in healthcare today and its natural evolution.

Evolution of Doctors’ Tools

Today, the most impactful tools used by medical professionals would be their pen and the scalpel. The pen started to evolve to a keyboard about 40 years ago as EMRs (Electronic Medical Records) started to emerge. EMRs serve as the primary system of record for every healthcare institution. The first EMR was developed way back in 1972 by the Regenstrief Institute. Fast forward today, we see hospitals rapidly evolving their infrastructure to attain the coveted Stage 7 on the Electronic Medical Record Adoption Model to elevate the quality of care. Most EMR systems are running as desktop systems. Resulting in this reasoning, Professor Atul Gwande states how doctors hate their computers in his article “ Why Doctors Hate Their Computers.”

The computers were brought in for better visibility and collaboration amongst healthcare professionals. However today they pose a hindrance as opposed to being a catalyst. Professor Gwande exclaims that the systems that promised to increase their mastery over their work have, instead, increased their work’s mastery over them. Doctors actively, viscerally, volubly hate their computers.

What More Do The Doctors Need?

Other industries have rapidly gone through their technology transformation. We have seen the adoption of tablet computers and other human-computer interfaces. But healthcare professionals keep tapping away on complex interfaces and opening new tabs.

We started asking what if doctors could assign follow-ups, get references, right on the same interface as patient symptom collection, and easy annotation on pathology and radiology results?

If we explore the future of work, our everyday tools are changing. We are going from Bristol boards hanging in our offices to real-time updating Notion pages. The decisions made by our incredibly smart doctors are life-impacting. In that vein, clinicians today deserve the best-in-class tools that help them deliver care and better navigate this omnipresent world of seamlessly switching from virtual to physical. Today a clinician in a hospital collaborates with lots of people: patients, peers and more. Collaboration needs to be at the core of healthcare delivery to address the changing landscape of patient care.

Why Can’t Doctors Truly Collaborate?

Our survey of the healthcare software landscape suggests that a lot of them were designed based on assumptions about the user’s mental model. In a GUI ( Graphical User Interface ) or WIMP (Windows, Icons, Menus, Pointer) world, the software tends to be pretty rigid in structure. Most healthcare professionals are directed by the GUI on healthcare information systems. A typical clinic would enter patient records using many data entry forms on their EMR. When looking up the records to find patterns, a particular UI/UX design pattern and mental model is enforced by the GUI forms. An example of a doctor using a GUI based EMR would be the following: To find relevant information about a patient, a doctor would go to a search form, enter keywords, display a list of results, sort or search through them further or create further search filters, and copy data from one window to another to infer patient diagnosis. While seeing a patient, they would also pull up any radiology and pathology reports. A typical diagnostic inference session may include putting the search results and image annotations in two window dialogues side by side. The rigidity enforced by dialogues and windows (and generally, WIMP patterns) does not really match the fluidity with which humans tend to think and infer.

Every doctor has their own individual mental model they operate with. It stems from their early days as medical students when they go through a process of thinking, investigating, and pondering. As the popular adage goes: “the medium defines the message,” we can look at the widely used medium among medical practitioners — their notebooks. Sketching, scribbling, note-taking, the spatial organization of notes and images — all of these are activities natural to medical students when they are learning and comparing notes to find patterns. However, this medium gets neglected in conventional healthcare software design.

Following the GUI revolution, a new user interface paradigm has started to emerge called the NUI (Natural User Interfaces). The term NUI was first coined by Steve Mann to refer to interfaces that utilize the everyday actions and gestures that we naturally perform, instead of the usual graphical user interfaces (GUI) and WIMP paradigm. Examples of NUI include digital pen-touch interfaces and gesture-based input systems. However, the practical applications of NUI have yet to make it to the doctors for their daily use. A lot of medical students use their iPads to take notes, but these apps lack sophisticated search, note organization, and summarization features that could unleash their true power over the manual pen and paper notebooks.

This disconnect between the current GUI paradigm and a widely used medium (notebook) can be bridged by NUI paired with other features such as visual search. Note-taking and analysis post notes can be transformative in many areas of healthcare, such as medical record-keeping, audio/image/video annotations, surgical planning, complex case planning and more. The future interfaces for doctors will allow them to think, investigate, and collaborate intuitively, using the mental models they feel most close to.

For example, one possible medium that we are particularly excited about is sketching. Sketching is often claimed as an activity that is universal in nature and can cross conventional lingual boundaries that otherwise make explanation and exploration harder. In the context of modeling, collaboration and instructional discussions, sketching has a valuable part to play. NUI defines design principles for sketching in digital tablets, and we believe leveraging this extensive HCI research can provide a superior experience to doctors and medical practitioners.

Connecting the dots, we believe the future is fluid. With fluid thinking, we believe healthcare professionals deserve technology tools that help them break out of the rigid structures offered by the GUI paradigm.  Currently, doctors have to navigate their thought processes based on how the GUI paradigm is designed. The natural human thinking process demands more. With NUI, we see a world of fluid record-keeping, sketching, and investigation.  We are moving into an era of complex disease patterns and curing tough diseases such as cancer and other neurodegenerative conditions. With fluid interfaces, healthcare professionals will have better collaboration and greater inferences, ultimately leading to curing complex conditions.