The Meeting Point - Human-Machine Interfaces (Part 2)
Digital Information Overload
In the manufacturing and industrial arenas, operations are often overseen or run from a centralized control room. In days gone by, these rooms were often filled with large control panels, or control walls, filled with buttons, switches, and gauges. When these systems were designed, each and every device on the control panel had to be wired individually, so great care was exercised in selecting what functions and measurements were included.
In modern times, these control panels have been replaced with computer screens, and even the heaviest industrial machinery is commonly connected to a control system network. The practical takeaway from this is that where each piece of machinery used to have a few key parameters to monitor, they often each now have hundreds of available data points. Building management system interfaces can often display data from hundreds of devices, crammed into one or two confusing screens.
The result is that operators are overwhelmed with digital data. In a corresponding development, this data is often displayed numerically, sometimes in dozens of places on a single screen. Studies have shown that the human brain doesn’t easily process numerical values. It requires an operator to read a number and compare that number mentally to a known “good” value, and then make a judgment. When this is done with a single value, it’s not all that difficult. When keeping track of dozens of numbers, it becomes problematic.
Another byproduct to the digital and networking revolution is the almost limitless ability of designers to create alerts and alarms. In those old-style control rooms, audible alarms had to be individually wired to annunciators, to trigger horns and flashing lights. Since each had to be wired individually and had measurable installation costs, great care was exercised to ensure that only the critical information was included. In modern systems, creating an alarm costs nothing in terms of additional hardware design, and is often accomplished by a few clicks of a mouse.
Again, we can see how this quickly became overwhelming for system operators. Alarm “floods” have become a major issue and instituting alarm management programs has become an important element of control system design.
Color, Animation, Graphics, and Layout
Control systems were, in the days of yore, their own world. The hardware and methods used were unique to control applications. As modern systems developed, they began to utilize more and more off-the-shelf technology. This led to the use of modern computers, monitors, touch-screens, and more recently, smartphones and tablets in control systems. This gave designers an almost unlimited palette with which to create these interfaces and at the time, there were almost no standards or best practices to follow.
The results were predictable. Some designers tried to recreate engineering diagrams and schematics on a screen. Others tried to paint a picture of the physical reality, actually drawing the machinery and equipment on the screen, peppering the displays with numerical information. The addition of wild color schemes, animation, and graphics has led to displays which are confusing, distracting, hard on the eyes, and provide lots of data, but very little useful information to the operator. Poor arrangement of the information and selectable objects can also create problems. In January of 2018, an emergency drill at the Hawaii Emergency Management Agency was in progress. One employee mistook the drill for a real event and initiated a “push notification” sending an emergency alert to all cell phones in Hawaii notifying them of an inbound missile attack. It took 38 minutes to cancel the alert and notify Hawaiians of the false alarm. One of the key items identified in the investigation was a poorly designed software interface that allowed a drill to initiate a real alarm but had no means of sending a cancellation or false alarm notice.
Industry Standards and Best Practices
The good news is that industry groups and professional organizations have closed the gap between the development of the interface technologies and the development of associated standards. The International Society of Automation (ISA), and the Electric Power Research Institute (EPRI), amongst other industry groups, have studied the problems and developed standards and guides for better interface design. The many details and factors are beyond the scope of this article, but here are some of the key ideas:
- HMI’s should be intuitive, enhance the situational awareness of the operator, and assist in the detection of, and response to, abnormal situations. There should be no confusion or guess-work on the part of the operator in determining what is taking place or in what mode the system is operating.
- Grayscale color design should be used to reduce eye-strain and enhance the use of other colors. Use of color should be judicious and consistent (e.g. if the color red is used to indicate an alarm, it should be used for no other purpose).
- Key performance data should be displayed in an analog format, or in graphs and trends. Values should be presented in context, not just as raw numbers.
Human Factors in Design
While progress has been made, many interfaces are still poorly designed. Engineers and system designers tend to design for humans as they would like them to be, rather than how they are. A growing movement in recent decades has emphasized design based on the needs of the user. A central premise of this movement has been to resist the urge to blame the operator anytime something goes wrong. As Earl Weiner, a renowned pilot and safety expert who advocated for human-focused design for the airlines and NASA once famously stated: “…there is no problem so great or so complex that it cannot be blamed on the pilot.”
Indeed, when things do go awry, the response from both the government and the business world has been to focus on the operator, training, checklists, and procedures, etc. In reality, it’s time to take a closer look at the crucial interface between the human and the technology.
Read Part 1 here.