Computer software must be easy to use

[This was written for another audience, but may be useful here. Note particularly the guidelines at the end.]

Friends and colleagues complain to me all the time: Why are so many computer programs hard to use, with misleading or confusing screen prompts, counterintuitive methods to accomplish simple tasks, and lots of options that are rarely used but that clutter the screen and that often slow users down? After all, on what appliance other than a personal computer would you press a ‘start’ button to turn the machine off?

At home, these issues may be simply an annoyance. At work, they represent lost productivity, increased training time, and, often, less accurate data. Programs that are hard to use cost a company every day.

As a user-interface designer, and a usability consultant, my job is to insure that programs are easy to use, intuitive, friendly, receptive to good data and likely to reject data that doesn’t make sense. My not-so-secret technique in this work? I spend lots of time with the users -- first learning about the tasks they need to do, then watching how they use their current systems, and, finally paying particular attention as they try out our proposed designs in rough prototype form.

Even with my extensive training and long experience as a computer software designer, I must confess that the users know more than I do! They live in their subject world, and regularly use their knowledge of that world as they take orders, review insurance claims, schedule manufacturing or repair orders, or whatever. Not paying attention to their experience is a much too common, and often fatal, mistake.

Here’s the story of one interface design assignment, and how careful listening turned the whole project around. When first presented to me, the project seemed straightforward, if not exciting. My new client had developed a computer system to help collection agencies in their work, and wanted me to redesign some of its screens. At least, that’s what I thought before I arrived at their office.

Actually, their primary agenda was more mundane and problematic. They wanted me to magically stuff a lot more information on the already crowded screens. It wasn’t at all clear how this would be helpful, and I knew that it would make their screens uglier and harder to use. But how could I say this in a polite and helpful way, while steering the client to a more productive agenda?

For me, the first step was quite clear. My business card says, “Listening to Users”, and that is exactly what I planned to do. After a brief review of the system design, I asked it we could visit one of their clients in the same Midwestern town as their offices. They agreed, expecting, I believe, that we’d spend an hour or two. We ended up spending two days, and totally transformed our agenda and their product.

I asked for a “training harness” (an extra telephone headset) so that I could hear the debt collectors conversing with their “clients” as I watched them navigating through the screens of this computer system. Many of these “clients” were repeats, who regularly defaulted on medical payments, furniture store bills, checks written to pizza parlors, credit card bills, etc. The computer system showed all these bills, payments made towards them, and promised payments that were never made. Typically there were screens and screens full of data.

In between calls, I had time to ask the collectors about their work -- how they assessed each situation, when they felt they were getting honest responses, how they decided on a strategy, and what led them to accept the final agreements (or lack thereof). There were many interesting stories, often filled with amazing complexity. Along with these details were the feelings of heartache, as more and more of these “clients” were getting caught up in a web of debt that seemed inescapable. My role was that of the cyber-age anthropologist, noting the debt collectors’ behavior, and beginning to discern some patterns. I could see some common themes -- almost some rituals -- but some aspect of the collectors’ behavior kept eluding me.

Finally, on the second day, I blurted out what might have been obvious from the start: “So you’re in the business of getting these clients to make promises that they will keep . . . and it’s the keeping part that is so important!” The collector I was with almost jumped out of his chair with excitement. “Wow”, he said. “Nobody has ever said it so clearly. Yes -- that’s exactly what we do, and the keeping promises part is what it’s all about.”

I ventured another statement, a bit more tentatively: “And what are you doing with all these screens of data? Are you computing how often clients have kept their promises (or not?)” The collector was even more excited. “Why that’s exactly it”, he said. “I want to know what promises the client has offered or agreed to, and how well those were kept.” And he eagerly agreed when I suggested, “It sounds like we need to display some indexes of promise-keeping”. When I followed up with the question, “How would you compute these?” we began an active dialog -- now with the user taking the lead. We agreed that the initial computer screens should show the promise-keeping indexes, with all the data details available to the collector on later screen, but often not needed.

Since they treated me at first as a kind of priest of technology, the users were hesitant to come forward and play an active role. But once the discussion shifted to how data is managed in what now felt like their system, became the process consultant, they were the experts, and they could play a major role.

Unfortunately, users are too often in the back seat as systems are being designed, yet they are the ones with the most real subject knowledge. By listening carefully to the users, responding thoughtfully, and really trying to understand their work process, I could get their active engagement in this design process. In this case it led to a radical redesign of the whole screen concept, and, I believe, to a much more powerful system.

This is an especially clear example of a scenario that I see constantly -- Users who would not have been part of the process of designing the very systems they will use, but whose deep knowledge and understanding is critical to the system’s success. Typically the users feel technically inferior, and, in fact, they don’t speak the “systems” language that computers consultants are so fond of using. They absent themselves from the process, and are not invited by anybody else to join in.

When I lecture on the system design process, I explain that my first attempts to design a system are usually reasonably good, but it’s the users who correct me and really take it to a higher level. Watching them work with rough prototypes I can see where they struggle with my designs, and where they can easily find their way. And when the users have to struggle to use the proposed system, the problem is almost always that my design is not clear enough or lacks needed functionality -- not that the users need more training or experience.

When computer systems are hard to use, most of us blame ourselves and our lack of experience or training. We don’t consider that usability should be a prime characteristic, or that as reasonably intelligent people we should be able to find our way through. We also tend to assume that the system is properly designed to give us the benefits we expect.

Often, however, the systems we use today are simply reworkings of systems designed earlier, that are themselves reworks, and at some point we can trace back to a much earlier computer system modeled on how people used to do a task. With the power of modern technology, it should be possible to help users do their tasks in a more efficient and easy manner.

Years ago I was asked to work on a system for an organic produce distributor -- who wanted me to track exactly how much of each product was on hand in their coolers. This sounds reasonable enough -- until you realize that on-hand quantities were not what this distributor was selling. Trucks full of produce were en route east from California, and the distributor was selling the quantity of those expected arrivals that were still available for promise to customers (after subtracting items already sold). That’s a lot of computation for people to do, but not hard at all for a computer. The system had to be rethought so that it could easily display the necessary available to promise data that would let salespeople responsibly commit product to customers. My job was to step back from the detailed operational data, and focus on how the users were trying to interpret and understand it to make critical business decisions.

Information systems will add value when they truly enable users to be more productive, more confident, more correct in communicating with customers, vendors, and with other employees. Simply using the latest technologies, having the fanciest and most beautiful screens is clearly not enough. Buzzwords like “real time” or “bus architecture” mean little by themselves. Here are some guidelines that can help insure that systems are really workable and usable tools, that exemplify best practices, and that really work for your organization:

1. Include actual system users, at a variety of levels, in the design and review of the system. Their experience and insight are critical.

2. Start with a clear understanding of the business goals of the system. Don’t be guided by just a list of data elements to be tracked, or reports to be produced.

3. Watch the users working with the system -- whether it is in rough prototype form, or is a production system running with lots of real data. Learn how the system is really used, and to identify times when it probably should be used but, for whatever reason, is not.

4. Don’t count on your most technical staff to carry these concerns. Just as you need architects, engineers, and builders to design and construct a new facility (and, in fact, you need lots of subspecialties within this list), so you need people with skills in usability, user testing, analysis, system design, modern programming methodologies, and coding techniques to design and build a new system, or to review and revise an existing system.

5. Insist that the overall system design be written in language that is comprehensible to you and to your users.

6. Keep looking at your systems, even after you believe that they are “finished”. Most systems can be improved, but many never are.