still from a paper prototype test sesion

Research: Paper Prototype

Audience

Lecturers in Kellogg classrooms

Team

Project manager, myself, third-party design firm

Challenge

Evaluate and improve new touch-screen interface for classroom control (Summer 2016)

Techniques & Tools

“Wizard of Oz” paper prototype

Outcome

Over the course of the research, the design firm made improvements/iterations of layout, icons, and labels before actually coding the screens.

The Project

New touchscreen control surfaces were being prepared for classrooms in a new building. These touchscreens allowed a user at the room’s lectern to control many elements of the room environment including lighting and window shades, and crucially controlled the audio-visual systems used to present lectures.

Kellogg had hired an outside firm to design the screens, but wished to independently review them with the target audience: professors with limited time or interest in learning a new system. The target audience had been using the ‘old’ design for some time and may have habitual uses and learned language.

Research

Typically this kind of ‘research’ is done as a show and tell and ask-for-feedback: someone familiar with the design shows it to a focus group made up of target users, tells them what everything does, and asks them if they like it. Feedback received in this way is often enthusiastic but does not actually reveal any of the flaws in the designs or labeling since the audience has just been told how the system works. Usability problems are revealed much later when untrained users must use the system alone for the first time.

When approached for my input on the designs, I encouraged the project manager to let me do actual research with target users. We had images of how the classrooms would look. The third-party design had provided images of the design which could be printed on paper. Target users were available to us with moderate scheduling effort. Working with the project manager who was familiar with the capability of the system(s), I wrote a test script to

  1. identify top tasks for the user,
  2. observe the user executing their own most important task,
  3. execute a secondary task and
  4. ask the user to tell us what was meant by the specialized labels on parts of the interface.

I interacted with the participants during the forty-minute sessions while the project manager acted the part of ‘the system:’ when I prompted the user to ‘tell me what you will click on,’ the project manager quickly handed us the paper which contained the next screen’s image. I could then follow up, for instance, by asking ‘is this what you expected to see?’ In this way we could test the screen designs without having to do any programming at all.

still from a paper prototype test sesion
Pay no attention to the man behind the screen: Wizard-of-Oz testing

To help us review the sessions later, we recorded them with an overhead camera. The image you see in this page is a capture from that recording. (My hands, the participant’s fingertips.)

Paper, really?

Lately in the usability community, the value of paper prototypes is being questioned. Many computer design tools can quickly be set up to flow from one screen to another, and even presentation tools like Powerpoint have long had the ability to jump between screens using buttons that can be triggered by the cursor. And in the pandemic age, doing anything as intimate as this test would be avoided.

But I argue gently that there is probably still a place for this kind of test in the toolbox. This kind of test can still be quicker to prepare than ‘programming’ an onscreen prototype, and at lower levels of fidelity need not require a designer. As in this case, the activity of ‘tapping’ a piece of paper is actually more realistic than using a mouse and cursor to simulate a touchscreen device. And it gives a stakeholder an opportunity to participate as more than just an observer, deepening their engagement with the findings of the study.

Learnings

  • Labels and terminology which seemed clear to us were a source of confusion for the target audience. These were especially around the idea of conference calling via the system or attending a web conference, which required different setup flows in the system.
  • Controls that were ‘always available’ were reorganized to an order and appearance that better reflected their relative importance to users.
  • Big splash screens were eliminated in favor of getting right to business. “I know I’m at Kellogg. You don’t need to show that big logo to me” remarked one marketing professor.
  • While some professors would plan to arrive as much as thirty minutes before a lecture in order to get organized in the room, others expected to be able to walk into a room as little as ten minutes before a lecture was to begin, and needed to be able to get the system running with their equipment in that time, while being peppered with questions by arriving students.

Improvements to the design (both layout and workflow) were made before actual programming of the screens began. This means we improved the designs at a lower cost—in user frustration and developer time—then would have been spent on building the system, installing it and then having to revise it later.