A prototype was created to answer the question, if you have a complex system that needs monitoring, like a soda bottling operation, and there is too much visual information to display on screen at a time, would auditory clues about the system’s behavior be helpful?
The prototype simulates the imaginary “ARKola” bottling plant, where empty glass bottles, cola nuts, and carbonated water are delivered at intervals at different ends of the plant. The nuts and water are heated, the bottles filled and then capped, and then marshalled for shipping. Each of these steps has its own box in the graphical simulation and its own sounds (such as the clinking of newly arrived bottles). The screen can show only 1/4 of the simulation at a time, so the user has to listen for the crash of broken bottles, the spill of boiling syrup, and so on, in order to adjust several controls and keep the plant running smoothly.
In the test, users were put into partner pairs where each partner was in a different room. In the control group, the partners could speak with each other and monitor the GUI. In the other group, the partners heard the auditory icons (“earcons”) also.
- Earcons in general seemed very useful
- However, more recognizable sounds, like breaking glass, could distract from less recognizable sounds that indicated more important problems.
- The stopping of a sound tended to be ignored, even if the stopping was important.
- The partners who had earcons tended to segment the work and talk with each other about the other’s problems. The partners who didn’t have earcons tended to ignore the other person’s problems and focus on their own.
- The overall performance of the simulated plant was inferrable from the earcons almost as a gestalt, similar to the way people suspect problems with their car based on the sound of the engine.