Understanding usability problem lists is challenging

In an ongoing study about creating GUI redesigns based on the results of a usability evaluation I asked the participants if they had problems understanding the usability problem list. 44 participants, a mix of informatics and information technology students following a design course, participated. Their assignment was to create redesign suggestions for a web shop selling merchandise and tickets. The company developing the web shop had conducted a think-aloud usability evaluation resulting in a simple usability problem list listing 36 usability problems. Each problem was described with the location, a short description, and severity of the problem. The table below shows how the participants answered.

 Were there any usability problems you could not interpret? (n=44)
 Disagree strongly  18%  41%
 Disagree  16%
 Slightly disagree  7%
 Neutral  16%  16%
 Slightly agree  27%  43%
 Agree  7%
 Agree strongly  9%

As can be seen, 43% found that at least one usability problem was difficult to interpret. While this aspect is not the focus of the study, it is still an interesting finding that a relatively large amount of the participants had troubles understanding all the usability problems of a relatively short list of problems. I suspect that the 16% choosing ‘neutral’ probably believed they understood all problems with some uncertainty if this actually was the case. Unfortunately, I have no quantitative data about the number of problems difficult to interpret, but I do have some qualitative data. Especially one particular problem was mentioned among the participants. Not surprisingly this was a semi-complex problem and one of the more important ones to investigate further. I’m sure people receiving and using usability problem lists can recognize similar problems. Another challenge faced by the participants was recreating problems. Some problems are only happening under certain conditions, recreating the same conditions based on a problem description is not straightforward. Despite the missing of details, this non-scientific presentation, and the number of participants, these numbers adds to earlier findings and research in the communication of usability problems.

Here a few papers discussing usability problem reporting:

  • Hornbæk, K., & Frøkjær, E. (2005, April). Comparing usability problems and redesign proposals as input to practical systems development. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 391-400). ACM. 10.1145/1054972.1055027
  • Høegh, R.T., Nielsen, C.M., Overgaard, M., Pedersen, M.B., and Stage, J. The Impact of Usability Reports and User Test Observations on Developers’ Understanding of Usability Data: An Exploratory Study. International Journal of Human-Computer Interaction 21, 2 (2006), 173–196. 10.1207/s15327590ijhc2102_4
  • Molich, R., Jeffries, R., and Dumas, J.S. Making usability recommendations useful and usable. Journal of Usability Studies 2, 4 (2007), 162–179. PDF
  • Nørgaard, M., & Hornbæk, K. (2008). Working together to improve usability: challenges and best practices. University of Copenhagen Dept. of Computer Science Technical Report no. 08/03. PDF
  • Nørgaard, M. and Hornbæk, K. Exploring the Value of Usability Feedback Formats. International Journal of Human-Computer Interaction 25, 1 (2009), 49–74. 10.1080/10447310802546708
  • Redish, J. G., Bias, R. G., Bailey, R., Molich, R., Dumas, J., & Spool, J. M. (2002, April). Usability in practice: formative usability evaluations-evolution and revolution. In CHI’02 extended abstracts on Human factors in computing systems (pp. 885-890). ACM. 10.1145/506443.506647

Leave a Reply

Your email address will not be published. Required fields are marked *