Description: Usability evaluation is a key activity in software development where the purpose is to assess “the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use” (ISO definition). In software development, usability evaluations are usually conducted to support re-design in order to enhance the usability of the product.
The aim of this course is to present an overview of the research literature on usability evaluation. The focus will be on empirical research on (1) user-based testing and (2) expert-based inspection. User-based testing is the classical approach where usability experts acting as evaluators observe a group of users working with the software system that is being evaluated. Based on their observations, the evaluators produce a list of usability problems that have been experienced by the users during their use of the system. Expert-based inspection, on the other hand, was originally introduced as an alternative to user-based testing. With this approach, a group of experts inspect the software system that is being evaluated. Based on this, they produce a list of usability problems that they imagine users would experience during the use of the system.
For both user-based testing and expert-based inspection, the course will present and discuss (a) the original descriptions of the approach, (b) new methods within the approach, (c) laboratory studies of the use of the approach, and (d) case studies of the use of the approach in practice. For item (c) and (d) there will also be literature with empirical comparisons across the two approaches.
Format: Lectures with exercises and hands-on practice; and reading in advance.
Prerequisites: A Master’s degree in an IT related area (from science, engineering, the humanities or business). Participants who have taken an under-graduate or graduate course in human-computer interaction will benefit from this additional background.
Learning objectives: After the course, the participants will be able to:
• Provide an overview of empirical research on usability evaluation
• Discuss and assess empirical research on usability evaluation in general
• Describe empirical research focusing specifically on user-based testing and expert-based inspection and assess their strengths and weaknesses
• Plan and conduct user-based testing and expert-based inspection in collaboration with a usability expert
Effie Lai-Chong Law is a Reader at Department of Computer Science, University of Leicester (UK) and a visiting Senior Research Scientist at ETH Zürich (Switzerland). She obtained her PhD in psychology from University of Munich, Germany. Her research domains are human-computer interaction (HCL) and technology-enhanced learning (TEL) with a specific focus on usability and user experience methodologies. Effie has been chairing two HCI projects: ‘Towards the Maturation of Usability Evaluation’ (MAUSE) and ‘Towards the Integration of Trans-sectorial IT Design and Evaluation’ (TwinTide) in which more than 20 European countries are involved. Effie has also assumed a leading role in several interdisciplinary research projects on various topics such as game-based learning, personal learning environment and online experimentation. She is an editorial board member of Interacting with Computers, and publishes in HCI and TEL journals and conferences.
Anders Bruun is assistant professor in Human-Computer Interaction at the Department of Computer Science, Aalborg University (Denmark). He obtained his PhD in HCI in January 2013. His research interests include methods for usability evaluation, interaction design and HCI in relation to health informatics. He is an editorial board member of the Journal of Health Technology and Management and reviewer for journals and several HCI related conferences including CHI and NordiCHI. Before his PhD studies he worked as a User Experience consultant in a Danish software company where he was responsible for designing and evaluating user interfaces in small and large scale web projects ranging from 500 to 15.000 person hours.
Jan Stage is professor in HCI at Aalborg University, Department of Computer Science. He obtained his PhD in Informatics from University of Oslo, Norway. His research interests are within usability evaluation and interaction design. He has previously worked with software engineering and object-oriented analysis and design. He has been involved in large research projects and international networks on usability evaluation. Many of his research activities focus on usability engineering and software development in practice, and they have often been conducted in collaboration with software organizations. He has published his research in major IS and HCI journals and conferences. He is associate editor of Behaviour and Information Technology (BIT) and editorial board member of International Journal of Mobile Human Computer Interaction (IJMHCI) and Journal of Database Management (JDM).
Organizer: Jan Stage, professor, firstname.lastname@example.org
Lecturers: Effie Lai-Chong Law, reader, email@example.com, Anders Bruun, assistant professor, firstname.lastname@example.org and Jan Stage, professor, email@example.com
Time: 23-25 June, 2014
Place: Aalborg University, Cassiopeia, Selma Lagerlöfs Vej 300, 9220, Aalborg East
Zip code: 9220
Number of seats:
Deadline: 1 June, 2014
- Teacher: Jan Stage