Heuristic Evaluation for Software Visualisation: Usability Evaluation Materials (original) (raw)

Heuristic Evaluation: Usability Evaluation Materials

Heuristic Evaluation (Nielsen and Molich, 1990; Nielsen, 1994) is a method of usability evaluation where an analyst finds usability problems by checking the user interface against a set of supplied heuristics or principles. A set of evaluation materials for the heuristics described in (Nielsen, 1994) are presented, in particular: • A structured description of Heuristic Evaluation to ease learning and use; • A self assessed test to allow analysts to assess their understanding of Heuristic Evaluation; • A checklist of heuristics for use in the analysis; • Problem record sheets to record usability problems during analysis. Currently, these only record the usability problem predicted. In future, these could be modified to include recommended solutions to these problems (Jeffries, 1994); • A questionnaire to fill in after the analysis

Heuristic Evaluation in Information Visualization Using Three Sets of Heuristics: An Exploratory Study

Human-Computer Interaction: Design and Evaluation, 2015

Evaluation in Information Visualization is inherently complex, and it is still a challenge. Whereas it is possible to adapt evaluation methods from other fields, as Human-Computer Interaction, this adaptation may not be straightforward since visualization applications are very specific interactive systems. This paper addresses issues in using heuristic evaluation to evaluate visualizations and visualization applications, and presents an exploratory study in two phases and involving 25 evaluators aimed at assessing the understandability and effectiveness of three sets of heuristics that have been used in Information Visualization.

A Formal Protocol to Conduct Usability Heuristic Evaluations in the Context of the Software Development Process

International journal of engineering & technology, 2018

Heuristic evaluation is one of the most used techniques to evaluate the level of usability of a software product. In this research, we performed a comprehensive analysis of the recent studies which report the use of this method in the context of a software development process. The purpose was to identify the specific way in which each author performs this usability evaluation method, in order to propose a formal protocol. After an indeed examination of these studies, we have determined there are several differences in the way this technique is conducted according to the literature. There is no agreement about the number of inspectors that should participate, the usability principles that should be used, the profile of the specialists who must be part of the assessment team, or the evaluation process that should be followed. This work highlights the available settings and a detailed procedure to perform a heuristic evaluation in the domain of software products.

A methodology to establish usability heuristics

2011

Usability evaluation for applications based on emerging information technology brings new challenges. Is it the classical concept of usability still valid? Which are the dimensions of the (new) usability? How can it be measured? How should we develop for (better) usability? A methodology to develop usability heuristics for emerging applications is proposed. The methodology was fully checked in the case of Grid Computing applications, and partially applied in the case of Interactive Digital Television and Virtual Worlds.

A Usability Assistant for the Heuristic Evaluation of Interactive Systems

2017

The increasing demands for usable interactive systems in the context of limited project budgets bring in front the need for faster and cheaper evaluation methods. Heuristic evaluation is a kind of inspection method that proved to be cost effective. Typically, the method involves a small number of evaluators that are testing the interactive system against a set of usability principles called heuristics. A way to increase the efficiency of usability evaluation methods is to provide evaluators with software tools able to assist in documenting and recording of usability problems. This paper presents a software assistant for usability evaluation which provides with various facilities to conduct heuristic evaluation: definition of the tasks set, specification of heuristics used, and documenting of usability problems. In order to support the specific requirements of a target application domain a set of usability guidelines could be specified that are detailing the heuristic set. These guid...

Enhancing the Effectiveness of Usability Evaluation by Automated Heuristic Evaluation System

2011 Third International Conference on Computational Intelligence, Communication Systems and Networks, 2011

Usability defects test escapee can have a negative impact on the success of software. It is quite common for projects to have a tight timeline. For these projects, it is crucial to ensure there are effective processes in place. One way to ensure project success is to improve the manual processes of the usability inspection via automation. An automated usability tool will enable the evaluator to reduce manual processes and focus on capturing more defects in a shorter period of time. Thus improving the effectiveness of the usability inspection and minimizing defects escapee. There exist many usability testing and inspection methods. The scope of this paper is on the Heuristic Evaluation (HE) procedures automation. The Usability Management System (UMS) was developed to automate as many manual steps as possible throughout the software development life cycle (SDLC). It is important for the various teams within the organization to understand the benefits of automation. The results show that with the help of automation more usability defects can be detected. Hence, enhancing the effectiveness of usability evaluation by an automated Heuristic Evaluation System is feasible.

Validation of a Usability Evaluation Protocol based on the Heuristic Inspection Method: An Experimental Case Study in the Web Domain

Advanced Science and Technology Letters, 2016

The usability is nowadays an aspect that can determine the success of a software product. For this reason, there are methods that allow specialists to verify if the properties of the system interfaces contribute to achieving the desired degree of usability. Although the heuristic evaluation is one of the most recognized methods, there is not a formal process to execute this technique. The original author only proposes general recommendations. Based on the analysis of different studies that report the use of this method, we established a new usability evaluation process that can be used for any professional in Computer Sciences. This work is focused on the validation of our protocol to conduct heuristic evaluations. For this purpose, a case study was conducted with the participation of twenty postgraduate students. The analysis establishes that the results are more accurate when a formal process is used.

Heuristic Evaluation and Usability Testing: Case Study

The goal of this user centered design (UCD) study was to to identify usability issues on the Boğaziçi University Industrial Engineering (BUIE) department website user interface (UI) and also to provide a re-design guideline for the website. In this context, the website was evalauted via using heuristic evaluation, remote usability testing, and post-test questionnaire methods. Every single screen of the BUIE website was evaluated, and design problems along with associated severity rankings were determined. Based on the heuristic evaluation findings, various task scenarios were created for the remote usability testing study. Four diffferent user groups were identified for the study: high school students, BUIE undergraduate students, BUIE Graduate students/academic staff, and other university students. The users were asked to perform tasks relavant to the group characteristics and expectations related to the website. Their performance were evaluated in terms of task completion success rate, number of clicks, and time spent till either accomplishment, failure, or quittance by a remote usability testing tool, Loop11. A post user testing questionnaire was also administered online where the user subjective rating data were collected for each task in terms of ease of use. The results indicated numerous UI design issues, as confirmed by both heuristic and usability testing methods. Further redesign study is required to implement the results of this sudy in order to enhance the ease of use of the BUIE department website.