REVIEWING PEER REVIEW, AN EYE TRACKING EXPERIMENT OF REVIEW BEHAVIOUR

Year: 2015
Editor: Christian Weber, Stephan Husung, Marco Cantamessa, Gaetano Cascini, Dorian Marjanovic, Srinivasan Venkataraman
Author: Boa, Duncan R; Hicks, Ben
Series: ICED
Institution: University of Bristol, United Kingdom
Section: Design Theory and Research Methodology, Design Processes
Page(s): 369-378
ISBN: 978-1-904670-65-0
ISSN: 2220-4334

Abstract

The quality of peer reviews is a longstanding issue within the Design Society with concerns over the consistency and transparency of reviews raised frequently. Previous research has sought to qualify these concerns by describing the variability of review scores and correlating them with academic’s backgrounds. This paper aims to update and advance the current understanding of peer review within the Design Society by characterising review behaviour through the addition of eye tracking. Seventeen academics attending Design 2014 took part in an experiment. The results of the experiment are discussed in this paper with the aim of answering two research questions: do different review strategies exist and what are they? And, do character traits of reviewers affect reviewer strategy? Results confirm findings from previous research, suggesting little has changed since the topic was last reported and that inconsistency remains a problem. However, some of the cause of review inconsistency is potentially explainable through identified review strategies evident from eye tracking data.

Keywords: Peer Review, Human Behaviour In Design, Eye Tracking

Download

Please sign in to your account

This site uses cookies and other tracking technologies to assist with navigation and your ability to provide feedback, analyse your use of our products and services, assist with our promotional and marketing efforts, and provide content from third parties. Privacy Policy.