Technical Communication

Information Design in Healthcare: Critique of a Journal Article (Part 1. Presentation)

Technical Communication – its sub-discipline of “Information Design”, more specifically – is often proven good or wanting by usage – hence the feedback principle: Technical Communication can be improved by constant revision based on reports from users. However, not all applications of Technical Communication can be usage tested safely. Opportunities for revision can be few, and, for the unfortunate, too late! Healthcare-related applications are a case in point. If healthcare information is incorrect or unclear, the consequences can be extremely serious. Consequently, establishing correctness and clarity ahead of publication is a paramount Technical Communication concern, there being little to zero room for retroactive correction.

The importance of Information Design principles are most obvious when applications are safety-critical. Consider a procedure that involves high voltages or dosage guidance, any error in the instructions could have fatal consequences. Information Design (ideally in conjunction with dummy testing) should reduce user risk by optimizing the efficacy of the informational conveyance.

This article critiques a journal article that tests the efficacy of a healthcare survey created according to Information Design principles.

Background

Health researchers are increasingly dependent on computer-generated forms and questionnaires for obtaining patient data. The authors of this study believe such forms do not yield information as efficiently as they ought, especially given the importance of the field to which the data pertains. The primary mode of data collection (the questionnaire) is typically not created by Information Design experts, and is characterized by poor design / insensitivity to respondents’ needs.  Employing comparison methodology, the authors explored whether a questionnaire designed in accordance with Information Design principles could reduce errors due to data entry or incomplete data spaces. Two questionnaires were separately created: one by information designers (the Study form), which was based on usability guidelines defined by Schriver (1997), and the other using the default settings of a design application (the Control form).

Method

Responses to 143 correspondent fields common to both forms were compared, allowing analysis of differences in the forms according to individual field.

Findings

Distinct patterns were evident: 4% of fields were responded to equally well; 6% of fields were better completed on the Control form; 90% of fields were better completed on the Study form.

The Study form outperformed the Control form on 129 fields.

Conclusion

Findings strongly supported the authors’ hypothesis that a form design based on Information Design principles would reduce inputting errors and thereby produce superior data. Results of statistical analysis were so emphatically positive in favour of the Study form that improvement over the Control form in terms of greater efficacy in obtaining patient data was determined to be between 15.2 % and 21.1%. The major implication of the findings of this study is that utilization of Information Design principles (incorporating usability testing protocols) constitutes an effective means of improving clinical research questionnaires.   

Key Research Issues

The study endeavored to address several problematic issues compromising effective collection of clinical data:

  • To obtain valuable healthcare data, patients are required to complete health questionnaires that are often badly designed, resulting in potential loss of information and waste of resources.
    • When faced with difficult questionnaires, patients are likely to satisfice (Simon, 1957), rather than engage and contribute maximally (optimize). It is therefore advantageous that patients interpret the document positively – a notion that is seldom given adequate consideration in medical questionnaires.
    • Because Information Design draws on a multitude of disciplines and its literature is comparatively disparate, designers of medical questionnaires are unlikely to have specialist training in its principles.
    • Frustrated respondents identify several core complaints regarding unsatisfactory, albeit not specifically medical, forms and questionnaires (Bagin and Rose, 1991):
    • Too complicated
    • Unclear instructions
    • Too long
    • Type too small
    • Insufficient space for answers
    • Words too difficult
    • Information requested too personal

Based on the above observations, the authors proffer three reasons that might explain why forms are not better designed:

  • Form designers cannot control every aspect of the design or design process.
  • Designers might lack understanding of Information Design principles.
  • Designers might lack understanding of the users of their forms (Schriver, 1997).

Purpose of Study

To establish whether designers, by addressing the foregoing shortcomings through application of principles derived from survey research and usability testing (incorporating feedback-driven audience analysis), could increase the efficacy of health questionnaires and, thereby, improve the data’s accuracy, quantity, and collection process.  

Hypothesis

… Health questionnaires that had been designed by information designers would reduce the number of errors caused by incomplete data spaces or improper data entry. (Zimmerman & Schultz, 2000)

Methodology (comparison)

The effectiveness of two separately designed questionnaires was compared.

  • The Control form was created by a systems analyst using the default settings of Teleform Elite Version 5.0, a Windows 95/NT-supported software application. The documentation for this application included instruction on adding data fields and accessing form templates for commonly used data fields, but did not cover the basics of form design per se.  
  • The Study form was created by researchers at Brigham Young University and was designed based on usability testing and feedback-driven audience analysis conducted using target audience members. Designers of the Study form analyzed every aspect of the data collection process that was likely to affect audience response, and incorporated fundamentals garnered from findings in survey research, questionnaire design, and usability testing in their design. Three stages of usability testing (Schriver, 1997) were performed to create the Study form:
  • Evaluation and testing of the original.
  • Creation, evaluation, and testing of the revised version.
  • Evaluation and testing of the final version.

Staff at the H. Lee Moffit Cancer Center and Research Institute defined the general specifications of the forms: both the Control and Study form attempted to elicit the same patient information; both contained the same type and number of fields (143).

Analysis

Three types of analysis were conducted to gauge differences between the two forms:

  1. Features/visual analysis of differences in general design.
  2. Statistical analysis (matched t test) of data retrieved using the forms.
  3. Field analysis of performance differences between each field on the forms.

Discussion

The study examined the value of Information Design for improving data collection in the healthcare field. Its findings supported the authors’ hypothesis that Information Design principles when applied to medical questionnaires can improve the quality and quantity of the data the questionnaires are created to collect. And, usability testing proved an effective means of identifying and repairing specific difficulties patients encountered in the earlier form.  

The findings of this study demonstrate the potential contributions Information Design can make to, in this case, the medical world. By logical extension, such smoothing of the interface phase of data gathering should also collaterally improve the quality, and ultimately the usage, of that data, resulting (ideally) in better practice all round.

Information Design principles, if seriously applied and given requisite facilitation, stand to offer major contributions to many, if not most, other fields by way of facilitating their data acquisition processes. However, the implications of the problems that the authors faced while conducting the study are equally worthy of attention – they exemplified broader issues symptomatic of negative perceptions held by longer established disciplines, namely: extant reluctance to recognize Information Design as a distinct and worthy field whose fundamentals, if adopted, offer major advantages to any industry or discipline that is reliant on data collection.

Spread the news