Readability assessment of online patient education materials from academic otolaryngology–head and neck surgery departments




Abstract


Purpose


The aim of this study was to compare the readability of online patient education materials among academic otolaryngology departments in the mid-Atlantic region, with the purpose of determining whether these commonly used online resources were written at a level readily understood by the average American.


Methods


A readability analysis of online patient education materials was performed using several commonly used readability assessments including the Flesch Reading Ease Score, the Flesch-Kincaid Grade Level, Simple Measure of Gobbledygook, Gunning Frequency of Gobbledygook, the New Dale-Chall Test, the Coleman-Liau Index, the New Fog Count, the Raygor Readability Estimate, the FORCAST test, and the Fry Graph.


Results


Most patient education materials from these programs were written at or above an 11th grade reading level, considerably above National Institutes of Health guidelines for recommended difficulty.


Conclusions


Patient educational materials from academic otolaryngology Web sites are written at too difficult a reading level for a significant portion of patients and can be simplified.



Introduction


The diversity and complexity of otolaryngology conditions make understanding educational materials on even the most basic topics challenging for many patients. Materials written at too high a level for someone unfamiliar with these subjects can result in misunderstanding and may deter patients from seeking essential medical care.


The relationship between “health literacy” and clinical outcomes has been extensively studied throughout other fields of medicine and has shown that deficits in health literacy are associated not only with poorer medical knowledge and comprehension but also with an increase in adverse patient outcomes (ie, additional hospitalizations and emergency care visits) . For these reasons, designing educational literature for patients in an appropriate and clear format is important.


Health literacy comprises numerous skills essential to function effectively as a health care consumer, including print literacy (reading and comprehension of health information), numeracy, and oral literacy . An estimated 80 million adults in the United States have substandard levels of health literacy, with these deficits most pronounced within lower socioeconomic groups and the elderly . In addition, numerous sources estimate that the average adult in the United States reads at between seventh and ninth grade levels .


The rapid proliferation of health-oriented online resources over the past decade only reinforces the importance of examining health literacy patterns. Health consumers using the Internet are very diverse both in terms of what type of information they are looking for, along with their actual demographics . Americans have increasingly turned to Internet sources when looking for information about health conditions, with more than 8 million Americans using online resources daily for this purpose . These figures are expected to further rise, as Internet usage has rapidly increased over the past 3 years with the proliferation of mobile-connected devices. One forecast estimates that by the end of 2012, there will be more mobile-connected devices than people on earth .


Reading comprehension required to understand online patient education resources is reflected in the “readability” of the text, which can be measured using several commonly used assessments . Commonly used evaluations examining readability include the Flesch Reading Ease Score (FRE), the Flesch-Kincaid Grade Level (FKGL), Simple Measure of Gobbledygook (SMOG), Gunning Frequency of Gobbledygook (Gunning FOG), the New Dale-Chall Test (NDC), the Coleman-Liau Index (CLI), the New Fog Count (NFC), the Raygor Readability Estimate (REG), the FORCAST test, and the Fry Graph .


The FRE score takes into account syllable count and sentence length, providing a score between 0 and 100 depicting difficulty that can also be supplemented with an estimated grade level (FKGL) . The SMOG assessment uses sentence length and the number of polysyllabic words to determine a grade level, whereas the Gunning-Fog also uses polysyllabic words (defining them as “complex” words) and total number of sentences to determine a score .


To estimate a grade level, the NDC test uses the number of unfamiliar words and sentence length, the CLI score uses sentence length and character count, and the New Fog Test uses sentence length and words with greater than 3 syllables . The FORCAST score does not use sentence length, only counting the number of single syllable words in its formula to determine a grade level. The REG uses the average number of sentences and long words to depict grade level on a graph, and the Fry Test uses the average number of sentences and syllables to also create a visual representation of grade level .


To the best of our knowledge, there are no studies examining the readability of Internet-based patient education materials (PEMs) from academic Otolaryngology practices. Our objective was to evaluate the readability of health educational materials aimed toward patients from the Web sites of these practices.





Materials and methods


Online PEMs describing procedures and the management of otolaryngologic conditions were obtained from the Web sites of academic otolaryngology departments in the mid-Atlantic states. Of 22 otolaryngology departments in mid-Atlantic states, defined as New York, New Jersey, Pennsylvania, Delaware, Maryland, and the District of Columbia, only 10 had their own PEMs. Any resources intended for the public were included in this analysis. Text sections with nonmedical information (eg, copyright notices, author information, citations, and references) were excluded from the assessment.


Each relevant entry was pasted into Microsoft Word, and a readability analysis was performed using the software package Readability Studio Professional Edition Version 2012.1 for Windows (Oleander Software, Ltd, Vandalia, OH). Scores from the 10 readability assessments previously mentioned were calculated.





Materials and methods


Online PEMs describing procedures and the management of otolaryngologic conditions were obtained from the Web sites of academic otolaryngology departments in the mid-Atlantic states. Of 22 otolaryngology departments in mid-Atlantic states, defined as New York, New Jersey, Pennsylvania, Delaware, Maryland, and the District of Columbia, only 10 had their own PEMs. Any resources intended for the public were included in this analysis. Text sections with nonmedical information (eg, copyright notices, author information, citations, and references) were excluded from the assessment.


Each relevant entry was pasted into Microsoft Word, and a readability analysis was performed using the software package Readability Studio Professional Edition Version 2012.1 for Windows (Oleander Software, Ltd, Vandalia, OH). Scores from the 10 readability assessments previously mentioned were calculated.





Results


Ten assessment tools for readability were used to evaluate online PEMs obtained from the Web sites of academic otolaryngology departments. Nine of these analyses calculated a readability figure in terms of grade level ( Table 1 ), with most readability scores for most departments being greater than or equal to an 11th grade reading level, which indicates graduate level readability. The FRE score was also tabulated, with all but 1 program scoring in the “difficult” readability range ( Table 2 ). The REG and Fry Graph can be depicted visually as the intersection of sentences per 100 words, with the number of long words in the former ( Fig. 1 ) and intersection with the average number of syllables in the latter ( Fig. 2 ). The data points in these 2 assessments mostly illustrate graduate level readability with PEMs from the University of Pennsylvania as outliers that appear to have easier readability levels than all of the other programs ( Fig. 3 ).



Table 1

Readability grade levels from online patient educational materials from academic otolaryngology department Web sites





























































































































Department (no. of articles) FKGL SMOG G-FOG NDC CLI NFC REG FCS Fry
Columbia (72) 13 14.6 13.9 11–12 13.7 10.5 17 11.6 16
Georgetown (25) 11.1 13 11.7 11–12 12.8 7.9 13 11.4 14
Johns Hopkins (55) 12.8 14.4 13.8 11–12 13.5 10 17 11.6 16
NYEE (31) 11.2 13.2 12.7 11–12 12 9.2 12 11.1 13
NYU (121) 11 12.9 10.9 9–10 12.5 7.4 13 11.5 14
Temple (11) 12.6 13.5 13.2 13–15 13.5 9.9 13 11.9 16
TJU (113) 15.2 16.5 16.6 13–15 14.2 14.3 17 11.5 17
U of Penn (69) 8.5 11.2 9.7 7–8 10.8 5.9 10 10.7 10
U of Rochester (21) 11.4 13.7 13.4 11–12 12.6 9.7 13 11.3 14
Weill Cornell (89) 10.8 13.2 12.7 11–12 12.6 8.4 13 11.3 13

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Aug 25, 2017 | Posted by in OTOLARYNGOLOGY | Comments Off on Readability assessment of online patient education materials from academic otolaryngology–head and neck surgery departments

Full access? Get Clinical Tree

Get Clinical Tree app for offline access