ChatGPT as a tool to improve readability of rheumatology patient education materials: a positive start with significant hurdles - PubMed (original) (raw)

. 2026 Jan;45(1):515-520.

doi: 10.1007/s10067-025-07746-9. Epub 2025 Dec 28.

Affiliations

ChatGPT as a tool to improve readability of rheumatology patient education materials: a positive start with significant hurdles

Arianna S Moss et al. Clin Rheumatol. 2026 Jan.

Abstract

Background: Patient education materials are vital to health education and, according to the American Medical Association (AMA), should not be written above a 6th grade reading level (Weiss 2003). We aim to assess American College of Rheumatology (ACR) patient education readability and use ChatGPT 3.5 to adjust it to a 6th grade reading level.

Methods: Four validated readability assessment tests were selected to assess the readability levels for 96 online ACR patient education materials on rheumatic conditions and treatments (American College of Rheumatology: Diseases and Conditions n.d; American College of Rheumatology: Treatments n.d.) the Gunning Fog Index (GFI), Flesch-Kincaid Grade Level (FKGL), Coleman-Liau Index (CLI), and Simple Measure of Gobbledygook Index (SMOG). All education materials were inputted into ChatGPT with the prompt "Rewrite the following text at a 5th grade reading level:". These converted excerpts were evaluated for readability using the online readability scoring software (Readability Formulas. Readability Scoring System 14).

Results: The mean (± SD) readability ratings of GFI, FKGL, CLI, and SMOG from the 96 original ACR patient education materials were 14.24 (1.85), 11.50 (1.74), 13.31 (1.42), and 10.33 (1.29), respectively. The ACR materials had a mean reading level of 12th grade (12.02 ± 1.35). After the ChatGPT prompt, the calculated average reading level for the simplified education materials ranged from grades 7 to 13, with a mean of 9.35 ± 1.12. The mean (± SD) readability ratings of GFI, FKGL, CLI, and SMOG for the simplified versions were 10.70 (1.39), 8.29 (1.21), 10.18 (1.11), and 7.68 (1.05), respectively.

Conclusion: The readability of ACR patient education materials exceeds AMA recommendations, and while ChatGPT significantly lowered the reading level, it failed to reach the 6th-grade level. Key Points • The readability of ACR patient education materials exceeds the AMA reading level recommendations. • AI has the potential to enhance the readability of patient education materials but failed to reach the desired 6th grade reading level. • The inherent bias of ChatGPT must be considered when evaluating the potential for patient education material optimization.

Keywords: Artificial intelligence; ChatGPT; Health literacy; Patient education.

© 2025. The Author(s), under exclusive licence to International League of Associations for Rheumatology (ILAR).

PubMed Disclaimer

Conflict of interest statement

Declarations. Ethics approval: The manuscript does not contain clinical studies or patient data. Consent for publication: Not applicable. Competing interests: The authors declare no competing interests.

References

    1. Safeer RS, Keenan J (2005) Health literacy: the gap between physicians and patients. Am Fam Physician 72(3):463–468 -PubMed
    1. Weiss BD (2003) Health literacy: a manual for clinicians. American Medical Association Foundation and American Medical Association, Chicago, IL
    1. Rhee RL, Von Feldt JM, Schumacher HR, Merkel PA (2013) Readability and suitability assessment of patient education materials in rheumatic diseases. Arthritis Care Res (Hoboken) 65(10):1702–1706. https://doi.org/10.1002/acr.22046
    1. Nweke U, Hassan S, Ahmad U, Jolly M. Are the American College of Rheumatology’s web-based patient education materials easy for patients to read and comprehend? [abstract]. Arthritis Rheumatol. 2022; 74 (suppl 9). https://acrabstracts.org/abstract/are-the-american-college-of-rheumatolo... . Accessed 26 Apr 2025
    1. Baker DW, Gazmararian JA, Williams MV (2002) Functional health literacy and the risk of hospital admission among Medicare managed care enrollees. Am J Public Health 92(8):1278–1283 -DOI -PubMed -PMC

MeSH terms

LinkOut - more resources