Assessing the Quality of the A3 Thinking Tool for Problem Solving (original) (raw)

Advances in Intelligent Systems and Computing, 2016

Abstract

The objective of this pilot study was to assess the inter-rater reliability of a newly developed A3 Quality Assessment (QA) rubric to evaluate the quality of completed Plan-Do-Study-Act (PDSA) projects that used an A3 Thinking Tool (A3) for problem solving. One A3 was independently reviewed by 7 PDSA experts using 5 main levels and 22 sublevels. Evaluations were compared and coded for agreement and used for statistical analysis. Fleiss’ kappa statistics was performed to test for inter-rater reliability between experts across 5 main and 22 sublevels. Preliminary results suggest that the A3 QA rubric meets reliability criteria with a moderate level of agreement beyond chance alone (κ = 0.44) and it is applicable to measure progress on problem solving abilities spearheaded via PDSA cycles. Additional verification testing is needed across multiple A3 improvement projects completed in multiple A3 Thinking templates.

Lukasz Mazur hasn't uploaded this paper.

Let Lukasz know you want this paper to be uploaded.

Ask for this paper to be uploaded.