Medical College of Wisconsin
CTSICores SearchResearch InformaticsREDCap

Meta-interpretive reliability of computer-based test interpretations: the Karson Clinical Report. J Pers Assess 1992 Dec;59(3):448-67

Date

12/01/1992

Pubmed ID

1487802

DOI

10.1207/s15327752jpa5903_3

Scopus ID

2-s2.0-0027016744 (requires institutional sign-in at Scopus site)   5 Citations

Abstract

Meta-interpretive reliability is a new method to evaluate the accuracy with which personality trait scores are communicated via interpretive statements in a computer-based test interpretation (CBTI). The prototypic experimental design is based on a two-way repeated measures analysis of variance (ANOVA); the two effects are personality traits and randomly chosen CBTI protocols. In this application, 101 psychologists read four examples of the Karson Clinical Report (KCR, Karson & O'Dell, 1975) and estimated the original trait scores from the Sixteen Personality Factor Questionnaire (16PF; Cattell, Eber, & Tatsuoka, 1970) on which the KCR is based. Estimated trait score variance was significantly related to the Trait x Protocol interaction and the main effects for personality trait and differences among protocols (omega 2 = .55). The total effect size corresponded to a multiple correlation of .74, suggesting that the KCR had acceptable meta-interpretive reliability. The protocol effect denoted a context effect created by the juxtaposition of several interpretive statements. Additional analyses showed that individual differences among raters contributed to less than 1% of the estimated standard ten (sten) score variance. Meta-interpretive reliability is proposed as an index of the upper limit of validity for CBTIs.

Author List

Endres LS, Guastello SJ, Rieke ML

Author

Stephen Guastello BA,MA,PhD Professor in the Psychology department at Marquette University




MESH terms used to index this publication - Major topics in bold

Adult
Aged
Communication
Female
Humans
Male
Middle Aged
Models, Statistical
Personality
Personality Assessment
Reproducibility of Results
Research Design