Date of Publication

2008

Document Type

Master's Thesis

Degree Name

Master of Science in Psychology

Subject Categories

Psychology

College

College of Liberal Arts

Department/Unit

Psychology

Thesis Adviser

Melissa Lopez Reyes

Defense Panel Member

Alexa P. Abrenica
Rose Marie S. Clemeña
Ma. Angeles G. Lapeña

Abstract/Summary

The main purpose of this research was to evaluate the appropriateness of applying item response theory (IRT) on the College Scholastic Aptitude Test (CSAT) and if its application would produce better psychometric properties for the test than classical test theory (CTT). The fit of three unidimensional IRT logistic models on the CSAT was explored by confirming model assumptions and predictions. The test data of a random sample of 15,000 CSAT examinees in school year 2004-2005 were used in the different analyses. The procedures were done on the four subtests of the CSAT separately and collectively as a single test. Based on the results, the three-parameter logistic model (3- PLM) was considered as the most suitable for the CSAT. This was further verified by demonstrating that the property of invariance was achieved when the model was applied on the test. The 3-PLM was then compared to CTT by investigating if invariance would also be attained when examinee scores and item indices were generated using CTT procedures. The two models were further compared in terms of the resulting validity of the CSAT. The conclusions reached by this study are the following: (1) CTT-based examinee scores are as invariant as IRT-based ability scores, (2) CTT-based item indices are as invariant as IRT-based item parameters except in Inductive Reasoning and across ability levels and test sites, where only IRT produced invariant item parameters, and (3) CTT-based and IRT-based scores are comparable in terms of test validity. The findings indicate that the IRT model may be applied to the CSAT only after the following improvements are done on the test: (1) revision of test specifications to enhance the unidimensionality of the different content areas in each subtest, (2) assembly of a new test form with items selected based on IRT parameters, and (3) replication of the different analyses for investigating the fit of IRT on the new test form.

Abstract Format

html

Language

English

Format

Electronic

Accession Number

CDTG004248

Shelf Location

Archives, The Learning Commons, 12F Henry Sy Sr. Hall

Physical Description

ix, 105 leaves ; 28cm.

Keywords

Item response theory; SAT (Educational test)

Upload Full Text

wf_yes

Share

COinS