Yayın:
Pretest Item Calibration in Computerized Multistage Adaptive Testing

dc.contributor.authorKaratoprak Ersen, Rabia
dc.contributor.authorLee, Won‐Chan
dc.date.accessioned2026-01-04T18:28:36Z
dc.date.issued2023-03-10
dc.description.abstractAbstractThe purpose of this study was to compare calibration and linking methods for placing pretest item parameter estimates on the item pool scale in a 1‐3 computerized multistage adaptive testing design in terms of item parameter recovery. Two models were used: embedded‐section, in which pretest items were administered within a separate module, and embedded‐items, in which pretest items were distributed across operational modules. The calibration methods were separate calibration with linking (SC) and fixed calibration (FC) with three parallel approaches under each (FC‐1 and SC‐1; FC‐2 and SC‐2; and FC‐3 and SC‐3). The FC‐1 and SC‐1 used only operational items in the routing module to link pretest items. The FC‐2 and SC‐2 also used only operational items in the routing module for linking, but in addition, the operational items in second stage modules were freely estimated. The FC‐3 and SC‐3 used operational items in all modules to link pretest items. The third calibration approach (i.e., FC‐3 and SC‐3) yielded the best results. For all three approaches, SC outperformed FC in all study conditions which were module length, sample size and examinee distributions.
dc.description.urihttps://doi.org/10.1111/jedm.12361
dc.identifier.doi10.1111/jedm.12361
dc.identifier.eissn1745-3984
dc.identifier.endpage401
dc.identifier.issn0022-0655
dc.identifier.openairedoi_________::b2906a46e85accf961e75c5c84bfb0cd
dc.identifier.orcid0000-0001-8617-1908
dc.identifier.scopus2-s2.0-85150647249
dc.identifier.startpage379
dc.identifier.urihttps://hdl.handle.net/20.500.12597/40556
dc.identifier.volume60
dc.identifier.wos000946664300001
dc.language.isoeng
dc.publisherWiley
dc.relation.ispartofJournal of Educational Measurement
dc.rightsCLOSED
dc.titlePretest Item Calibration in Computerized Multistage Adaptive Testing
dc.typeArticle
dspace.entity.typePublication
local.api.response{"authors":[{"fullName":"Rabia Karatoprak Ersen","name":"Rabia","surname":"Karatoprak Ersen","rank":1,"pid":{"id":{"scheme":"orcid","value":"0000-0001-8617-1908"},"provenance":null}},{"fullName":"Won‐Chan Lee","name":"Won‐Chan","surname":"Lee","rank":2,"pid":null}],"openAccessColor":null,"publiclyFunded":false,"type":"publication","language":{"code":"eng","label":"English"},"countries":null,"subjects":null,"mainTitle":"Pretest Item Calibration in Computerized Multistage Adaptive Testing","subTitle":null,"descriptions":["<jats:title>Abstract</jats:title><jats:p>The purpose of this study was to compare calibration and linking methods for placing pretest item parameter estimates on the item pool scale in a 1‐3 computerized multistage adaptive testing design in terms of item parameter recovery. Two models were used: embedded‐section, in which pretest items were administered within a separate module, and embedded‐items, in which pretest items were distributed across operational modules. The calibration methods were separate calibration with linking (SC) and fixed calibration (FC) with three parallel approaches under each (FC‐1 and SC‐1; FC‐2 and SC‐2; and FC‐3 and SC‐3). The FC‐1 and SC‐1 used only operational items in the routing module to link pretest items. The FC‐2 and SC‐2 also used only operational items in the routing module for linking, but in addition, the operational items in second stage modules were freely estimated. The FC‐3 and SC‐3 used operational items in all modules to link pretest items. The third calibration approach (i.e., FC‐3 and SC‐3) yielded the best results. For all three approaches, SC outperformed FC in all study conditions which were module length, sample size and examinee distributions.</jats:p>"],"publicationDate":"2023-03-10","publisher":"Wiley","embargoEndDate":null,"sources":["Crossref"],"formats":null,"contributors":null,"coverages":null,"bestAccessRight":{"code":"c_14cb","label":"CLOSED","scheme":"http://vocabularies.coar-repositories.org/documentation/access_rights/"},"container":{"name":"Journal of Educational Measurement","issnPrinted":"0022-0655","issnOnline":"1745-3984","issnLinking":null,"ep":"401","iss":null,"sp":"379","vol":"60","edition":null,"conferencePlace":null,"conferenceDate":null},"documentationUrls":null,"codeRepositoryUrl":null,"programmingLanguage":null,"contactPeople":null,"contactGroups":null,"tools":null,"size":null,"version":null,"geoLocations":null,"id":"doi_________::b2906a46e85accf961e75c5c84bfb0cd","originalIds":["10.1111/jedm.12361","50|doiboost____|b2906a46e85accf961e75c5c84bfb0cd"],"pids":[{"scheme":"doi","value":"10.1111/jedm.12361"}],"dateOfCollection":null,"lastUpdateTimeStamp":null,"indicators":{"citationImpact":{"citationCount":1,"influence":2.585622e-9,"popularity":2.912106e-9,"impulse":1,"citationClass":"C5","influenceClass":"C5","impulseClass":"C5","popularityClass":"C5"}},"instances":[{"pids":[{"scheme":"doi","value":"10.1111/jedm.12361"}],"license":"Wiley Online Library User Agreement","type":"Article","urls":["https://doi.org/10.1111/jedm.12361"],"publicationDate":"2023-03-10","refereed":"peerReviewed"}],"isGreen":false,"isInDiamondJournal":false}
local.import.sourceOpenAire
local.indexed.atWOS
local.indexed.atScopus

Dosyalar

Koleksiyonlar