Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
An exchange algorithm for optimal calibration of  items in computerized achievement tests
Stockholm University, Faculty of Social Sciences, Department of Statistics.
Stockholm University, Faculty of Social Sciences, Department of Statistics.ORCID iD: 0000-0003-4161-7851
(English)Manuscript (preprint) (Other academic)
Abstract [en]

The importance of large scale achievement tests, like national tests in school, eligibility tests for university, or international assessments for evaluation of students, is increasing. Pretesting of questions for the above mentioned tests is done to determine characteristic properties of the questions by adding them to an ordinary achievement test. If computerized tests are used, it has been shown using optimal experimental design methods that it is efficient to assign pretest questions to examinees based on their abilities. We can consider the specific distribution of abilities of the available examinees and apply restricted optimal designs.A previously used algorithm optimizes the criterion directly. We develop here a new algorithm which builds on an equivalence theorem. It discretizises the design space with the possibility to change the grid during the run, makes use of an exchange idea and filters computed designs. We illustrate how the algorithm works in some examples and how convergence can be checked. We show that this new algorithm can be used flexibly even if different models are assumed for different questions.

National Category
Probability Theory and Statistics
Research subject
Statistics
Identifiers
URN: urn:nbn:se:su:diva-174075OAI: oai:DiVA.org:su-174075DiVA, id: diva2:1357022
Available from: 2019-10-02 Created: 2019-10-02 Last updated: 2020-01-23Bibliographically approved
In thesis
1. Achievement tests and optimal design for pretesting of questions
Open this publication in new window or tab >>Achievement tests and optimal design for pretesting of questions
2019 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Achievement tests are used to measure the students' proficiency in a particular knowledge. Computerized achievement tests (e.g. GRE and SAT) are usually based on questions available in an item bank to measure the proficiency of students. An item bank is a large collection of items with known characteristics (e.g. difficulty). Item banks are continuously updated and revised with new items in place of obsolete, overexposed or flawed items over time. This thesis is devoted to updating and maintaining the item bank with high-quality questions and better estimations of item parameters (item calibration). 

The thesis contains four manuscripts. One paper investigates the impact of student ability dimensionality on the estimated parameters and the other three deal with item calibration.

In the first paper, we investigate how the ability dimensionality influences the estimates of the item-parameters. By a case and simulation study, we found that a multidimensional model better discriminates among the students.

The second paper describes a method for optimal item calibration by efficiently selecting the examinees based on their ability levels. We develop an algorithm which selects intervals for the students' ability levels for optimal calibration of the items. We also develop an equivalence theorem for item calibration to verify the optimal design.  

The algorithm developed in Paper II becomes complicated with the increase of number of calibrated items. So, in Paper III we develop a new exchange algorithm based on the equivalence theorem developed in Paper II.

Finally, the fourth paper generalizes the exchange algorithm described in Paper III by assuming that the students have multidimensional abilities to answer the questions.

Place, publisher, year, edition, pages
Department of Statistics, Stockholm University, 2019. p. 26
Keywords
Achievement test, Equivalence theorem, Exchange algorithm, Item calibration, Item response theory model, Optimal experimental design
National Category
Probability Theory and Statistics
Research subject
Statistics
Identifiers
urn:nbn:se:su:diva-174079 (URN)978-91-7797-879-4 (ISBN)978-91-7797-880-0 (ISBN)
Public defence
2019-11-15, William-Olssonsalen, Geovetenskapens hus, Svante Arrhenius väg 14, floor 1, Stockholm, 10:00 (English)
Opponent
Supervisors
Note

At the time of the doctoral defense, the following papers were unpublished and had a status as follows: Paper 1: Manuscript. Paper 3: Manuscript. Paper 4: Manuscript.

Available from: 2019-10-23 Created: 2019-10-02 Last updated: 2020-01-13Bibliographically approved

Open Access in DiVA

No full text in DiVA

Search in DiVA

By author/editor
Ul Hassan, MahmoodMiller, Frank
By organisation
Department of Statistics
Probability Theory and Statistics

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 8 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf