Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Automated Essay Scoring: Scoring Essays in Swedish
Stockholm University, Faculty of Humanities, Department of Linguistics, Computational Linguistics.
2013 (English)Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
Abstract [en]

Good writing skills are essential in the education system at all levels. However, the evaluation of essays is labor intensive and can entail a subjective bias. Automated Essay Scoring (AES) is a tool that may be able to save teacher time and provide more objective evaluations. There are several successful AES systems for essays in English that are used in large scale tests. Supervised machine learning algorithms are the core component in developing these systems.

In this project four AES systems were developed and evaluated. The AES systems were based on standard supervised machine learning software, i.e., LDAC, SVM with RBF kernel, polynomial kernel and Extremely Randomized Trees. The training data consisted of 1500 high school essays that had been scored by the students' teachers and blind raters. To evaluate the AES systems, the agreement between blind raters' scores and AES scores was compared to agreement between blind raters' and teacher scores. On average, the agreement between blind raters and the AES systems was better than between blind raters and teachers. The AES based on LDAC software had the best agreement with a quadratic weighted kappa value of 0.475. In comparison, the teachers and blind raters had a value of 0.391. However the AES results do not meet the required minimum agreement of a quadratic weighted kappa of 0.7 as defined by the US based nonprofit organization Educational Testing Services.

Abstract [sv]

Jag har utvecklat och utvärderat fyra system för automatisk betygsättning av uppsatser (AES). LDAC, SVM med RBF kernel, SVM med Polynomial kernel och "Extremely Randomized trees" som är standard klassificerarprogramvaror har använts som grunden för att bygga respektivt AES system.

Place, publisher, year, edition, pages
2013. , p. 51
Keywords [en]
Automated Essay Scoring, Swedish Essays, supervised machine learning
National Category
General Language Studies and Linguistics
Identifiers
URN: urn:nbn:se:su:diva-87266OAI: oai:DiVA.org:su-87266DiVA, id: diva2:602025
Presentation
(Swedish)
Uppsok
Technology
Supervisors
Examiners
Available from: 2013-01-31 Created: 2013-01-31 Last updated: 2018-01-11Bibliographically approved

Open Access in DiVA

Automated Essay Scoring: Scoring Essays in Swedish(1234 kB)4050 downloads
File information
File name FULLTEXT01.pdfFile size 1234 kBChecksum SHA-512
11718a7b9e07a9d1038245ec13c80bd65c54844a2c67e2748a9f95687567858c8d8f142553d32574c4619748d560d5ed41dd05e3e46923151be7df5d0ea0ba33
Type fulltextMimetype application/pdf

By organisation
Computational Linguistics
General Language Studies and Linguistics

Search outside of DiVA

GoogleGoogle Scholar
Total: 4050 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 3444 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf