A Study for the Development of Automated Essay Scoring (AES) in Malaysian English Test Environment
Keywords:Automated Essay Scoring (AES), Innovative Computing, Intelligent System in Education, Natural Language Processing, Artificial Intelligence
Automated Essay Scoring (AES) is the use of specialized computer programs to assign grades to essays written in an educational assessment context. It is developed to overcome time, cost, and reliability issues in writing assessment. Most of the contemporary AES are â€œwesternâ€ proprietary product, designed for native English speakers, where the source code is not made available to public and the assessment criteria may tend to be associated with the scoring rubrics of a particular English test context. Therefore, such AES may not be appropriate to be directly adopted in Malaysia context. There is no actual software development work found in building an AES for Malaysian English test environment. As such, this work is carried out as the study for formulating the requirement of a local AES, targeted for Malaysia's essay assessment environment. In our work, we assessed a well-known AES called LightSide for determining its suitability in our local context. We use various Machine Learning technique provided by LightSide to predict the score of Malaysian University English Test (MUET) essays; and compare its performance, i.e. the percentage of exact agreement of LightSide with the human score of the essays. Besides, we review and discuss the theoretical aspect of the AES, i.e. its state-of-the-art, reliability and validity requirement. The finding in this paper will be used as the basis of our future work in developing a local AES, namely Intelligent Essay Grader (IEG), for Malaysian English test environment.