OLNA Support has partnered with Rembiont for instant written response analysis software

The team at OLNA Support has been in partnership with another online education provider (Rembiont Pty Ltd) who run the NAPLearn website. The team at Rembiont have developed and made use of proprietary algorithms using Natural Language Processing techniques and machine learning models to analyse writing responses. This technology has been developed and refined by one of the Directors Dr Robert Williams PhD, who has spent over 33 years, in the Information Systems Industry and across multiple Universities in Perth as a Researcher and Lecturer. He has developed this Automated Essay Grading (AEG) software, which makes use of computer and natural language processing technology to assign grades to electronic versions of English language essays. Robert and his colleagues have received over $900,000 in research and business development funding for their AEG and related projects. In 2018 Ewan Thompson founded Rembiont to develop an essay grading API and SaaS platform to further commercialise Robert’s technology. Ewan has extensive experience in the Information Systems industry, including many years in various software development roles, over 10 years as CTO for Australian ASX-listed company IDEAS International and 5 years in VP roles at Gartner.

AEG systems generally perform as well as human graders when human and computer scores on the same essays are compared. This information can be found in Robert’s (and a student’s) research in the links below:

https://espace.curtin.edu.au/handle/20.500.11937/38
https://espace.curtin.edu.au/handle/20.500.11937/1870

Constant refinement of this software over time has resulted in an overall very high correlation of 0.80 – 0.92 between a human and computer marking scores.

The team at OLNA Support have been impressed with the validity and reliability of this AEG software and has worked with the team at Rembiont to tailor a highly specific AEG system to implement and mark students writing responses on the OLNA Support website.

What does the Response Analysis Software mark against?

Broadly, the Response Analysis Software marks against Level 3 and 4 of the Writing section of the Australian Core Skills Framework (Developed by the Federal Department of Education, Skills and Employment), which states the minimum standard of literacy in the Australian education system. Specifically, the Response Analysis Software marks against the Online Literacy and Numeracy Assessment (OLNA), which has originated and been implemented from the Australian Core Skills Framework document.

The types of writing prompts that are available to students to practice and receive feedback are; Persuasive, Procedural, Informative and Creative/Narrative responses. All four categories are listed in the Australian Core Skills Framework as areas of written literacy that people in the Australian community should be proficient in.

We have developed and orientated this Response Analysis Software specifically on the OLNA Writing guide rubric, which details the areas that are marked by the School Curriculum and Standards Authority in an OLNA writing assessment. The seven criterion that are graded are:

Audience
Structure and Organisation
Vocabulary
Cohesion
Sentence Structure
Punctuation
Spelling

We have also included two non-marked criteria of “Ideas” and “Character and Setting” to aid students in their response feedback. Students will receive marks for each criterion and an overall mark for their response.

How does the Response Analysis Software grade a writing response?

The OLNA Support Response Analysis Software system makes use of proprietary algorithms using Natural Language Processing (NLP) techniques and machine learning models to analyse a student’s response. This software can detect events in a response and determine information density across the essay. We then use this information to assign scores for various OLNA grading criteria (as mentioned above).

Events consist of

– Actors
– Actions
– Locations
– Times

NLP software is used to detect these objects in a response.

A thesaurus is also used to find the concepts the response is discussing and these concepts are then used when assigning scores for content.

What Feedback does the Response Analysis Software give?

The aim of OLNA Support is to enable Western Australian students in Year’s 10, 11 & 12 to practice OLNA styled writing at school, and home, in preparation for their OLNA writing assessment. The comprehensive and instant feedback provided on a response is designed to enable the student, together with teachers, and parents, to discuss the strengths and weaknesses of the response.

Once a student’s response has been finalised and submitted, their response will be graded in a few seconds. In the student’s feedback, they will get a mark per criterion (Audience, Vocab, Spelling, etc) and within that criterion will achieve a level (Beginner, Intermediate or Expert) and feedback according to their response. Students will also be able to see where spelling and grammatical errors are made, so they may correct them in the future. Their feedback will be stored on the students account, which can be retrieved for further revision.

Will my Response Analysis scores reflect an actual score by a SCSA OLNA Assessor?

The scores provided by the Response Analysis Software on the OLNA Support website may not be the same as the scores that a response would be awarded from an actual OLNA assessment, though in most cases they should be similar. On various tests involving hundreds of responses the Response Analysis Grading Engine had a correlation of 0.80 to 0.92 with human markers, which is similar to, or exceeds, the correlation between different human markers grading the same responses. There will always be some natural variation between human markers as with human to machine marking, so some difference between marks should be expected.