Give automated marking a chance

Educators should have an open mind about computers assessing creative work

0
10400

Most educators use smartphones throughout their day. They communicate using email, utilise social media is various forms and drive cars that carry embedded computers. They also utilise spelling checks and grammar tools offered by Microsoft Word and other programs. It is clear to most educators that computer technology is developing, and adapting to it is necessary.

For those educators expressing discontent over computer automated marking, it might be wise to consider its benefits prior to disparaging the concept or making claims based on fear and anecdote.

Whilst it is true that adaptive technologies cannot yet wholly or accurately mark all forms of creative writing, educators should be open to working with technology in the interests of advancing educational outcomes, productivity and efficiency.

The context

Computers have been marking aspects of student assessment for well over a decade. Automated marking has become increasingly sophisticated. It is likely that every adult in Australia will have experienced exams, competitive tests such as the Australian Mathematics Competition, externally set school exams and even done their driving test, where computers were integrated into the process of marking. No one argues that the computer marking of so-called ‘objective response’ multiple choice questions is a problem, despite issues arising annually in every Year 12 exam featuring multiple choice questions where some of the questions may be open to interpretation and ambiguity. Nevertheless, educators have not stopped relying on computer marking as a result, even when there is confusion or disagreement with how questions are framed, and even the proposed ‘acceptable’ responses.

In terms of creatively written tasks, the notion of computers marking creative writing seems to have generated significant resistance from some educators. However, this resistance needs to be assessed in the light of what is currently done in external assessment. For written tasks on external exams (such as the Year 12 HSC in NSW) teachers will be briefed to mark to a standard from a pre-determined marking guideline and will then be allowed to mark from home at night after school, and on weekends. It is possible that a marker may be sipping on a glass of red wine while The Bachelorette or other television show may be on in the background, while reading and marking scanned scripts online.

Educators should be open to working with technology in the interests of advancing educational outcomes, productivity and efficiency

There is a requirement that teachers mark in the order of ten essays per hour, marking out of 20 or 25. Most items are marked once. If items are double-marked, which is occurring less frequently, then a margin of disparity between markers is allowed. The margin can vary as much as 15% above or below the first-given mark, meaning a variation of up to 30% can occur if three people mark a response. When it comes to creative tasks, it can be very difficult for subjectivity and discretion not to affect how a marker allocates marks, even on a predetermined criteria set.

 

Can algorithms mark creative written work?

Algorithms will be programmed to ‘recognise’ particular forms of writing, choice, use and order of words. There is academic evidence supporting the accuracy and efficacy of computer-based marking for written work.

 

Managing the transition

It is probable that algorithms will read ‘preferred’ forms of writing and, in this context, it is incumbent on authorities to disclose what forms of writing are acceptable. This should be publicly available to ensure equity and access.

Lexical, syntactic and semantic features are three important aspects of creative writing – but not all of them. A consideration for assessors and educators should be that the actual work has to be typed in order to be marked by computer and so a question arises as to whether the same creativity can be brought to typed tasks as hand-written tasks. Consider, for example, a creative piece that has been written in non-linear or non-rational (that is, creative) order as an aspect of the creative element.

Moreover, what if as an aspect of creativity, a child utilises SMS text and abbreviated speech that scores poorly on account of syntax but is actually valuable in the context of the piece?

The issue here is the ‘outliers.’ However, outliers at present still get penalised through the inaccuracies of the human-marking process. Having been a marker of externally state-run tests, I can state from experience that even amongst humans, there are significant disparities in marking, even when marking against a set of ‘objective standards.’ Therefore, a robust hybrid approach should be favoured with extensive human-marked sampling for validation. Included in this should be that all students not achieving requisite standards should have their work marked by teachers as well.

Educators should be interested to understand how adaptive technologies can ‘read’ creativity and the elements of persuasion. It is very common for people to be sceptical about the uncertainty that change can bring.  However, for the metrics that can be derived from clever design and application, the outcomes for students can be extremely beneficial, more targeted support and improved evidence for rich teaching helping educators to use their resources and energy wisely.