I. The ratio of student passes to

                                                                                          I.           
Are E-Assessment Tools Helpful
In Programming Courses

Four research questions
were formulated to measure the degree to which e-assessment tools have helped
to students and instructors 25:

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

 

1) Have e-assessment
Tools proven to be helpful in improving student learning?

2) Do students think that
e-assessment Tools have improved their performance?

3) After having used the
tools, do instructors think that the tools have improved their teaching
experiences?

4) Is the assessment
performed by e-assessment Tools accurate enough to be helpful?

 

1.
Have e-assessment Tools proven to be helpful in improving student learning?

In 2003, Edwards 26 presented
fascinating results when moving from one e-assessment tool to another in a
junior level course on comparative languages, i.e. WebCAT replaced Curator
demonstrating submission of assignments along with test cases by the students
much earlier the deadline. In 2003, Woit 27 exhibited that online testing of
student’s practical skills imparts a more accurate measure of student ability.
This opinion is supported by the data that was collected over five academic
years, comparing student performance on online tests with and without e-assessment
tools. In 2005, Higgins 28 described an experiment in which Ceilidh was
replaced by CourseMarker at the University of Nottingham. The ratio of student
passes to failures was found to be very high, and has improved with the
evolution of CourseMarker. In 2005, Kumar 29 showed learning improvement with
an automated tutor intended at testing static and dynamic scoping concepts in a
programming languages course. Also in 2005, Malmi 30 showed results from
students using TRAKLA and TRAKLA2, in which final exam grades increased when
instructors modified the ways in which students were allowed to use the
automated tool and were allowed to resubmit their work. In 2011, Wang 31
showed that final grades of students using AutoLEP for grading were way better
than grades produced without using any tool.

 

Considering all these facts, a positive impact on
student learning with introduction of e-assessment tools into a course can be
inferred. End-of-grades or final exam scores were major measures used to
measure this.

 

2:
Do students think that e-assessment tools have improved their performance?

In 2003, Edwards 26
created a 20-question survey for students using Web-CAT, and it was found that
perceptions of using Web-CAT were generally positive.  In 2005, Higgins 27 distributed a survey to
programming students who tested the tool CourseMarker and indicated that over
75% of students’ treasured features only an e-assessment tool could provide
such as multiple attempts. Specifically, most students felt that several
available submissions encouraged them to work for a higher grade. In 2009, when
Garcia-Mateos 32 introduced Mooshak, he presented students with a survey
designed as a series of questions prompting for agreement or disagreement. 77%
of the students specified that “they learn better with the new methodology than
with the old one,” while 91% said that “if they could choose, they would follow
the continuous evaluation methodology again.” Also in 2012, Brown 33 surveyed
students using the JUG automated assessment tool on their insight of the tool’s
impact. Given the question “Did the auto-graded tests match your expectations
of the requirements?” the majority of students opted for the middle answer,
“Sometimes.” But the question “did the reports from

the auto-grader clarify
how your code should behave?” elicited a much more positive response, with the
majority of students answering “Often.”

 

Unconvincing results
concerning student perceptions of e-assessment tools were observed. Students
had a mixed reaction on this question; some were very positive, but a
significant number showed student dissatisfaction with the tools.

     

3:
After having used the tools, do instructors think that the tools have improved
their teaching experiences?     

In 1995, Schorsch 35
reported that 6 of 12 teachers who taught a class that used CAP to grade
assignments stated that the tool saved them around ten hours of grading per
section of roughly twenty students. In 2003, Venables 35 stated that the
feedback provided by Submit, the e-assessment tool she discussed, provided
answers to many of the questions students would need to ask while working on an
assignment. This tool ability freed up class time that otherwise would have
been needed for answering students’ questions. 
In 2012, Queirós 36 briefly asserted that automated grading surpasses
manual grading in efficiency, accuracy, and objectivity. E-assessment tools
remove biases and other factors from the grading process, and submissions are
marked in a fraction of the time humans would take to grade.

Overall, instructors
appreciate e-assessment tools for the benefits they provide, such as the time
savings. Most instructors report they must devote time in larger quantities
before a class first uses an e-assessment tool compared to subsequent
semesters, but the overall agreement is that these tools are effective
time-savers and are capable at the tasks they are designed to perform.

 

4:
Is the assessment performed by e-assessment tools accurate enough to be
helpful?

In 2005, Higgins 37
reported that CourseMarker effectively graded student submissions of one
section of a course to at least the similar level of correctness as a teaching
assistant liable for the marking in another section of the same course. In
2012, Taherkhani 38 demonstrated that for about 75% of submissions, AARI was
able to successfully identify the algorithms students used in a program that
required them to sort integers in ascending order.  In 2014, Gaudencio 39 reported that
instructors who manually graded assignments tended to agree more with the
results of an e-assessment tool than with results other instructors provided.

 

E-assessment tools have
proven to impart beneficial results in assisting the assessment process. 

x

Hi!
I'm Dana!

Would you like to get a custom essay? How about receiving a customized one?

Check it out