COMPUTER APPLICATIONS IN ENGINEERING EDUCATION, cilt.28, sa.6, ss.1406-1420, 2020 (SCI-Expanded)
In this study, a computer-based evaluation (CBE) software tool, namely CBE-MCQs, is proposed to assist the instructor to evaluate and analyze traditional classroom-based exams, which include multiple-choice questions (MCQs) reliably, quickly, and automatically. Using the proposed CBE-MCQs software tool while performing question analysis, the learning outcome-based analyzes are performed by matching the learning outcomes of the courses with the exam questions. With question-based evaluations, item analyzes such as item difficulty, item discrimination, and item distracters can be made easily. Also, with the question-based evaluation, the instructor of the course can effectively measure the extent to which students achieve the learning outcomes of the related course. Thus, the CBE-MCQs tool helps successful maintenance of both accreditation and training processes with the analysis it provides to the instructors. In addition, the final exam assessment of the course based on MCQs within the scope of the case study was carried out with the CBE-MCQs software tool. Each MCQ was evaluated based on classical theory within the scope of difficulty index (P), discrimination index (D), and distracter efficiency (DE). It is observed that association betweenDandPwas statistically significant according to Pearson correlation (r = -0.600) and chi(2)(chi(2) = 15.98,df = 8;alpha = 0.05). Also, a system usability survey of 10 statements was applied to the instructors using this software to evaluate the usability of the CBE-MCQs software tool. As a result of a series of experimental studies and survey, the developed CBE-MCQs software tool is evaluated as simple, effective, and satisfactory by 87.4% of the instructors.