HomeTIP Research Journal Manilavol. 4 no. 1 (2007)

Development of an Automated Competency Exam Analyzer with Interactive Feedback Mechanism and its Effectiveness

Jester L. Bensan | Policarpio D. Tena Iii

Discipline: Education, Information Technology

 

Abstract:

Competency exams are conducted every semester in different subjects. Questions included in a competency exam need to be verified corrected and validated before actually using them. Hence, a storage for these validated questions are needed. Security risk will then be an issue since exams and questions are just printed in plain papers and / or stored in unsecured digital files. Post-competency exam tasks are also troublesome which involve the checking of student answers, registration of answers and scores to a test-item analysis and presentation of statistical results. This research intends to solve the problems stated by designing and developing an automated solution of analyzing the set of questions before a competency exam, registration of student answers and generation of an interactive feedback mechanism that will determine the competency of students for the recently concluded competency exam.

The descriptive method is used to describe the current problem, and developmental research to test the effectiveness of the proposed system against the existing one. Primary sources of data involve the participation of instructors that create questions and exams and fill up the test item analysis form. Secondary sources involved publishing that are related to the research. Data were gathered through interview, observation and survey questionnaires. Analytical tools such as block diagram, USECASE diagram/model, dataflow diagrams, entity-relationship diagram/model, system flow chart and tree view were used in conceptualizing and designing the system. The model used in the development of the proposed system is the waterfall approach.