Resource Database: Geography, Earth and Environmental Sciences

Title

Computer-marked Tests in Geography and Geology

Originator

Joe Angseesing

Department

School of Environment, Cheltenham & Gloucester College of Higher Education, Francis Close Hall, Swindon Road, Cheltenham, GL50 4AZ, UK

Tel.

+44 (0)1242 532973

Fax

+44 (0)1242 543273

Email

joea@chelt.ac.uk


Outline:

First year students in geography and geology have been partly examined using computer-based assessments since 1985. Our current practice is to use multiple-choice tests which are undertaken at a computer terminal. Marking of test answers is objective, absolutely consistent and extremely rapid, and output includes an analysis of facility and of the distractors chosen.

Context:

Cheltenham & Gloucester College operates a modular degree scheme based on a semester system, with 15 weeks per semester. First year modules have relatively large student numbers (up to 120 in the Geography and Geology Department) because they are taken by most minor, as well as major and joint, subject students. Computer-based tests are used in geology (up to 60 students), physical geography and geography modules, and have been used in modules in earth's resources. They help to ease the assessment bottleneck that occurs twice a year at the end of each semester.

Currently, automated testing is used in five of the Department's eleven first-year modules, an increase over previous years; between 1985-1994 only 10-25% of level I modules were assessed in this way. In two modules they are organised as part of the formal examination programme at the end of the semester, but in the other three they occur within the modules as part of the continuous assessment (see table below).

Description

All the computer-based tests are undertaken at a computer-terminal. The software used is called CMAPROG and was developed in house. It is loaded up in advance by the invigilator, who also distributes any other ancillary material. Student fill in a cover-sheet, noting their student number and terminal number, and begin. The software is self-timing, starting from the point where the question file is loaded.

The software is text-based - it presents the question on-screen and an array of possible answers. Examinees respond by typing in a key (usually a number in the range 1 to 6) for the selected response.

About half the tests in current use refer to ancillary material - mostly specimens in geology exams and printed graphs, diagrams and tables in geography. The method of selecting (or editing) an answer means that the time limit is not a problem. Candidates are set between of 60-80 questions per hour and usually finish with 10-20 minutes to spare. Large modules may require two sittings, which means that members of the first test-group have to remain supervised until the second group is seated.

 

Computer-marked tests in geography and geology modules at CGCHE: assessment timing (within a 15-233k semester), weighting and duration

Module title

Assessment in week:

Computer test weighting

other test weighting

Test duration (minutes)

Earth Materials

6

20%

50%

30

Earth Structure and Evolution

14-15

25%

25%

45

Environmental Data Handling

14-15

50%

-

60

Global Environmental Issues

10

25%

25%

60

Environmental Systems

i) 6
ii) 10

i) 35%
ii) 35%

-
-

i) 60
ii) 60

It takes less than 5 seconds for a separate computer program to mark 120 answer files, although printing out results, including a basic question analysis, takes a few minutes. The question analysis lists

  1. the percentage of candidates who were successful on each question, and
  2. a table showing how often each distractor was chosen.

Resource Implications

Physical resources: Sufficient terminals and specimen sets are required for each member of the module, or for half the group if two sittings are planned. Running an exam in two sittings is fairly straightforward, but more than two becomes a logistical problem. Generally there will be no resource problem as most institutions will have computer rooms for class teaching, though sometimes the spacing of the machines means that only alternate places can be used in an exam. The specimens required are no more than would be used for a conventional exam in subjects such as geology. If investment in teaching computers has already occurred then the only finance that might be required is to acquire a suitable software package (usually less than the cost of one computer).

Staff time: This can be broken down into three areas:

Student performance

Very high marks are possible, as are low scores, so the coefficient of variation is higher than for conventional written exams. In one extreme case the first computer-based test we constructed for 'Environmental Data Handling' produces a bimodal distribution: 40% or so of the candidates were on top of the work and formed a mode at over 60%, while the rest of the candidates found the exam quite difficult and clumped about a mode at 40%; only 3 of nearly 60 examinees scored between 50-59%. This resulted from about 25% of the questions having a high difficulty but less than 10% being easy, this is where experienced question-setting or reviewing is vital to ensure that there is a range of questions with respect to facility.

Correlations in student performance between computer-based tests and other forms of assessment are quite high - there was a correlation coefficient of between 0.55 and 0.70 in three recent comparisons between computer-based and written answers in geology. The question analysis highlights areas of difficulty both in the course and also general processes such as the manipulation of units and the sequences of reasoning in making inferences.

Evaluation

The computer-based tests used at Cheltenham offer similar advantages and disadvantages to paper-based multiple-choice tests including:

Advantages:

Weaknesses:

Overall, the advantages make these techniques worth employing, but the disadvantages limit their application. Examiners have most to gain with large first-year modules, and their use in this department is restricted to level I. They do assess more than recall, and can be used to test numerical manipulation, practical observation and the application of sequences of reasoning, argument and synthesis. In this department we have experimented with open-ended computer-marked tests (Angseesing, 1989) but have found that this approach requires much more marking time, even after several runs of a question have built up a substantial answer bank.

Key advice

Reference

Angseesing, J. (1989) Open-ended computer-marked tests, Teaching Earth Sciences, 14, 17-19.

Keywords

Computer assisted assessment
Multiple-choice
Objective tests
Question analysis

This is one of the case studies from the Science Education Enhancement and Development (SEED) project's handbook Computer Based Assessment (Volume 2): Case studies in Science and Computing


Keywords can be used to search for specific topics
Abstracts are also listed by Originator

Resource Database Home Page

Page created 2 October 1999
Database pages maintained by Phil Gravestock