A Framework For Developing Online Tests
by Rosario Giráldez and Victoria Dieste
The advent of online and blended programs has brought about
changes for teachers and learners, and it has required adjustments in
curricula. One such adjustment involves the design and development of testing
frameworks that can fulfill new program needs. These frameworks must include
measurements that provide reliable information for assessment and evaluation
purposes.
Bachman and
Palmer (2010) contend that “In the real world of language assessment use, it is
becoming increasingly important, and in many cases mandatory, for test
developers and users to be accountable to stakeholders.” Being accountable
means being able to demonstrate that the intended use of a test is justified,
which means developing tests that reflect what students have learned and been
exposed to. This can be done by comparing the test specifications to the course
syllabus. It is also important to explicitly state that the intended use of the
test will be the one described in the specifications and to share this
information with students.
The update
of programs must be accompanied by an update in testing methods. Coombe and
Hubley (2013) state that “The term validity refers to the extent to which a
test measures what it says it measures. In other words, test what
you teach, how you teach it!” If testing doesn’t match teaching,
then the test isn’t valid—and the scores derived can’t provide reliable
information about meeting program outcomes. If new programs have online
teaching components, then the logical consequence is that testing frameworks
should include online components, and these new tests must be developed with a
careful focus on reliability and validity.
We started
out with the design of a framework to be used in our blended programs. Because
the components of our program were 50% in-person and 50% online, the design
included 50% in-person testing and 50% online testing. We later applied the
same framework to our fully online testing program.
Test
Development Procedure
Designing an Online
Test
Our programs
include courses at all levels, from beginning to advanced, and therefore, our
test design needed to comprise all levels as well. We followed these
steps:
Step 1. Define the Test
Construct
State the
knowledge and abilities that the test is to measure.
Step 2. Revise the
Inventory
Revise the
inventory of course contents and materials that will guide the language to be
used in each of the levels.
Step 3. Write Test
Specifications
These
provide the general instructions and details for creating the test blueprint.
Step 4. Define the
Skills and Number the Tasks
As you
create the test blueprint, define the skills to be tested and suggest the
number of tasks to include to assess each skill.
Step 5. Weight the
Tasks
Assign the
relative weight that each skill or area is going to have in the total test
score.
Step 6. Design the
Question Types
Bear in mind
the cornerstones of testing; the question types need to be the ones that students
are familiar with, disregarding the inclusion of questions to which students
have not been exposed.
Guidelines for Creating
Test Items
At the same
time, we developed the following guidelines for test developers to assist the
process of creating objective test items, for example, multiple-choice or true
or false tasks:
-
Item formats are
correctly matched to the purpose and content of each item. When
writing test items, we need to bear in mind the objective of each item and
decide if the ideas included accurately match the format in question.
-
The items are
written at the intended level of students’ proficiency. It is
imperative to revise the level of the test input to assure that it is right and
fair for the intended audience.
-
All parts of an
item are visible at the same time. The task to be completed must be
displayed on a single page or screen so that the test taker can focus on content
and not on scrolling up and down or changing windows.
-
There is only one
correct answer for each problem. Having more than one correct answer
may cause ambiguity and thus affect test fairness.
-
Negatives and
double negatives have been avoided. It may be misleading and
confusing to write a negative answer as the correct one because that may be a
source of confusion for the test taker.
-
Rubrics provide
clear guidance for test correction. They show the relative weight of
each test task and each item within the task. Rubrics are shared with students
so they have a clear idea of how the test is to be graded.
-
Race, gender, and
nationality bias have been avoided. It is of the utmost importance
to revise items to detect possible sources of bias that would influence test
impartiality.
-
At least one
other colleague has proofread the items. It is essential to have
reviewers that check all test items to ensure test validity and fairness.
Sample
Tools
There are
many tools that can be used to develop online tests, and it should be up to the
institution or the teacher to decide which is best for their contexts. Language
management systems could be considered one of the most powerful and complete
tools.
Moodle
In the case
of our adult courses, we opted for Moodle because it provided us with an array
of possibilities that proved highly effective for our context. Our students are
accustomed to using Moodle as a learning environment, and therefore validity is
considered by using the same means for teaching and testing purposes.
Additionally, Moodle presents several advantages over other language management
systems, given that it is open source and offers a comprehensive, responsive interface
that can adapt to different devices.
Google Forms
For our
teenage courses, we went in a different direction. Our teenagers were familiar
with publishers’ platforms for online work, but they did not use Moodle. Because
these platforms did not offer us a testing solution, we resorted to Google
Forms to provide students with a tool that was user friendly and easily
accessible to all of them. The Google Forms feature of converting forms into
quizzes offers valuable data analysis for teachers. Our students had already
used a few Google Forms and we made sure to train them in the program before
they faced their first online test in this modality. Google Forms offered us an
efficient, easy-to-implement solution to test our teenage students.
In all
cases, it is advisable to provide students with a mock test to run a system
requirements check. Even if students are familiar with the testing tool to be
used, this check provides an opportunity to familiarize them with the test
mechanics and navigation, thus ensuring content validity.
Conclusion
It is
undeniable that online teaching and testing have come to stay and will be used
in education for the coming years. It is also true that online testing presents
advantages for teachers and learners. We must capitalize on these advantages as
they can provide an excellent way to help make a smooth transition from
paper-and-pencil to online testing. Consequently, we need to develop frameworks
for online testing so that teachers can have appropriate evaluation tools,
ensuring that learners are fairly tested.
References
Bachman, L.,
& Palmer, A. (2010). Language assessment in practice. Oxford University Press.
Coombe, C.,
& Hubley, N. (2013). Fundamentals of language assessment. Cultural Affairs Office, U.S. Embassy.
Rosario Giráldez
is the academic director at the Alianza Cultural Uruguay-Estados Unidos, where
she has also coordinated Teacher Education Programs, Alianza Centers, and
English Programs in Schools. She is a frequent presenter at national and
international events. Her main areas of interest are evaluation and curriculum
design. She holds a TEFL degree from the Alianza and has taken courses in her
main areas of interest at Iowa State University, Indiana University, and Hawaii
Pacific University.
Victoria
Dieste is an EFL teacher who has been
working at the Alianza for the past 15 years and is currently the associate
academic director. She has presented in various academic events in Uruguay,
Brazil, Paraguay, and the United States. Her last experiences abroad were as a
teacher-in-residence in Minnesota, 2016, and as a presenter at TESOL 2019. She
completed the TESOL ELT Leadership Management Certificate Program in 2020.