Standardising Assessment using the CEFR in Vietnam

In order to help the Vietnamese Ministry of Education meet its ambitions educational targets for its 2020 Strategy, the British Council has recently designed and delivered a number of training programs for educators across the region. One of these recent programs is a course on Assessing with the Common European Framework of Reference (CEFR). The 25 hour course was delivered at The University of Labour and Social Affairs in Ho Chi Minh City over a two- week period. The purpose of this course was to give test designers a thorough understanding of the CEFR levels and to design objective assessments which measure language ability in the four skills (reading, writing, listening and speaking).


The CEFR is rapidly becoming the global standard for English language assessment for a number of reasons. Opinions of language ability are largely subjective and labels such as “intermediate” or “advanced” are not standardised and do little to demonstrate one’s ability at using language in a particular situation. Furthermore, some people have higher levels in some skills than others. 

The CEFR allows an objective, standardised indication of language level in each skill. These levels are based on the students’ functional, communicative ability and are used by teachers, learners, governments and employers around the world. By understanding these levels thoroughly through “Can-Do statements” educators can develop reliable, valid and practical assessments that accurately and objectively measure a student’s ability.


Course participants worked through a number of theoretical and practical exercises in order to prepare themselves for the task of making their own assessments. The first section of the course involved studying the CEFR thoroughly and understanding what a student should be able to do in each of the skills at each of the six levels. Then the participants explored assessing the receptive skills (listening and reading) and evaluated a number of questions in terms of level, variety and practicality. From this, they created their own assessment questions for their colleagues to attempt. 

They then moved on to the productive skills (speaking and writing) and explored the difficulties in establishing reliable and valid criteria which could be assessed objectively by examiners. Participants developed their own assessment criteria based on the CEFR levels. This proved to be both challenging and rewarding as participants had some trouble agreeing on the best way to design a useful rubric which could be interpreted easily by other examiners. Finally, they compared their criteria with those used in other exams such as FCE, CAE and IELTS. 


Participants were very satisfied with the course and enjoyed the process of working from their own experience rather than simply emulating other similar exams. They were also given a number of useful take-away items including examples of questions and a collection of marking criteria sheets for productive skills. In addition to these, a set of points were developed, which can be used  as a checklist when making assessments:

  1. Pitch the tasks and test questions at the correct CEFR level.
  2. Have a range of question types (multiple choice, short answer, gap-fill etc) for receptive skills.
  3. Have a range of task types (concrete to abstract and easy to challenging) for productive skills.
  4. Grade the questions from easy (early in the test) to difficult (later in the test).
  5. Give clear instructions with examples for each task type.
  6. Check spelling, punctuation and grammar carefully.
  7. Do a test run with colleagues first.
  8. Check internal and external validity.

Lecturers at the University of Labour and Social Affairs will soon be developing their own in-house assessments to measure the progress of their students. We look forward to the results of their hard work and wish them and their students’ success.