Published: 23rd March, 2015 Last Edited: 23rd March, 2015
This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.Introduction.
For the longest time, the idea of testing language have always revolved around testing the knowledge of the language itself but now, the idea of testing for communicative competence is getting more and more popular. In testing communicative competence, speaking and listening tasks are commonly used. Those require tasks such as the completion of an information gap and role play (Kitao & Kitao, 1996).
As teachers of ESL, it is imperative for us to enhance the students' delivery skills, increase their confidence, and develop their methods of organization and critical thinking skills. In order to do this, a valid and reliable way of assessment to determine whether the set goals were met is required. The oral communication field needs a clear-cut method of evaluation as can be found in discrete language skill classes such as listening comprehension (Nakamura & Valens, 2001). Language teachers and language testers need a method which takes subjective qualitative observations and then transforms them into objective quantitative measures.
A critical issue in the assessment is the selection of criteria for evaluating performance. Stiggins (as cited in Butler & Stevens, 1997) points out that the selection of these criteria should be one of the first steps in designing performance assessments. Students should understand ahead of time what is expected of them. This can actually help them determine on what basis their performance will be judged. When students are actively involved in establishing assessment criteria for tasks, they do not only have a better understanding of what is expected of them when they perform the tasks, but they will be able to more fully appreciate why the criteria are important (Butler & Stevens, 1997).The Issue of Assessing Speaking Skills.
Speaking is probably one of the most difficult skills to test. It combines skills that may have little or no correlation with each other, and which do not do well to objective testing. In ( Kitao & Kitao, 1996), it was mentioned that there are not yet good answers to questions about the criteria for testing these skills and the weighing of these factors.
It is possible to find people who can produce the different sounds of a foreign language appropriately; hence they lack the ability to communicate their ideas correctly. This is one of the difficulties that testers encounter when testing the oral production of learners. However, the opposite situation could occur as well; some people do have the ability of expressing their ideas clearly, but at the same time they cannot pronounce all the sounds correctly.
Another difficulty is the actual implementation of speaking skills testing. That is because it is difficult to test a large number of learners in a relatively short time. Therefore, the examiner is put under great pressure (Heaton, 1988).
The next difficulty is that speaking and listening skills are very much related to one another; it is impossible to keep them mutually exclusive. In most cases, there is an interchange between listening and speaking, and speaking appropriately depends on comprehending spoken input. Therefore, this has an impact on testing speaking because the testers will not know whether they are testing purely speaking or speaking and listening together.
Finally, the assessment and scoring of speaking skills is one of its biggest problems. If possible, it is better to record the examinees' performance and the scoring will be done upon listening to the tape. The aspects of speaking that are considered part of its assessment include grammar, pronunciation, fluency, content, organization, and vocabulary. (Kitao & Kitao, 1996).
Depending on the situation and the purpose of the test, testers need to choose the appropriate methods and techniques of testing.The Solution: Method of Assessing Speaking Skills. 3.1. Monologue, Dialogue and Multilogue Speaking Test.
Nakamura & Valens (2001) conducted a study on Japanese graduate students at Keio University. They used three different types of speaking tests as a form of assessment. The first type is the Monologue Speaking Test which is also called the presentation. Students were asked to perform some tasks such as; show and tell where they talk about anything they choose. This gives the students a chance to make a mini presentation. The second type is Dialogue Speaking Test which is also known as the interview. It is an open-ended test where the students lead a discussion with the teacher, and students in that kind of test are required to use conversation skills that they have learned before. The third type is Multilogue Speaking Test that is also called the discussion and debating. Here, the discussions are student-generated, and students are put into groups where as a group, they decide on a topic they feel would be of interest for the rest of the classroom.
The evaluation criteria that was used in that study was as follows:
Ability to explain an idea
Discussing and debating:
Able to be part of the conversation to help it flow naturally
Uses fillers/ additional questions to include others in conversation
Transfers skills used in dialogues to group discussions
The rating scale ranged between poor and good with the symbols from 1 to 4.
The finding of their study reveals that among the three test types, the discussion tests was the most difficult followed by interview test and the presentation test.
In Malaysia, we saw a similar system being implemented but were poorly regulated and too restrictive. Dialogues are used in the school-based assessment and Monologues and Multilogues are common in both school-based assessment and the MUET speaking test. Although it follows this model, it failed to accurately gauge student's speaking ability as the tests were poorly regulated (prevalent in school-based assessment) and too restrictive (MUET).3.2. Testing speaking using visual material
Without even comprehending spoken or written material, it is possible to test speaking using visuals such as pictures, diagrams, and maps. Through a careful selection of material, the testers can control the use of vocabulary and the grammatical structures as required. There are different types of visual materials that range in their difficulty to suit all the levels of learners. One common stimulus material could be a series of pictures showing a story, where the student should describe. It requires the student to put together a coherent narrative. Another way to do that is by putting the pictures in a random order of the story to a group of student. The students decide on the sequence of the pictures without showing them to each other, and then put them down in the order that they have decided on. They then have the opportunity to reorder the pictures if they feel it is necessary. In the Malaysian context, this system is already in use in the school-based oral assessment for primary school.
Another way of using visual stimulus is by giving two students similar pictures with slight differences between them, and without seeing each other's pictures they describe their own pictures in order to figure out the differences. However, there is a problem in using visual stimulus in testing speaking, it lies in that the choice of the materials used must be something that all the students can interpret equally well, since if one student has a difficulty understanding the visual information, it will influence the way he/she is evaluated (Kitao & Kitao, 1996).3.3. The Taped Oral Proficiency Test
In that approach, the students' performances are recorded on tapes and then assessed later by the examiner. This method has some advantage and some disadvantages. According to Cartier (1980), one disadvantage of the taped test is that it is less personal; the examinee is talking to a machine and not to a person. Another disadvantage is that it has a low validity. Moreover, the taped test is inflexible; if something goes wrong during the recording, it is virtually impossible to adjust for it. On the other hand, there are some advantages of that type of test. It can be given to a group of students in a language lab, it is more standardized and more objective since each student receives identical stimuli, and scoring can be performed at the most convenient or economical time and location.
I believe that the taped test method is very practical when it comes to testing large numbers of students where the teacher would not have enough time to assess each one of them individually. However, the problem lies in not having enough language labs in some schools which, in turn, creates a big difficulty for teachers.Conclusion
Previous research on classroom testing of ESL speech skills provides several models of both task types and rubrics for rating, and suggestions regarding procedures for testing speaking with large numbers of learners. However, there is no clear, widely disseminated consensus in the profession on the appropriate paradigm to guide the testing and rating of learner performance in a new language, either from second language acquisition research or from the best practices of successful teachers. While there is similarity of descriptors from one rubric to another in professional publications, these statements are at best subjective. Thus, the rating of learners' performance rests heavily on individual instructors' interpretations of those descriptors (Pino, 1998).
In spite of the difficulties inherent in testing speaking, a speaking test can be a source of beneficial backwash. If speaking is tested, unless it is tested at a very low level, such as reading aloud, this encourages the teaching of speaking in classes.
In my opinion, testing speaking skills could be a very interesting experience, and it gives teachers an opportunity to creative in selecting the test items and materials. Moreover, it has a great impact on students by making them enjoy taking the test and feel comfortable doing so if the teacher chooses the materials that interest their students and that is suitable to their age and levels of knowledge.References
Butler, F. A. & Stevens, R. (1997) Oral languages assessment in the classroom. Theory Into Practice, 36 (4). 214-219.
Cartier, F. A. (1980). Alternative methods of oral proficiency assessment. In J. R. Firth (Ed.), Measuring spoken language proficiency (7-14). GA: Georgetown University.
Heaton, J. B. (1988). Writing English language tests. Longman.
Kitao, S. K. & Kitao, K. (1996). Testing speaking (Report No.TM025215). (ERIC Document Reproduction Service No. ED398261)
Kitao, S. K. & Kitao, K. (1996). Testing communicative competence (Report No. TM025214). (ERIC Document Reproduction Service No. ED398260)
Nakamura, Y. & Valens, M. (2001). Teaching and testing oral communication skills. Journal of Humanities and Natural Sciences,3, 43-53.
Pino, B. G. (1998). Prochievement testing of speaking: matching instructor expectations, learner proficiency level, and task types. Texas Papers in Foreign Language Education, 3, (3), 119-133.
Essay Writing Service
Fully referenced, delivered on time, Essay Writing Service.
Assignment Writing Service
Everything we do is focussed on writing the best possible assignment for your exact requirements
Our Marking Service will help you pick out the areas of your work that need improvement.
FREE Reference Generators
Tools to help you with the creation of academic references in a number of styles.
FREE Help Guides
Everything you need to know during your studiesPlace an order now
Our experts are waiting to help you with your essayOur experts can help you with your essay question Request Removal
If you are the original writer of this essay and no longer wish to have the essay published on the UK Essays website then please click on the link below to request removal:More from UK Essays Invest in your future today
Copyright © 2003 - 2017 - UK Essays is a trading name of All Answers Ltd, a company registered in England and Wales. Company Registration No: 4964706. VAT Registration No: 842417633. Registered Data Controller No: Z1821391. Registered office: Venture House, Cross Street, Arnold, Nottingham, Nottinghamshire, NG5 7PJ.
How to Test ESL Writing
By: Miranda Morley
Regardless of what you teach, assessing an ESL student's paper can be particularly challenging. Should you have the same expectations for ESL students' papers as you do for native speakers' writing? Should you give separate grades for content and grammar? These are just a few of the questions you should ask yourself when grading an ESL paper. The answers to such questions will depend at least partially on individual circumstances, such as the nature of the course. However, following some general guidelines can help you effectively test ESL writing.
Read for content. Unless you are grading an exam that is meant to assess only usage and punctuation, content always trumps grammar. Determine whether or not the student is attempting to answer the prompt or fulfill the parameters of the assignment. Question the paper's depth, development, and unique insight. If allowing for revision, make comments that clearly explain how the student could improve her content.
Note usage errors. Mark grammar, punctuation and sentence structure problems in such a way that it requires the student to look up and understand these problems. Don't simply correct the essay. Use a circle, check mark, or highlighting/underlining system. Make a summary note to remind yourself and the student of the largest mechanical problems in his text. Also note whether these mechanical problems interfere with the paper's meaning. According to Dana Ferris and John Hedgcock's book on ESL composition, keeping notes on your students' writing, attendance and participation are all important parts of grading.
Read through all the papers written by a particular class. Before assigning a grade, read through a stack of papers with a pen and notebook beside you. Make notes of problems that seem to appear often. If you're seeing the same error over and over again, it may be a result of too little time spent on the issue, rather than a student's lack of understanding. Be sure to take these issues into account when you assign the final grade and plan your next lesson.
Use a rubric. Design a rubric that clearly lists important grammatical and content areas. For each paper, assess a student's performance in each content area. Choosing four categories that range from excellent to unacceptable can make it easy for the student to understand skills that she has mastered and must develop. This also allows you to easily justify a grade. Additionally, by presenting the rubric to your ESL students before you grade the paper, you clearly communicate what you are expecting from them.
Manage your time. According to "Teaching ESL Composition," it can take as long as eight hours to grade a stack of 500-word essays. The book recommends that you schedule enough of your time to respond thoroughly, building in time for any possible delays.
If you're not an ESL teacher, but are working with ESL students, take advantage of the ESL resources in your school. Instructors who are practiced at testing this unique kind of writing can give you pointers to help you assess more accurately and in less time.
Make sure you take some time to sit down with the ESL student in your class during the semester. This can make it easier for students to do well on essay exams, and it saves you time when it comes to grading them.
At the beginning of the spring/summer semester at my high school I found out that I was going to be allowed to give speaking tests for my 12 ‘second’ grade classes. Last semester I dropped several comments every once and a while about how student motivation and classroom behavior are heavily influenced by whether or not there are test points assigned to the lesson content they are learning in my co-teachers ears. and apparently during a pre-semester meeting it was decided that I could have 10% of the English grade. For my 10 ‘first’ grade classes, though, at first I was told there weren’t any test points that I could get assigned to my classes. and then later, about six weeks or so, I was told I could write 3 questions of the 33 questions on the mid-term and final exams for the English section of the test. this just goes to show one of many examples of how hard it is for native English teachers to design a semester syllabus, choose the curriculum, and how testing points are all too often not assigned to their classes and/or they’re told about the testing points weeks after they have already prepared and designed their lessons. but I digress, and should get back to writing about the process I went through designing my speaking tests.
I have a lot of experience designing speaking tests and administering them with different kinds of EFL language learners (from middle school and high school to pre-service student teachers and in-service Korean English teachers). But I decided to do some research and re-read materials I have in my EFL/ESL library (see the list of relevant books at the bottom of this post) cause I hadn’t looked at them in a long time. While doing my research and writing up my speaking test design I thought to myself, “What do you do when researching “EFL/ESL speaking test +Korea +public school +high school” and your own writing is the only thing you find that is relevant? HAS NOBODY who teaches high school in Korea designed speaking tests, and then written about it online? Wow .”
Actually, there are bloggers who have written about speaking tests in Korean public high schools but they are a minority. Also, due to the nature of blogging as an informal genre most of them haven’t really gone into much detail about their test design process, why they chose the test format they did, and other details that I would have really liked to read about the experiences of other native teachers in Korean public schools doing speaking tests.
One teacher I did find, and I posted about, wrote this series by Supplanter ‘s blog which I found pretty interesting–and which reinforced my decision to record all the speaking tests with my mp3 player (something I usually do anyways–Korean university students are notorious for trying to get their test scores raised if they don’t like them, so when they do come to ask for an increase I suggest we review their recordings and look at my notes for their test. this usually dissuades most of the complainers, lol).
Finally, I come across something related to my search parameters, Evaluation of The Foreign Language High School Programme in South Korea by Yvvette Denise Murdoch, a master’s dissertation submitted to the School of Humanities, University of Birmingham to fulfill requirements in the Master of Arts in Teaching English as a Foreign or Second Language”, 2002. Unfortunately, while it’s an interesting read, Murdoch doesn’t really provide much in the way of how she tested and what process she went through while designing her tests. But that being said it’s still a good read.
Anyways, I decided to give myself a research and writing project to kill time when I had no classes at school. I loosely based my writing goals on Chapter Six: Developing Test Specifications of “Assessing Speaking” by Sari Luoma, Cambridge Language Assessment Series, Cambridge University Press 2004.
Here is a list of things a teacher should be considering, at least some of them anyways, when designing a language test,
“the test’s purpose; description of the examinees; test level; definition of construct (theoretical framework for the test); description of suitable language course or textbook; number of sections/papers; time for each section of paper; target language situation; text-types; text length; language skills to be tested; test tasks; test methods; rubrics; criteria for marking; descriptions of typical performances at each level; description of what candidates at each level can do in the real world; sample papers; samples of students’ performances on task” from “Assessing Speaking,” Chapter 6, page 114.
The problem is the logistics (I’m going to use this word a lot) of designing and giving speakings tests in Korean public school English native speaker classes is that there are so many unforeseeable, unplannable, and unbelievable (from a native teacher’s perspective anyways) issues and challenges that come up throughout the whole process that trying to do a truly professional EFL/ESL speaking test is nearly impossible–in my opinion. but I’ll get into that in more detail in part 2 of this post.
I also found “Chapter Eight: Ensuring a reliable and valid speaking assessment” to be an extremely helpful unit to help me refresh on what I needed to be thinking about as I designed the speaking tests for the high school boys.
While reading Chapter 6 I came across three examples of how to do test specification write-ups: Example 1: An end-of-course classroom test, Example 2: A language test at university entrance, and Example 3: A general purpose proficiency test. after reading this chapter I decided to do my own test specifications write up. although I was unable to follow the models exactly due to the realities of planning lessons and tests that Korean public schools present.
Alright, that’s enough about why I decided to write this blog post. time to wade into the nitty-gritty of what I did while going the process of making speaking tests for a Korean public high school.
Before class/semester begins language learner assessment. There were no opportunities for me to assess the actual language learner levels of the students in each class. The only thing available was the students test scores from the previous semester which in terms of communicative ability and fluency really had no validity or relevance. The only thing I found useful about the test scores that I asked my Korean English co-teachers to show me was being able to see which classes might have a majority of low level students, or average to higher level students so that I could alter my teaching methods accordingly (or ‘differentiate’ them).
Test #1 format (of 4 over the course of the school year, 2 in the spring semester, 2 in the fall/winter semester) . one on one interview, teacher and student
Test #2: one on one interview, teacher and student
Test #3: Unfortunately I won’t be teaching as my contract finishes August 24th, 2010. I am, however, leaving all testing and lesson materials from the book I was using for the next native teacher. I hope that they will continue to teach from the same book. my original plans for the four tests were that in speaking tests 3 and 4 that the tests would shift from focusing on accuracy with a low degree of fluency to a higher focus on fluency in balance with the test point values for accuracy. The book I was using focuses on developing fluency and learning, practicing, and mastering speaking strategies so it will be interesting to hear from the new native teacher how the students progress throughout the fall/winter semester.
Test #4: fluency and accuracy have equal values on the rubric.
Class hours before Test #1. two fifty-minute classes.
NOTE. The logistical realities of teaching EFL speaking and conversation in a Korean public high school often necessitate the instructor exhibiting a degree of “flexibility” when it comes to following EFL methodology the way it “should” be practiced versus adapting to and dealing with the chaotic and extremely unstable school schedule and teaching/learning conditions. I scheduled the first speaking test with only 2 weeks of instruction due to several reasons: 1) My classes were not assigned time slots during the school’s official midterm exam and final exam periods (thus necessitating me having to schedule testing during regular classes). 2) The students do not understand fully (perhaps even not at all) how they will be tested (my test will be the first ever speaking test done at the school in its entire history), and this diminishes their ability to develop effective learning styles and habits specific to my classes (I made a “How to” study guide for speaking tests handout (look at the bottom of this post) and gave tips and strategies during my classes). 3) I fully expect motivation and attention levels to dramatically spike after Test #1 has been completed as students will have a much clearer idea based on first-hand speaking test experience with a native speaker/teacher in a public school setting.
Test #1 focus. pronunciation, intonation, grammar, and demonstrating/performing cultural rules for speaking and interactions during the test (for example, how to shake hands)
Test Duration. 2 minutes
Type of school. 2nd grade classes at an all boys trade/sports school transitioning into an academic high school, Seoul–the 2nd grade students were enrolled during the trade school standards for acceptance. The overall English abilities are lower. On average each class has 25% false-beginner, 50% low-intermediate to intermediate, and 25% high-intermediate to advanced levels of English language ability.
Class size. 30-40 multi-level high school boys
Language learners. mixed levels, on average each class has 25% false-beginner, 50% low-intermediate to intermediate, and 25% high-intermediate to advanced levels of English language ability.
Test location. in the native English teacher’s classroom, no other students are permitted to be in the room; also, no Korean English co-teachers (they’re presence would inhibit student speaking performance and test conditions), all other students will be waiting in their homeroom, and in groups of 5 come to the hallway outside, line up and wait for their turn.
General Condition s: each class of 30-40 boys will be divided into two groups, A and B. Boys will do a lottery that places them in one of the two groups, and also determine order of testing. This is to avoid the ‘not fair’ criticism that is a very big concern for Korean students in testing situations (whether or not what they’re saying has anything to do with ‘fairness’).
Role of Korean English co-teacher during testing. KET will be responsible for organizing the boys into testing order, and keeping them quiet as the tests are in progress.
Absent/Sick students. more than likely a make up test time will have to be set up during a lunch period as soon as possible
NOTE. During testing I realized that my co-teachers were not taking attendance on the day of the test (they tend not to during regular classes too, sigh), not marking down who was absent/sick, and that they had not checked the order of testing group lists we had made against the class attendance list to make sure every student was on the testing list. this caused some problems later, and hopefully other native teachers can avoid this by explicitly asking their co-teachers to check these rather important details (although, even when explicitly asked, the unfortunate truth is that the task is not done or done poorly).
Test Writing/Editing: due to “issues of motivation and a lack of interesting co-producing lesson plans” the native English teacher designs and writes up the English test text and rubric. A younger Korean English teacher who I wasn’t actually co-teaching with ended up helping me edit and proof the test questions and make sure there were no problems.
Test Planning/Meetings with Co-teachers. Getting the co-operation and willing participation of co-teachers was extremely challenging and often met with failure in the weeks leading up to speaking test #1. For speaking test #2 I decided not to go through the ridiculous stress and passive-aggressive/apathetic participation I was exposed to during prep for test #1 due to being constantly reassured by my co-teachers that they knew what their testing roles and tasks were (which they didn’t, and had forgotten). Test 2 had several administrative problems due to a lack of a pre-test meeting where I wanted to, as I did in the pre-test meeting for test 1, go over and review what each of us needed to do, how to do it, and try to make sure my co-teachers understood the very few testing tasks I had asked them to take care of. I will write more about this in part 2.
Test Review Class. a test review class will be scheduled in the week before each testing period so that students can learn and practice test procedures as it is the first time for them to participate in a public school speaking test, and familiarizing themselves with the procedure will help reduce stress and anxiety for students. This is also vital for the co-teachers as they have never participated or overseen speaking tests and they need the practice time too.
1 week before the test. a list of questions based on the English class lessons will be provided to students. The reason for this is that the classes are multi-level and the speaking book lessons are intermediate level. In order to give the lower level students a fighting chance to do well on the test it is vital to provide them with the general test content. If they actually study hard, and practice hard, they will then have the possibility of achieving a good test score.
Primary focus of KETs in test design. produce ranked results for evaluations–in general, they had very little to no interest in accurately and fairly assessing student speaking abilities.
Primary focus of NET. I wanted to accurately, and in a fair and professional manner, assess student speaking abilities.
Role of KETs in test and curriculum design. 0%.
Role of NET in test and curriculum design. 100%
Reasons for choosing teacher-student interview style test versus other types of speaking tests :
1) Large multi-level classes preclude being able to pair up students. Putting a low level student with an advanced student is a recipe for disaster.
2) Low level students need a testing situation where they can be prompted if need be, and also a friendly non-threatening partner for the speaking interview. Pairing up students who are not friends or part of the same social peer group within a class (there are multiple groups) creates the potential for classroom peer to peer dynamics to sabotage the testing process and validity (i.e. I’m gonna kick your ass after school if you screw up my test score).
3) Selection of partners process and ‘fairness.’ If students perceive the process to be “unfair”–regardless of whether or not they’re right–this can dramatically impact teacher-class relationships, etc.
4) Small groups. see #1.
I mentioned that some other instructors I know suggested doing pairs or small group speaking tests but my reaction is the same as I found online while doing some research to see what other instructors have done for speaking tests. The instructor in a A Case Study: One Speaking Test Format says, “But a good conversation depends on both students doing their part, so you run the risk of one person’s grade being affected by the other’s performance.”
The relationship between test format choice and language learner levels is critical. In the particular situation of large multi-level classes in Korean public schools the range of choices is severely limited by both consideration for the students abilities AND the logistical nightmare that is organizing and scheduling the test dates and times.
I came across two articles from the Internet TESL Journal. The first is Using Pair Work Exams for Testing in ESL/EFL Conversation Classes by Ian Moodie, Daegu Hanny University, Gyeongsan, South Korea
“In utilizing one-on-one interview examinations obviously the instructor can get a sense of the oral communicative competence of students and overcome this weakness of written exams. However, there are other disadvantages to this approach. First of all for the instructor, time management can be an issue. For example, assuming a two hour period for exams, a class of 20 students would mean each student only has six minutes of time for testing. This includes the time needed to enter the room/office and adjust to the setting. With such a time constraint it becomes doubtful that the student and instructor can have any kind of normal real-world conversation. Also, considering the weight of the exam (assuming that it is between 20-40% of the final score), it is not a lot of time to elicit and test for speaking ability or listening comprehension. Six minutes for 30 or 40 percent of the student’s grade puts a lot of pressure on the students to perform in a very limited amount of time. The fact that it is a direct conversation with the instructor, who will dole out the final grade, would also make it more stressful for the students . As for the instructors, it can be taxing to both have a conversation with a student and evaluate it simultaneously” (my bold, my italics).
The logistical realities (or perhaps ‘nightmare’ might be more appropriate) of my English classes and testing pretty much forced me to have to ignore certain EFL standards and methods for testing that I would have preferred to follow. The testing time was only 2 minutes, the class sizes 30-40, and because the school was not integrating my testing into the official mid-term and final exam testing days I was forced to schedule TWO WEEKS (one class per week) of class time in order to complete all the 2 minute tests (and that was just for test 1)–if I’d reduced the testing time to one week only each test would have lasted something like 45 seconds MAYBE…
Pair Work Conversations
“One way to improve upon one-on-one testing is the utilization of pair work activities as part of or all of the exam itself. This type of activity frees up the cognitive resources of instructors in order to pay closer attention to the production of each student than if they were participants themselves. Students have a longer time to interact, instructors have longer to evaluate and comment on each student’s performance. In the case of the instructor following Communicative Language Teaching methods, where pair work may take up a significant portion of a class, it would be appropriate to incorporate similar activities in the exam. That way the exam itself is much better integrated into the fabric of the course. Students can be tested for performance related to activities done in class. For a conversation course, oral pair work exams are much more relevant than written exams or one-on-one interviews. There may also be benefits in regards to student motivation. If students are aware that they will be tested on activities similar to the ones done in class, they may have more incentive to be attentive and use class time effectively.”
I had originally planned on getting my guys to do a lot of in-class pair activities so that I could have them test in pairs. but learner motivation, maturity, and self-discipline in the classroom quickly made it apparent that I’d spend more time on trying to get the low level/low motivation language learners to do the paired learning tasks/activities and to keep them from disrupting the other pairs who were actually doing the work. I quickly reassessed my curriculum plan and testing from paired to teacher-student interview format.
I also took a look at the different ways one can test speaking. here are links and descriptions.
“Duration. 19 minutes (28 minutes for groups of three at centres where there’s an odd number of candidates).
Participants. Candidates are interviewed in pairs. There are two examiners present: one who asks the questions, the other acts as assessor and doesn’t speak during the interview.
Format. The oral test consists of three parts.”
“Tests ability to: use language for social purposes, such as in making introductions, answering questions, giving an opinion.
This first section of the CPE Speaking exam lasts about 3 minutes (4 minutes for groups of three). In this section the examiner will ask you at least three questions to give you the chance to introduce yourself and for you to give an opinion on a general topic to do with your life experiences, interests etc.”
“Tests ability to: use language to discuss and interpret, to agree, disagree or agree to disagree, negotiate and collaborate, to rank or classify, speculate, evaluate, make decisions etc.
There are two sections to Part 2 of the CPE Speaking test, which lasts about 4 minutes (6 minutes for groups of three). The examiner will ask you and your partner to talk about a set of visual prompts together.”
“Tests ability to: speak at length coherently, use language to develop a topic, describe, compare and contrast, hypothesise and comment.
Part 3 of the CPE Speaking test lasts about 12 minutes including 2 minutes for each long turn and 4 minutes for the final discussion. Candidate A is passed a card and has to speak about the topic without interruption, either from the examiner or their partner. When Candidate A has finished the examiner asks Candidate B a brief question about the topic. The roles are then reversed: Candidate B is given a different card and speaks for 2 minutes followed by Candidate A who answers a brief question about the topic. At the end of the long turns both candidates participate in a discussion with the examiner about the theme of the two topics.”
“Duration: Between 11 and 14 minutes.
Participants: Candidates interviewed individually. The test is recorded.
Format: The test consists of three parts.”
“Part 1 of the IELTS Speaking test lasts between 4 and 5 minutes. The examiner will ask some simple ‘getting-to-know-you’ questions which will help the examiner find out a little about you and help put you at ease. These will be general questions such as about your family, your studies, where you come from or what your interests are.”
“Part 2 of the IELTS Speaking test lasts between 3 and 4 minutes (including 1 minute preparation time). The examiner gives you a task card and you have to speak about the subject without interruption for between 1 and 2 minutes.”
“In Part 3 of the test, which lasts between 3 to 4 minutes, the examiner will ask you questions linked to the topic in Part 2.”
While doing my research I came across a research article excerpt from, The paired speaking test format: recent studies by Linda Taylor. In the two page excerpt Taylor talks about the drawbacks of one to one testing versus two students paired up for speaking tests. I think the article has extremely valid points about language learner anxiety and how being paired with a student helps to relax them, raise their performance levels, and also produce a wide range of speaking skills and content whereas a teacher/evaluator pairing with a student has an ‘asymmetrical’ relationship that impacts what a student thinks they can and can’t do based on a different set of relationship rules from their L1 classroom cultural experiences (Korean public school classroom culture is notoriously imbalanced in terms of teacher-student power dynamics) but I would argue that I create and foster a sense of informality and friendliness between myself and the students I have in my conversation classes. To play Devil’s Advocate with myself, though, I would say that in a testing situation there is a dampening of the normal English conversation class teacher-student dynamic I try to foster because of testing anxiety and its powerful influence on a student.
Speaking Test Rubrics: I told students I would post copies of the rubric, and explanations of the rules and standards for each point in the classroom. I also went over the rubric in the test review class, and had my co-teacher translate what I said to reinforce and make sure the students understood how they would be tested. I think the students were surprised at how transparent and fair I was making the testing process–overall, their reactions were positive.
I’m going to end part 1 of this post on speaking tests with copies of the handouts I gave my classes, the list of books I referred to while preparing my speaking tests, and links to other posts I’ve written about speaking tests in Korean public schools. It’ll probably take a few days to finish writing part 2 of this post. in it I plan to write about some of the challenges and issues that arose during the designing of the tests, during the testing itself, and also more about test 2.
1. Each class will be divided into two groups: A and B.
각 반은 A와 B 두 그룹으로 나누어 질 것입니다.
A lottery will decide the order in which students are tested.
추첨을 통해 각각의 그룹 안의 학생들의 시험 순서가 정해질 것입니다.
2. Group A testing. March 29th to April 2nd
A 그룹 시험일. 3월 29일부터 4월 2일까지
Group B testing. April 5th to 9th
B 그룹 시험일. 4월 5일부터 4월 9일까지
3. Go to the waiting area 5 minutes before class time/test time begins.
수업/시험 시간이 시작되기 5분 전까지 대기실로 가기 바랍니다.
4. Leave all papers and notes outside the test room. If you are caught cheating you will get a ZERO score.
모든 서류와 메모들을 시험실 밖에 두고 들어오기 바랍니다. 만일 부정행위가 발각된다면 0점을 받게 될 것입니다.
5. The Korean teacher will tell you when to go to the English classroom for your test.
한국인 선생님께서 여러분이 시험을 치르기 위해 언제 영어 교실로 가야할 지를 알려주실 것입니다.
6. Five students at a time should wait outside the English classroom in the hallway QUIETLY.
한 번에 5명씩의 학생들이 영어 교실 밖 복도에서 조용히 시험을 기다리게 될 것입니다.
7. Wait quietly outside the classroom door. If you talk loudly and/or laugh you will do 30 minutes lunch time cleaning.
교실 밖 복도에서 조용히 기다리세요. 만약 큰 소리로 말하거나 웃는다면 점심시간 30분 동안 청소를 하게 될 것입니다.
8. When it is your turn, come into the classroom and be ready to start the test immediately.
당신의 차례가 되었을 때, 교실로 들어와서 바로 시험을 볼 수 있는 준비를 하도록 하십시오.
9. The test is only 2 minutes so please be ready to speak English
시험은 오직 2분이 소요되기 때문에 영어로 말할 준비가 되어 있기를 바랍니다.
10. After the test is finished you should return to the waiting area (homeroom).
시험이 끝난 후에는 대기실로 돌아가길 바랍니다.
11. Do not talk about the test with other students. If you do this you only help them get a higher score than you.
다른 학생들과 시험에 대해서 말하지 마십시오. 만약 하게 된다면, 이는 단지 그들이 당신보다 시험에서 더 높은 점수를 받는 것을 도와주는 것이 될 것입니다.
Speaking Test Study Guide
1. Find a quiet place to practice and study.
2. Find a partner to practice asking and answering questions with.
3. Memorizing spoken English.
a) Read over the class handouts.
b) Write all of the expressions, questions, and answers 5Xs each.
4. Practice speaking the expressions at NORMAL VOLUME and SPEED.
5. Practice in the same way you normally speak. Do not practice speaking quietly, and in a robot voice.
6. Make an mp3 recording of yourself speaking, and listen to it. Try to find errors and then practice the correct pronunciation and intonation.
7. Speak slowly when you begin your practicing, and then slowly speed up to native speaker speed if possible.
8. Do some practice speaking each day over many days. Do not practice the NIGHT BEFORE the test day, or the HOUR before the test day.
9. After memorizing the English do not use a script paper when you practice speaking. Practice speaking with NO PAPER because you cannot have a script in the test.
10. If you need help with pronunciation, intonation, or have a question about the language on the speaking test YOU should ask your Korean teacher, or Jason, for help. Do NOT ask for help in the hour just before your test date and time !
NOTE 1. My co-teachers insisted on the point range for each letter grade. thus the unusual “D” value.
NOTE 2. Use an mp3 player to record each speaking test. If necessary, you can use this later to support your evaluation and the score you give a student if it is challenged. If possible, and necessary, use a video camera (or point and shoot camera with video capability) if you are assessing body language and gestures.
Criteria for Marking: Explanation of Point Values
Eye contact and handshake
Korean style eyes down, left hand/arm horizontal position, very soft hand pressure, 5+ seconds holding hand too long, bow
Left hand begins in Korean style position but student self-corrects, right hand 4+ seconds holding time or less, hand grip pressure is too soft or too strong
Left hand stays at side entire time, right hand 3 seconds holding time or less, pressure a little too soft or a little too strong, eye contact good
Left hand stays at side entire time, right hand 3 seconds holding time or less, medium pressure not too much pressure
No up and down sounds, robot speakingYes/No questions. stress the most important word, go up at the end of the questionWH questions. jump to the most important word, step down to the end of the question
Very little variation to up and down sounds, and wrong direction of sounds with wordsYes/No questions. stress the most important word, go up at the end of the questionWH questions. jump to the most important word, step down to the end of the question
Good up and down sounds paired with appropriate wordsYes/No questions. stress the most important word, go up at the end of the questionWH questions. jump to the most important word, step down to the end of the question
Excellent up and down sounds paired with appropriate wordsYes/No questions. stress the most important word, go up at the end of the questionWH questions. jump to the most important word, step down to the end of the questionLeave a Reply Cancel reply
I had to do a similar speaking test, but I had middle school students- a large number of middle school students (800+) to test in a 1 month period. The co-teachers were a little bit more helpful in designing the tests than yours were. Not much helpful- I still had to cobble questions together myself but at least they told me to differentiate the tests between advanced and basic students.
I had enough time with them for one practice session, 2 weeks before the test. 1 group of 3 students, 3 questions each out of 24 possible, easy-medium-hard. I wanted a simple grading rubric (+-) but the teachers wanted numbers 100-60 so I went with that.
The students performed as well as I expected they would. There were less “frozen” students than I originally assumed there would be. But there were some disappointments
What school do you go to?- I can’t believe the number of students that got this wrong. After a while I started telling students who got it wrong “You’re going back to elementary school”.
Which is bigger, car or bus?- again, quite a few got this wrong. I don’t know of any cars in Korea that big. I remember the 80’s pimp Caddies…
When is the Yeosu Expo?- this is on a sign on EVERY DAMN BLOCK in the city! There’s like no way they can’t know this.
Where is Dokdo located, the east sea or the west sea?- I didn’t ask this question very often, but I was disappointed with the results. At one point I ranted about it to the class after the test”Do you know how much money your government and K-POP stars waste on this crap. I’m not even from here yet I know where the damn thing is!”
*sigh* next semester, more practice…
I hear you on the whole ‘the only thing that matters is producing test scores’ thing.
I’m curious–how did you have the students practice? Did you divide them into groups of 3 and then have them ask each other sample questions?
How much time did you have per test? Did you give them any kind of ‘warm-up questions’? For example, “What’s your name?” to give them time to settle down a bit…?
I’m also curious if the questions you made were from the textbook or your lessons or ….
I only see them once every two weeks (If that) and so I was able to dedicate one period for prep and one period for the test. For the prep, I gave them the answer form (aka scaffold) and asked them to answer and write it down on their test question sheet. So they had two weeks to practice the answers they wrote down.
Easy questions were essentially the warm up questions. I asked them in order of 1 easy question, 1 medium question, and 1 hard question to each group, depending on the dynamics of the question (How are you today?- this can be reused, while Which is smaller, cat or mouse? only has a shelf life of once per group).
I made up all the questions, though some questions like “What is your father’s job?” came from a previous lesson on job titles earlier in the semester. Some of the questions are local knowledge questions. Nonetheless, there were no questions on the test that were not covered in the prep class.
There was no time for a group practice, unfortunately.
Just wondering what textbook you used in your highschool classes. You mentioned that you were leaving a ‘book’ for the next native english teacher and that you would hope he would use it.
I’ve been asked to get a textbook/workbook for my afterschool classes. Haven’t found a very decent one yet. Some are OK, but none stand out.
PS – interesting post. i’ve done high school interviews myself. in fact, that’s what my wordpress account stems from!