Formative+Evaluation

= = = = = Formative Evaluation = The one-to-one evaluation was two-fold. It involved speaking with the learners themselves and a second evaluation with another two instructors.

Part One evaluation was conducted via class testing assessments and individual learner interview in a face-to-face setting. The instructor chose seven learners in which to interview and analyze. After greeting the seven students, we all sat down and I first gave the students the written surveyto complete. Next, we looked at a grade spreadsheet of their grades, the actual quizzes and tests given, and they were given the opportunity to listen to the recording of the ORAL Presentations. As we looked through different items, the students commented on the instructional materials, the difficulty of the content and the complexity of the assessment tool. The learners spoke about what parts were difficult and which were easy, which were tricky, and which worked particularly well.

The following are some sample questions posed during the interview:

Some issues uncovered as part of the evaluation included:
 * 1) What did you think of the instruction?
 * 2) Specifically, what did you think of:
 * Wiki - collaboration
 * Powerpoint Presentation for visual show
 * Written & oral script writing
 * Oral Presentation
 * Peer review on Ning
 * Final essay (synthesis assessment)
 * 1) Was anything difficult to understand?
 * 2) Could you relate to the examples?
 * 3) How did you feel about the practices and feedback?
 * 4) Are there any changes you would recommend?

1. Students preferred the instruction of grammar taking place in a face-to-face meeting, such as classroom lecture, as opposed to individual learning. Students performed poorly on first written Quiz after having independently studied in //Ven// book and //Barrera// book with brief instructor review at end of segment. Students improved their written assessment scores after teacher did the following: she created “English” version of notes; posted student-teacher written notes on a Powerpoint to the Ning; provided Web-based practice opportunity from [|www.studyspanish.com] website; and added individual and small-group instructional time for review during class and after-school tutoring. Then the teacher re-tested.

2. Instructor has also decided to move the written “application” assessment (grammar quiz) to instead take place after the Recipe Projects (written and oral alternative “application” assessment) to allow for more student practice and preparation of Spanish content.

3. Some changes suggested by the test subjects may not be warranted. Students are 16-years-old, and may not have yet developed good organizational skills or study skills. Students are also in the beginning stages of their overall rigorous two-year IB Programme, and may not yet be astute in their learning practices. Instructor has noticed that students typically study “to recognize, not memorize” the concepts. The current Spanish content is intensive and students must judiciously choose answers to reflect knowledge.

In the small-group evaluation, the instructor decided to add the following sample questions:

1. How did you feel about the intensity of the Spanish material and pacing of the course? 2. How did you feel about using the technology tools to enhance your learning? 3. Did you have enough time to complete tasks? If not, realizing this course only takes place during four class meetings per week, what would you suggest?

During Part Two, using the textbook //Designing Effective Instruction// suggested questions, I self-evaluated during a “debrief” with two colleagues at my school:

1. What is my overall impression of the instruction? 2. Did the learners have any difficulty with the material? 3. Were the in-class practice activities effective? 4. Were the online components effective? 5. Are there any changes that need to be made to the performance checklist? 6. Are there any other changes that should be considered? (Morrison, Ross, Kemp 2007)

During Part One and Part Two, I examined the posttest results and prepared a summary chart: ** Assessment Instrument ** || ** Objective #1 ** || ** % Correct/Accuracy – Sample group average ** || Vocabulary Test || 1 ||   91%  || Grammar Quiz  || 2 ||   71%  || Oral Quiz  || 3, 4 ||   94%  || Reading Comprehension Quiz  || 6 ||   94%  || Based on the posttest data of the sampling, the teaching methods /assessment for the grammar material appear to still need revision. After modifying the instruction methods, I have decided to move the Grammar Quiz so that it takes place after the alternative assessments, as stated above. Test item questions may need to be revised, to include more matching sections and smaller sections (that test one concept at a time instead of all at once), in order to help students achieve a higher score.

See Design Project Report 3 (//Discussion of small group data// section) for specific responses to questions posed during the Evaluation Interview.