This past week we focused on evaluating CALL software. Three of the frameworks for evaluating CALL that were mentioned in class were Methodological Frameworks, Checklists, and SLA Approaches (Hubbard, 2006). In this blog post I comment on each one of these.
Methodological Frameworks based on Language Teaching
The methodological framework approach takes a more broad approach than checklists. Examples of checklists can be found on CALICO's software review guidelines.Teacher fit, learner fit, activities, and technology features are the four broad parts of the framework used by CALICO. What I like about the framework used here is that the technological evaluation is paired with a strong orientation towards defining how it will benefit the learner and how it can be used in teaching. Hubbard (2006) points out that until the 1980s evaluation of CALL was left to checklists adapted from general education, so the methodological framework approach takes a step in the right direction by better addressing language learning and teaching.
Checklists
Checklists have been criticized for overemphasizing technological aspects at the sake of not covering language pedagogy adequately. However, I think that if they are detailed enough they can be effective in giving language teachers an evaluative criteria that can quickly applied to a software. Both the SLA and Methodological approach are not as easy to apply to reviewing an application for making quick decisions on whether or not to use it in a language course. I think that the checklists handed out in class by Dr. Smart demonstrate the ease of use factor.
SLA-Based Approaches
One approach used to integrate SLA and CALL is that designed by Jamieson, Chapelle, and Preiss (2005):
1. Language learning potential: The degree of opportunity present for beneficial
focus on form;
2. Learner fit: The amount of opportunity for engagement with language under
appropriate conditions given learner characteristics;
3. Meaning focus: The extent to which learners’ attention is directed toward
the meaning of the language;
4. Authenticity: The degree of correspondence between the learning activity
and target language activities of interest to learners out of the classroom;
5. Positive Impact: The positive effects of the CALL activity on those who
participate in it; and
6. Practicality: The adequacy of resources to support the use of the CALL
activity.
This approach has a considerable amount of overlap with evaluation criteria of assessments, which is understandable given the researchers interest in language assessment as well. To me, this approach is not really intuitive and has some of the same ambiguity that I felt is apparent in the evaluative criteria used for reviewing language tests. I feel one would have to be trained in using such an evaluative approach and see a number of examples of it being applied first for it to make sense.
I think of three types of approaches, I prefer the methodological framework. In looking at the description of the review criteria posted on the CALICO website, it seems to me it is the approach that make the most sense for me and is the most approachable. Finally, I think it is interesting to look at the reviews of different CALL tools that have been posted on the site to better understand how evaluative criteria can be put to use.
References
Hubbard, P. (2006).
"Evaluating CALL Software," in Lara Ducate and Nike Arnold (eds.) Calling on
CALL: From Theory and Research to New Directions in Foreign Language Teaching.
San Marcos, TX: CALICO.
Jamieson, J., Chapelle, C., & Preiss, S. (2005). CALL Evaluation by developers, a teacher, and students. CALICO Journal, 23 (1), 93-138.
Great review. I think you've summed up the day's class succinctly. I tend to think the methodological frameworks are more valuable as well, especially from the perspective of an individual educator.
ReplyDelete