Title: Evaluation
1Evaluation
- Xiaofei Lu
- APLNG 588
- October 25, 2007
2Agenda
- Assignments 4 5
- Software review
- Chapter 3
3Introduction
- Value and effectiveness of CALL materials
- Student attitudes and perceptions in LLE
- CALL tasks
- Specific methodologies and strategies
- Evaluation targets and scales
- Software, Web sites, online courses, CMC, LMS
- Small, large, and national level evaluations
3
4Introduction (contd)
- Evaluation methodologies
- Simple checklists or surveys
- Multifaceted, longitudinal studies involving
qualitative and quantitative approaches - Formative vs. summative evaluation
4
5Introduction (contd)
- Distinction between evaluation and research
- Assessment of quality or value vs. contribution
to knowledge or theory (Johnson 1992) - Decision vs. hypothesis driven (Krathwohl 1993)
- Difference in audience
- Usefulness in evaluation process
- Effect-based vs. theory-based evaluation
6Introduction (contd)
- Evaluation studies
- Aimed at establishing the worth of something
- Are primarily decision driven
- Are designed for a more targeted audience
- Have a practical outcome
- Draw value from the process as well as product
- Focus on did it work, not why did it work
7Approaches to software evaluation
- CALL focus
- Construction testing of new artifacts
- Research practice around finished products
- Emerging vs. established technologies
- Has the technology come far enough?
- Language teacher or designer as evaluator
8Checklists
- Levy Farrugias (1988) checklist
- Fourteen categories
- Content Program objectives
- Documentation Program instructions
- Student use Program response to student
- Program design Difficulties for ESL students
- Presentation Authoring material
- Teacher utility Motivational devices
- Technical quality Multiple-choice questions
9 Levy Farrugia (1988)
- Related questions under each section
- Program objectives
- Are program objectives commensurate with those of
the college, the teachers and the students? - Are program objectives clearly defined in the
documentation? - Are the stated objective achieved?
- Are the objectives relevant to the student? Are
they clear to the student?
10Limitations of checklists
- No guidance on how evaluators
- Answer questions
- Resolve issues if positive/negative answers
co-exist
11Surveys
- Useful tool to collect student/teacher reactions
- Goals or purposes of evaluation
- Evaluate new technology, functionality,
application - Assess student attitudes and perceptions
- Obtain student/tutor feedback on CALL course
- Artifacts or products included
- CALL programs, CDs, Web sites, courses
12Soboleva Tronenko (2002)
- Learning Russian on the Web
- Observations, surveys, interviews
- Revealed strengths and weaknesses of course
- Focused nature of questions
- Detail extracted from surveys
- Designers responded to observations and concerns
in further changes
13Third-party evaluations
- Evaluations conduced by third party vs. language
teacher-designers - Evaluators have no involvement in product
- Challenges
- choosing the appropriate evaluation criteria
- Knowing the software and its use in depth
- Values
14The CALICO software review
- Assesses critical systematic properties using an
intrinsically discursive process (Burston 2003) - Required properties
- Pedagogical validity
- Curriculum adaptability
- Desired properties
- Efficiency
- Effectiveness
- Pedagogical innovation
15The CALICO template
- Technical features
- Reliability of operation
- Ease of use
- Activities (procedure)
- Nature and design of activities
- Teacher fit (approach)
- Learner fit (design)
16Teacher fit
- Most critical and hardest to assess
- Theoretical underpinnings of student activities
- Conformance to theories of cognitive
development/SLA classroom methodology - Accordance with teachers curricula objectives
17Learner fit
- Linguistic level
- Response handling
- Adaptation to learner differences
- Learning styles and strategies
- Learner control
- Design flexibility by the instructor
18Examples
- Reviews
- Multifunction sites, e.g., Daves ESL Cafe
- LL materials/activities teachers through CMC
- Qualitatively different kinds of interaction
- Human vs. automatic feedback
- Activity-by-activity assessment
- One set of criteria not sufficient
19Selected points of focus
- The designer-evaluator perspective
- A methodology focus
- An online teaching and technology focus
- A language/language skills focus
- A student/courseware focus
20Methodology focus
- Evaluating cultura (Furstenberg et al. 2001)
- Learning about language and culture using CMC
- Questionnaires, discussions, authentic materials
- Evaluation criteria
- Usefulness and interest for cultural
understanding - Quality of materials and activities web
interface - Nature and frequency of resources used
- Games in understanding target culture
- Focus on user perception of methodology
21Online teaching and technology focus
- Evaluating Lyceum (Hampel 2003)
- Focus on viability of the tool for online tuition
- Learner/tutor response to it for
learning/pedagogy - Ease of learning to use the tool
- Stage of tool development relevant
- Questionnaires, observations, student logbooks
- Changes made based on feedback
22Language/language skill focus
- Komori Zimmerman (2001)
- Evaluated 5 web-based kanji programs
- Criteria from literature on autonomous kanji
learning and review of kanji learning programs - Facilitates design of WWKanji
- Blok et al. (2001)
- 6 factors for describing qualities of courseware
for word learning, e.g., learning goals - Evaluative questions for each factor
23Student/courseware focus
- Effectiveness of a course or courseware
- Airline Talk Project (Gimeno-Sanz 2002)
- LL materials for airline staff
- Designers conducted student needs analysis
- Summative evaluation of first edition
- Results used formatively in Airline Talk 2
24Larger-scale frameworks
- Hubbards methodological framework
- Consistent with frameworks for language teaching
methodology - Nondogmatic and flexible
- Link development, evaluation and implementation
- Identify elements of teaching/learning processes
and the multiple interrelationships among them
25Hubbards evaluation framework
- Table 3.3, page 61
- teacher fit (approach)
- Learner fit (design)
- Operational description (procedure)
- Basis of the CALICO approach
26Chapells framework
- Theory-based, task-oriented
- SLA theory central to framework
- Cognitive and social affective conditions for LLT
- Task-based instruction and focus on form
- Improving evaluation criteria
- Incorporating recent findings and SLA theory
- Guiding usage of criteria
- Ensuring applicability of criteria and theory to
software and task
27Chapells framework (contd)
- 5 principles for CALL evaluation
- A situation-specific argument
- Judgmental analysis of software and tasks and
empirical analysis of learner performance - criteria should come from theory and research
- Criteria should be applied in view of task
purpose - LL potential should be central to evaluation
28Chapells framework (contd)
- 6 criteria for CALL task appropriateness
- LL potential
- Learner fit
- Meaning focus
- Authenticity
- Positive impact
- Practicality
29Discussion
- Evaluation focus and approaches
- What is evaluated
- How are they evaluated
- Strengths and limitations of
- Checklists and surveys
- Designer-oriented and third-party evaluations
- Nature of the object of the evaluation
- A closer look at general evaluation frameworks
30Checklists
- Checklist format
- In defense of checklist problems (Susser 2001)
- Accuracy, compatibility, and transferability
issues - Focus on technology vs. teaching/learning aspects
- Lack of objectivity, reliability and validity
- Bias toward particular approach/method
- General-purpose evaluation tool for any software
- Need for background knowledge and experience for
accurate response
31Surveys
- Hemand Cushion (2002)
- More composite evaluation approach preferred
- Peer evaluations and discussion in formative
stages - User walkthroughs/workshops in summative phase
- Survey results useful
- Potential for shared experience
- Wise and informed ideas for problem solutions
32Designer-oriented and third-party evaluations
- Designer-evaluator evaluations
- Intimate knowledge of object of evaluation
- Criteria defined to answer targeted questions
- Weaknesses may be glossed over
- Third-party evaluations
- Less knowledgeable of object/context of
evaluation - Useful for narrowing the field
- Unbiased assessment
33Nature of object of evaluation
- Coherent and clear content and form
- Complex, hybrid systems
- Significant differences between elements
- Different criteria for elements of different
categories - Tutors quality of input processing and feedback
- Tools nature and quality of interaction
34General evaluation frameworks
- Hubbard and Chapelle frameworks
- Give weight to different elements
- Give order to certain processes
- Chapelle (2001)
- Six criteria with an order of priority
- Theory-driven focus on form
- Single-priority criterion LL potential (form)
- Noncore areas Positive Impact
35Contrasting Hubbard Chapelle
- Table 3.5, page 82
- Chapell
- Single priority on LL potential
- Theory-driven
- Focuses on establishing priorities
- Directed at the LL task
36Contrasting Hubbard Chapelle
- Hubbard
- Operational description, teacher fit and learner
fit equally weighted - Hubbard broad conception of teacher/learner fit
- Describes interrelationships
- Most suitable for courseware or web tutorials
37Software review recommendations
- LL website
- Daves ESL Cafe
- Randalls ESL Cyber Listening Lab
- Authoring software
- Adobe Authorware 7
- Online courses
- Learn Chinese online
- Korean through English
- Transitional English for Speakers of Spanish
- More