IT in Education - PowerPoint PPT Presentation

1 / 44
About This Presentation
Title:

IT in Education

Description:

... of Education Richard Riley remarked, 'The primary reason ... Mason, M. M. 1984 49 0.884. McCaskey, et al. 1989 144 0.193. Russin, I. 1995 36 0.071 ... – PowerPoint PPT presentation

Number of Views:40
Avg rating:3.0/5.0
Slides: 45
Provided by: leefo
Category:
Tags: education | mason | riley

less

Transcript and Presenter's Notes

Title: IT in Education


1
IT in Education
  • Evaluation

2
Why Evaluation?
  • In his opening address, U.S. Secretary of
    Education Richard Riley remarked, "The primary
    reason for this conference is to gather
    information from all of the outstanding schools,
    districts, and states represented here-so that we
    can study it, share it, and learn from it. Just
    as important as learning what works, we must
    learn what does not work. We must not assume
    everything that employs technology is going to be
    successful. That is why evaluation is so
    important. And then we must use that evaluation
    to create positive change."

The Secretary's Conference on Educational
Technology Evaluating the Effectiveness of
Technology on July 12-13, 1999, in Washington,
D.C.
3
  • Do we really use evaluation to create positive
    change?

4
Evaluate what?
  • Software/courseware for choosing suitable
    teaching/learning software
  • Computer/Internet Use for knowing whether IT
    used properly
  • Perception/attitude for knowing students
    feeling about IT
  • Learning Effectiveness
  • Academic achievement whether IT improves
    learning?
  • Generic Skills whether IT helps developing
    high-order thinking skills that can be
    transferred to other disciplines
  • Web-based Collaboration Learning Behaviour ????
  • Implementation of IT in schools

5
Evaluation Methods
  • Checklist/Questionnaires
  • Achievement Tests
  • Tests on problem solving, creativity,
  • Interview/observation

6
Software Evaluation
  • By using Checklist or similar to be done by
    experts, teachers,users
  • By experiments
  • Evaluation of CAI
  • http//www3.fed.cuhk.edu.hk/ited/AppOfComp2004/Lec
    ture13-Evaluation/Evaluation20of20Computer20bas
    ed20labroatory20simulation.pdf
  • http//www3.fed.cuhk.edu.hk/ited/AppOfComp2004/Lec
    ture13-Evaluation/Evaluation20of20the20Hyper20
    Apuntes20Ingteractive20Learning20environment..p
    df
  • Evaluation of Interactive Learning Environment
  • http//www3.fed.cuhk.edu.hk/ited/AppOfComp2004/Lec
    ture13-Evaluation/Evaluation20of20the20Hyper20
    Apuntes20Ingteractive20Learning20environment..p
    df

7
Software Evaluation
  • Checklist an example
  • http//ited.fed.cuhk.edu.hk/CommonUse/cai.htm

8
Software Evaluation
  • What generally evaluated?
  • Read this http//www3.fed.cuhk.edu.hk/ited/AppOfCo
    mp2004/Lecture13-Evaluation/Evaluation20of20the
    20Hyper20Apuntes20Ingteractive20Learning20envi
    ronment..pdf
  • And answer.

9
Can the following questions be answered?
  • Students learn better with CAI?
  • In what ways?
  • Do they like using CAI?
  • Others?
  • A collective answer?

10
Measure of Learning Effects
  • Meta-Analysis A Meta-Analysis of the
    Effectiveness of Teaching and Learning with
    Technology on Student Outcomes

11
Effects of CAI on Academic Achievement A
meta-analysis
  • Each study examined in this research
  • Was conducted in a secondary school.
  • Included quantitative results pertaining to
    academic achievement as the dependent variable
    and CAI as the treatment variable.
  • Was of a quasi-experimental, experimental, or
    correlational research design.
  • Had a combined minimum total of 20 students in
    the experimental and control groups.

12
Effects of CAI on Academic Achievement A
meta-analysis
  • Author(s)                    Date       n       
    ES  Bailey, T. E.                1991       
    46      0.775 Bass, et al.                
    1986       121      0.414 Bass, et
    al.                 1986       121     
    0.066 Bass, et al.                 1986       
    85      0.626 Bass, et al.                
    1986        85      0.421 Battista, et
    al.             1987        48     
    0.189 Birkenholtz, et al.          1987      
    312      0.143 Carnes, et al.              
    1987       100      0.280 Christie, et
    al.             1989       265     
    0.044 Christie, et al.             1989      
    265      0.108 Davidson, R. L.             
    1985        54      0.175 Dunn, S.
    M.                  1985        96     
    0.413 Durnin, R.                   1985      
    154      0.928 Durnin, R.                  
    1985       154      1.360 Durnin,
    R.                   1985       154     
    1.000 Durnin, R.                   1985      
    154      0.587 Elliot, E. L.               
    1985       191     -0.042 Elliot, E.
    L.                1985       191     
    0.647 Elliot, E. L.                1985      
    191      0.114 Ferrell, B. G.              
    1986        91      0.488 
  • Horton, et al.               1994        72    
    -0.360 Horton, et al.               1994       
    72      0.125 Hounshell, et al.           
    1989       202      0.438 Hunter, et
    al.               1992        32    
    -0.366 King, R. V.                  1988      
    342      0.230 Klein, et al.               
    1987        96     -0.061 Landry, S.
    A.                1987        29    
    -1.250 Lewis, et al.                1993      
    148      0.156 Mannuel, S. Q.              
    1987        28     -0.073 Marty, J.
    E.                 1985       425     
    0.293 Mason, M. M.                 1984       
    49      0.884 McCaskey, et al.            
    1989       144      0.193 Russin,
    I.                   1995        36     
    0.071 Russin, I.                   1995       
    38     -0.725 St. Pierre, K. A.           
    1992        72     -0.557 St. Pierre, K.
    A.            1992        72    
    -0.520 Wohlgehagen, K. S.           1992      
    242     -0.378 Wood, J. B.                 
    1991       104      0.081 Wood, J.
    B.                  1991       104      0.077

13
Mean Effect Size by Differing Years
  • Year         Number of ES       Mean
    ES  1984              1             
    0.884 1985              9             
    0.606 1986              6             
    0.385 1987              6            
    -0.129 1988              1             
    0.230 1989              4             
    0.196 1991              3             
    0.311 1992              4            
    -0.455 1993              1             
    0.156 1994              2            
    -0.118 1995              2             -0.327

14
Overall average effect of sizes in SDx units
15
Relationship between mean effect sizes and years
16
A More Recent Meta-Analysis
  • http//www.ncrel.org/tech/effects2/waxman.pdf 

17
Satisfied with the results?
18
CAI or Integrating IT in T/L
19
media-comparison studies--studies comparing
instruction with and without computers--may be
misconceptualized (Clark, 1994).
  • effect of achievement gains being attributed to
    the computer mode of delivery, instead probably
    occur because of the instructional methods
    employed in the software used.
  • Papert (1987) argues that controlled experiments
    are useful only in a "conservative context where
    change is small, slow, and superficial.
  • They are based on a concept of changing a
    single factor in a complex situation while
    keeping everything else the same. This is
    incompatible with the enterprise of rebuilding an
    education system in which nothing will be the
    same" (p. 22).

20
Computer-based learning (CBL)
  • defined more broadly to include such traditional
    CBL types as drill-and-practice software,
    simulations, and computer-based labs, as well as
    more recent computer applications such as
    Internet communication and information retrieval,
    collaborative projects using HyperStudio (1996)
    and the Internet, and Jonassen's (1996) mind
    tools, such as spreadsheet and computerized
    concept mapping using Inspiration (1994).
  • The teachers involved were ordinary teachers
    whose computer competence varied from novice to
    advanced.
  • The various computer hardware and applications
    mentioned here are integrated into teaching and
    learning whenever necessary and appropriate.

21
Effects of Integrating IT into Classroom
  • No significant effect of computer integration on
    achievement,
  • although positive attitude toward computers was
    high both before and after the integration
    period, there was no significant change in
    student attitude toward computers after the
    computer integration.
  • Generally, students' perceived using computers as
    having a positive effect on their learning.

22
What are the problems?
  • IT enhance learning or not?
  • Use CAI or Integrate IT?

23
critical issues in evaluating the effectiveness
of technology in education
  • The effectiveness of technology is embedded in
    the effectiveness of other school improvement
    efforts.
  • Current practices for evaluating the impact of
    technology in education need broadening.
  • Standardized test scores offer limited formative
    information with which to drive the development
    of a school's technology program. Most schools
    are looking for additional means for collecting
    useful data for this purpose.
  • Schools must document and report their evaluation
    findings in ways that satisfy diverse
    stakeholders' need to know.
  • In order for evaluation efforts to provide
    stakeholders with answers to their questions
    about the effectiveness of technology in
    education, everyone must agree on a common
    language and standards of practice for measuring
    how schools achieve that end.
  • The role of teachers is crucial in evaluating the
    effectiveness of technology in schools, but the
    burden of proof is not solely theirs.
  • Implementing an innovation in schools can result
    in practice running before policy. Some existing
    policies need to be "transformed" to match the
    new needs of schools using technology.

24
The effectiveness of technology is embedded in
the effectiveness of other school improvement
efforts.
  • The school community members use technology to
    simplify, facilitate, and enhance individualized
    and social learning processes within its
    interdisciplinary curriculum.
  • Teachers are seen as leaders, facilitators, and
    mentors, well grounded in technology
    implementation strategies and well trained in the
    use of the most current computing equipment and
    software applications.
  • Children exposed to interdisciplinary units of
    study use technology as a tool to become
    literate, cooperative, problem-solving,
    self-motivated learners and that is what Mantua
    is all about.
  • What most distinguishes education at Mantua
    Elementary is that its students are not passive
    recipients of knowledge, but rather, active
    participants in the full educational process.

25
Current practices for evaluating the impact of
technology in education need broadening.
  • It is about learning and the need to find new
    ways to identify and measure the skills and
    knowledge that students gain from using
    technology.
  • It is about stakeholders' needs for information
    beyond self-report analyses and traditional
    standardized testing.
  • It is about building the capacity of teachers to
    evaluate technology resources and to align their
    uses with the learning goals and content
    standards of the curriculum.
  • It is about evaluating technology implementation
    efforts, curriculum integration methods, and
    learning processes in order to make sound
    decisions for continual improvement.
  • Ultimately, the issue is about involving the key
    stakeholders, identifying appropriate measurable
    indicators, and developing reliable instruments
    that will yield insightful and valid information
    about what makes educational technology
    effective.

26
Standardized test scores offer limited formative
information with which to drive the development
of a school's technology program.
  • Formative evaluation
  • tells what technology applications work, under
    what conditions, and with which students.
  • Tells how technology affects student attitudes
    toward learning.
  • shows the impact of technology on promoting
    collaboration among diverse learners.
  • tracks technology literacy skills development and
    indicate the impact of technology access.
  • tells teachers about their students' progress
    toward developing the skills to access, explore,
    and integrate information think at high levels
    and design, experiment, and model complex
    phenomena.
  • yields information on the effectiveness of
    professional development activities, the adequacy
    of school management systems, and other issues
    having to do with building the school technology
    infrastructure.

27
  • Evidence of technology effectiveness may lie in
  • fewer disciplinary referrals,
  • students' completing more complex homework
    assignments,
  • a new robustness in student performances,
  • students taking more difficult electives or
    requesting particular teachers and courses,
  • increases in requests for equipment and technical
    assistance,
  • declines in special education placements,
  • lower drop-out rates,
  • rises in college applications and acceptances,
  • increases in student job offers,
  • more parent participation..

28
Examples of Alternative Measurements
29
Evaluation of Problem Solving Abilities
  • The Research Cycle (McKenzie, 1995)
  • questioning
  • planning
  • gathering
  • sorting sifting
  • synthesizing
  • evaluating
  • questioning
  • planning
  • gathering
  • sorting sifting
  • synthesizing
  • evaluating
  • reporting

30
Assessment
  • each student maintains a "research log" which
    tracks the reasoning used as well as the research
    actions taken while cycling through the process.
  • classroom teacher maintains written anecdotal
    observations of the student's activity.

31
Scoring
  • 1. QUESTIONING
  • A researcher recognizes decisions, issues and
    problems when looking at a topic.
  • 5 - Discovers independently an issue or problem
    which needs a decision or solution after
    exploring a topic
  • 3 - Formulates questions about topics with adult
    help to elevate the question to focus on issues
    and problems
  • 1 - Relies upon adults to state questions and
    topics
  • 2. PLANNING
  • A researcher identifies sources of information
    likely to build understanding.
  • 5 - Selects high quality sources independently
    and efficiently
  • 3 - Selects sources with mixed success
  • 1 - Wanders from source to source without
    questioning which source will be most helpful

32
Scoring
  • 3. GATHERING
  • A researcher collects and stores information for
    later consideration.
  • 5 - Collects and organizes important information
    for retrieval independently
  • 3 - Collects information with some degree of
    organization
  • 1 - Loses track of most important information
  • 4. SORTING
  • A researcher reorganizes information so that the
    most valuable becomes readily available to
    support understanding.
  • 5 - Creates structure which provides a coherent
    and clear focus
  • 3 - Creates partial organization of information
  • 1 - Leaves information as gathered

33
Scoring
  • 5. SYNTHESIZING
  • A researcher recombines information to develop
  • decisions and solutions.
  • 5 - Creates an original decision or solution
  • 3 - Reorganizes and combines strategies of others
  • 1 - Restates the decisions and solutions of
    others
  • 6. EVALUATING
  • A researcher determines whether the information
    gathered is sufficient to support a conclusion.
  • 5 - Tests solutions and decisions to see if
    supporting information is adequate
  • 3 - Looks for missing information
  • 1 - Reaches a hasty conclusion

34
Scoring
  • 7. REPORTING
  • A researcher translates findings into a
    persuasive, instructive, or effective product(s).
  • 5 - Creates and presents an original product
    whicheffectively addresses original problem or
    issues
  • 3 - Provides a product which offers some insight
    with regard to the original problem or issues
  • 1 - Shares the work or thoughts of others

35
????
36
(No Transcript)
37
????????
  • ????
  • ????
  • ????
  • ????

38
????(Productivity indicator)
  • ???????????????????????,????????????????,?????
  • ???????
  • ?????????????????????????????????????????,????????
    ????????????????

39
????(Reflection indicator)
  • ???????????????????????,????????????,??????????,??
    ???
  • ??????????????????
  • ??????????????????
  • ????????????????????

40
????(Contribution indicator)
  • ???????????????????????,???????,??????????,?????
  • ??????????????????
  • ??????????????????
  • ????????,??????????????

41
????(Feedback indicator)
  • ???????????????????????,????????????????,?????
  • ?????????????????,?????????????????
  • ?????????????????,?????????????????
  • ???????????????????,?????????(????????????????)???
    ???
  • ???????????????????,?????????(????????????????)???
    ???

42
Implementation of IT in Education
  • The Hong Kongs experience
  • Conceptual Framework for the Overall Study
  • Any missing components?
  • Any incorrectly positioned components?

43
Research Questions
  • Read the 6 sets of questions
  • Are there any questions that should be
    added/deleted?

44
Questions for Discussion
  • What do we expect from evaluation?
  • How is the evaluation affect our way forward?
Write a Comment
User Comments (0)
About PowerShow.com