Evaluating the - PowerPoint PPT Presentation

About This Presentation
Title:

Evaluating the

Description:

Title: PowerPoint Presentation Author: Mimi Mark Created Date: 1/15/2006 6:20:54 PM Document presentation format: On-screen Show Other titles: Times New Roman Arial ... – PowerPoint PPT presentation

Number of Views:401
Avg rating:3.0/5.0
Slides: 54
Provided by: Mimi53
Category:

less

Transcript and Presenter's Notes

Title: Evaluating the


1
Evaluating the RTI Readiness of School
AssessmentsJim Wrightwww.interventioncentral.or
g
2
(No Transcript)
3
Interpreting the Results of This Survey
  • YES to Items 1-3. Background. The measure gives
    valid general information about the students
    academic skills and performance. While not
    sufficient, the data can be interpreted as part
    of a larger collection of student data.
  • YES to Items 4-5. Baseline. The measure gives
    reliable results when given by different people
    and at different times of the day or week.
    Therefore, the measure can be used to collect a
    current snapshot of the students academic
    skills prior to starting an intervention.
  • YES to Items 6-7. Goal-Setting. The measure
    includes standards (e.g., benchmarks or
    performance criteria) for typical student
    performance (e.g., at a given grade level) and
    guidelines for estimating rates of student
    progress. Schools can use the measure to assess
    the gap in performance between a student and
    grade level peersand also to estimate expected
    rates of student progress during an intervention.
  • YES to Items 8-11. Progress Monitoring. The
    measure has the appropriate qualities to be used
    to track student progress in response to an
    intervention.

4
Background Validity
  • Content Validity. Does the measure provide
    meaningful information about the academic skill
    of interest?
  • Convergent Validity. Does the measure yield
    results that are generally consistent with other
    well-regarded tests designed to measure the same
    academic skill?
  • Predictive Validity. Does the measure predict
    student success on an important future test,
    task, or other outcome?

5
Baseline Reliability
  • Test-Retest/Alternate-Form Reliability. Does the
    measure have more than one version or form? If
    two alternate, functionally equivalent versions
    of the measure are administered to the student,
    does the student perform about the same on both?
  • Interrater Reliability. When two different
    evaluators observe the same students performance
    and independently use the measure to rate that
    performance, do they come up with similar
    ratings?

6
Benchmarks Goal-Setting
  • Performance Benchmarks. Does the measure include
    benchmarks or other performance criteria that
    indicate typical or expected student performance
    in the academic skill?
  • Goal-Setting. Does the measure include guidelines
    for setting specific goals for improvement?

7
Progress-Monitoring and Instructional Impact
  • Repeated Assessments. Does the measure have
    sufficient alternative forms to assess the
    student weekly for at least 20 weeks?
  • Equivalent Alternate Forms. Are the measures
    repeated assessments (alternative forms)
    equivalent in content and level of difficulty?
  • Sensitive to Short-Term Student Gains. Is the
    measure sensitive to short-term improvements in
    student academic performance?
  • Positive Impact on Learning. Does research show
    that the measure gives teachers information that
    helps them to make instructional decisions that
    positively impact student learning?

8
Team Activity Evaluate the RTI Readiness of
Your Schools Academic Measures
  • Directions Select one important literacy
    measure used by your school. On the form Evaluate
    the RTI Readiness of Your Schools Academic
    Measures (next page), evaluate the RTI
    readiness of this measure. Be prepared to share
    your results with the group.

9
A Review of RTI Literacy Assessment/ Monitoring
ToolsJim Wrightwww.interventioncentral.org
10
RTI Literacy Assessment Progress-Monitoring
  • The RTI Literacy model collects reading
    assessment information on students on a schedule
    based on their risk profile and intervention
    placement.
  • Reading measures used are valid, reliable,
    brief, and matched to curriculum expectations for
    each grade. Depending on the grade, the battery
    of reading measures used can include assessments
    in phonological awareness, oral reading fluency,
    and basic reading comprehension.

Source Burns, M. K., Gibbons, K. A. (2008).
Implementing response-to-intervention in
elementary and secondary schools Procedures to
assure scientific-based practices. New York
Routledge.
11
RTI Literacy Assessment Progress-Monitoring
(Cont.)
  • To measure student response to
    instruction/intervention effectively, the RTI
    Literacy model measures students reading
    performance and progress on schedules matched to
    each students risk profile and intervention Tier
    membership.
  • Benchmarking/Universal Screening. All children in
    a grade level are assessed at least 3 times per
    year on a common collection of literacy
    assessments.
  • Strategic Monitoring. Students placed in Tier 2
    (supplemental) reading groups are assessed 1-2
    times per month to gauge their progress with this
    intervention.
  • Intensive Monitoring. Students who participate in
    an intensive, individualized Tier 3 reading
    intervention are assessed at least once per week.

Source Burns, M. K., Gibbons, K. A. (2008).
Implementing response-to-intervention in
elementary and secondary schools Procedures to
assure scientific-based practices. New York
Routledge.
12
  • Apply the 80-15-5 Rule to Determine if the
    Focus of the Intervention Should Be the Core
    Curriculum, Subgroups of Underperforming
    Learners, or Individual Struggling Students (T.
    Christ, 2008)
  • If less than 80 of students are successfully
    meeting academic or behavioral goals, the
    intervention focus is on the core curriculum and
    general student population.
  • If no more than 15 of students are not
    successful in meeting academic or behavioral
    goals, the intervention focus is on small-group
    treatments or interventions.
  • If no more than 5 of students are not successful
    in meeting academic or behavioral goals, the
    intervention focus is on the individual student.

Source Christ, T. (2008). Best practices in
problem analysis. In A. Thomas J. Grimes
(Eds.), Best practices in school psychology V
(pp. 159-176).
13
Curriculum-Based Measurement Advantages as a Set
of Tools to Monitor RTI/Academic Cases
  • Aligns with curriculum-goals and materials
  • Is reliable and valid (has technical adequacy)
  • Is criterion-referenced sets specific
    performance levels for specific tasks
  • Uses standard procedures to prepare materials,
    administer, and score
  • Samples student performance to give objective,
    observable low-inference information about
    student performance
  • Has decision rules to help educators to interpret
    student data and make appropriate instructional
    decisions
  • Is efficient to implement in schools (e.g.,
    training can be done quickly the measures are
    brief and feasible for classrooms, etc.)
  • Provides data that can be converted into visual
    displays for ease of communication

Source Hosp, M.K., Hosp, J. L., Howell, K. W.
(2007). The ABCs of CBM. New York Guilford.
14
SOURCE CAST Website http//www.cast.org/publica
tions/ncac/ncac_curriculumbe.html
15
Measuring General vs. Specific Academic Outcomes
  • General Outcome Measures Track the students
    increasing proficiency on general curriculum
    goals such as reading fluency. Example CBM-Oral
    Reading Fluency (Hintz et al., 2006).
  • Specific Sub-Skill Mastery Measures Track
    short-term student academic progress with clear
    criteria for mastery (Burns Gibbons, 2008).
    Example Letter Identification.

Sources Burns, M. K., Gibbons, K. A. (2008).
Implementing response-to-intervention in
elementary and secondary schools Procedures to
assure scientific-based practices. New York
Routledge. Hintz, J. M., Christ, T. J., Methe,
S. A. (2006). Curriculum-based assessment.
Psychology in the Schools, 43, 45-56.
16
(No Transcript)
17
CBM Literacy Measures Sources
  • DIBELS (https//dibels.uoregon.edu/)
  • AimsWeb (http//www.aimsweb.com)
  • Easy CBM (http//www.easycbm.com)
  • iSteep (http//www.isteep.com)
  • EdCheckup (http//www.edcheckup.com)
  • Intervention Central (http//www.interventioncentr
    al.org)

18
  • Reading 5 Big Ideas
  • Phonemic Awareness/Specific Subskill Mastery
  • Alphabetics
  • Fluency with Text
  • Vocabulary
  • Comprehension

19
Initial Sound Fluency (ISF)
  • standardized, individually administered measure
    of phonological awareness that assesses a childs
    ability to recognize and produce the initial
    sound in an orally presented word. The examiner
    presents four pictures to the child, names each
    picture, and then asks the child to identify
    (i.e., point to or say) the picture that begins
    with the sound produced orally by the examiner.
  • Time About 3 minutes

SOURCE Good et al. (2002) DIBELS administration
and scoring guide. https//dibels.uoregon.edu/meas
ures/files/admin_and_scoring_6th_ed.pdf
20
  • Reading 5 Big Ideas
  • Phonemic Awareness/Specific Subskill Mastery
  • Alphabetics
  • Fluency with Text
  • Vocabulary
  • Comprehension

21
Phoneme Segmentation Fluency (PSF)
  • assesses a students ability to segment three-
    and four-phoneme words into their individual
    phonemes fluently. The PSF task is administered
    by the examiner orally presenting words of three
    to four phonemes. It requires the student to
    produce verbally the individual phonemes for each
    word.
  • Time 1 minute

SOURCE Good et al. (2002) DIBELS administration
and scoring guide. https//dibels.uoregon.edu/meas
ures/files/admin_and_scoring_6th_ed.pdf
22
  • Reading 5 Big Ideas
  • Phonemic Awareness
  • Alphabetics/Specific Subskill Mastery
  • Fluency with Text
  • Vocabulary
  • Comprehension

23
Letter Naming Fluency (LNF)
  • Students are presented with a page of upper- and
    lower-case letters arranged in a random order and
    are asked to name as many letters as they can.
  • Time 1 minute

SOURCE Good et al. (2002) DIBELS administration
and scoring guide. https//dibels.uoregon.edu/meas
ures/files/admin_and_scoring_6th_ed.pdf
24
  • Reading 5 Big Ideas
  • Phonemic Awareness
  • Alphabetics/Specific Subskill Mastery
  • Fluency with Text
  • Vocabulary
  • Comprehension

25
  • Reading 5 Big Ideas
  • Phonemic Awareness
  • Alphabetics/Specific Subskill Mastery
  • Fluency with Text
  • Vocabulary
  • Comprehension

26
Nonsense Word Fluency (NWF)
  • Tests the alphabetic principle including
    letter-sound correspondence and of the ability to
    blend letters into words in which letters
    represent their most common sounds. The student
    is presented a sheet of paper with randomly
    ordered VC and CVC nonsense words (e.g., sig,
    rav, ov) and asked to produce verbally the
    individual letter sound of each letter or
    verbally produce, or read, the whole nonsense
    word.
  • Time 1 minute

SOURCE Good et al. (2002) DIBELS administration
and scoring guide. https//dibels.uoregon.edu/meas
ures/files/admin_and_scoring_6th_ed.pdf
27
  • Reading 5 Big Ideas
  • Phonemic Awareness
  • Alphabetics/Specific Subskill Mastery
  • Fluency with Text
  • Vocabulary
  • Comprehension

28
  • Reading 5 Big Ideas
  • Phonemic Awareness
  • Alphabetics
  • Fluency with Text/General Outcome Measure
  • Vocabulary
  • Comprehension

29
Oral Reading Fluency (ORF)
  • Student performance is measured by having
    students read a passage aloud for one minute.
    Words omitted, substituted, and hesitations of
    more than three seconds are scored as errors.
    Words self-corrected within three seconds are
    scored as accurate. The number of correct words
    per minute from the passage is the oral reading
    fluency rate.
  • Time 1 minute

SOURCE Good et al. (2002) DIBELS administration
and scoring guide. https//dibels.uoregon.edu/meas
ures/files/admin_and_scoring_6th_ed.pdf
30
  • Reading 5 Big Ideas
  • Phonemic Awareness
  • Alphabetics
  • Fluency with Text
  • Vocabulary
  • Comprehension/General Outcome Measure

31
Comparison of RTI Assessment/Monitoring Systems
  • DIBELS Dynamic Indicators of Basic Early
    Literacy Skills
  • Initial Sound Fluency Preschool gt Middle K
  • Letter Naming Fluency Beginning K gt Beginning Gr
    1
  • Phoneme Segmentation Fluency Middle K gt End Gr 1
  • Nonsense Word Fluency Middle K gt Beginning Gr 2
  • Oral Reading Fluency Middle Gr 1 gt Gr 6

32
Comparison of RTI Assessment/Monitoring Systems
  • Easy CBM
  • Letter Naming Fluency K gt Gr 1
  • Letter Sound Fluency K gt Gr 1
  • Phoneme Segmentation Fluency K gt Gr 1
  • Word Reading Fluency K gt Gr 3
  • Oral Reading Fluency Gr 1 gt Gr 8

33
Comparison of RTI Assessment/Monitoring Systems
  • AimsWeb
  • Letter Naming Fluency Beginning K gt Beginning Gr
    1
  • Letter Sound Fluency Middle K gt Beginning Gr 1
  • Phoneme Segmentation Fluency Middle K gt Middle
    Gr 1
  • Nonsense Word Fluency Middle K gt End Gr 1
  • Oral Reading Fluency Gr 1 gt Gr 8
  • Maze (Reading Comprehension Fluency) Gr 1 gt Gr 8

34
Comparison of 2 RTI Assessment/Monitoring Systems
  • DIBELS
  • Initial Sound Fluency Preschool gt Middle K
  • Letter Naming Fluency Beginning K gt Beginning
    Gr 1
  • Phoneme Segmentation Fluency Middle K gt End Gr
    1
  • Nonsense Word Fluency Middle K gt Beginning Gr 2
  • Oral Reading Fluency Middle Gr 1 gt Gr 6
  • AimsWeb
  • Letter Naming Fluency Beginning K gt Beginning
    Gr 1
  • Letter Sound Fluency Middle K gt Beginning Gr 1
  • Phoneme Segmentation Fluency Middle K gt Middle
    Gr 1
  • Nonsense Word Fluency Middle K gt End Gr 1
  • Oral Reading Fluency Gr 1 gt Gr 8
  • Maze (Reading Comprehension Fluency) Gr 1 gt Gr
    8

35
Elbow Group Activity RTI-Ready Literacy
Measures
  • In your elbow groups
  • Review the set of CBM literacy assessment tools
    in the handout.
  • Select a starter set of literacy measures by
    grade level that you would like your school to
    adopt. (If your school already has a standard set
    of CBM literacy/tools, discuss ways to optimize
    its use.)

36
CBM Developing a Process to Collect Local
Norms Jim Wrightwww.interventioncentral.org
37
RTI Literacy Assessment Progress-Monitoring
  • To measure student response to
    instruction/intervention effectively, the RTI
    model measures students academic performance and
    progress on schedules matched to each students
    risk profile and intervention Tier membership.
  • Benchmarking/Universal Screening. All children in
    a grade level are assessed at least 3 times per
    year on a common collection of academic
    assessments.
  • Strategic Monitoring. Students placed in Tier 2
    (supplemental) reading groups are assessed 1-2
    times per month to gauge their progress with this
    intervention.
  • Intensive Monitoring. Students who participate in
    an intensive, individualized Tier 3 intervention
    are assessed at least once per week.

Source Burns, M. K., Gibbons, K. A. (2008).
Implementing response-to-intervention in
elementary and secondary schools Procedures to
assure scientific-based practices. New York
Routledge.
38
Local Norms Screening All Students (Stewart
Silberglit, 2008)
  • Local norm data in basic academic skills are
    collected at least 3 times per year (fall,
    winter, spring).
  • Schools should consider using curriculum-linked
    measures such as Curriculum-Based Measurement
    that will show generalized student growth in
    response to learning.
  • If possible, schools should consider avoiding
    curriculum-locked measures that are tied to a
    single commercial instructional program.

Source Stewart, L. H. Silberglit, B. (2008).
Best practices in developing academic local
norms. In A. Thomas J. Grimes (Eds.), Best
practices in school psychology V (pp. 225-242).
Bethesda, MD National Association of School
Psychologists.
39
Local Norms Using a Wide Variety of Data
(Stewart Silberglit, 2008)
  • Local norms can be compiled using
  • Fluency measures such as Curriculum-Based
    Measurement.
  • Existing data, such as office disciplinary
    referrals.
  • Computer-delivered assessments, e.g., Measures of
    Academic Progress (MAP) from www.nwea.org

Source Stewart, L. H. Silberglit, B. (2008).
Best practices in developing academic local
norms. In A. Thomas J. Grimes (Eds.), Best
practices in school psychology V (pp. 225-242).
Bethesda, MD National Association of School
Psychologists.
40
Measures of Academic Progress (MAP)www.nwea.org
41
Applications of Local Norm Data (Stewart
Silberglit, 2008)
  • Local norm data can be used to
  • Evaluate and improve the current core
    instructional program.
  • Allocate resources to classrooms, grades, and
    buildings where student academic needs are
    greatest.
  • Guide the creation of targeted Tier 2
    (supplemental intervention) groups
  • Set academic goals for improvement for students
    on Tier 2 and Tier 3 interventions.
  • Move students across levels of intervention,
    based on performance relative to that of peers
    (local norms).

Source Stewart, L. H. Silberglit, B. (2008).
Best practices in developing academic local
norms. In A. Thomas J. Grimes (Eds.), Best
practices in school psychology V (pp. 225-242).
Bethesda, MD National Association of School
Psychologists.
42
Local Norms Supplement With Additional Academic
Testing as Needed (Stewart Silberglit, 2008)
  • At the individual student level, local norm
    data are just the first step toward determining
    why a student may be experiencing academic
    difficulty. Because local norms are collected on
    brief indicators of core academic skills, other
    sources of information and additional testing
    using the local norm measures or other tests are
    needed to validate the problem and determine why
    the student is having difficulty. Percentage
    correct and rate information provide clues
    regarding automaticity and accuracy of skills.
    Error types, error patterns, and qualitative data
    provide clues about how a student approached the
    task. Patterns of strengths and weaknesses on
    subtests of an assessment can provide information
    about the concepts in which a student or group of
    students may need greater instructional support,
    provided these subtests are equated and reliable
    for these purposes. p. 237

Source Stewart, L. H. Silberglit, B. (2008).
Best practices in developing academic local
norms. In A. Thomas J. Grimes (Eds.), Best
practices in school psychology V (pp. 225-242).
Bethesda, MD National Association of School
Psychologists.
43
Steps in Creating Process for Local Norming Using
CBM Measures
  • Identify personnel to assist in collecting data.
    A range of staff and school stakeholders can
    assist in the school norming, including
  • Administrators
  • Support staff (e.g., school psychologist, school
    social worker, specials teachers,
    paraprofessionals)
  • Parents and adult volunteers
  • Field placement students from graduate programs

Source Harn, B. (2000). Approaches and
considerations of collecting schoolwide early
literacy and reading performance data. University
of Oregon Retrieved from https//dibels.uoregon.e
du/logistics/data_collection.pdf
44
Steps in Creating Process for Local Norming Using
CBM Measures
  • Determine method for screening data collection.
    The school can have teachers collect data in the
    classroom or designate a team to conduct the
    screening
  • In-Class Teaching staff in the classroom collect
    the data over a calendar week.
  • Schoolwide/Single Day A trained team of 6-10
    sets up a testing area, cycles students through,
    and collects all data in one school day.
  • Schoolwide/Multiple Days Trained team of 4-8
    either goes to classrooms or creates a central
    testing location, completing the assessment over
    multiple days.
  • Within-Grade Data collectors at a grade level
    norm the entire grade, with students kept busy
    with another activity (e.g., video) when not
    being screened.

Source Harn, B. (2000). Approaches and
considerations of collecting schoolwide early
literacy and reading performance data. University
of Oregon Retrieved from https//dibels.uoregon.e
du/logistics/data_collection.pdf
45
Steps in Creating Process for Local Norming Using
CBM Measures
  • Select dates for screening data collection. Data
    collection should occur at minimum three times
    per year in fall, winter, and spring. Consider
  • Avoiding screening dates within two weeks of a
    major student break (e.g., summer or winter
    break).
  • Coordinate the screenings to avoid state testing
    periods and other major scheduling conflicts.

Source Harn, B. (2000). Approaches and
considerations of collecting schoolwide early
literacy and reading performance data. University
of Oregon Retrieved from https//dibels.uoregon.e
du/logistics/data_collection.pdf
46
Steps in Creating Process for Local Norming Using
CBM Measures
  • Create Preparation Checklist. Important
    preparation steps are carried out, including
  • Selecting location of screening
  • Recruiting screening personnel
  • Ensure that training occurs for all data
    collectors
  • Line up data-entry personnel (e.g., for rapid
    computer data entry).

Source Harn, B. (2000). Approaches and
considerations of collecting schoolwide early
literacy and reading performance data. University
of Oregon Retrieved from https//dibels.uoregon.e
du/logistics/data_collection.pdf
47
Team Activity Draft a Plan to Conduct an
Academic Screening in Your School or District
  • Directions
  • Discuss a process for collecting screening data
    three times per year in your school.
  • What are resources in your school that can assist
    with these screenings?
  • What challenges do you anticipateand how can you
    overcome them?

48
Monitoring Student Progress at the Secondary
Level Jim Wrightwww.interventioncentral.org
49
Universal Screening at Secondary Schools Using
Existing Data Proactively to Flag Signs of
Disengagement
  • Across interventions, a key component to
    promoting school completion is the systematic
    monitoring of all students for signs of
    disengagement, such as attendance and behavior
    problems, failing courses, off track in terms of
    credits earned toward graduation, problematic or
    few close relationships with peers and/or
    teachers, and then following up with those who
    are at risk.

Source Jimerson, S. R., Reschly, A. L., Hess,
R. S. (2008). Best practices in developing
academic local norms. In A. Thomas J. Grimes
(Eds.), Best practices in school psychology V
(pp. 1085-1097). Bethesda, MD National
Association of School Psychologists. p.1090
50
Mining Archival Data What Are the Early Warning
Flags of Student Drop-Out?
  • A sample of 13,000 students in Philadelphia were
    tracked for 8 years. These early warning
    indicators were found to predict student drop-out
    in the sixth-grade year
  • Failure in English
  • Failure in math
  • Missing at least 20 of school days
  • Receiving an unsatisfactory behavior rating
    from at least one teacher

Source Balfanz, R., Herzog, L., MacIver, D. J.
(2007). Preventing student disengagement and
keeping students on the graduation path in urban
middle grades schools Early identification and
effective interventions. Educational
Psychologist,42, 223235. .
51
What is the Predictive Power of These Early
Warning Flags?
Number of Early Warning Flags in Student Record Probability That Student Would Graduate
None 56
1 36
2 21
3 13
4 7
Source Balfanz, R., Herzog, L., MacIver, D. J.
(2007). Preventing student disengagement and
keeping students on the graduation path in urban
middle grades schools Early identification and
effective interventions. Educational
Psychologist,42, 223235. .
52
Breaking Down Complex Academic Goals into Simpler
Sub-Tasks Discrete Categorization
53
Identifying and Measuring Complex Academic
Problems at the Middle and High School Level
  • Students at the secondary level can present with
    a range of concerns that interfere with academic
    success.
  • One frequent challenge for these students is the
    need to reduce complex global academic goals into
    discrete sub-skills that can be individually
    measured and tracked over time.

54
Discrete Categorization A Strategy for Assessing
Complex, Multi-Step Student Academic Tasks
  • Definition of Discrete Categorization Listing
    a number of behaviors and checking off whether
    they were performed. (Kazdin, 1989, p. 59).
  • Approach allows educators to define a larger
    behavioral goal for a student and to break that
    goal down into sub-tasks. (Each sub-task should
    be defined in such a way that it can be scored as
    successfully accomplished or not
    accomplished.)
  • The constituent behaviors that make up the larger
    behavioral goal need not be directly related to
    each other. For example, completed homework may
    include as sub-tasks wrote down homework
    assignment correctly and created a work plan
    before starting homework

Source Kazdin, A. E. (1989). Behavior
modification in applied settings (4th ed.).
Pacific Gove, CA Brooks/Cole..
55
Discrete Categorization Example Math Study Skills
  • General Academic Goal Improve Tinas Math Study
    Skills
  • Tina was struggling in her mathematics course
    because of poor study skills. The RTI Team and
    math teacher analyzed Tinas math study skills
    and decided that, to study effectively, she
    needed to
  • Check her math notes daily for completeness.
  • Review her math notes daily.
  • Start her math homework in a structured school
    setting.
  • Use a highlighter and margin notes to mark
    questions or areas of confusion in her notes or
    on the daily assignment.
  • Spend sufficient seat time at home each day
    completing homework.
  • Regularly ask math questions of her teacher.

56
Discrete Categorization Example Math Study Skills
  • General Academic Goal Improve Tinas Math Study
    Skills
  • The RTI Teamwith teacher and student
    inputcreated the following intervention plan.
    The student Tina will
  • Approach the teacher at the end of class for a
    copy of class note.
  • Check her daily math notes for completeness
    against a set of teacher notes in 5th period
    study hall.
  • Review her math notes in 5th period study hall.
  • Start her math homework in 5th period study hall.
  • Use a highlighter and margin notes to mark
    questions or areas of confusion in her notes or
    on the daily assignment.
  • Enter into her homework log the amount of time
    spent that evening doing homework and noted any
    questions or areas of confusion.
  • Stop by the math teachers classroom during help
    periods (T Th only) to ask highlighted
    questions (or to verify that Tina understood that
    weeks instructional content) and to review the
    homework log.

57
Discrete Categorization Example Math Study Skills
  • Academic Goal Improve Tinas Math Study Skills
  • General measures of the success of this
    intervention include (1) rate of homework
    completion and (2) quiz test grades.
  • To measure treatment fidelity (Tinas
    follow-through with sub-tasks of the checklist),
    the following strategies are used
  • Approached the teacher for copy of class notes.
    Teacher observation.
  • Checked her daily math notes for completeness
    reviewed math notes, started math homework in 5th
    period study hall. Student work products random
    spot check by study hall supervisor.
  • Used a highlighter and margin notes to mark
    questions or areas of confusion in her notes or
    on the daily assignment. Review of notes by
    teacher during T/Th drop-in period.
  • Entered into her homework log the amount of
    time spent that evening doing homework and noted
    any questions or areas of confusion. Log reviewed
    by teacher during T/Th drop-in period.
  • Stopped by the math teachers classroom during
    help periods (T Th only) to ask highlighted
    questions (or to verify that Tina understood that
    weeks instructional content). Teacher
    observation student sign-in.
Write a Comment
User Comments (0)
About PowerShow.com