Title: Project VIABLE
1Project VIABLE Teacher Preference Assessment of
Direct Behavior Ratings (DBRs) Ajlana Music, T.
Chris Riley-Tillman, Sandra M. Chafouleas,
Theodore J. Christ East Carolina University,
University of Connecticut, University of
Minnesota .
RESULTS
RESULTS
Project VIABLE
This study represents one of the investigations
initiated under Project VIABLE. Through Project
VIABLE, empirical attention is being directed
toward the development and evaluation of
formative measures of social behavior involving a
direct behavior rating (DBR). The goal of Project
VIABLE is to examine the DBR through 3 phases of
investigation including 1) foundations of
measurement, 2) decision making and validity, and
3) feasibility.
A summary of participant responses to preference
of DBR features are reported in Table 1.
Participants ranked the 100 point scale as the
one they were most likely to use (28) followed
by a 10 point scale (24), a 3 point scale (13),
a 6 point scale (4) and a 20 point scale (1).
The majority of responders (51) ranked
teacher as the most appropriate rater
completing a DBR. Nearly half of the teachers
(45) ranked 30-minutes as the preferred length
of observation, followed by a full day (24)
observation. With regard to classes of academic
and behavior problems, the responders indicated
that severe academic (43) and severe (35) and
minor behavior (35) problems were most
appropriate to be monitored with a DBR. In
addition, 43 of participants ranked the DBRs as
most suited for high stakes decision making. Mean
ratings of participant DBR preference statements
are presented in Table 2. As noted, participants
preferred a 10 point rating scale (M 3.17, SD
1.28) followed by a 100 point rating scale (M
2.98, SD 1.47). More teachers preferred to rate
students where behavior was worded in positive
manner (M 3.82 SD 1.05) as opposed to negative
manner (M 2.45, SD 1.26). Majority of
participants preferred a 30 minute observation
(M 3.38, SD 1.37) followed by a 60 minute
observation (M 3.17, SD 1.28), of one behavior
a the time (M 4.02, SD 1.09) followed by two
behaviors (M 3.19, SD 1.00), and once a day (M
3.38, SD 1.32) followed by once a week (M 2.99,
SD 1.12). Preferred terms used to describe a
DBR included Daily Progress Report (M 3.20, SD
1.21), Behavior Report Card (M 3.13, SD 1.08)
and Direct Behavior Rating (M 3.06, SD 1.36).
Mean ratings of participant acceptability
regarding the use of DBRs as an assessment tool
are presented in Table 3. Participants were most
willing to rate 2 students at once (M 3.03, SD
1.16) using a continuous line (M 3.50, SD 1.10)
with descriptive anchors (M 3.84, SD 0.95). In
addition, the participants indicated that DBRs
are more effective for making medium stakes
special education decisions (M 3.18 SD 0.75)
and rating state behaviors (M 3.46, SD 0.89)
.
Table 2 Mean and Standard Deviation of DBR
Preference Statements
Table 3 Mean and Standard Deviation of a DBR as
an Assessment Tool
Table 1 Preference of DBR Features
INTRODUCTION
A DBR refers to the rating of a specified
behavior at least daily and then sharing that
information with someone other than the rater. In
a review by Chafouleas and colleagues (2002), it
was suggested that DBRs may be feasible,
acceptable, effective in promoting positive
student behavior, and provide a way to increase
parentteacher communication. Despite these
characteristics and frequent reference in the
literature, to date , only one study has
formally examined the acceptability and reported
use of DBRs among a national sample of teachers
(Chafouleas, Riley-Tillman Sassu, 2006). The
findings report that the format of DBRs varies
greatly, suggesting that teachers have found the
DBRs to be highly adaptive in representing a
broad array of possibilities rather then having a
single, scripted purpose. In addition, the
results supported previous claims that the DBR is
both a used and accepted tool in practice,
suggesting that DBRs deserve closer attention in
research and practice related to positive
behavior supports. Considering that acceptability
has been hypothesized to be likely related to use
as well as fidelity of implementation
acceptability is an important concept to explore.
Information about different facets of
acceptability and current use among teachers will
continue to aid in understanding how to better
incorporate DBRs in practice as well as provide
directions for future research. The purpose of
the present study was to, in part, replicate the
previous study by Chafoules, Riley-Tillman
Sassu (2006) as well as to investigate features
of DBRs that are most preferred among teachers
following a short training.
Consistent with the previous study by Chafouleas,
Riley-Tillman Sassu (2006), the majority of
teachers preferred being responsible for
completion of the DBR form. More teachers
preferred rating students when behavior was
worded in positive manner, and completing ratings
once per day after a specific block of time (as
opposed to entire day). Finally, the teachers in
this study preferred the terms Daily Progress
Report, Behavior Report Card, and Direct Behavior
Ratings, whereas in the previous study, DBRs
were most likely referred to by some other name
(e.g. Daily Report Card, Home-School Note).
Overall, responders perceived DBRs to be an
acceptable tool for assessment purposes, and
findings support directions related to
development and evaluation of single item DBR
scales. Future research might examine the length
of a training session as well as content, such as
with and without feedback, in order to explore
effects on accuracy of ratings in natural
classroom settings.
MATERIALS METHODS
Note N104. Participants were asked to rank
order the items from (1) most to (5) least or (1)
most to (3) least. Responses ranked as most (1)
are presented.
Note N104. Participants were asked to use a
5-point Likert scale with responses that range
from strongly disagree (1) to strongly agree (5)
.
Note N104. Participants were asked to use a
5-point Likert scale with responses that range
from strongly disagree (1) to strongly agree (5)
.
Participants were 104 elementary school teachers
employed by the public schools in the
southeastern region. Six workshops were organized
within the schools where a doctoral student
presented a 10 minute Powerpoint overview of the
DBRs and the DBR standard form, prior to
distributing the preference assessment. The
assessment packet was created to assess
participant acceptability of DBR features. The
packet asked participants to provide demographic
information, and answer questions regarding
acceptability of the DBR as an assessment tool.
Acceptability was assessed through use of a
5-item scale that incorporated a 5-point Likert
scale with responses that range from strongly
disagree (1) to strongly agree (5) as well as the
rank order of items from (1) most to (5) least.
CONTACTS
For additional information, please direct all
correspondence to Chris Riley-Tillman as
rileytillmant_at_ecu.edu or Ajlana Music as
AM0828_at_ecu.edu Preparation of this poster was
supported by a grant from the Institute for
Education Sciences (IES), U.S. Department of
Education (R324B060014).
Chafouleas, S.M., Riley-Tillman, T.C.,
McDougal, J. (2002). Good, bad, or in-between
How does the daily behavior report card rate?
Psychology in the Schools, 39, 157-169.
Chafouleas, S.M., Riley-Tillman, T.C., Sassu,
K. A. (2006). Acceptability and reported use of
Daily Behavior Report Cards among teachers.
Journal of Positive Behavior Interventions, 3,
174-182.