Title: Data, Exhibits and Performancebased Assessment Systems
1Data, Exhibits andPerformance-based Assessment
Systems
- David C. Smith, Dean Emeritus
- College of Education
- University of Florida
- Email smithdc_at_aol.com
2Accountability and Assessment are With Us
- It is obvious that we are in an age of
accountability and that inevitably involves
assessment. Like the Ice Age, there is little
reason to believe that it will pass quickly.
3We Need Data
- What data do you have to support your convictions
and contentions? - How do you respond to important questions
regarding program effectiveness and efficiency
when you are asked, what evidence do you have,
and how do you know?
4Often the Right Behavior for the Wrong Reasons
- Think about assessment and the development of an
assessment system as an opportunity rather than a
problem or burden. (Not NCATE.)
5Your Mind-set
- An assessment system will not tell you what you
should do. - Data-driven decisions.
- Data-informed decisions.
6Essential Considerations in Design
- Have you thought deeply about the purpose,
mission and vision of the organization and
related them to the assessment system? - Is your conceptual framework reflected in your
assessment system? - There are implications for what you choose to
include in your assessment system.
7Assessment System Issues to Consider
- Will (do) you have a blueprint or framework
(design) for your assessment system? - What criteria will you use for creating it?
- What does (will) it look like?
- How was (will) it be created?
- Consider the language in Standard 2.
8Inevitable Tension in Assessment
- The need to accommodate your situation.
- The need to compare with others in similar
situations. - It is necessary to compare within and across
institutions. (Internal relative productivity
and comparison with counterpart units.)
9Multiple Measures
- Multiple measures can be valuable. Intentional
redundancy can be critical. (Aircraft
instruments.) - Sometimes a matter of perspective. It is
valuable to look at a problem from more than one
angle. (Headcount and FTE faculty and
candidates.) - Sometimes a matter of timing. What are key
points at which to access? (At a minimum,
entrance, exit and follow-up.)
10Problems that I see regularly see
- People have difficulty in creating an assessment
system. - People think more about collecting data than they
do about the structure of their assessment system
and the kinds of data that they include in their
assessment system. - They often want to do it for the wrong reason
for accreditation rather than seeing it as a tool
to evaluate and improve what they are doing.
11Other Problems
- They have difficulty in using candidate three
ways. (The aggregation issue.) - People are not aware of meaningful data that
already exists and can be imported into their
assessment system. Then they can focus their
effort on data that they need to generate. - People often do not know how to use data well.
12- People often do not consider examining
relationships among data sets. (FTE and
headcount enrollment, enrollment and cost to
generate a SCH). - Time is a problem. It is not realistic to expect
that busy people can create and maintain an
assessment system on top of everything else.
It is very difficult to develop, implement,
maintain and revise an assessment system without
additional resources. - Resources human and technological, are needed.
The allocation of resources is a measure of
institutional priority.
13Collecting and Using Data
- It is one thing to collect data.
- It is another thing to be discriminating in
collecting data. - And still another thing to know how to use data.
14Proactive Data
- We are not good at being proactive in generating
data and we are not good at being creative in
generating data. - Be proactive give people information that they
do not ask for but informs them more deeply about
the effectiveness of your organization. - Think carefully about what creative and
informative data you might want to include in
your assessment system.
15Aggregation and Design Issues - Timing
- Admission.
- Early in the program.
- Mid-program.
- Pre-student teaching.
- Exit.
- Follow-up.
16Aggregation and Design Issues - Content
- Candidate.
- Demographic.
- Qualitative.
- Performance.
- Knowledge
- Skills
- Dispositions
- Evidence of a positive effect on student learning.
- Resources and Productivity
- People.
- Budget.
- Space.
- Equipment.
17Aggregation and Design Issues Levels of Data
Course/Faculty
Program
Department / Cost Center
Unit
Institution
18Aggregation and Design Issues Sets and Sub-sets
Course Faculty Course Faculty Course Faculty
Program
Program Program Program Program
Department
Unit
Department Department Cost Centers
Unit Unit Unit Support
Centers
Institution
19Candidate Performance Assessment
- Choose a question. (K S D)
- How would you measure individual performance?
- How would you aggregate the data to the program
and the unit? - If appropriate, how would you compare the unit
data with parallel institutional data?
20Knowledge
- The candidates are well-grounded in the content
they teach. - The candidates possess the professional knowledge
to practice competently. - The candidates possess technological knowledge
for professional and instructional purposes.
21Skills
- The candidates can plan an effective lesson.
- The candidates can give timely and effective
feedback to their students. - The candidates appropriately address the needs of
diverse and special needs students. - The candidates have a positive effect on student
learning.
22Dispositions
- The candidates have a passion for teaching.
- The candidates genuinely care about their
students. - The candidates believe that all their students
can learn. - The candidates are reflective practitioners.
23Informing Through Exhibits
- Provide data through exhibits.
- The conceptual framework.
- Evidence of candidate performance.
- Portfolios.
- Evidence of a positive effect on student
learning. - Pictures are worth 1000s of words
- Clinical sites.
- Maps.
- Posters of events.
24Exhibits Reflect a Climate
- Exhibits can be user-friendly.
- Access to documents.
- Electronic support.
- Video tapes.
- Work stations.
- CDs.
- Creature comforts.
- Pictures of campus events.
- Faculty publications.
- Location, location, location.
25Everything is not easily measured.
- It doesnt make sense to think that you have
to measure with a micrometer if you are going to
mark with a piece of chalk and cut with an axe.
26- Do not make high-stakes decisions based on soft
data. - Consider directionality in analyzing data.
27- What matters and what matters most? (The need to
know and the nice to know.) - There are major implications for assessment
system design and data elements.
28- Some of the least valuable data are the most
easily gathered. - Some of the most important things may be the most
difficult to measure.
29- What you do not measure is a profound statement
about what you do not value.
30- People in an organization focus on what is
measured not what is said to be important. - Consider the impact on single measures of
performance in P-12 schools.
31Assessing Your Assessment System
- What data will you include?
- How essential is it?
- How important is it?
- In considering your data.
- How will you collect it?
- How will you analyze it?
- How will you use it?
32Assessing Your Assessment System
- Is your assessment system too large?
- Is your assessment system too small?
- Does it have the data you need?
- Does it have data you do not use?
33- Creating an assessment system is a creative
task it is also tedious and time-consuming.