Title: Engaging Community Colleges A First Look
1CCSSE Workshop Digging Deeper into CCSSE Results
and Learning More about Student Engagement
2Welcome!
Samuel Echevarria Senior Research
Associate Erika Glaser Research Associate Beiyi
Cai Research Associate
3Important Questions
- 1) How do you use CCSSE data?
- What are the strengths and weaknesses of CCSSE
data? - What other quantitative and/or qualitative data
sources are available? - How do you present CCSSE data?
- How do you create improvement plans or implement
successful program changes based on your CCSSE
data? - How do you tie learning outcomes data to CCSSE
engagement data?
4Where to Begin?
- Know Thyself
- Learners
- Teachers
- Facilitators
- Community
- Resources
5- Agenda
- Benchmark Analysis
- Item Analysis
- Activity
6Benchmarks
- Benchmarks Provide context
- Determine what the mean you would like to be
compared with is - Benchmarks Situate Your Results
- What does it mean to have 80 of your students
satisfied? - A good place to start, but not necessarily the
end point
7Benchmarking
- Five ways colleges can reach for excellence
- using CCSSE Benchmarks
- Compare themselves to national average
- Compare themselves to high-performing colleges
- Measure subgroup differences in engagement
- Measure student engagement over time
- Measure subgroup differences over time!
8Benchmarking
Comparing Yourself to Others
9The CCSSE Benchmarks
- The five CCSSE benchmarks
- Active and Collaborative Learning (ACL)
- Student-Faculty Interaction (SFI)
- Academic Challenge (ACH)
- Support for Learners (SL)
- Student Effort (SE)
10Starting with Report Benchmarks
- City Community College
- 2005 2007
- SL 52.6 52.3
- SFI 45.1 51.6
- SE 48.5 51.5
- ACH 44.1 50.6
- ACL 43.2 50.5
11Starting with Report Benchmarks
12Converting to Raw Benchmarks
- Standardized vs. Raw Benchmarks
- Consideration of external versus internal
comparisons - Consideration of over-time change
- Consideration of subgroup differences
- CCSSE recommends expanded use of raw scores
(i.e. scores that range from 0 to 1) - Remember to utilize iweight variable
13Continuing with Raw Benchmarks
- City Community College
- 2005 2007 (Effect Size)
- SFI (2) 0.32 0.38 6.4 (0.34)
- ACL (5) 0.31 0.36 4.9 (0.30)
- ACH (4) 0.50 0.56 5.2 (0.28)
- SE (3) 0.44 0.46 2.2 (0.13)
- SL (1) 0.42 0.43 0.8 (0.04)
14Selecting Raw Items for Analysis
- Digging into a raw benchmark can uncover
interesting differences in items. - Academic Challenge Raw Item Scores
- ITEMS 2005 2007
- Exams 0.65 0.67
- Envschol 0.64 0.66
- Analyze 0.55 0.61
- Perform 0.54 0.61
- Synthesz 0.50 0.57
- Applying 0.50 0.55
- Workhard 0.49 0.53
- Evaluate 0.46 0.51
- Readasgn 0.39 0.45
- Writeany 0.38 0.40
15Referring to the Benchmark Report
Readasgn
Writeany
Exams
16Selecting Raw Items for Over-Time Analysis
- Digging into a raw benchmark can uncover
interesting differences in items. - Academic Challenge Raw Scores and Effect Size
- ITEMS 2005 2007 Effect Size
- Exams 0.65 0.67 0.09 (1.9)
- Envschol 0.64 0.66 0.10 (2.9)
- Analyze 0.55 0.61 0.23 (6.8)
- Perform 0.54 0.61 0.26 (7.7)
- Synthesz 0.50 0.57 0.23 (6.7)
- Applying 0.50 0.55 0.16 (4.8)
- Workhard 0.49 0.53 0.11 (3.1)
- Evaluate 0.46 0.51 0.18 (5.4)
- Readasgn 0.39 0.45 0.20 (5.6)
- Writeany 0.38 0.40 0.07 (2.1)
17Comparing Raw Items and Effect Sizes
18Subgroup Approaches
- It is important to identify important subgroups
based on knowledge of student demographics,
community factors and other at-risk identifiers
you have available to you. - CCSSE data include many important demographic
student attributes (e.g. age, sex,
race/ethnicity, marital status parents
education). - CCSSE also includes academic variables (e.g.
developmental status, credit hours credential
seeker) - Both research and institutional knowledge should
inform these exploratory analyses (e.g. transfer
intention, student loan status, orientation
program/course learning community experience).
19Selecting Raw Item for Sub-Group Analysis
- Digging into a raw benchmark can uncover
interesting differences in items. - Academic Challenge Raw Scores and Effect Size
- ITEMS 2005 2007 Effect Size
- Exams 0.65 0.67 0.09 (1.9)
- Envschol 0.64 0.66 0.10 (2.9)
- Analyze 0.55 0.61 0.23 (6.8)
- Perform 0.54 0.61 0.26 (7.7)
- Synthesz 0.50 0.57 0.23 (6.7)
- Applying 0.50 0.55 0.16 (4.8)
- Workhard 0.49 0.53 0.11 (3.1)
- Evaluate 0.46 0.51 0.18 (5.4)
- Readasgn 0.39 0.45 0.20 (5.6)
- Writeany 0.38 0.40 0.07 (2.1)
20Selecting Subgroups
- City Community College
- Frequencies
21Selecting Subgroups
- City Community College
- Means - PERFORM
22Looking at Subgroup Effect Sizes over Time (2005
to 2007)
23Looking at Subgroup Effect Sizes in Combination
over Time
24More Group Differences
Comparison of Different Groups
25Questions?
26Activity
- Groups should address the following questions
- How does your institution use CCSSE data?
- In your experience working with CCSSE data, what
are the strengths and weaknesses of these data? - What other quantitative and/or qualitative data
sources are available at your college? Have you
found support for your CCSSE results? - Explain how your college shares its CCSSE results
and any challenges associated with presenting
CCSSE data to others in the college community.
Based on your experience, what suggestions or
tips would you give to others presenting CCSSE
data? - Has anyone created improvement plans or
implemented successful program changes based on
your CCSSE data? How did you accomplish this?
Which data elements did you use to support your
decision? - Has your college successfully tied learning
outcomes data to CCSSE engagement data? How did
you accomplish this? What did you find?