Title: Experiences with Active and Collaborative Learning
1Experiences with Active and Collaborative Learning
- PACISE 2005
- Bloomsburg University
- Tom Briggs
- thb_at_ship.edu
2Introduction
- Active and Collaborative Learning
- Students interact directly with material
- 70 increase in long-term retention
- Reduced drop out rate
3Active Learning in CS
- Commonly reserved for intro courses
- Used for solving a problem or writing code
- CS is a contact sport
- What about upper division courses?
4Felder-Silverman Learning Styles
- Four groups of learning styles
- Active and Reflective
- Sensing and Intuitive
- Visual and Verbal
- Sequential and Global
- Identifies preferred learning style
5Active vs. Reflective
- Active Learners
- Prefer concrete knowledge
- Hands-On
- Interactive
- Reflective Learners
- Prefer abstract concepts and theory
6Sensing and Intuition
- Sensing Learners
- Learning facts and concepts
- Intuitive Learners
- Prefer possibilities, applications and
relationships
7Visual and Verbal
- Visual Learners
- Prefer visual representation of material they can
see - Charts, graphs, figures
- Verbal Learners
- Prefer words, either spoken or written
8Sequential and Global
- Sequential Learners
- Follow material in a step-by-step sequence
- Global learners
- Need material in the context of its domain
- Need to understand the relationships between new
and old material
9Computer Science Students
10CS Students Preferences
- The demographics
- 83 of CS students are visual learners
- 55 were active
- Implications to in-class time
11Simple Active Technique
- Active does not imply difficult
- Ask students to interact with material
- Solve a problem
- Sketch a proof
- Trace a section of code
- Ask students to break into groups
- Set a time limit to complete task
- Call on a few groups to share solutions
- Ask students to judge goodness of solutions
12What does this do?
- Breaks the sequential flow of a lecture
- Students interact with material and their peers
- try out their conceptual understanding of the
material - Get immediate feedback
- Students get a break from information assault
- Provides time for students to cognitively process
knowledge
13Active Learning in CS
- Most literature addresses intro course
- Usually describes using code review / peer
programming - Advanced Courses ?
- Frequently taught as
- abstract facts / theory courses
- straight lectures
14Operating Systems
- Background
- Introduction to operating system concepts
- Juniors and Seniors
- Taught two sections (¼ 30 each)
- Four credit course
15Worksheets
- Frequently made use of in-class worksheets
- Example
- Computing system utilization with I/O
- Described the context of the equations (global)
- Relationship of equations (intuitive)
- Guided exploration, concrete examples (active)
- Worksheet Lecture slides (visual)
16Hypothesis Testing
- Abstract material difficult for students
- Ad-hoc, instructor lead demonstrations
- Example mmap( ) system call
- Student question provoked discussion (open/fopen)
- Lecture / slides put aside
- Lead class to develop hypothesis
(global/intuitive) - Created code to test hypothesis (visual)
- Students helped prof look up system calls
- Students ran program (truss) (active)
17System Calls
- In previous example, truss command
int main(int argc, char argv) int x FILE
fp int in open("test.c",S_IREAD)
read(in,x,sizeof(in)) close(in) fp
fopen("test.c","r") fread(x,sizeof(int),1,fp)
fclose(fp)
gcc test.c o test truss ./test
open("test.c", O_RDONLYO_NOCTTY)
3 read(3, "inc", 4)
4 close(3)
0 brk(0)
0x86da000 brk(0x86fb000)
0x86fb000 brk(0)
0x86fb000 open("test.c", O_RDONLY)
3 fstat64(3, st_modeS_IFREG0600,
st_size292, ...) 0 mmap2(NULL, 32768,
PROT_READPROT_WRITE, MAP_PRIVATEMAP_AN, -1, 0)
0xb75f0000 read(3, "include \ninclude
0 munmap(0xb75f0000, 32768)
0
18Student Perception
- Threading and context switches
- Class Lecture
- Read discussed theory (sensing)
- Discussed different OS implementations
(intuition) - Students challenged which is faster
- Small group discussions (active)
- Lead to develop hypothesis to test
(active/intuition) - Out-of-class assignment
- implement test, collect results, submit graphs
(visual) - small groups compare (varied) results (active)
- group presents one set of results
19Evaluation Synthesis
- Active environment
- Peer review of work
- Challenge pre-conceived beliefs
- Evaluate goodness of results
- Hypothesis testing leads to new results
- Evaluation Synthesis
- Highest levels of Blooms taxonomy
20Computer Organization
- Assembly Programming, CPU Architecture, ILP,
Memory, IO - Sophomores with CS1 CS2 experience
- Taught two sections (¼ 20 each)
- Four credit course
21Differences from OS
- Students lack extensive background
- Most had CS1 CS2
- Discrete Math, some Prob. Stat.
- Sophomores
- Relied on more structured / guided activities
- Fewer expectations of independent thinking
- Computer Organization
- Use of simulators and counters
- Focus on architecture
- Closer to familiar hardware
22Worksheets
- 15 worksheets
- Guided students through various activities
- Deriving and using Amdahls Law
- Observing and computing speed-ups
- Researching processor specifications
- Building assembly programs
- Use simulators and counters to observe machines
23Simulators
- Simulators
- software to simulate a physical system
- SimpleScalar tool chain
- Free MIPS R4000 32-bit simulator
- GNU C compiler and binutils (cross compiler)
- Different execution models
- Simple, no ILP
- Pipeline, no cache
- Pipeline and cache, in-order execution
- Pipeline, cache, and speculative execution (ROB)
24Counters
- Counters
- Machine status registers
- Pentium (RDMSR/WRMSR)
- UltraSPARC v8, v9 CPU control masks
- Software configures events
- Track execution of real program on real hardware
- Stochastic element
25Worksheets
Section from a worksheet (white space / student
response fields omitted)
26Inconsistent Results
- Simulators little variance
- Single thread of execution
- Not simulating entire system
- Counters high variance
- Context switching and interrupts
- Process affected by external events
- Inconsistent / surprising results challenge
students expectations
27Active Learning
- Hypothesis testing (e.g. best cache) (active)
- Data collection forecasting (visual/active)
- System comparison (active/global/intuitive)
- Tracing execution on simulator (visual)
- Assembly programming / registers (visual)
28Conclusions
- Active learning
- Effective in upper division theory courses
- Engaged and challenged students
- Appealed to a range of learning styles
- Did not require significant preparation overhead
29Hypothesis Test Explain
- Instructor guides students to
- Understand problem
- Develop hypothesis
- Identify tests to prove/disprove hypothesis
- Execute tests and collect results
- Explain results
- Support hypothesis
- Develop new hypothesis to explain inconsistent
results - Students
- Exposed to the science in Computer Science
- Cognitive process challenged and reinforced
30Results
- First offering for courses
- Initial exam scores were generally good
- Student feedback on end-of-term surveys
- Overall very positive / higher than department
college averages - Over 60 listed worksheets and class
discussions as most positive aspects of course - Comments
31Take Home Points
- Active environments
- Challenge students knowledge
- Move students higher in Blooms taxonomy
- Improve student comprehension and retention of
material - Provide another vehicle to assess student
comprehension - Do not require sophisticated or overwhelming
class preparation - Promote faculty role as leader/guide
32Bibliography
- (see paper for in-text citations)
- 1 Owen Astrachan, Concrete teaching hooks and
props as instructional technology, ITiCSE '98
Proceedings of the 6th annual conference on the
teaching of computing and the 3rd annual
conference on Integrating technology into
computer science education, ACM Press, 1998, pp.
21-24. - 2 R. M. Felder and R. Brent, Learning by doing,
Chem. Engr. Education 37 (2003), no. 4, 282-283. - 3 Scott Grissom and Mark J. Van Gorp, A
practical approach to integrating active and
collaborative learning into the introductory
computer science curriculum, Proceedings of the
seventh annual consortium on Computing in small
colleges midwestern conference, Consortium for
Computing Sciences in Colleges, 2000, pp. 95100. - 4 Lewis E. Hitchner, Judith Gersting, Peter B.
Henderson, Philip Machanick, and Yale N. Patt,
Programming early considered harmful, SIGCSE '01
Proceedings of the thirty-second SIGCSE technical
symposium on Computer Science Education, ACM
Press, 2001, pp. 402403. - 5 SimpleScalar LLC, Simplescalar 3.0.
- 6 Jeffrey J. McConnell, Active learning and its
use in computer science, ITiCSE '96 Proceedings
of the 1st conference on Integrating technology
into computer science education, ACM Press, 1996,
pp. 5254. - 7 K. Silverman R. Felder, Index of learning
stylels, World Wide Web., February 2005. - 8 Lynda Thomas, Mark Ratclie, John Woodbury,
and Emma Jarman, Learning styles and performance
in the introductory programming sequence, SIGCSE
'02 Proceedings of the 33rd SIGCSE technical
symposium on Computer science education, ACM
Press, 2002, pp. 3337. - 9 Henry M. Walker, Collaborative learning a
case study for cs1 at grinnell college and
austin, SIGCSE '97 Proceedings of the
twenty-eighth SIGCSE technical symposium on
Computer science education, ACM Press, 1997, pp.
209213.