The NSF Course, Curriculum, and Laboratory Improvement CCLI Program

1 / 31
About This Presentation
Title:

The NSF Course, Curriculum, and Laboratory Improvement CCLI Program

Description:

Building on and contributing to the STEM education knowledge base ... STEM Education Knowledge Base ... the results to knowledge base. 16. National Science ... –

Number of Views:38
Avg rating:3.0/5.0
Slides: 32
Provided by: jeff114
Category:

less

Transcript and Presenter's Notes

Title: The NSF Course, Curriculum, and Laboratory Improvement CCLI Program


1
The NSF Course, Curriculum, and Laboratory
Improvement (CCLI) Program
  • CUR Meeting March 2007
  • Duncan McBride
  • Program Director
  • Division of Undergraduate Education
  • National Science Foundation
  • dmcbride_at_nsf.gov

2
Course, Curriculum, and Laboratory Improvement
(CCLI)
  • Purpose of the Program
  • To improve the quality of STEM education for all
    students by targeting activities affecting
    learning environments, course content, curricula,
    and educational practices
  • Supports projects at all levels of undergraduate
    education.
  • Supports activities in the classroom, laboratory,
    and field settings
  • Current CCLI Program Solicitation (NSF07-543)

3
Increased Emphases in CCLI Program
  • Building on and contributing to the STEM
    education knowledge base
  • Building a community of scholars in STEM
    education
  • Identifying project-specific measurable outcomes
  • Using them in the project management and
    evaluation

4
Three Scales of Projects
Phase 1 Exploratory Projects Up to 150,000
(200,000 when 4-year 2-year schools
collaborate) 1 to 3 years (can occur at a single
institution with primarily local impact) Phase 2
Expansion Projects Up to 500,000 2 to 4
years build on smaller-scale proven ideas.
Diverse users at several institutions Phase 3
Comprehensive Projects Up to 2,000,000 3 to 5
years combine proven results and mature
products. Involve several diverse institutions
5
CCLI Cycle of Innovation
Project Components
Developing Faculty Expertise
Creating New Learning Materials and Teaching
Strategies
Implementing Educational Innovations
Research on Undergraduate STEM Teaching and
Learning
Assessing Learning and Evaluating Innovations
6
CCLI - Creating New Learning Materials and
Teaching Strategies
  • Phase 1 projects can focus on piloting new
    educational materials and instructional
    methodologies Phase 2 projects on larger-scale
    development, broad testing, and assessment.
  • Similar to the old proof-of-concept and full
    development CCLI-EMD projects, respectively.
  • Phase 1 projects can focus on outcomes at a
    single site, but must include a rigorous
    assessment and community engagement program.
  • Can be combined with other components, especially
    faculty development in phase 2.

7
CCLI - Developing Faculty Expertise
  • Methods that enable faculty to gain expertise
  • May range from short-term workshops to sustained
    activities
  • Foster new communities of scientists in
    undergraduate education
  • Cost-effective professional development
  • Diverse group of faculty
  • Leading to implementation
  • May be combined with other components, especially
    materials development and assessment.

8
CCLI - Implementing Educational Innovations
  • Approximately equivalent to the CCLI-AI track
    projects. Phase 1 projects generally.
  • Projects must result in improved STEM education
    at YOUR institution via implementing exemplary
    materials, laboratory experiences, and/or
    educational practices developed and tested at
    other institutions.
  • CCLI-Implementation projects should stand as
    models for broader adaptation in the community.
  • Proposals may request funds in any budget
    category supported by NSF, including
    instrumentation

9
CCLI - Assessing Learning and Evaluating
Innovations
  • Design and test new assessment and evaluation
    tools and processes.
  • Apply new and existing tools to conduct
    broad-based assessments
  • Must span multiple projects and be of general
    interest

10
CCLI - Conducting Research on STEM Teaching and
Learning
  • Develop new research on teaching and learning
  • Synthesize previous results and theories
  • Practical focus
  • Testable new ideas
  • Impact on STEM educational practices.
  • - May be combined with other components

11
CCLI Cycle of Innovation
Project Components
Developing Faculty Expertise
Creating New Learning Materials and Teaching
Strategies
Implementing Educational Innovations
Research on Undergraduate STEM Teaching and
Learning
Assessing Learning and Evaluating Innovations
12
Important Features of Successful CCLI Projects
  • Quality, Relevance, and Impact
  • Student Focus
  • Use of and Contribution to the STEM Education
    Knowledge Base
  • STEM Education Community-Building
  • Expected Measurable Outcomes
  • Project Evaluation

13
Quality, Relevance and Impact
  • Innovative
  • State-of-the-art products, processes, and ideas
  • Latest technology in laboratories and classrooms
  • Have broad implication for STEM education
  • Even projects that involve a local
    implementation
  • Advance knowledge and understanding
  • Within the discipline
  • Within STEM education in general 

14
Student Focus
  • Focus on student learning
  • Project activities linked to STEM learning
  • Consistent with the nature of todays students
  • Reflect the students perspective
  • Student input in design the project.

15
STEM Education Knowledge Base
  • Reflect high quality science, technology,
    engineering, and mathematics
  • Rationale and methods derived from the existing
    STEM education knowledge base
  • Effective approach for adding the results to
    knowledge base

16
Community-Building
  • Include interactions with
  • Investigators working on similar or related
    approaches in PIs discipline and others
  • Experts in evaluation, educational psychology or
    other similar fields
  • Benefit from the knowledge and experience of
    others
  • Engage experts in the development and evaluation
    of the educational innovation

17
Expected Measurable Outcomes
  • Goals and objectives translated into expected
    measurable outcomes
  • Project-specific
  • Some expected measurable outcomes on
  • Student learning
  • Contributions to the knowledge base
  • Community building
  • Used to monitor progress, guide the project, and
    evaluate its ultimate impact

18
Project Evaluation
  • Includes strategies for
  • Monitoring the project as it evolves
  • Evaluating the projects effectiveness when
    completed
  • Based on the project-specific expected measurable
    outcomes
  • Appropriate for scope of the project

19
Lessons from the first year (2006)
  • Phase 1 is an open competition many new
    players
  • Phase 2 requires substantial demonstrated
    preliminary work
  • Phase 3 is for projects from an experienced team
    with a national scale.
  • Program for 2008 has no substantive changes from
    2006. Changes may be made for 2009.

20
Funding and Deadlines
  • 35 million for FY07 (Maybe more)
  • Project Deadlines Phase 1 -- May 8 and 9, 2007
    depending on first letter in state name
  • January 10, 2008 Phase 2 and Phase 3 proposals

21
CCLI Funding 2000-2007
22
Merit Review Criteria
  • Intellectual merit of the proposed activity
  • How important is the proposed activity to
    advancing knowledge and understanding within its
    own field or across different fields?
  • How well qualified is the proposer to conduct the
    project?
  • How well conceived and organized is the proposed
    activity?
  • Is there sufficient access to resources?

23
Merit Review Criteria
  • Broader impacts of the proposed activity
  • How well does the proposed activity advance
    discovery and understanding while promoting
    teaching, training, and learning?
  • How well does the proposed activity broaden the
    participation of underrepresented groups?
  • To what extent will it enhance the infrastructure
    for research and education?
  • Will the results be disseminated broadly to
    enhance scientific and technological
    understanding
  • What may be the benefits of the proposed activity
    to society?

24
Additional Review Criteria
  • Phase 1
  • How likely is it that the project will result in
    a successful implementation, prototype, or pilot
    study?
  • Phase 2
  • based on previously developed and tested
    innovations and implementations?
  • more than one component and multiple institutions
    as appropriate
  • successfully deliver a mature version of the work

25
Additional Review Criteria
  • Phase 3
  • Based on proven results and mature products?
  • Include most of the program components defined in
    the cyclic model
  • involve a set of diverse institutions?
  • outcomes will have a national impact?
  • appropriate plan for sustainability or
    commercialization?

26
Relation to Tracks  in Previous Solicitation      
  • AI
  • Implementing Educational Innovations component
  • May include equipment projects
  • Likely Phase 1
  •  EMD
  • Creating Learning Materials and Teaching
    Strategies component, perhaps including some
    other components
  • Phase 1 (Proof of Concept) Phase 2 or 3 (Full
    Development)

27
Relation to Tracks  in Previous Solicitation
(Cont)
  • ASA
  • Assessing Learning and Evaluating Innovations
    component
  • Phase 1, 2, or 3 
  • ND
  • Developing Faculty Expertise component
  • Now should include other components
  • Phase 1, 2, or 3 

28
Getting Started
  • Start EARLY
  • Get acquainted with FASTLANE
  • Read the Program Solicitation and follow the
    guidelines
  • Learn about the recent DUE awards using PIRS
  • Become an NSF reviewer
  • Contact (e-mail is best) a program officer to
    discuss your idea. This may cause you to refine
    your idea and may prevent you from applying to
    the wrong program

29
Formatting, Fastlane, and Grants.gov
  • NSF proposal format requirements
  • 15 single-spaced pages
  • 10-point or larger font (please use 11 or 12)
  • Intellectual Merit and Broader Impact explicit in
    Project Summary
  • Fastlane submission
  • Web-based software access from any browser
  • Mature, well-supported system for NSF
  • Accepts many file types, converts to .pdf
  • Grants.gov
  • Stand-alone software downloaded to local computer
  • May eventually be used for any Federal agency
  • Still under development and does not support all
    NSF processes (for example, collaborative
    proposals)
  • Accepts only .pdf files

30
How to Really Know About a Program
  • Become a reviewer for the proposals submitted to
    the program.
  • Give me a business card
  • Send e-mail to dmcbride_at_nsf.gov
  • Your name will be added to the database of
    potential reviewers.
  • We want to use many new reviewers each year,
    especially for Phase 1.

31
Whos Who in DUE
  • Biology
  • Terry Woodin
  • Dan Udovic
  • Nancy Palaez
  • Chemistry
  • Susan Hixson
  • Pratibha Varma-Nelson
  • Eileen Lewis
  • Computer Science
  • Dianna Burley
  • Mark Burge
  • Geosciences
  • Keith Sverdrup
  • gt Rotator
  • Engineering
  • Russ Pimmel
  • Bevlee Watford
  • Sheryl Sorby
  • Barbara Anderegg
  • Mathematics
  • Elizabeth Teles
  • Lee Zia
  • Dan Mackie
  • Physics/Astronomy
  • Duncan McBride
  • Dan Litynski
  • Social Sciences
  • Myles Boylan
Write a Comment
User Comments (0)
About PowerShow.com