Title: The NSF Course, Curriculum, and Laboratory Improvement CCLI Program
1The NSF Course, Curriculum, and Laboratory
Improvement (CCLI) Program
- CUR Meeting March 2007
- Duncan McBride
- Program Director
- Division of Undergraduate Education
- National Science Foundation
- dmcbride_at_nsf.gov
2Course, Curriculum, and Laboratory Improvement
(CCLI)
- Purpose of the Program
- To improve the quality of STEM education for all
students by targeting activities affecting
learning environments, course content, curricula,
and educational practices - Supports projects at all levels of undergraduate
education. - Supports activities in the classroom, laboratory,
and field settings - Current CCLI Program Solicitation (NSF07-543)
3Increased Emphases in CCLI Program
- Building on and contributing to the STEM
education knowledge base - Building a community of scholars in STEM
education - Identifying project-specific measurable outcomes
- Using them in the project management and
evaluation
4Three Scales of Projects
Phase 1 Exploratory Projects Up to 150,000
(200,000 when 4-year 2-year schools
collaborate) 1 to 3 years (can occur at a single
institution with primarily local impact) Phase 2
Expansion Projects Up to 500,000 2 to 4
years build on smaller-scale proven ideas.
Diverse users at several institutions Phase 3
Comprehensive Projects Up to 2,000,000 3 to 5
years combine proven results and mature
products. Involve several diverse institutions
5CCLI Cycle of Innovation
Project Components
Developing Faculty Expertise
Creating New Learning Materials and Teaching
Strategies
Implementing Educational Innovations
Research on Undergraduate STEM Teaching and
Learning
Assessing Learning and Evaluating Innovations
6CCLI - Creating New Learning Materials and
Teaching Strategies
- Phase 1 projects can focus on piloting new
educational materials and instructional
methodologies Phase 2 projects on larger-scale
development, broad testing, and assessment. - Similar to the old proof-of-concept and full
development CCLI-EMD projects, respectively. - Phase 1 projects can focus on outcomes at a
single site, but must include a rigorous
assessment and community engagement program. - Can be combined with other components, especially
faculty development in phase 2.
7CCLI - Developing Faculty Expertise
- Methods that enable faculty to gain expertise
- May range from short-term workshops to sustained
activities - Foster new communities of scientists in
undergraduate education - Cost-effective professional development
- Diverse group of faculty
- Leading to implementation
- May be combined with other components, especially
materials development and assessment.
8CCLI - Implementing Educational Innovations
- Approximately equivalent to the CCLI-AI track
projects. Phase 1 projects generally. - Projects must result in improved STEM education
at YOUR institution via implementing exemplary
materials, laboratory experiences, and/or
educational practices developed and tested at
other institutions. - CCLI-Implementation projects should stand as
models for broader adaptation in the community. - Proposals may request funds in any budget
category supported by NSF, including
instrumentation
9CCLI - Assessing Learning and Evaluating
Innovations
- Design and test new assessment and evaluation
tools and processes. - Apply new and existing tools to conduct
broad-based assessments - Must span multiple projects and be of general
interest
10CCLI - Conducting Research on STEM Teaching and
Learning
- Develop new research on teaching and learning
- Synthesize previous results and theories
- Practical focus
- Testable new ideas
- Impact on STEM educational practices.
- - May be combined with other components
11CCLI Cycle of Innovation
Project Components
Developing Faculty Expertise
Creating New Learning Materials and Teaching
Strategies
Implementing Educational Innovations
Research on Undergraduate STEM Teaching and
Learning
Assessing Learning and Evaluating Innovations
12Important Features of Successful CCLI Projects
- Quality, Relevance, and Impact
- Student Focus
- Use of and Contribution to the STEM Education
Knowledge Base - STEM Education Community-Building
- Expected Measurable Outcomes
- Project Evaluation
13Quality, Relevance and Impact
- Innovative
- State-of-the-art products, processes, and ideas
- Latest technology in laboratories and classrooms
- Have broad implication for STEM education
- Even projects that involve a local
implementation - Advance knowledge and understanding
- Within the discipline
- Within STEM education in generalÂ
14Student Focus
- Focus on student learning
- Project activities linked to STEM learning
- Consistent with the nature of todays students
- Reflect the students perspective
- Student input in design the project.
15STEM Education Knowledge Base
- Reflect high quality science, technology,
engineering, and mathematics - Rationale and methods derived from the existing
STEM education knowledge base - Effective approach for adding the results to
knowledge base
16Community-Building
- Include interactions with
- Investigators working on similar or related
approaches in PIs discipline and others - Experts in evaluation, educational psychology or
other similar fields - Benefit from the knowledge and experience of
others - Engage experts in the development and evaluation
of the educational innovation
17Expected Measurable Outcomes
- Goals and objectives translated into expected
measurable outcomes - Project-specific
- Some expected measurable outcomes on
- Student learning
- Contributions to the knowledge base
- Community building
- Used to monitor progress, guide the project, and
evaluate its ultimate impact
18Project Evaluation
- Includes strategies for
- Monitoring the project as it evolves
- Evaluating the projects effectiveness when
completed - Based on the project-specific expected measurable
outcomes - Appropriate for scope of the project
19Lessons from the first year (2006)
- Phase 1 is an open competition many new
players - Phase 2 requires substantial demonstrated
preliminary work - Phase 3 is for projects from an experienced team
with a national scale. - Program for 2008 has no substantive changes from
2006. Changes may be made for 2009.
20Funding and Deadlines
- 35 million for FY07 (Maybe more)
- Project Deadlines Phase 1 -- May 8 and 9, 2007
depending on first letter in state name - January 10, 2008 Phase 2 and Phase 3 proposals
21CCLI Funding 2000-2007
22Merit Review Criteria
- Intellectual merit of the proposed activity
- How important is the proposed activity to
advancing knowledge and understanding within its
own field or across different fields? - How well qualified is the proposer to conduct the
project? - How well conceived and organized is the proposed
activity? - Is there sufficient access to resources?
23Merit Review Criteria
- Broader impacts of the proposed activity
- How well does the proposed activity advance
discovery and understanding while promoting
teaching, training, and learning? - How well does the proposed activity broaden the
participation of underrepresented groups? - To what extent will it enhance the infrastructure
for research and education? - Will the results be disseminated broadly to
enhance scientific and technological
understanding - What may be the benefits of the proposed activity
to society?
24Additional Review Criteria
- Phase 1
- How likely is it that the project will result in
a successful implementation, prototype, or pilot
study? - Phase 2
- based on previously developed and tested
innovations and implementations? - more than one component and multiple institutions
as appropriate - successfully deliver a mature version of the work
25Additional Review Criteria
- Phase 3
- Based on proven results and mature products?
- Include most of the program components defined in
the cyclic model - involve a set of diverse institutions?
- outcomes will have a national impact?
- appropriate plan for sustainability or
commercialization?
26Relation to Tracks in Previous Solicitation     Â
- AI
- Implementing Educational Innovations component
- May include equipment projects
- Likely Phase 1
- Â EMD
- Creating Learning Materials and Teaching
Strategies component, perhaps including some
other components - Phase 1 (Proof of Concept) Phase 2 or 3 (Full
Development)
27Relation to Tracks in Previous Solicitation
(Cont)
- ASA
- Assessing Learning and Evaluating Innovations
component - Phase 1, 2, or 3Â
- ND
- Developing Faculty Expertise component
- Now should include other components
- Phase 1, 2, or 3Â
28Getting Started
- Start EARLY
- Get acquainted with FASTLANE
- Read the Program Solicitation and follow the
guidelines - Learn about the recent DUE awards using PIRS
- Become an NSF reviewer
- Contact (e-mail is best) a program officer to
discuss your idea. This may cause you to refine
your idea and may prevent you from applying to
the wrong program
29Formatting, Fastlane, and Grants.gov
- NSF proposal format requirements
- 15 single-spaced pages
- 10-point or larger font (please use 11 or 12)
- Intellectual Merit and Broader Impact explicit in
Project Summary - Fastlane submission
- Web-based software access from any browser
- Mature, well-supported system for NSF
- Accepts many file types, converts to .pdf
- Grants.gov
- Stand-alone software downloaded to local computer
- May eventually be used for any Federal agency
- Still under development and does not support all
NSF processes (for example, collaborative
proposals) - Accepts only .pdf files
30How to Really Know About a Program
- Become a reviewer for the proposals submitted to
the program. - Give me a business card
- Send e-mail to dmcbride_at_nsf.gov
- Your name will be added to the database of
potential reviewers. - We want to use many new reviewers each year,
especially for Phase 1.
31Whos Who in DUE
- Biology
- Terry Woodin
- Dan Udovic
- Nancy Palaez
- Chemistry
- Susan Hixson
- Pratibha Varma-Nelson
- Eileen Lewis
- Computer Science
- Dianna Burley
- Mark Burge
- Geosciences
- Keith Sverdrup
- gt Rotator
- Engineering
- Russ Pimmel
- Bevlee Watford
- Sheryl Sorby
- Barbara Anderegg
- Mathematics
- Elizabeth Teles
- Lee Zia
- Dan Mackie
- Physics/Astronomy
- Duncan McBride
- Dan Litynski
- Social Sciences
- Myles Boylan