CHREC Overview - PowerPoint PPT Presentation

1 / 17
About This Presentation
Title:

CHREC Overview

Description:

Brigham Young University (pending partner institution) ... Brigham Young University. Dr. Brent E. Nelson, Professor of ECE BYU Site Director ... – PowerPoint PPT presentation

Number of Views:30
Avg rating:3.0/5.0
Slides: 18
Provided by: draland8
Category:

less

Transcript and Presenter's Notes

Title: CHREC Overview


1
CHREC Overview
  • Alan D. George, Ph.D.
  • Director, NSF Center for High-Performance
  • Reconfigurable Computing (CHREC)

2
What is CHREC?
  • NSF Center for High-Performance
  • Reconfigurable Computing (CHREC)
  • Pronounced shreck ?
  • Under development since Q4 of 2004
  • Lead institution grant by NSF to Florida awarded
    on 09/05/06
  • Partner institution grant by NSF to GWU awarded
    on 12/04/06
  • Partner institution grants tentatively projected
    for VT and BYU in 2007
  • Under auspices of I/UCRC Program at NSF
  • Industry/University Cooperative Research Center
  • CHREC supported by CISE Engineering
    Directorates _at_ NSF
  • CHREC is both a Center and a Research Consortium
  • University groups form the research base
    (faculty, students)
  • Industry and government organizations are
    research partners, sponsors, collaborators, and
    technology-transfer recipients

3
Objectives for CHREC
  • Establish first multidisciplinary NSF research
    center in reconfigurable high-performance
    computing
  • Basis for long-term partnership and collaboration
    amongst industry, academe, and government a
    research consortium
  • RC from supercomputing to high-performance
    embedded systems
  • Directly support research needs of our Center
    members
  • Highly cost-effective manner with pooled,
    leveraged resources and maximized synergy
  • Enhance educational experience for a diverse set
    of high-quality graduate and undergraduate
    students
  • Ideal recruits after graduation for our Center
    members
  • Advance knowledge and technologies in this field
  • Commercial relevance ensured with rapid
    technology transfer

4
Academic Team
HPC High-Performance Computing HPEC
High-Performance Embedded Computing
  • University of Florida (lead institution)
  • Base of research expertise in HPC, HPEC, RC
  • Emphasis on RC for HPC and HPEC
  • George Washington University (partner
    institution)
  • Base of research expertise in HPC and RC
  • Emphasis on RC for HPC w/ scientific commercial
    systems
  • Virginia Tech (pending partner institution)
  • Planning grant proposal submitted Sep 06
  • Emphasis on human-computer interaction for RC,
    HPC
  • Brigham Young University (pending partner
    institution)
  • Planning grant proposal submitted Sep 06
  • Emphasis on RC for HPEC, focus on device-level
    issues

5
Center Management Structure
BYU B. Nelson VT S. Bohner
6
CHREC Faculty
  • University of Florida
  • Dr. Alan D. George, Professor of ECE UF Site
    Director
  • Dr. Herman Lam, Associate Professor of ECE
  • Dr. K. Clint Slatton, Assistant Professor of ECE
    and CCE
  • Dr. Dapeng Oliver Wu, Assistant Professor of ECE
  • George Washington University
  • Dr. Tarek El-Ghazawi, Professor of ECE GWU Site
    Director
  • Dr. Ivan Gonzalez, Research Scientist in ECE
  • Dr. Mohamed Taher, Research Scientist in ECE
  • Virginia Tech
  • Dr. Shawn A. Bohner, Associate Professor of CS
    VT Site Director
  • Dr. Peter Athanas, Professor of ECE
  • Dr. Wu-Chun Feng, Associate Professor of CS and
    ECE
  • Dr. Francis K.H. Quek, Professor of CS
  • Brigham Young University
  • Dr. Brent E. Nelson, Professor of ECE BYU Site
    Director
  • Dr. Michael J. Wirthlin, Associate Professor of
    ECE
  • Dr. Brad L. Hutchings, Professor of ECE

7
Membership Fee Structure
  • NSF provides initial funds for CHREC via I/UCRC
    Center grants
  • Grant to each participating university (UF, GWU,
    later VT and BYU)
  • Industry and govt. partners support CHREC through
    memberships
  • NOTE Each membership is associated with ONE
    university
  • Partners may hold multiple memberships (and thus
    support multiple projects/students) at one or
    multiple participating universities (e.g. NSA)
  • Full Membership fee is 35K in cash per year
  • 35K unit? Approx. cost of graduate student for
    one year
  • Stipend, tuition, and related expenses (not
    including IDC)
  • Fee represents tiny fraction of size benefits
    of Center
  • All tolled, CHREC budget projected at 1.5-2M in
    Y1 (UFGW)
  • Equivalent to far higher if had been founded in
    govt. or industry (10M)
  • Each university supports various costs of its
    CHREC operations
  • 25 matching of industry membership contributions
  • Indirect Costs waived on membership fees (1.5
    multiplier)
  • Matching on administrative personnel costs

More bang for your buck!
8
Benefits of Center Membership
  • Research and collaboration
  • Selection of project topics that your membership
    resources support
  • Direct influence over cutting-edge research of
    prime interest
  • Review of results on semiannual formal basis
    continual informal basis
  • Rapid transfer of results, before publication
    any IP from all projects
  • Leveraging and synergy
  • Highly leveraged and synergistic pool
  • Cost-effective RD in todays budget-tight
    environment
  • Multi-member collaboration
  • Many benefits between members
  • e.g. new industrial partnerships and teaming
    opportunities
  • Personnel
  • Access to strong cadre of faculty, students,
    post-docs
  • Recruitment
  • Strong pool of students with experience on
    industry govt. issues
  • Facilities
  • Access to university research labs with
    world-class facilities

9
General Policies for CHREC
  • We follow the I/UCRC Standard Agreement
  • As defined by NSF
  • CHREC publication-delay policy
  • Results from funded projects shared with members
    30 days prior to publication
  • Any industry member may delay publication for up
    to 90 days for IP issues
  • Industrial Advisory Board (IAB)
  • One representative for each full member in CHREC
  • Board members elect IAB chair and vice-chair on
    annual basis
  • Y1 Chair Alan Hunsberger (NSA), Vice-Chair Nick
    Papageorgis (Smiths Aerospace)
  • Number of votes commensurate with number of
    memberships
  • On Center policies 1 vote per full membership
  • On Center projects 35 votes per full membership
    for flexibility (support several projects)
  • Focus in Y1 on full memberships, but other
    options possible in future
  • e.g. Associate Member equipment
  • Capital equipment donation as option for extended
    rights beyond paid full membership
  • e.g. Associate Member small business
  • Reduced membership fee with commensurately
    reduced IP and voting rights
  • All membership options require approval by IAB

10
Founding Members in CHREC
  • AFRL Munitions Dir.
  • ARSC
  • Honeywell
  • HP
  • IBM Research
  • Intel
  • Linux Networx
  • NASA Goddard
  • NASA Langley
  • NASA Marshall
  • National Security Agency
  • NCI/SAIC
  • Oak Ridge National Lab
  • Office of Naval Research
  • Raytheon
  • Rockwell Collins
  • Sandia National Labs
  • SGI
  • Smiths Aerospace

ORANGE Member with UF, BLUE Member with GWU,
GREEN Member with both
11
What is a Reconfigurable Computer?
  • System capable of changing hardware structure to
    address application demands
  • Static or dynamic reconfiguration
  • Reconfigurable computing, configurable computing,
    custom computing, adaptive computing, etc.
  • Typically a mix of conventional and
    reconfigurable processing technologies
    (control-flow, data-flow)
  • Enabling technology?
  • Field-programmable hardware (e.g. FPGAs)
  • Applications?
  • Broad range satellites to supercomputers!
  • Faster, smaller, cheaper, less power heat, more
    versatile

12
When and where do we need RC?
  • When do we need RC?
  • When performance versatility are critical
  • Hardware gates targeted to application-specific
    requirements
  • System mission or applications change over time
  • When the environment is extremely restrictive
  • Limited power, weight, area, volume, etc.
  • Limited communications bandwidth for work offload
  • When autonomy and adaptivity are paramount
  • Where do we need RC?
  • In conventional HPC systems clusters where apps
    amenable
  • Field-programmable hardware fits many demands
    (but certainly not all)
  • High DOP, finer grain, direct dataflow mapping,
    bit manipulation, selectable precision, direct
    control over H/W (e.g. perf. vs. power)
  • In space, air, sea, undersea, and ground systems
  • Embedded deployable systems can reap many
    advantages w/ RC

13
Research Challenge Stack
Performance Prediction Performance Analysis Nume
rical Analysis Languages Compilers System Serv
ices Portable Libraries System Architectures De
vice Architectures
  • Performance prediction
  • When and where to exploit RC?
  • Performance analysis
  • How to optimize complex systems and apps?
  • Numerical analysis
  • Must we throw DP floats at every problem?
  • Programming languages compilers
  • How to express achieve multilevel parallelism?
  • System services
  • How to support variety of run-time needs?
  • Portable core libraries
  • Where cometh building blocks?
  • System architectures
  • How to scalably feed hungry FPGAs?
  • Device architectures
  • How will/must FPLD roadmaps track for HPC or HPEC?

14
Bridging the Gaps
  • Vertical Gap
  • Semantic gap between design levels
  • Application design by scientists programmers
  • Hardware design by electrical computer
    engineers
  • We must bridge this gap to achieve full potential
  • Better programming languages to express
    parallelism of multiple types and at multiple
    levels
  • Better design tools, compilers, libraries,
    run-time systems
  • Evolutionary and revolutionary steps
  • Emphasis integrated SW/HW design for multilevel
    parallelism
  • Horizontal Gap
  • Architectures crossing the processing paradigms
  • Cohesive, optimal collage of CPUs, FPGAs,
    interconnects, memory hierarchies,
    communications, storage, et al.
  • Must we assume simple retrofit to conventional
    architecture?

15
Y1 Projects at UF GW
UF
GW
16
CHREC OpenFPGA
Diagram c/o Dr. Eric Stahlberg _at_ OpenFPGA
17
Conclusions
  • New NSF Center in reconfigurable computing
  • Overarching theme
  • CHREC forms basis for research consortium with
    partners from industry, academia, and government
  • Focus upon basic applied research in RC for HPC
    and HPEC with major educational component
  • Technical emphasis at outset is towards aerospace
    defense
  • Building blocks, systems services, design
    automation, applications
  • Opportunities for expansion and synergy in other
    app areas
  • Current emphasis is preparation for kickoff in
    Dec. 06
  • UF and GW are operational on Y1 projects
    beginning on 01/01/07
  • VT and BYU are building their membership in prep
    for Site proposals to NSF
  • We invite government industry groups to join
    CHREC consortium
  • Leverage and build upon common interests and
    synergy in RC
  • Pooled resources matched resources maximal
    ROI, minimal member fee
Write a Comment
User Comments (0)
About PowerShow.com