HEPiX/HEPNT Welcome - PowerPoint PPT Presentation

1 / 19
About This Presentation
Title:

HEPiX/HEPNT Welcome

Description:

Ian Bird - HEPiX JLAB - Nov 1 2000. 3. Jefferson Lab Science Programs ' ... Add HELIOS synchrotron (from IBM) 'time-lapse' view of photon-material interactions ... – PowerPoint PPT presentation

Number of Views:95
Avg rating:3.0/5.0
Slides: 20
Provided by: ianb185
Learn more at: https://www.jlab.org
Category:
Tags: hepnt | hepix | helios | welcome

less

Transcript and Presenter's Notes

Title: HEPiX/HEPNT Welcome


1
HEPiX/HEPNT Welcome
  • Ian Bird
  • JLAB
  • 1 November 2000

2
Introduction to JLAB
  • Jefferson Lab
  • Facilities Science
  • Accelerator and HENP experiments
  • FEL
  • Current status
  • Future plans
  • Computing
  • Overview of computing at JLAB
  • Some directions

3
Jefferson Lab Science Programs
  • High Energy Nuclear Physics
  • CEBAF Accelerator
  • 4 (6) GeV continuous wave electron accelerator
  • Superconducting RF technology
  • Very high beam current
  • High beam polarizations possible
  • Last Fall Cebaf delivered more polarized
    electrons to expts. than all other electron
    machines in their lifetimes combined
  • Free Electron Laser (FEL) facility
  • Spin-off of SRF technology, growing user
    community
  • Spallation Neutron Source (SNS) (Oak Ridge)
  • JLAB is building the superconducting linacs
  • Future
  • JLAB Energy upgrade
  • Construct parts of Rare Isotope Accelerator (RIA)

4
The Laboratory
  • 70 M/year operating budget
  • 600 staff
  • 2000 users _at_ 240 institutions in 36 countries
  • SNS means 10 increase in staff over 18 months
  • FEL User community technology transfer
  • Consortium of University groups and industry
  • 100 FEL related users on site
  • ARC building ARC II for FEL medical apps

5
Accelerator and experiments
6
Experimental Halls
7
Nuclear Physics
  • 3 experimental halls (A, B, C)
  • A, C are single/two arm spectrometers -
    facilities
  • Accept very high beam currents
  • Data rates from KB/s 10 MB/s depending on
    experiment
  • Experiments last from few days to a few months
  • B (Cebaf Large Acceptance Spectrometer CLAS)
  • Accepts only lower currents
  • Large data rates up to 20 MB/s (0.5 TB/day)
  • CLAS is size of HEP fixed target experiment 10
    years ago, collaboration 150 people.
  • Very much a nuclear physics community

8
Computing for NP
  • Data storage
  • STK silo with 8 RedWoods, 10 9840 drives
  • Will add lt10 9940 this year
  • 12 TB of stage, cache, analysis disk pools
  • Add 10 TB this year (Linux, SCSI/IDE??)
  • Processing
  • Farm of 250 Linux cpu (8 Solaris..)
  • Gigabit network infrastructure

9
Scientific computing
10
Program upgrades
  • Timescale 2005
  • Upgrade CEBAF to 12 GeV
  • Upgrade existing experiments
  • New Hall D tagged photon beam facility
  • Data rates 75-100 MB/s to tape (after L3 trigger)
  • PWA requires significant simulation effort
  • Expect 3 PB/year (raw, processed, simulated)
  • I.e Data 2/3 CMS but collaboration of 200
    (not 2000)

11
Theory Computing - LQCD
  • Very strong Lattice QCD group
  • Collaboration is 1/3 of US LQCD effort
  • FNAL Brookhaven JLAB/MIT
  • Present development system
  • 40 Alphas (Linux) with Myrinet
  • Proposed this year
  • 256 node Alpha cluster (300 Gflops)
  • Goal is to build a few TeraFlop system in FY0x

12
Free Electron Laser
  • Most powerful (1.7kW) IR laser (by a factor 100)
  • Upgrades planned to 10 kW IR extend to UV
  • Will be 1000 x more powerful than any other
    tuneable IR laser
  • Uses SRF technology

13
FEL
14
FEL
  • Growing user community, experiments just
    beginning
  • Material science, surface chemistry
  • Other applications
  • Add HELIOS synchrotron (from IBM)
  • time-lapse view of photon-material interactions
  • FEL computing
  • Expect a serious computing requirement as the
    program grows simulations
  • Several Teraflops, could require more than LQCD

15
Computing growth
  • Accelerator upgrades
  • Hall D upgrades to existing Halls
  • Large data management analysis requirement
  • Lattice QCD
  • FEL simulations
  • Corresponding growth in user community
  • New computing facilities buildings
  • Could be 80-100 computing staff ( accelerator)

16
Computing groups
  • There are 4 computing groups
  • Computer Center (35 people)
  • Accelerator Controls (25 people)
  • Data Acquisition Software (6 people)
  • High Performance Computing (4 people)

17
Computing groups
  • Accelerator controls
  • Mainly focused on controls software (EPICS)
  • Some system administration, but try to leverage
    off CC
  • Data Acquisition
  • CODA software for experiments
  • High Performance Computing
  • LQCD cluster software (PBS, MPI etc)
  • Accelerator controls (CDEV)
  • Collaborate on PPDG with CC

18
Computing groups
  • Computer Center
  • Scientific computing and LQCD facilities
  • Networks and infrastructure
  • Desktop computing
  • User support (central point for Acc. too)
  • MIS (Business services HR)
  • Security and reporting to DOE
  • Responsible for ADP strategy and reporting
  • Members of PPDG

19
Summary
  • JLAB is fairly small, but
  • Has an active and diverse science program
  • Several ongoing or upcoming projects promise to
    drive continued expansion
  • Several computing initiatives
  • Computing facilities and scope will expand in the
    next 5 years
Write a Comment
User Comments (0)
About PowerShow.com