AMS TIM, CERN Apr 21, 2004 - PowerPoint PPT Presentation

1 / 17
About This Presentation
Title:

AMS TIM, CERN Apr 21, 2004

Description:

AMS TIM, CERN Apr 21, 2004. AMS-02 Computing and Ground Centers ... AMS-02 Ground Support Systems ... Evaluate HyperThreading Option for AMS processing nodes ... – PowerPoint PPT presentation

Number of Views:31
Avg rating:3.0/5.0
Slides: 18
Provided by: ak10
Category:
Tags: ams | cern | tim | apr | hyperthreading

less

Transcript and Presenter's Notes

Title: AMS TIM, CERN Apr 21, 2004


1
AMS TIM, CERN Apr 21, 2004
AMS-02 Computing and Ground Centers
Alexei Klimentov Alexei.Klimentov_at_cern.ch
2
Outline
  • AMS-02 Ground Centers
  • AMS Computing at CERN
  • MC Production
  • Manpower

3
AMS-02 Ground Support Systems
  • Payload Operations Control Center (POCC) at CERN
    (first 2-3 months in Houston)
  • control room, usual source of commands
  • receives Health Status (HS), monitoring and
    science data in real-time
  • receives NASA video
  • voice communication with NASA flight operations
  • Backup Control Station at JSC (TBD)
  • Monitor Station in MIT
  • backup of control room
  • receives Health Status (HS) and monitoring
    data in real-time
  • voice communication with NASA flight
    operations
  • Science Operations Center (SOC) at CERN (first
    2-3 months in Houston)
  • receives the complete copy of ALL data
  • data processing and science analysis
  • data archiving and distribution to Universities
    and Laboratories
  • Ground Support Computers (GSC) at Marshall Space
    Flight Center
  • receives data from NASA -gt buffer -gt retransmit
    to Science Center
  • Regional Centers
  • Madrid,MIT, Yale, Bologna,Milan, Karlsruhe,
    Lyon, Taipei,Nanjing, Shanghai, 19
    centers
  • analysis facilities to support geographically
    close Universities

4
AMS facilities
NASA facilities
5
(No Transcript)
6
AMS-02 Ground Centers.Science Operations Center.
Computing Facilities.
Analysis Facilities (linux cluster)
Central Data Services
Shared Tape Servers
AMS regional Centers
Interactive and Batch physics analysis
tape robots tape drives LTO, DLT
10-20 dual processor PCs
5 PC servers
Shared Disk Servers
25 TeraByte disk 6 PC based servers
batch data processing
interactive physics analysis
CERN/AMS Network
7
AMS Computers at CERN
  • Central services (Web, Batch, Database)
  • Offline SW repository
  • SW development (libraries, compilers, SW tools)
  • Computing facilities
  • Data storage and archiving (AMS-01, TestBeam, MC)
  • MC Production

8
AMS Computers at CERN
CPU clock 1998 2004 7.5 times (450
MHz/3.4GHz) Disks capacity 1998 2004 14 times
(17GB/250GB)
  • To buy the bulk of computing power later to have
    better price/performance ratio
  • To build system gradually

Out of warranty
AMS Computers Loading, (weekly graph, 30 mins
average) Max CPU 98, Average 85
9
AMS-02 Computing Facilities .
Ready operational, bulk of CPU and disks
purchasing L-9 Months
10
AMS Computing (action items Jan 2004)
  • Action items
  • improve networking topology
  • gigabit switch will be installed before end of
    Jan
  • Dr.Wu Hua (SEU) started to work on Network
    Monitoring
  • dedicated batch, web, production server
  • evaluate the market the order will be placed
    Q1 2004
  • Closed
  • Switch installed beg of Feb (AE, AK)
  • Server was delivered this week and will be
    installed in April (AE,AK)
  • Network Monitoring is in production (Wu Hua,
    draft note URL ams.cern.ch/AMS/Computing/network_m
    onitor.pdf)

11
AMS Computing. SOC Prototype.
MC production
Q2 2004
Q4 2004 Q1 2005
CIEMAT, ITEP, MIT group
12
AMS-02 Ground Centers Prototypes
  • Centers architecture and functions are identified
  • The estimation of needed computing power and
    benchmarking for AMS-02 data processing is done
    (together with V.Choutko, will be reevaluted
    based on MC production benchmarks)
  • Networking and data transmission issues are
    studied
  • SW and centers prototypes are used for AMS-02 MC
    mass-production
  • Data transfer computer is installed and will be
    in production in May (M.Boschini et al)

13
MC Production Y2004A
V.Choutko A.Eline A.Klimentov
14
Year 2004 MC Production
  • Started Jan 15, 2004
  • Central MC Database
  • Distributed MC Production
  • Central MC storage and archiving
  • Distributed access (under test)

15
Y2004 MC production centers
MC Production is not a primary responsibility
for people participating in it.
16
MC Production Statistics
97 days, 4139 GB
40 of MC production done Will finish by end of
July
URL pcamsf0.cern.ch/mm.html
17
Y2004 MC Production Highlights
  • Data are generated at remote sites, transmitted
    to AMS_at_CERN and available for the analysis
  • Transmission, process communication and
    book-keeping programs have been debugged, the
    same approach will be used for AMS-02 data
    handling
  • 97 days of running (95 stability)
  • 15 Universities Labs
  • 4.1 Tbytes of data produced, stored and archived
  • Peak rate 130 GB/day (12 Mbit/sec)
  • 547 computers
  • Daily CPU equiv 173 1 GHz CPUs running 97
    days/24h

To support MC production we need 1 more disks
server (5 TB) Q2 Production farm prototype (10
CPUs) Q4
Write a Comment
User Comments (0)
About PowerShow.com