Title: The experience with LCG GRID in Russia
1The experience with LCG GRID in Russia
- E.Lyublev, A.Selivanov, B.Zagreev
- ITEP, Moscow
- November 3, 2005
2ITEP history
- was founded on December 1, 1945.
- the heavy-water reactor was run in 1949
- in 1961 the 7-GeV proton synchrotron started
operating. It was the first Russian proton
accelerator using the strong focusing principle. - Now ITEP is Russian scientific centers aimed at
studying nuclear physics and physics of
elementary particles. - Institute occupies the area of the old eighteenth
century estate "Cheremushki".
3ITEP
- Alikhanov Institute for Theoretical and
Experimental Physics - Russian Federation State Scientific Center
4ITEP in winter
5Research program
- Particle and nuclear physics
- Theoretical study
- Experimental research
- _at_ITEP accelerator
- _at_CERN, FNAL, DESY,KEK and other international
centers - 2B decay (Ge, Mo, Xe )
6Research program
- Low energy physics and chemistry
- Accelerator techniques
- Nuclear power facilities
- Medical physics
- Details www.itep.ru
7International collaboration
- DESY (Hamburg) ARGUS, H1, HERA-B
- CERN (Geneva) AMS, CHORUS, L3, ATLAS,
ALICE, CMS, LHCb - FNAL (Batavia) D0, E781(SELEX)
- GSI (Darmstadt) - CBM
8Russia participation in EGEE/LCG
- RDIG Russian Data Intensive Grid
9RDIG
PNPI
JINR
KIAM
ITEP
SINP
RRC KI
IHEP
IMPB
10ITEP EGEE/LCG production cluster
11ITEP EGEE/LCG production cluster
12ITEP EGEE/LCG hardware
- UI user interface
- CE computing element
- SE storage element
- WNs nodes for batch system
- Mon RGMa server
- VO Box server of Virtual Organization
- RDIG user support server
- LFC LCG File Catalog
13ITEP LCG parameters
- OS SLC 3.05
- MW LCG -2.6.0-9
- Bach system PBS with maui
- WNs P4(HT) -2.4Ghz 1Gb 80Gb
- SE
- Classic SE -1Tb
- dCache/srm 4Tb
14Network
- ITEP Network Backbone
- 1Gb Ethernet
- ITEP LAN
- 100Mb Ethernet
- Wireless
- WAN
- 1Gb channel (RAS)
- 100Mb channel (MSU)
15Application SW
- Alice Alien 2_4 (VO Box) AliRoot, ROOT,
xrootd - Atlas - VO-atlas-release-10.0.4
- CMS - OSCAR_3_6_5
- ORCA_8_7_1
- CMKIN_4_4_0_dar
- LHCb- Gaudi-v15r5
- DaVinci-v12r11
16Monitoring statistics
- GOC
- Gridice
- MonAlisa
- Farm statistics
- Network statistics
17GridIce
18RDIG Monitoring
19RDIG User Support
20ALICE DC04 statistics
21DC04 Summary
- About 7000 jobs have been successfully done at
AliEn Russian sites in 2004. It is 4 from the
total Alice statistics. Job efficiency is about
75 - Quite visible participation in ALICE and LHCb
Data Challenges - ITEP part 70
- SE 1.7 TByte
-
22DC05 to be continued
23Timeline of PDC05/SC3
2005
Sep
Oct
Nov
Dec
Aug
Prototype data analysis (Phase 3)
ALICE data push -
reserved/shared bandwidth - test of FTS (Phase
2)
Job submission through LCG interface
Event production (Phase 1)
SC3 start of service phase
24Participating in Alice SC3
All experiment specific SW
25 AliEn (AliCE Environment)
- The AliEn framework has been developed as the
ALICE user entry point into the Grid world,
shielding the users from its underlying
complexity and heterogeneity. Through interfaces,
it can use transparently resources of different
Grids (LCG and INFN Grids). In the future, the
cross-Grid functionality will be extended to
cover other Grid flavours. - The system is built around Open Source components
and uses a Web Services model and standard
network protocols. Less than 5 is native AliEn
code (PERL). - None of other Grid flavours provides a complete
solution for the ALICE computing model. All these
Grids provide a different user interface and a
diverse spectrum of functionality. - Therefore some of the AliEn services will
continue to be used as the ALICE's single point
of entry to the computing resources of other Grid
and as a complement of their functionality. The
foreign Grid will be accessed via interfaces.
26AliEn services structure
- Central services
- Catalogue
- Task queue
- Job optimization
- -etc.
File registration
AliEn CE/SE
Job submission
LCG UI
AliEn CE/SE
AliEn CE/SE
LCG RB
27ALICE interface to LCG
- Through a VO-Box, provided on the site
- LCG UI full mapping
- AliEn services (Cluster Monitor, CE, SES,
MonALISA, PackMan, xrootd) - VO-Box requirements published https//uimon.cern.
ch/twiki/pub/LCG/ALICEResourcesAndPlans/alice_vobo
x_requirements.doc
28ITEP LCG site as Tier2 in SC3 (ALICE)
- LCG 2.6
- FTS client
- SE Dcache with srm
- LFC
- Xrootd protocol
- Alien 2_4
- connectivity with TIER1 centers is an issue !
29Non LHC GRID activity
- Russian VO PHOTON for SELEX colleagues was
organized in 2005 - Regional centre for AMS (VO in preparation)
- Collaboration with CBM project (GSI, Darmstadt)
- ITEP theory department is very interested
30Summary plans
- Ready for PDC05/SC3
- Further support of LHC experiments Data
Challenges - becomes a trivial task running automatically
- Increase significantly the power of ITEP farm in
2006 - current installation occupies only 5 of
infrastructure - Concentrate on distributed analysis
connectivity with TIER1 centers is an issue !