Presentacin de PowerPoint - PowerPoint PPT Presentation

1 / 38
About This Presentation
Title:

Presentacin de PowerPoint

Description:

Final user interactive applications. Extend GRID into 11 EU countries ... UAB (Barcelona) G.Merino. USC (Santiago) A.Gomez. Demo (Athenas) C.Markou ... – PowerPoint PPT presentation

Number of Views:49
Avg rating:3.0/5.0
Slides: 39
Provided by: jesusmarc
Category:

less

Transcript and Presenter's Notes

Title: Presentacin de PowerPoint


1
Development of GRID environment for interactive
applications
J.Marco (CSIC)
DATAGRID WP6 MEETING (PARIS) 5-III-2002
2
The CROSSGRID project
  • EU Vth Framework Programme (IST)
  • 3 year project, starting date 1st March 2002
  • 4.86 M contract signed last week!
  • Objectives
  • Final user interactive applications
  • Extend GRID into 11 EU countries
  • in close collaboration with DATAGRID
  • 21 Partners
  • COORD CYFRONET (PL, coordinator M.Turala)
  • CR UvA (N), PSNC(PL), FZK (D) ,CSIC (E)
  • Companies Algosystems(GR) DATAMAT (I)
  • ICM,INP,IPJ (PL) IISAS (SK) USTUTTUM (D)
    ULINZ (A) TCD(Ir) UABUSC (E) LIP (P)
    DemoAuTH (Gr) UCY (Cy)

3
CROSSGRID Collaboration
Ireland TCD Dublin
Poland Cyfronet INP Cracow PSNC Poznan ICM
IPJ Warsaw
Germany FZK Karlsruhe TUM Munich USTU Stuttgart
Netherlands UvA Amsterdam
Slovakia II SAS Bratislava
Austria U.Linz
Spain CSIC Santander Valencia RedIris UAB
Barcelona USC Santiago CESGA
Greece Algosystems Demo Athens AuTh Thessaloniki
Portugal LIP Lisbon
Italy DATAMAT
Cyprus UCY Nikosia
4
Project Structure
  • Workpackages
  • Applications (P.Sloot, UvA)
  • Programming Environment (H.Marten, FZK)
  • New Services tools (N.Meyer, PSNC)
  • Testbed (J.Marco, CSIC)
  • Management (M.Turala)
  • Architecture Team
  • Lead by M.Bubak, Cyfronet

5
Applications
  • Final user interactive applications
  • Response loop in the range of seconds to minutes
  • Computing Data intensive
  • Architecture basis GLOBUS DATAGRID
  • Multidisciplinary
  • Simulation visualisation for surgical
    procedures
  • Air pollution combined with weather forecasting
  • Distributed Data Analysis in HEP
  • Flooding crisis team decision support system

6
Tools Services
  • GRID Application Programming Environment
  • Verification of MPI use
  • Performance prediction
  • Monitoring
  • Services Tools
  • User friendly portals
  • Roaming access
  • Efficient distributed data access
  • Specific resource management
  • Open Architecture approach

7
Testbed
  • Testing and validation for
  • Applications
  • Programming environment
  • New services tools
  • Emphasis on collaboration with DATAGRID
    extension to DATATAG
  • Extension of GRID across Europe

8
Extension of GRID
9
Application development
  • Common requirements
  • Near real-time performance across the network
  • Efficient access to large distributed databases
  • Large scale parallel simulations

WP2 WP3
MPI
  • Data discovery
  • Data mining
  • Visualisation

Adressed in collaboration between WP1 tasks
  • Deployment on the CROSSGRID testbed will test
    the applications in the final user environment
    and provide feedback

10
Distributed Physics Analysis in HEP
  • Final user applications for interactive physics
    analysis in a GRID aware environment for LHC
    experiments
  • Acess to large O/R DBMS through middleware server
    vs access to catalog distributed root data
    files
  • Distributed data-mining techniques (mainly
    Neural Networks, also self organizing maps, ...)
  • Integration of user-friendly interactive acess
    via portals
  • CSIC UAB FZK INP ICM IPJ

11
Evolution in HEP Interactive Physics Analysis
  • From LEP2 era
  • (CERN 1995-2000, the Higgs hunt)
  • Physics analysis on final ntuples, 10-100 GB
    size/channel
  • Processed by teams/individually
  • Preselection cuts b-tagging kinematics fixed.
  • Multidimensional techniques not real-time
  • Neural Networks took 10 minutes several hours
  • CL estimations not simultaneously optimized

12
Evolution in HEP Interactive Physics Analysis
  • ...To LHC era (CERN 2003 (Physics TDR)-...)
  • High rate of complex pp collisions ( 109
    interactions/second)
  • Optimize High Level Trigger
  • Work on Physics TDR channels to define data
    subsets
  • Interactive physics analysis will run on
    distributed databases much larger than at LEP
  • Need for interactive friendly user applications
    in a GRID environment, supporting powerful
    analysis techniques

13
HEP Interactive Physics Analysis
Scalability OK in local cluster
MPI Master
14
WP4 International Testbed Organisation
  • Objective
  • Provide a distributed resource facility
  • Support testbed sites across Europe, integrating
    national facilities
  • Assure interoperability with DATAGRID testbed
  • 15 testbed sites across 9 countries
  • 16 FTE working on them
  • Start points
  • Join DataGRID testbed gradually
  • Use GEANT for network connectivity
  • Extend testbed with incremental releases as
    applications and middleware demand and provide
    feedback
  • Effort coordinated by CSIC, will benefit from
    experience in DATAGRID testbed also in TCD, UAB,
    LIP

15
Testbed MAP
TCD Dublin
PSNC Poznan
UvA Amsterdam
ICM IPJ Warsaw
FZK Karlsruhe
CIFRONET Cracow
II SAS Bratislava
USC Santiago
CSIC IFCA Santander
LIP Lisbon
UAB Barcelona
Auth Thessaloniki
CSIC Madrid
CSIC IFIC Valencia
DEMO Athens
UCY Nikosia
16
CrossGrid WP4 - International Testbed Organisation
Network (Geant) setup
17
CrossGrid WP4 - International Testbed Organisation
  • Tasks in WP4
  • 4.0 Coordination and management
  • (task leader J.Marco, CSIC, Santander)
  • Coordination with WP1,2,3
  • Collaborative tools (webvideoconfrepository)
  • Integration Team
  • 4.1 Testbed setup incremental evolution
  • (task leaderR.Marco, CSIC, Santander)
  • Define installation
  • Deploy testbed releases
  • Trace security issues
  • Testbed site responsibles
  • CYFRONET (Krakow) A.Ozieblo
  • ICM(Warsaw) W.Wislicki
  • IPJ (Warsaw) K.Nawrocki
  • UvA (Amsterdam) D.van Albada
  • FZK (Karlsruhe) M.Kunze
  • IISAS (Bratislava) J.Astalos
  • PSNC(Poznan) P.Wolniewicz
  • UCY (Cyprus) M.Dikaiakos
  • TCD (Dublin) B.Coghlan
  • CSIC (Santander/Valencia) S.Gonzalez
  • UAB (Barcelona) G.Merino
  • USC (Santiago) A.Gomez
  • Demo (Athenas) C.Markou
  • AuTh (Thessaloniki) D.Sampsonidis
  • LIP (Lisbon) J.Martins

18
CrossGrid WP4 - International Testbed Organisation
  • Tasks in WP4
  • 4.2 Integration with DATAGRID (task leader
    M.Kunze, FZK)
  • Coordination of testbed setup
  • Exchange knowledge
  • Participate in WP meetings
  • 4.3 Infrastructure Support (task leader J.Salt,
    CSIC, Valencia)
  • Fabric management
  • HelpDesk
  • Provide Installation Kit
  • Network support
  • 4.4 Verification quality control (task leader
    J.Gomes, LIP)
  • Feedback
  • Improve stability of the testbed

19
CrossGrid WP4 - International Testbed Organisation
20
CrossGrid WP4 - International Testbed Organisation
Workflow and interfaces
Feedback from
Feedback from
Feedback mainly from
requirements from
DATAGRID testbed
WP 1,2,3
WP 1
WP 1,2,3
Integration Team
Setup
Plan
Evolution Prototype 0
FINAL
First testbed
Support Prototype 1
30
3
15
21
33
36
6
10
M4.4
M4.2
D4.5
D4.7
D4.6
D4.3
D4.2
M4.1
M4.3
PU
PU
CO
CO
CO
CO
internal progress report
final testbed
testbed setup on selected sites
internal progress report
D4.1
D4.4
internal progress report
PU
PU
PU
report on requirements planning
1st prototype releaset
final demo report
4.0 Coordination and management (PM1-36)
21
Questions
  • Interactive use of resources
  • Priorities/Allocation/Booking policies
  • How to allocate 500 machines for 5 minutes of
    interactive use?
  • Connection
  • Concept of Interactive Session?
  • Use of Permanent and Volatile storage
  • Scheduler?
  • Strongly data-location aware
  • Dynamic versioning/installation?
  • Software configuration for different applications

22
Testbed Organization
  • Use national certificates as in DataGrid
  • Spain, Portugal, Ireland, Germany, Poland
  • Greece, Cyprus, Slovakia
  • V.O.
  • Define CROSSGRID testbed V.O.
  • Join DataGrid HEP V.O. (Alice,Atlas,CMS,LHCb)
  • Define new V.O. for other applications
  • Med V.O.
  • Flood Prevention V.O.
  • Meteo V.O.

23
Testbed Organization
  • Kick-off meeting in Krakow (17th March)
  • First three months
  • Review and setup local infraestructure
  • Gather requirements from applications/middleware
  • Setup HelpDesk, define Integration Team
  • Reinforce connection with Datagrid
  • Extend testbed (Germany,Poland,Portugal,Spain)
  • Find topics for contribution

24
(No Transcript)
25
CrossGrid WP4 - International Testbed Organisation
  • Links with other projects
  • Strong link with DATAGRID
  • Extension of testbed Interoperability
  • Participation in DATAGRID meetings
  • GLOBAL FZK CYFRONET(Architecture Team)
  • Observers in DataGrid WP1-8 meetings
  • Participation in DATAGRID Testbed
  • Testbed activity at CSIC,UAB,UAM LIP TCD

26
CrossGrid WP4 - International Testbed Organisation
  • Testbed setup
  • Basic setup to provide Grid (Globus) services
  • GIS (Grid Information Service)
  • GIIS (Grid Index Information Service) machine
  • GSI (Grid Security Infrastructure)
  • CA (Certification Authority) machine
  • RA (Registration Authority) web server and LDAP
    server
  • GRAM (Globus Resource Allocation Manager)
  • Gatekeeper machine nodes
  • Network Monitoring Routing

27
CrossGrid WP4 - International Testbed Organisation
  • Testbed hardware setup
  • Basic setup to provide Grid (Globus) services
  • GIS (Grid Information Service)
  • GIIS (Grid Index Information Service) machine
  • GSI (Grid Security Infraestructure)
  • CA (Certification Authority) machine
  • RA (Registration Authority) web server and LDAP
    server
  • GRAM (Globus Resource Allocation Manager)
  • Gatekeeper machine nodes
  • Network Monitoring Routing

CA machine
RA web
GIIS
Network monitoring
Router
Switch local network backbone
gatekeeper
node
node
node
RESOURCES (Durable equipment) NEEDED AS SEED FOR
SOME PARTNERS
28
CrossGrid WP4 - International Testbed Organisation
  • Preparing to start
  • Benefit from DataGrid experience!
  • First list of resources
  • Responsibles for testbed sites
  • Get expertise on Globus
  • Collaborative tools setup
  • Videoconf meetings (first meeting 5th October,
    10 sites)
  • Web Support (www.ifca.unican.es/crossgrid)
  • Documentation

Ready to start beginning 2002 !
29
(No Transcript)
30
Resources
16 FTE 3 years
31
Deliverables
  • D4.1 (PM3) Detailed Planning
  • D4.2 (PM6) First testbed on selected sites
  • D4.3 (PM9) Status Report
  • D4.4 (PM10) 1st testbed prototype
  • D4.5-7 (PM15,21,30) Status Report
  • D4.8 (PM33) Final testbed
  • D4.9 (PM36) Final demo report

32
Links with GRID projects
  • Strong link with DATAGRID
  • Complementary middleware
  • Extension of testbed interoperability
  • Support to prepare project! Thanks!
  • National GRID initiatives
  • Poland, Ireland, Spain...
  • DATATAG
  • Support in applications with possible
    collaboration in USA
  • Presence in GRIDSTART
  • Participation in GGF

33
Current Activities Plans
  • Exploratory work on
  • Distributed O/R DBMS use
  • Data-mining techniques (NN) parallelization using
    MPI
  • Collaborative tools setup
  • Videoconf meetings
  • Web Support
  • Documentation
  • Preparing Negotiation (AnnexI)

Start beginning 2002 !
34
Interactive Simulation Visualisation of a
Biomedical System
  • pre-treatment planning in vascular interventional
    and surgical procedures through real-time
    interactive simulation of vascular structure and
    flow

GRID
Distributed real-time simulation environment
  • The user can change the structure of the
    arteries, thus mimicking an interventional or
    surgical procedur
  • Developped by University of Amsterdam, in
    collaboration with the Leiden University Medical
    Center
  • Distributed Visualisation System GVK (University
    of Linz)

35
Flooding Crisis Team Support
  • Virtual organisation for flood prevention and
    protection
  • Key points
  • Numerical flood modelling
  • Near real-time response
  • Tasks
  • Distributed data collection
  • Distributed simulation data analysis
  • Distributed access, support for Virtual
    Organisation
  • System integration
  • Leadership II SAS Bratislava

36
Flooding Crisis Team Support Model
37
Weather forecast and air pollution modelling
  • ICM (Warsaw) CSIC (SantanderValencia) USC
    (Santiago)
  • (strong links to Meteo community)
  • Integration of distributed databases into GRID
  • Operational data
  • Reanalysis DB (numerical weather prediction model
    integrated over decades)
  • Migration of data-mining algorithms
  • Linear and non-linear correlation methods
  • Neural networks
  • Air pollution model coupled to weather
    prediction
  • Based on photochemical grid model (STEM II)
  • Visualisation using GVK
  • Testing and demonstration

38
WP2 GRID Application Programming Environment
  • Facilitate development and tuning of parallel
    distributed interactive applications on GRID
  • MPI profiling and verification (USTCSIC)
  • Metrics and benchmarks (UCY,TUM)
  • Interactive performance evaluation tools
    (CYFRONETUSC)
  • Integration and coordination FZK

39
WP3 new GRID services tools
  • User friendly GRID (PSNC UCY DATAMAT ALGO)
  • Portals
  • Roaming access
  • GRID resource management (UAB, CSIC)
  • Based on self-adaptive scheduling agents
  • Take into account dynamic information from
    monitoring
  • Monitoring (TCDCYFRONET)
  • Service managers local monitors application
    monitors
  • Optimisation of data access (CYFRONET)
  • Expert system proposing data migration

40
WP3support for other WP
41
Portal
Write a Comment
User Comments (0)
About PowerShow.com