What Makes Parallel Programming Hard - PowerPoint PPT Presentation

1 / 12
About This Presentation
Title:

What Makes Parallel Programming Hard

Description:

Mapping tasks to threads and scheduling them. Defining & implementing synchronization ... Scientific computing: MatLab, BLAS, ... Statistical analysis: R, SAS, ... – PowerPoint PPT presentation

Number of Views:18
Avg rating:3.0/5.0
Slides: 13
Provided by: markh214
Category:

less

Transcript and Presenter's Notes

Title: What Makes Parallel Programming Hard


1
What Makes Parallel Programming Hard?
  • Finding independent tasks
  • Mapping tasks to threads and scheduling them
  • Defining implementing synchronization protocol
  • Race conditions deadlock avoidance
  • Dealing with the memory model
  • Locality management
  • Scalability
  • Portable predictable performance
  • Composing parallel tasks
  • Recovering from errors
  • And, of course, all the single thread issues

2
How do We Move Forward with Parallelism?
3
Towards Universal Parallelism
  • Very-high level programming
  • Domain-specific
  • Easy to write, debug, optimize, and evolve,
  • Declarative
  • Programmers provides algorithmic knowledge about
    parallelism, locality,
  • Open questions
  • How many and which?
  • Language Vs. API?

Domain-Specific Languages
Common Language Environment for Parallelism
Efficient Parallel Architectures
4
Towards Universal Parallelism
  • Compiler runtime support common case patterns
  • Manage parallelism locality
  • Exploit explicit regular patterns
  • Manage and schedule implicit and irregular
    patterns
  • Utilize programmers hints and dynamic monitoring
  • Open questions
  • Dealing with hierarchy and heterogeneity
  • Dealing with irregular computations
  • Low-level languages
  • Automated optimizations and tuning

Domain-Specific Languages
Common Language Environment for Parallelism
Efficient Parallel Architectures
5
Towards Universal Parallelism
  • Support for critical features
  • 100s of concurrent threads
  • Multi-granular parallelism
  • Low-overhead fork and synchronized
  • Atomic and isolated tasks
  • Caching bulk transfers
  • Accurate monitoring
  • Open questions
  • Homogeneous Vs. heterogeneous?
  • Mix of resources?
  • Scalability issues?

Domain-Specific Languages
Common Language Environment for Parallelism
Efficient Parallel Architectures
6
The Goal for this Class
  • Develop, improve, or replace this vision
  • Provide the seeds for our research agenda for the
    next few years

7
So, Where do We Start From?
8
1st Assignment Domain-specific Languages for
Parallelism
  • The work
  • Pick an important application domain
  • One from our list or one you are passionate about
  • Propose a (parallel) language for this domain OR
  • Study the (parallel) languages available for this
    domain
  • What patterns does it cover?
  • How effective is this programming language?
  • Study the implications on the rest of the system
  • Parallelism/locality patterns, scheduling HW
    implications,
  • Bring up advantages, challenges, questions,
    experiments,
  • The rules
  • Work in groups of two
  • Prepare a 10min talk for 4/19
  • Prepare a longer talk to make available online
  • Include important references and links

9
Application Domains Languages
  • Examples
  • Enterprise data management SQL, Focus,
  • Web services Ajax, Ruby,
  • Scientific computing MatLab, BLAS,
  • Statistical analysis R, SAS,
  • Network processing Click, Shangri-La, FPL,
  • DSP Simulink, StreamIt,
  • Probabilistic computing machine learning
    MapReduce,
  • Graphics modeling OpenGL, DirectX, Cg,
  • Algebraic computations SETL, Q,
  • Important notes
  • Feel free to propose your favorite domain(s)
  • Focus on high-level languages/models/libraries/API
    s
  • Some languages span multiple application domains

10
How to Evaluate a Programming Language
  • Some important dimensions to consider (credit
    Thomas Green)
  • Closeness of mapping how well does it map onto
    the problem domain?
  • Abstraction gradient How much abstraction is
    possible?
  • Hard mental operations does the notation lead
    you to complex combinations of primitive
    operations
  • Terseness how succinct is the language?
  • Premature commitment Does the notation constrain
    the order/way you do things? AKA imposed look
    ahead.
  • Consistency Similar semantics implied by similar
    syntax. Can you guess one part of the notation
    given other parts?
  • Error proneness How easy is it to make mistakes?
  • Progressive evaluation can you check a program
    while incomplete? Can parallelism be added
    incrementally?
  • Viscosity how hard is it to introduce small
    changes.
  • Hidden dependencies does a change in one part of
    a program cause other parts to change in ways not
    overtly apparent in the program text?
  • Additional dimensions for parallel languages
    (credit Tim Mattson)
  • Hardware visibility is a useful cost model
    exposed to the programmer? How?
  • Portability does the notation assume constraints
    on the hardware?

11
How to Map, Manage, Support a HL Parallel Language
  • Some questions to consider
  • Are parallelism and memory/com patterns explicit
    or implicit?
  • Are parallelism and memory/com patterns regular
    or irregular?
  • What is the granularity of parallelism and
    memory/com patterns?
  • Is dynamic scheduling or locality management
    needed?
  • What is the granularity frequency of
    synchronization events?
  • What kind of hardware model does it imply?
  • Processor, memory, interconnect,
  • What are the critical parameters to monitor for
    optimizations?
  • What are the scaling issues?
  • Note
  • These issues depend on both the language and the
    app
  • Nevertheless, the language often forces some
    model

12
Good Luck and See You Next Week
  • Dont forget the reading assignment for next week
Write a Comment
User Comments (0)
About PowerShow.com