Title: Computational%20Methods%20in%20Physics%20PHYS%203437
1Computational Methods in Physics PHYS 3437
- Dr Rob Thacker
- Dept of Astronomy Physics (MM-301C)
- thacker_at_ap.smu.ca
2Things to think about
- Only an incredibly small number of problems can
be solved analytically - Sometimes linearising equations (to make them
analytically tractable) throws away the
non-linear behaviour we want to study - Dimensional reduction (e.g. assuming spherical
symmetry) is usually necessary to make an
analytic solution possible - In many cases (but not all) this throws away
important information about the problem - Despite what you may think in relation to global
warming, computers are incredibly efficient at
arithmetic
3Example of a state-of-the-art computational method
4How long would these simulations take if I did
them with pen and paper?
- Let say I can do one calculation every 5 seconds
- The total number of calculations required was
about 1018 - So I would need 51018 seconds!
- How long is this?
- Age of Universe 13.7 billion years
- 4.31017 seconds
- So I would need ten times longer than the age of
the Universe to do this calculation by hand!
5Some other interesting numbers
- If I can write 20 calculations per sheet of
paper.. - I would need 61014 work books!
- This would be a stack that reaches from the Sun
to the orbit of Pluto! - But what about the ink I would need?
- Lets say I use 1 pen per workbook
- Each pen contains 1/10000th of a litre
- Works out to 61010 litres, which is 5 times the
volume of the outer Halifax Harbour!
6So computers must be really energy efficient!
- You bet!
- Energy is measured in joules
- One joule of energy is enough to power a 60W bulb
for 1/60th of a second - Each mathematical operation required only
0.00000002 joules (210-8 J) reeeeeeally small! - The total amount of energy for the entire
calculation was only 21010 J - How much is that?
- Per month, an average university consumes about
41012 J so only 0.5 of the average
universities power consumption
But computer companies still need to make
computers more energy efficient!
You probably use around 3109 J in your house per
month
7A few questions
- Who has worked with UNIX before?
- Who has worked with Excel, Maple, Mathematica or
MathCad before? - Who has done any programming before?
- What languages?
8My thoughts on computational techniques
- Programming isnt that difficult once you learn
to think in a structured fashion - If you have a very free form approach to
problem solving you may find programming hard - My advice learn to plan things out
- Most numerical methods are comparatively
straightforward to learn - Of course there are some exceptionally complex
algorithms, but most of the time we wont need
these - You can get a long way relying on algorithms, but
to be an effective researcher/student you must
understand what is going on - Algorithms are a method for understanding the
underlying physical problem dont lose sight of
the fact that we are doing physics
9Programming
- Programming is best learnt by practice rather
than watching - What would you think of someone who said they
were going to become a concert pianist by
attending as many concerts as they could? - There is no substitute for doing the work and
gaining the experience you need - It isnt that hard!
10The course project ( presentation)
- You are required to write a report 5-10 pages in
length, and give a 15 minute presentation on a
numerical problem - Ill put possible examples on the website and
give Dr Clarkes handout from previous years - Reports will be assessed by me, presentations
will be assessed by your peers - Reports should be written in clear and precise
English in the paper format using LATEX if
you do not possess good writing skills you need
to develop them now! - I will provide advice on projects, but only a
very limited amount of help - The project must be your work
- You must also learn how to manage your time!
Dont leave things until the last minute!
11Presentation
- Half of your project grade will come from your
presentation - You may use overheads, the whiteboard or
PowerPoint as you see fit - You need to outline what you worked on, show that
algorithms/codes worked as necessary and provide
results - If you have seen seminars/colloquia given by
faculty, that should give some idea of what is
expected - If you havent, then I strongly advise you attend
them! ?
12Course Goals
- To provide all students with a knowledge of
- (0) Computer design fundamentals
- (1) Both programming and development skills
- (2) Numerical methods and algorithms
- (3) Basic ideas behind parallel programming and
visualization - Ensure students are comfortable with numerical
methods, and know where to go to find the
appropriate tools
13Course Website
- www.ap.smu.ca/thacker/teaching/3437/
- Course outline any news
- Lecture notes will be posted there in ppt format
- Homeworks and relevant source code will also be
posted there
14Statement on Plagiarism
- ZERO TOLERANCE
- See the universitys statement on this issue
- This is particularly relevant to code development
as codes to solve a given problem may be
available on the web - If I want you to use codes as building blocks
then I will make it clear in any questions I set
15Course Outline
- 0. Overview of Computational Science
- - Introduction to UNIX, FORTRAN and programming
- - Fundamentals of computer architecture and
operation - - Binary representations of data, floating point
arithmetic - 1. Numerical Methods
- - Functions roots
- - Interpolation and approximation
- - Numerical Integration
- - Ordinary differential equations
- - Monte-Carlo methods
- - Discrete (Fast) Fourier transforms
- 2. Parallel programming visualization of
numerical data - 3. Student Presentations
16Marking Scheme
- 5 assignments 50
- Set approximately every two weeks, with two weeks
allowed for completion - Project and presentation 30
- Write-up (15)
- Report (15)
- 2 hour conceptual final exam (no programming)
20
17Books
- The following are excellent resources
- (1) Numerical Recipes (note some algorithms are
somewhat dated, chose your favourite language) - (2) The Art of Computer Programming (Knuth, 4
volumes, very expensive but excellent resource) - (3) The Visualization Toolkit
18Lecture 0 a few notes on computational science
- Computational science relies upon computing as a
tool to solve problems that are
difficult/intractable by other methods - Computing as a tool focuses approach
- (1) Time to solution is important
- (2) Exact method chosen depends on many factors
(including human ones) - (3) Generating new approaches and creative
solutions is important due to (1), but not a goal -
Getting science done is the goal!
19Why is Unix (Linux) so popular for computational
science?
- Cost and flexibility
- It is not because of the availability of
software, but rather the OpenSource approach - Geek sensibilities are a non-issue
- Windows is a viable platform for computational
science if - You can afford the software you want
- You can accept there will be pieces of software
you can never see the source code to
Ultimately you should choose the best tool for
the job
20Computation as a third pillar of science
Theory
Observation/Experiment
Traditional view
Modern view
21CSE as a discipline
22Todays Lecture
- Fundamentals of computer design
- von Neumann architecture
- Memory
- Process Unit
- Control Unit
- Instruction cycle
- Five generations of computer design
- Moores Law
23Definition of a computer
- Definition has evolved out of Turings (circa
1930s) discussions - Takes input
- Produces output
- Capable of processing input somehow
- Must have an information store
- Must have some method of controlling its actions
24Birth of Electronic Computing
- Prior to 1940 a computer was assumed
- to be a person
- Turing (1936) definition of the Turing
- machine (formalized notion of
- algorithm execution)
- Atanasoff-Berry (1938) Zuse (Z-series 1941)
both developed electronic computing machines with
mechanical assistance (e.g. rotating memory
drums) - Colossus (UK) first all electronic configurable
computer (1941) - ENIAC (1945) arguably first fully programmable
electronic computer
First generation of computers used vacuum tubes
25Key events, 1953-1974
- 1953 First transistor based computer (Manchester
MkI) - 1954 FORmula TRANslation language (first high
level language) developed at IBM by Backus - 1964 IBM System/360, first integrated circuit
(IC) computer - 1965 Moores Law published by Gordon Moore
2nd Generation (1953-64)
3rd Generation (1964-72)
26Moores Law transistor counts grow exponentially
27Power density a side effect of Moores Law
Graph by Dr Avi Mendelson, Intel
Heat is not distributed evenly across the chip
hot spots Power consumption is already a
significant market factor
281972-present
- 1972 8008 microprocessor released by Intel, uses
very large scale integration on IC (3500
transistors, 8 bits) - C programming language developed by Ritchie this
year - 1985 Intel 386, first 32bit Intel processor
- 1992 First commercially available 64bit
processor, DEC Alpha 21064
4th Generation (1972-now)
29Fifth Generation?
- All modern computers are similar in principle to
the 8008 just scaled up in size enormously - Some historians claim that the fifth generation
of machines is constituted by the appearance of
parallel computers in the early 1990s - Japan invested heavily in such machines in the
1980s - Others argue that the appearance of AI based
computers will be the next step?
30Humour Famous computer gaffes
- 1943 I think there is a world market for maybe
five computers Thomas Watson (IBM Chairman) - 1950 Well have to think up bigger problems if
we want to keep them busy Howard Aiken - 1957 I have travelled the length and breadth of
this country and talked with the best people, and
I can assure you that data processing is a fad
that wont last out the year Prentice Hall
business books Editor - 1977 There is no reason anyone would want a
computer in their home Ken Olsen (DEC Chairman) - 1981 640k ought to be enough for anybody Bill
Gates
31Latency and Bandwidth
- Latency
- The time taken for a message to get from A to B
- Typically used when discussing the time taken for
a piece of information in memory to get to the
central processing unit (CPU)
- Bandwidth
- The amount of data that can be passed from point
A to B in a fixed time - Typically used when discussing how much
information can be transferred from memory to the
CPU in a second
32Bits, Bytes Nybbles
- Just to recap 1 bit is a zero or one, usually
represented (but not always) by a lower case b - Byte 8 bits usually represented by capital
B - Nybble 4 bits
- Things can be slightly confusing when people talk
about bandwith in Mb/s - Do they mean megabits per second or megabytes
second? Most of the time it is megabytes
33von Neumann architecture
Machine instructions are encoded in binary
stored key insight!
- First practical stored-program architecture
- Still in use today
- Speed is limited by the bandwidth of data between
memory and processing unit - von Neumann bottleneck
MEMORY
DATA MEMORY
PROGRAM MEMORY
CONTROL UNIT
PROCESS UNIT
CPU
INPUT
OUTPUT
Developed while working on the EDVAC design
34Instruction encoding
- Machine instructions are represented by binary
codes - Encoding of instructions is extremely important
in CPU design - No two instructions can have same value
- Each binary value must be decoded
- For k-bits in each instruction word require k
lines of incoming circuitry - However, there are 2k possible instructions, so
one naively requires 2k output lines - Gets expensive very quickly!
- Instruction Set Architecture (ISA) A computers
instructions and their formats
35von Neumann bottleneck
- This has become a real factor in modern computer
design - In a landmark paper in 1994, Wulf Mckee pointed
out that next generation computers would be
seriously limited unless more bandwidth was made
available - Witness the push for higher bandwidth memory in
modern computers
36Memory
- Memory is divided in cells, each cell has an
address and contents - Address n-bit identifier describing location
- Addresses are unique
- Contents m-bit value stored at the given address
- k x m array of stored bits (k is usually 2n)
- Actions
- Read a word from memory LOAD
- Write a word to memory STORE
01110010
0000 0001 0010 0011 0100 0101 0110 1101 1110 111
1
10100010
Remember both the program and the data it
operates on have to be read from memory
Technically a cell is a single bit of memory,
but well use it to mean a single memory unit
37Memory Registers
- A register is an extremely fast memory address,
close to the circuitry of the control process
units - Small number of special purpose cells they have
a specific function - Can store addresses, data or instructions
- Example registers used in memory access
- Memory Address Register (MAR)
- Holds the address of a cell to access
- Same size in bits as the address length
- Memory Data Register (MDR)
- Holds content of cell that was fetched or that is
to be stored - Same size in bits as the cell contents
38MAR/MDR Memory (logical diagram)
m bit lines
Bit 0
000
MAR
Address decoder
Memory cells
001
Address line
Active Address switched on
Bit n
2n-1
MDR can actively Read/write to active
address
39Process Unit
- The PU can consist of many specialized units
- Arithmetic Logic Unit (ALU)
- floating point arith. or specialized fast square
roots - Needs some kind of small local storage to hold
data output on, so uses registers - Some modern CPUs have hundreds of registers
- Data is stored in words, with a distinct word
size - number of bits normally processed by ALU in one
instruction
40Control Unit
- Oversees execution of the program
- Responsible for reading instructions from memory
- Directly interprets the instruction, and informs
execution unit what to do - Two important parts of the CU
- Instruction Register (IR) holds the current
instruction - Program Counter (PC) holds the address of the
next instruction to be executed - The PC is very useful tool for debugging tells
you exactly what is being done at a given time
41Execution sequence
Instruction fetch from memory PC gives address,
stored in IR
Decode instruction
Evaluate address
Fetch operands from memory
Execute operation on PU
Store result
42On die cache is the reason why Moores Law is
still valid
43Bus based architectures
PC Design, Circa 1998
PCI BRIDGE
CPU
MEMORY
VIDEO
SCSI CONTROLLER
NETWORK CARD
PCI BUS
ISA BRIDGE
SOUND CARD
MODEM
ISA BUS
A single bus is insufficient for modern machines
multiple buses are used to manage both I/O and
data processing
44(Symmetric) Multiprocessor machines
CPU
CPU
CPU
CPU
MEMORY
Traditional shared memory design all processors
share a memory bus Becoming less popular
extremely hard to design buses with
sufficient Bandwidth, although smaller 2-4
processor Intel systems use this design
45Summary of lecture 1
- Turing developed the fundamental basis on which
computation is now discussed - In the von Neumann architecture the fundamental
components are input/output, processing unit,
memory - Key insight program can be stored as software
- Modern systems use numerous external buses,
connected via bridges
46Next lecture