Title: Generating Design Alternatives.
1EST 200, Evaluating Design Alternatives
2Contents
- Decision Making.
- Choosing the Best Alternative.
- Numerical Evaluation Matrix.
- Priority Checkmark Method.
- Best-of-Class Chart.
- Design Evaluation.
3Decision Making
- A selection process.
- The best or most suitable course of action
finalized from among several available
alternatives. - Intellectual and Goal Oriented.
- Involves imagination, reasoning, evaluation and
judgement. - Involves commitment of resources.
4Effectiveness of decision making enhanced by
participation.
5-
- Our job is to give the client, on time and on
cost, not what he wants, but what he never
dreamed he wanted and when he gets it, he
recognises it as something he wanted all the
time -
- - Sir. Denys
Louis Lasdun, - Architect.
6Choosing the Best Alternative
- Resources are limited eg time, money, expertise
etc. - Rarely are the resources available to fully
develop more than one design scheme. - Never mind all of our alternatives.
- Must choose the best alternative for further
elaboration, testing, and evaluation.
7Choosing the Best Alternative
- Limit the analysis to the clients most important
objectives. - Avoid drowning useful information in a sea of
relatively unimportant data. - Establish metrics with common sense of scale.
- Do not mistakenly over/underemphasize some
results.
8Choosing the Best Alternative
- Information must necessarily reflect a fair
amount of subjectivity. - Many of the metrics reflect qualitativenot
measurable, quantitativeresults. - Metrics should be thought of more as indicating a
clear sense of direction than an algorithm or
numerical solution.
9Choosing the Best Alternative
- Check that each alternative satisfies all of the
applicable constraints. - Design alternatives that dont meet constraints
considered infeasible. - Three selection methods to link design
alternatives to ordered unweighted design
objectives.
10Pareto Optimality
- Named after Vilfredo Pareto.
- Pareto improvement Resources can be re-allocated
to make at least one person better off without
making other individuals worse off. - Pareto optimal/Pareto efficient Situation where
no individual/preference criterion can be better
off without making at least one
individual/preference criterion worse off or
without any loss thereof, means no scope for
further Pareto improvement.
11A, B - Pareto inefficient , Pareto improvement
since total output increases. C,D Pareto
Efficient, No improvement possible.
Upper Limit
12Pareto Efficiency
- Implies resource allocation in the most
economically efficient manner. - Does not imply equality or fairness.
- Resources cannot be reallocated to make one
individual better off without making at least one
individual worse off. - Economy in a Pareto optimum state when no
economic changes can make one individual better
off without making at least one other individual
worse off.
13Methods of Choosing an Alternative
- Three selection methods
- - numerical evaluation matrix.
- - priority checkmark method.
- - best of class chart.
- Ordered objectives cannot be scaled on a
mathematically meaningful ruler. - May bring order to judgments and assessments that
are subjective at their root.
14Numerical Evaluation Matrix
- Constraints (upper rows) and objectives (lower
rows) in the left-hand column. - Limit the number of decisive objectives to the
top two or three. - Difficult to mediate among more than two or three
objectives at one time. - Reflects the application of the metrics to the
design alternatives. - To see if one design is Pareto optimal - superior
in one or more dimensions, and at least equal in
all the others.
15Numerical Evaluation Matrix
- Values can be used to work with the client (and
perhaps users) to revisit the objectives. - Client may change their mind about relative
rankings to get a very strong winner. - Metrics and associated testing procedures not to
change with whoever is applying or making
measurements for the metrics.
16Numerical Evaluation Matrix- Juice Container
Design
17Priority Checkmark Method
- A simpler, qualitative version of the numerical
evaluation matrix. - Easy to use, makes the setting of priorities
simple, readily understood by clients and other
parties. - Ranks the objectives as high, medium, or low in
priority. - However, considerable information is lost that
may be useful in differentiating between
relatively close alternatives.
18Priority Checkmark Method
- Objectives with high priority given three checks,
those with medium priority given two checks,
objectives with low priority given only one
check. - Design alternative that meets an objective in a
satisfactory way marked with one or more checks.
19Priority Checkmark Method
- Metric results assigned as 1 if they are awarded
more than the target value (eg 70 points on a
0100 scale), and as 0 if their award is less
than the target value. - Choice of a target value (threshold) very
important. - May lead to results that appear to be more
disparate than they really are.
20Priority Checkmark Method - Juice Container
Design
21Best-of-Class Chart
- For each objective, assign scores to each design
alternative. - 1 for the alternative that meets that objective
best, 2 for second-best and so on.. - Alternative that met the objective worst given a
score equal to the number of alternatives being
considered.
22Best-of-Class Chart
- Two alternatives can be considered best,
handled by splitting available rankings. - Eg two firsts would each get a score of
(12)/2 1.5, second and third would get
(23)/2 2.5. - Scores help to see if the design is Pareto
optimal (best in all categories), or at least
best in the most important (i.e., highest ranked)
objectives.
23Best-of-Class Chart- Juice Container Design
24Best-of-Class Chart - Merits
- Allows us to evaluate alternatives with respect
to the results for each metric. - No binary yes/no decisions.
- Easy to implement and explain.
- Ranking methods allow for qualitative evaluations
and judgments. - Can be done by individual team members or by a
design team as a whole.
25Best-of-Class Chart - Merits
- Helpful if there are many alternatives to choose
among from. - Can be used if we want to narrow our consultative
and thoughtful process to the top few.
26Best-of-Class Chart - Drawbacks
- Encourages evaluation based on opinion rather
than testing or actual metrics. - Shows only the rankings, not the actual score.
- May lead to a moral hazard akin to that attached
to priority checkmarks. - Temptation to fudge the results or cook the books.
27Best-of-Class Chart - Drawbacks
- May not provide information on whether two
results are close or not. - Eg we do not know if the first and second
results are close or not, which could be
important information.
28Design Evaluation
- Design evaluation and selection demand careful,
thoughtful judgment. - Ordinal rankings of the objectives obtained using
PCCs cannot be meaningfully scaled or weighted. - Cannot simply sum the results.
- Use common sense when evaluating results.
29Design Evaluation
- Metrics results for two alternative designs that
are relatively close to be treated as effectively
equal, unless there are other unevaluated
strengths or weaknesses. - Results should meet the expectations.
- No excuse for accepting results blindly and
uncritically.
30Design Evaluation
- If results do meet our expectations, ask whether
evaluation was done fairly. - Should not reinforce biases/preconceived ideas.
- Check whether the constraints used to eliminate
designs are truly binding.
31Thank You