Title: Scientific and Technological Advances: Measuring Performance and Value
1Scientific and Technological AdvancesMeasuring
Performance and Value
Workshop on Performance Assessment of Public
Research, Technology, and Development
Programmes European Commission and the
Washington Research Evaluation Network Brussels,
June 17, 2004
Parry.norling_at_comcast.net
2Scientific and Technological AdvancesMeasuring
Performance and Value
When and how can we benchmark ST performance
and make meaningful comparisons between
countries? Should we seek standardization in the
measures and approaches to make these
comparisons?
3Scientific and Technological AdvancesMeasuring
Performance and Value
What are the dimensions of performance to be
assessed and evaluated? What are the levels of
performance where measurements can be made nation
to nation? What are the different challenges in
measuring performance and value of scientific
advances vs. technology development?
4Background in Answering these Questions
- Not an expert on evaluation
- A university then industrial research chemist
RD Director RD Planning Director at DuPont - An observer, user, participant, and recipient of
evaluations - Lesson from 12 years ago
5Wall St. Journal-1992
6Wall St. Journal Front Page
- despite spending of more than 13 billion on
chemical and related research over the past 10
years, DuPonts 5,000 scientists and engineers
were a technological black hole
7 More
- They sucked in money but, Company officials
concede, didnt turn out a single all new
blockbuster or even many innovations. - The technology is great, but wheres the payoff?
8Another Nylon or Teflon ?
1938 Nylon, Teflon and Butacite
1940
1935 Better things for better living... through
chemistry
9How do I view measures from an Industrial RD
Perspective
- Must link measures with an understanding of how
ST works. - Some approaches in the private sector are similar
to those in the public sector. - Many of the reasons for evaluation are the same.
- Many of the challenges are the same.
10ST/Innovation as a System
Adoption by society commercialization
Outcomes cost reduction new businesses,products,
processes...
Processing system
Lab
Inputs
Receiving system
Activities Research,Development,testing,knowledge
building...
people ideas equipment facilities funds informatio
n
Outputs
Marketing business planning manufacturing Operatio
ns
patents products processes publications facts, kno
wledge
In-process measurement and feedback
Output measurement and feedback
Outcome measurement and feedback
Brown and Swinson RTM July-Aug.1988
11ST/Innovation as a System
Outcomes cost reduction new businesses,products,
processes...
Processing system
Lab
Inputs
Receiving system
Activities Research,Development,testing,knowledge
building...
people ideas equipment facilities funds informatio
n
Outputs
Marketing business planning manufacturing Operatio
ns
patents products processes publications facts, kno
wledge
In-process measurement and feedback
Output measurement and feedback
Outcome measurement and feedback
Brown and Swinson RTM July-Aug.1988
12ST/Innovation as a System
Outcomes cost reduction new businesses,products,
processes...
Processing system
Context External and Internal Environment
Lab
Inputs
Receiving system
Activities Research,Development,testing,knowledge
building...
people ideas equipment facilities funds informatio
n
Outputs
Marketing business planning manufacturing Operatio
ns
patents products processes publications facts, kno
wledge
In-process measurement and feedback
Output measurement and feedback
Outcome measurement and feedback
Brown and Swinson RTM July-Aug.1988
13Levels of Performance to be evaluated
- Individual research effort
- Studies by team, network, community of
researchers - Project
- Group of projects (program) or grouping of
research funded by one agency - Initiative, focused research problem
- Results from an organization or institution
- Advances in a scientific discipline
- National or regional innovation systems
14Dimensions of Performance
- Extent to which objectives (expectations,
promises, plan) were met or exceeded (Efficiency) - Absolute value or impact of research results
extent to which relevant knowledge has increased
(Effectiveness, Significance, and Quality) - Productivity Output/input (usually over a
certain period of time) - Yield profits/research result (output)
- Return on investment profits/financial inputs
15TYPES OF MEASURES/METRICS What to measure and
how to measure?
RETROSPECTIVE
PRESENT
PROSPECTIVE
COUNT, classification OF RESOURCES BEING USED
COUNT, CLASSIFICATIONOF RESOURCES USED
INPUTS
MILESTONE REVIEWS PANELS ECONOMIC EVALUATIONS
OPTIONS
LEARNING HISTORIES
IN-PROCESS
ST FORECASTS ROADMAPPING PROJECTED SALES AND
INCOME NPVs
OUTPUTS
PEER REVIEW BIBLIOMETRICS PATENTS FINANCIAL
METRICS
OUTCOMES/ IMPACTS
Project reviews histories
FORECASTS
16METRICS INPUTS/INVESTMENTS IN ST
- Expenditures for each stage of the innovation
process - Expenditures for each time period
- Distribution by categories of expenditure
- Source of funding
- Comparison of expenditures to competitors,
industry averages - RD intensity
- Expenditures by discipline, market, or product
line
17Metrics Financial
- Cost savings
- Return on investment
- Return on assets
- Price differential based on technology advantages
18Commercial/business metrics
- New sales ratio
- Projected sales and income
- Profit ratio (attributed to ST results/all
sales) - Market share attributed to ST results
- Customer satisfaction
- Regulatory compliance
- Quality and reliability
- Response time
- Proprietary sales and revenue ratio
- Value of in-process research
- Value of products in the pipeline
- Net present value
19Bibliometrics
- Publications as ratios to investments in ST
that have generated the publications. - Citation analysis
- Co-word analysis and Database Tomography
- Special presentations and honors
- Science Model (citation mining)
20Patents
- Count of patents cost of patents ratio of count
to resources - Relevant/ embodied patents
- Comparative patents
- Patent map (patents by technology)
- Cost of patents
- Value of intellectual property intangible assets
21Peer Review Metrics
- Internal evaluation
- External evaluation subjective evaluation of
the ST unit, its activities, its outcomes, and
its overall quality by a panel of experts - Targeted reviews (panel evaluation of any ST
outcome, paper, project, program, or individual
scientist) Could result in a prize or award. - Historical analysis
DuPont Better things for better living through
chemistry has now become Miracles of Science
22Organizational, Strategic, and Managerial Metrics
- Project management internal or external cycle
time - Measure of projects with teams
- Evaluation of scientific and technical
capabilities of a ST unit. - Financial measure of the degree to which projects
have had technical and commercial success - Ownership, support, and funding of projects
- Relation of ST to strategic objectives
- Benchmarking best practices
23Stages of Outcomes
- Immediate outputs includes direct outputs from
ST/RD activity such as bibliometric measures - Intermediate outputs includes outputs of the
organizations and entities that have received the
immediate outputs, transformed them and are
providing the transformed outputs to other
entities in society and the economy. - Pre-ultimate outputs includes measures of the
products and services that are generated by those
social and economic entities that had received
and transformed the intermediate outputs.
24Stages of Outcomes cont.
- Ultimate outputs measures of things of value to
the economy and society that were impacted by the
pre-ultimate outputs - Index of leading indicators manipulation of
core and organizational specific measures in a
weighted procedure. - Value indices for leading indicators value
added at each stage of the innovation process.
25Impact of Science on
- Culture
- Knowledge
- Know-how
- Attitudes
- Values
- Society
- Welfare
- Discourses actions of groups
- Policy
- Policy-Makers
- Citizens
- Public programs
- National security
- Organization
- Planning
- Work organization
- Administration
- Human resources
- Health
- Public health
- Health systems
- Environment
- Management of natural resources and the
environment - Climate and meteorology
- Training
- Curricula
- Teaching tools
- Qualifications
- Graduates
- Insertion into the job market
- Fitness of training/work
- Careers
- Use of acquired knowledge
- Science
- Knowledge
- Research activities
- Training
- Technology
- Products and processes
- Services
- Know-how
- Economy
- Production
- Financing
- Investments
- Commercialization
- Budget
Godin, Benoit and Christian Dore (2004) Measuring
the Impacts of Science Beyond the Economic
Dimension working paper See http//www.csiic.ca/
PDF/Godin_Dore_Impacts.pdf
26Some approaches International ST comparisons
at different levels
- Performance evaluations prizes, awards
- Science Model network analysis, bibliometrics
- constraint analysis learning histories panel
reviews net present value - Expert panel reviews best practices mechanisms
- Science Model
- Laboratory evaluations- panels, visiting
committees - Bibliometrics expert panels
- Indices based on statistics/indicators
- Individual research effort
- Studies by team, network, community of
researchers - Project
- Group of projects (program) or grouping of
research funded by one agency - Initiative, focused research problem
- Results from an organization or institution
- Advances in a scientific discipline
- National or regional innovation systems
27Some approaches International ST comparisons
at different levels
- Performance evaluations prizes, awards
- Science Model network analysis, bibliometrics
- constraint analysis learning histories panel
reviews net present value - Expert panel reviews best practices mechanisms
- Science Model
- Laboratory evaluations- panels, visiting
committees - Bibliometrics expert panels
- Indices based on statistics/indicators
- Individual research effort
- Studies by team, network, community of
researchers - Project
- Group of projects (program) or grouping of
research funded by one agency - Initiative, focused research problem
- Results from an organization or institution
- Advances in a scientific discipline
- National or regional innovation systems
28Project Performance Constraint Analysis
Technology
- Merrifield scoring for commercial success
validated with existing businesses in the US,
India, Israel, Chile, France, and Japan - D. Bruce Merrifield, Corporate Renewal Through
Cooperation and Critical Technologies,
Research-Technology Management, July-August 1993 - Business attractiveness
- sales/profit potential-10
- growth potential-10
- competition-10
- distribution risk-10
- restructuring potential-10
- political/social factors-10
Business Fit Factors Capital
availability-10 Manufacturing-10
Marketing-distribution-10 technical
competence-10 access to components-10
management-10
29Project Performance Constraint Analysis
Technology
- Merrifield scoring for commercial success
validated with existing businesses in the US,
India, Israel, Chile, France, and Japan - D. Bruce Merrifield, Corporate Renewal Through
Cooperation and Critical Technologies,
Research-Technology Management, July-August 1993 - Business attractiveness
- sales/profit potential-10
- growth potential-10
- competition-10
- distribution risk-10
- restructuring potential-10
- political/social factors-10
Business Fit Factors Capital
availability-10 Manufacturing-10
Marketing-distribution-10 technical
competence-10 access to components-10
management-10
30Merrifield Scoring 80 points or higher spells
success
8 out of 10 successes- nation to nation
120
100
100
Analysis of profit center initiatives in process
all gave scores below 80.
80
80
60
60
Fit factors
Business attractiveness
Are there analogies for scientific advances?
31Process Project Reviews
- Compare with similar projects
- Some rules of thumb 3x return on standard cost
reduction effort - Data base of assessments of previous projects
- This project has the characteristics of project
X in 1995 - Studies by Independent Project Analysis Inc.
Project 80 All new process In retrospect it is
clear that this process was not ready for
commercialization. The key decision was not to
spend the considerable sum required for a fully
integrated pilot plant. If it had been built the
project would not have gone forward because data
would have shown the process to be uneconomic.
(Comparing with a similar but better process by
Japanese company)
32Learning Histories
- DuPont RD began to review a number of projects
- 38 reviewed recently and factors for success were
discovered and used by the organization in
assessing ongoing projects global RD projects.
Sandukas, Ted, Jerry K. Okeson, and Rashi Akki
(2001) Project Learning Histories A Corporate
Summary, 2001-CRD-64, DuPont, Wilmington DE
33Learning Histories
34Learning Histories
35- Knowledge (Scientific and technological) has no
intrinsic value must put it in an application or
context for it to acquire potential value. - Many ways to calculate the future value of a
technological development - Discounted cash flow, Internal rate of return,
net present value - Requires a model of the development pathway an
influence diagram is a good starting point - Requires realistic forecasts
36Influence Diagram
37Some approaches International ST comparisons
at different levels
- Performance evaluations prizes, awards
- Science Model network analysis, bibliometrics
- constraint analysis learning histories panel
reviews net present value - Expert panel reviews best practices mechanisms
- Science Model
- Laboratory evaluations- panels, visiting
committees - Bibliometrics expert panels
- Indices based on statistics/indicators
- Individual research effort
- Studies by team, network, community of
researchers - Project
- Group of projects (program) or grouping of
research funded by one agency - Initiative, focused research problem
- Results from an organization or institution
- Advances in a scientific discipline
- National or regional innovation systems
38Program (Portfolio) Evaluation
- Financial analysts (Wall St.) estimate value of
research in the pipeline - Primarily in pharmaceuticals
- Other sectors now sharing pipeline information-
important driver today - Can compare company to company, nation to nation.
39Program Evaluation
BENCHMARKING EVALUATION OF PUBLIC SCIENCE
AND TECHNOLOGY PROGRAMS IN THE UNITED STATES,
CANADA, ISRAEL, AND FINLAND PROCEEDINGS OF A
WORKSHOP SPONSORED BY TEKES NATIONAL TECHNOLOGY
AGENCY OF FINLAND HELD AT THE EMBASSY OF FINLAND,
WASHINGTON, DC, USA SEPTEMBER 25, 2002
Prepared by ROSALIE RUEGG TIA CONSULTING, INC.
40(No Transcript)
41OMB RD Scorecard Criteria (Investment Criteria)
42Mechanisms for demonstrating the Criteria
Norling, Parry M., Mihal Gross, Aaron Kofner,
Mark Wang, and Helga Rippen (2003) RD Management
Practices Illustrative Examples PM 1588-OSTP,
RAND, Santa Monica CA.
43Benchmarking
- A process for identifying and importing best
practices to improve performance - Comparisons, focus on performance, measuring
44Measure adequacy of process?
Use some findings from COV reports for other
reports and for evaluation of NSF operations
provide to GPRA committee
NSF management Assistant Director Advisory
Committee COV monitor COV members NSF
Program Staff
Develop COV Review guidelines composition
timing, content
Receive report and related documents
Prepare response, and adequacy report
Use for evaluation of other aspects of program
management, investment strategy, priority setting
Schedule COV Reviews
Select Committee Members
Prepare Charge to Committee
Review Report
Accept?
yes
no
Receive Report
Review and accept report
File conflict of interest forms logistics of
meetings, template
Measure adequacy of input
yes
Answer questions, make assessment. Develop
recommendations
Review documentation decide how to review
proposal jackets
More info?
Develop and submit report
no
Participate in Review presentations, answer
questions
Provide COV members with documentation, reports,
jackets
Committee of Visitors NSF Process map
45Flow diagram Biofuels program
46Measuring Technology Progress Ratios
Learning factor. In a log-log plot of cost
(operation or plant investment) vs. cumulative
production (an experience curve) the relative
reduction in cost for each doubling of cumulative
production is called the learning factor. The
Progress ratio is 1-learning factor.
47Measuring Technology Progress Ratios
EU 1980-1995
48Some approaches International ST comparisons
at different levels
- Performance evaluations prizes, awards
- Science Model network analysis, bibliometrics
- constraint analysis learning histories panel
reviews net present value - Expert panel reviews best practices mechanisms
- Science Model
- Laboratory evaluations- panels, visiting
committees - Bibliometrics expert panels
- Indices based on statistics/indicators
- Individual research effort
- Studies by team, network, community of
researchers - Project
- Group of projects (program) or grouping of
research funded by one agency - Initiative, focused research problem
- Results from an organization or institution
- Advances in a scientific discipline
- National or regional innovation systems
49Science Model
- Structure of science
- Disciplines a cluster of journals citing one
another - Research communities a cluster of research
papers citing earlier work - Regions a contemporary network of research
communities - Research agendas coword analysis of research
communities - Universe focus on a research problem
- Performance of Science
- Stage of work
- Phase of work
- Level of performance in a discipline
50Science Model
51(No Transcript)
52Assessing Science Performance in a firm or country
Firm or country 1
Firm or country 2
53Disciplines, Communities, Regions, Trends
i.e. citation analysis
Reference Patterns
Between annual sets of communities
Between journals
Between communities
Between papers
Define disciplines
Define regions
Define communities
Define trends
Groups of communities which work on related
problems
Problem-oriented group
Static, fission or fusion?
Networks of journals
54Performance Indicators in the Science Model
- Science-driven research communities analysis of
age distribution of reference papers - Technology-driven research communities analysis
of size distribution of reference papers - Hot topic research communities large set of
current papers - Disruptive science research communities not
linked to research communities from previous
years - High to low performance (by momentum)
55Research Universe
- Purpose help clients find out of the box
scientific and technical solutions to their
problems - 5 Steps
565 steps
- 1. Identify the disciplines of greatest interest
seed discipline (iterative process) and the
research communities associated with these
disciplines - 2. Assign research communities to core, boundary,
and cross-border sectors of the universe (core
dominated by the seed discipline boundary step
removed cross border several steps removed - 3. Identify the research most relevant to the
client needs number of research communities by
discipline - 4. Analyze indicators of performance for each
sector of the universe does the universe have
more or less research that I high momentum? Is
the core better or worse than the boundary
and crossborder research? Are the research
communities of interest more or less disruptive
than the research communities in the universe. - 5. Form a science strategy
57Results
- Revamped the RD portfolio of Smith Kline
Beecham. - In this case the universes were therapeutic
areas. - After generating maps of 7 research based
therapeutic areas, they concluded that that the
field of gastrointestinal disease research was
not generating a significant amount of
highperformance research and closed research in
this area. - Looking at technology areas they identified
research communities common to the 7 therapeutic
areas
Norling, Parry M., Jan P. Hering, Wayne A.
Rosenkrans,Jr., Marcia Stellpflug, and Stephen B.
Kaufman (2000) Putting Competitive Technology
Intelligence to Work Research-Technology
Management, September-October pp.23-28
58Results
- Found a technology universe working in the broad
area of genomics (uncertain field in the early
1990s). - Through the map, they found several university
groups and small companies that were conducting
high-momentum research in this area. - Developed agreement with Human Genome Sciences.
- Also located a multi-million dollar research
facility focusing on the central nervous system
(CNS) maps showed centers of excellence in CNS
research in the US but also in France- where they
ultimately built the facility.
59Some approaches International ST comparisons
at different levels
- Performance evaluations prizes, awards
- Science Model network analysis, bibliometrics
- constraint analysis learning histories panel
reviews net present value - Expert panel reviews best practices mechanisms
- Science Model
- Laboratory evaluations- panels, visiting
committees - Bibliometrics expert panels
- Indices based on statistics/indicators
- Individual research effort
- Studies by team, network, community of
researchers - Project
- Group of projects (program) or grouping of
research funded by one agency - Initiative, focused research problem
- Results from an organization or institution
- Advances in a scientific discipline
- National or regional innovation systems
60- Assessed the US Army Natick Research,
Development, and Engineering Center - Army wanted to know if the lab was world-class
- Developed metrics and anchored scales for the
evaluation - Interviews and reviews of the work were used in
the assessment
61(No Transcript)
62(No Transcript)
63Lab Evaluations
This book drew upon the National Comparative
Research and Development Project One of these was
the National Comparative Research and Development
Project (NCRDP)a large multi-year exercise
(1984-1999) conducted by a team of more than
thirty researchers across seven universities in
four countries
The Five Phases of NCRDP Phase I focused on
825 Energy RD Labs in US/Canada including 32
intensive case studies Phase IIexpanded the
universe to 16,000 RD labs of all sizes and
sectors surveying a sample of 1341 labs Phase
IIIResurveyed Phase-II labs with focus on
government labs and on technology transfer
issues Phase IVFocused on Government Labs in
Japan and Korea Phase V Focused on 200 companies
that regularly interact with federal labs
Crow, Michael and Barry Bozeman(1998) Limited by
Design RD Laboratories in the U.S. National
Innovation System, Columbia University Press,
New York
64Some approaches International ST comparisons
at different levels
- Performance evaluations prizes, awards
- Science Model network analysis, bibliometrics
- constraint analysis learning histories panel
reviews net present value - Expert panel reviews best practices mechanisms
- Science Model
- Laboratory evaluations- panels, visiting
committees - Bibliometrics expert panels
- Indices based on statistics/indicators
- Individual research effort
- Studies by team, network, community of
researchers - Project
- Group of projects (program) or grouping of
research funded by one agency - Initiative, focused research problem
- Results from an organization or institution
- Advances in a scientific discipline
- National or regional innovation systems
65Panel Evaluation
- Determinants of leadership National imperatives,
innovation process (pluralism, partnerships,
regulations, professional societies) major
facilities, Centers, Human resources, funding - Assessment of biomaterials, ceramics, composites,
magnetic materials, metals, photonic materials,
polymers, catalysts
66NANOTECHNOLOGY EXPERT GROUP AND EUROTECH
DATA MAPPING EXCELLENCE IN NANOTECHNOLOGIES PREPAR
ATORY STUDY Prepared by Martin Meyer, Olle
Persson, Yann Power and the nanotechnology expert
group December 2001
67Bibliometric approach ranked countries
68Scale-independent Indicators and Research
EvaluationJ. Sylvan Katz
- A power law exists between recognition or impact
and the publishing size of a research community - New scale independent indicators can be used to
overcome the inequity produced by some non-linear
characteristics commonly measured when evaluating
research performance.
69Changing rank in Physics
70Some approaches International ST comparisons
at different levels
- Performance evaluations prizes, awards
- Science Model network analysis, bibliometrics
- constraint analysis learning histories panel
reviews net present value - Expert panel reviews best practices mechanisms
- Science Model
- Laboratory evaluations- panels, visiting
committees - Bibliometrics expert panels
- Indices based on statistics/indicators
- Individual research effort
- Studies by team, network, community of
researchers - Project
- Group of projects (program) or grouping of
research funded by one agency - Initiative, focused research problem
- Results from an organization or institution
- Advances in a scientific discipline
- National or regional innovation systems
71(No Transcript)
72Science and Technology Indicators for the
European Research Area (STI-ERA)
European Commission Trend charts
http//trendchart.cordis.lu/ Innovation
scoreboard and trends source of major indicators.
73THEME 1 Human resources in RTD Researchers
(FTE) per 1000 workforce New ST PhDs per 1000
population aged 25-34 years THEME 2 Public and
private investment in RTD Total RD expenditure
in of GDP Industry financed RD as of
industrial output Share of government budget
allocated to RD (GBAORD) Share of SMEs in
publicly funded RD executed by the business
sector () Venture capital-investment per 1000
GDP THEME 3 Scientific and technological
productivity Scientific publications per million
population Highly cited publications per million
population European patents per million
population US patents per million population
THEME 4 Impact of RTD on economic
competitiveness and employment Labour
productivity (GDP per hour worked) in PPS Labour
productivity (GDP per hour worked) annual average
growth Value added in high-tech industries as
of GDP Employment in high-tech industries as
of total employment Value added of knowledge
intensive services as of GDP Employment in
knowledge intensive services as of total
employment Technology balance of payments
receipts as of GDP Technology balance of
payments (exports-imports) as of GDP Exports
of high-tech products as of world total
74(No Transcript)
75Innovation Index
- Related the input variables to Output- regression
analysis - The Index is then a measure of the output
76(No Transcript)
77Challenges in Evaluation
- Surveys
- Case studies- descriptive
- Case studies economic
- Econometric studies
- Network analysis observation, analysis of
social/organizational behavior and related
outcomes - Bibliometrics counts, citation analysis,
content analysis - Historical tracing
- Expert panels, peer review
78Challenges in Evaluation
- Surveys
- Getting the right respondents and number
- Eliminating biases
- Getting the full story
- Assure credibility for audience
- Case studies- descriptive
- Assure understanding by decision makers
- Less persuasive than statistical information
- To what extent the particular case is
representative - Case studies economic
- Major benefits may not be economic
- Time to see major outcomes from programs
- Attribution of benefits
79Challenges in Evaluation
- Econometric studies
- Difficult to capture all variables
- Regression analysis does not establish cause and
effect - Many assumptions required
- Attribution may be difficult
- Network analysis observation, analysis of
social/organizational behavior and related
outcomes - Will the qualitative measures be meaningful to
decision makers? - By themselves do not measure performance
80Challenges in Evaluation
- Bibliometrics counts, citation analysis,
content analysis - Popularity vs. impact?
- Bias against newer journals
- Need to publish vs. not revealing proprietary
information - All publications not of equal merit
- Ignores other outputs and long term outcomes
- Citations may not demonstrate intellectual
linkage - Historical tracing learning histories
- Disconnects make some traces difficult
- Availability of good documentation
- Revision of history by those available to relate
history - Is the project or program representative
81Challenges in Evaluation
- Expert panels, peer review
- Biases, conflicts of interest
- Getting the right people with expertise
- May tend not to get or hear minority opinions
- ST / Innovation Indicators
- Linking the measures to meaning
- How to assemble and weight factors
- Innovation Index
- Innovation is complex, changing process
- What variables should be included and how much
weight should be given at any point in time? - Demonstration of cause and effect?
- Ability to manipulate to favor one policy or
another
82General Challenges in Evaluation
- Difficult to measure research performance due to
long time lags from inputs to outcomes - Assigning value to knowledge itself
- Tracing creation of knowledge to some benefit
- Assigning value to many contributing actors
- Defining success when there are many objectives
(or expectations) - Knowing what approaches are most important for
each stakeholder - Ability to compare different studies using
different approaches - Inability to have true control studies
83Next steps
- Is standardization the answer?
- Important to have a set of evaluation tools and
continue to develop new approaches - Possibly the focus on best practices in
evaluation methods or mechanisms is the answer? - Some collaborative work is in order.
Rembrandt The Feast of Belshazzar- The Writing
on the Wall 1635