Complexity as a Methodology, Point of View, Theory - PowerPoint PPT Presentation

About This Presentation
Title:

Complexity as a Methodology, Point of View, Theory

Description:

Complexity as a Methodology, Point of View, Theory Bruce Kogut EIASM and Oxford University June 2006 What Complexity Seems to Mean In Practice Interdisciplinary ... – PowerPoint PPT presentation

Number of Views:136
Avg rating:3.0/5.0
Slides: 39
Provided by: BKo4
Learn more at: https://www.eiasm.org
Category:

less

Transcript and Presenter's Notes

Title: Complexity as a Methodology, Point of View, Theory


1
Complexity as a Methodology, Point of View,
Theory
  • Bruce Kogut
  • EIASM and Oxford University
  • June 2006

2
What Complexity Seems to Mean In Practice
  • Interdisciplinary sharing of knowledge and
    creating a larger community of scholarship.
  • Appreciation of the a-linear view of the world
  • The importance of events for triggering change.
  • Analyzing the statistical properties of large
    datasets
  • Understanding local interactions by micro-rules
    whose effects depend on topology (structure) but
    whose interpretations rely upon contextual
    knowledge.
  • More attention to what Elster and Hedstrom call
    mechanisms as opposed to causes

3
Research Strategies 1. Some Old one, 2. Some
opportunistic ones, 3. Some new ones.
  1. Greco-Latin Squares to Charles Ragins
    comparative methods
  2. Borrowed simulation structures and topologies
  3. Graph dynamics relying on new estimation
    techniques

4
Example of (1) Old Method Rethought
  • In economics and management, we would like to
    determine the complementarities, or interactions,
    that compose best practices to improve
    performance.
  • Economics gave us an elegant analysis of
    complementarities but poor methods.
  • The empirical problem of complementarities is
    saturating an experimental design. (This is
    identical to the theory of monotone comparative
    statics the power set of combinations has to be
    tested for its effect on performance.)

5
Example of (2) a useful opportunistic strategy
is the NK model applied to complementarities
  • Consider a NK model in which a technological
    landscape is hardwired (the number of nodes is
    given, they are connected, k-the interactions-are
    given but N and K can be varied. Fitness
    values are randomly assigned to nodes and hence
    to their combinations).
  • Random boolean nets have been useful in
    biochemistry in which there are rules of and,
    or, not and. (However, from genes to
    phenotypic expression, there are many things that
    intervene RNA, proteins. And fitness can be
    endogenous my fitness can depend on how fit is
    your fitness in a given space.)
  • We know a priori the central results from
    simulations in other fields there are finite
    multiple optima that for some k (such as k2) has
    known expected values. We also know much about
    search time to optima (this has received a lot of
    attention in science and little in social
    science is the rate by which we have gotten to
    where we are explainable?.
  • The apparatus sneaks in a language long jumps,
    landscapes, iterations, that imply firms are
    engaged in search over a technological terrain
    that awaits to be discovered. (This is not social
    construction.)
  • Unfortunately, it is hard to feed data to the
    model.

6
Example of (3) is the application of graphs to
understanding data
  • We now have a much appreciation that static
    representations of networks cannot easily isolate
    endogeneity (I smoke because I am weak or
    because you smoke) but more importantly cannot
    easily identify social rules.
  • We have though a better understanding even for
    static graphs that some properties are consistent
    or inconsistent with important social behaviors.
  • For example, we know that the absence of a power
    law in degrees is inconsistent with preferential
    attachment. Since preferential attachment is a
    reasonable way to represent such concepts as
    prestige and reputation, a non-finding is
    important.
  • We still have a long way to go regarding
    estimations
  • our models are convenient (even if very hard
    exponential random graph models)
  • It is hard to rule out other explanations not
    specified.
  • Exciting space (for me) is the combination of
    estimation and simulation to arrive at better
    understandings of the possible interpretations.
  • Formal models will also be critical.

7
Return to Example (1) What can old methods say
to complexity?
  • Consider a question
  • What are good corporate and labor institutions
    for generating growth?
  • Some argue that there are two prototypes
  • Coordinated (e.g. Germany) and market (e.g. US)
    and each are good for growth (?)

8
Here are data from Hall and Gingerich who want to
show there are 2 best configurations for setting
policy
COUNTRY Growth Degree of wage coordination Level of wage coordination Labor turnover Shareholder power Stock market size Dispersion of control
Austria 1 1 1 1 1 1 1
Germany 1 1 1 1 1 1 1
Italy 1 1 0 1 1 1 1
Belgium 1 0 1 1 1 1 1
Norway 1 1 1 1 0 1 1
Finland 1 1 1 0 1 1 1
Portugal 1 0 1 1 1 1 1
Sweden 0 0 1 1 1 0 1
France 0 0 1 1 1 1 1
Denmark 1 1 1 0 1 1 0
Japan 1 1 1 1 0 0 0
Netherlands 0 0 1 1 1 0 1
Switzerland 0 1 1 1 1 0 0
Spain 1 0 0 0 0 1 1
Ireland 1 0 0 0 0 1 0
Australia 0 0 0 0 0 0 0
New Zealand 0 0 0 0 0 0 0
Canada 0 0 0 0 0 0 0
United Kingdom 0 0 0 0 0 0 0
United States 0 0 0 0 0 0 0
The coordination dichotomies are all coded in the
same direction, with a score of 1 signaling
conformity with coordinated market economies
and a score of 0 signaling conformity with
liberal market economies.
9
Observations on the Data
  • N of 20 countries and yet 6 variables, hence 64
    possible combinations (26).
  • Of these 64, only 15 are uniquely observed.
  • We are making inferences based on a poorly
    populated space.
  • Sparseness may reflect the operation of a
    maximizing hand that rules out inefficient (?)
    combinations.
  • It may reflect path dependency and hence the
    paths not taken (even if perhaps better).
  • It may reflect cultural preferences that rule out
    certain institutions, such as stock markets
    historically in some countries.

10
Approach One Crisp LogicWe can try to find
good causal combinations by borrowing from
electrical engineering.
Using three logical gates (join, meet, null),
what are the minimal circuits you need, or what
are the fewest elements you need to cause
performance.
Absorption A AB A Reduction AB Ab
A(Bb) A(1) A
Advantage of this method it is simple and
intuitive. Problem is that with too much
sparseness, we wont get much simplicity.
11
Solution for High Growth/Low Initial GDP per
capita, without simplifying assumptions
  1. degreewc levelwc turnover sharehld STOCKMKT
  2. DEGREEWC LEVELWC turnover SHAREHLD STOCKMKT
  3. DEGREEWC LEVELWC TURNOVER STOCKMKT DISPERSN
  4. DEGREEWC TURNOVER SHAREHLD STOCKMKT DISPERSN
  5. LEVELWC TURNOVER SHAREHLD STOCKMKT DISPERSN
  6. DEGREEWC LEVELWC TURNOVER sharehld stockmkt
    dispersn

12
Simplifying Assumptions
  • Consider the case where there are two solutions
  • ABC ABc
  • No reduction is possible.
  • If we permit two assumptions, we can achieve a
    simplification
  • Y ABC aBc ABc aBC
  • (ABC aBC) (aBc ABc)
  • (BC) (Bc)
  • B
  • This is a type of simulation, but done by
    intuition called theory on unobservables.
  • It is a theory that explicitly reduces the
    complexity It posits, lets imagine that if we
    had the data or if nature had been more
    experimental, we would indeed observe two cases
    with positive outcomes. These cases are aBc and
    aBC. Once we do this, we arrive at B.
  • This is very similar to Michael Hannans recent
    work in propositional logic in which premises are
    fed to a computer program that derives logical
    propositions. We simply say lets use nature as
    far as we can to infer propositions and then
    add in theory to derive more simple expressions.

13
Solution for Low Growth/High Initial GDP per
capita,
  • C. without simplifying assumptions
  • degreewc levelwc turnover sharehld stockmkt
    dispersn
  • DEGREEWC LEVELWC TURNOVER SHAREHLD stockmkt
    dispersn
  • degreewc LEVELWC TURNOVER SHAREHLD stockmkt
    DISPERSN
  • D. with simplifying assumptions
  • degreewc stockmkt
  • SHAREHLD stockmkt

14
Lets do better by understanding more clearly the
Limited Diversity in data
15
Logical exploration of the Not-Observed
  • 1. Reconsider the result for low growth
  • low_growth degreewcstockmkt
    SHAREHLDstockmkt
  • 2. We did not though combine our knowledge of
    what determines low growth with that for what
    determines high growth. We can do this by
  • Apply De Morgans Law by reversing the outcome,
    changing all upper-case to lower-case, and vice
    verse, and then also changing intersection to
    union, and vice versa
  • high_growth (DEGREEWC STOCKMKT)(sharehld
    STOCKMKT)
  • We have arrived now at the maximal saturation of
    our experimental design, filling in as many of
    the 64 cells that we can.

16
After maximal saturation
3. Finally, simplify the terms using Boolean
algebra hgDsh DS Ssh SS. By
absorption rule and since SS 11
1S, high_growth STOCKMKT
DEGREEWCsharehld 4. And if we are not happy
with two explanations, we can theorize what we
should observe by simplifying assumptions and
reduce further. This is a combination of an
incomplete saturated design methodology that
analyzes complex non-linear interactions by a
combination of logic, theory, and simulation
using DATA.
17
Example Two Reviewed NK model
  • NK models impose a large penalty on
    experimentation
  • Landscapes are rugged and organizations easily
    get trapped.
  • Long-jumps are random.
  • Consider Fontanas idea of Neutrality (and the
    implementation by Lobo, Fontana, and Miller)
  • Fitness is discretized into bands such that
    organizations are inert to small changes in
    fitness caused by experiments in complements.
  • However, for large changes in fitness,
    experiments can lead to adoption of new
    configurations.

18
Simulating Neutrality in the Kauffman/Levinthal
NK Model (Amit Jain Implementation and Simulation)
Four simulations are run first panel is the
standard, the next 3 vary neutrality from fitness
bands of 10, 25, 100 that is, change only if
change in fitness hits the band.
19
Under Neutrality, Organizations Discover Ridges
Between Peaks
  • Comparing results
  • 1. Fitness value is higher under neutrality (for
    these number of simulations).
  • In other words, local traps are less confining.
  • 2. More organizational forms are viable over
    the short-run.
  • 3. We believe, but are checking, to show that
    there is more exploration of possible space
  • if N10, then we have 1024 combinations. But
    with only a 100 organizations, how many
    combinations are actually explored in a period of
    time. Here we are returning to the type of
    questions how long should it take to see a
    possible universe realized?
  • 4. We still dont how, nor do we think we know
    how, to fit data to this simulation.
  • Neutrality is a reasonable concept by which to
    capture the capability of firms to learn by trial
    and error before engaging in massive retooling
    or reengineering.
  • It also captures the idea of institutions and
    institutional transplants many institutions
    can cross borders because they are neutral.

20
Example (3) Topologies, Graphs, Data, Inferences
  1. Science of the complexity should be engagement of
    theory, data, estimation, simulation, imagining
    the possible.
  2. To understand large complex systems, we need a
    lot of data.
  3. A lot of what we do uses small data sets from
    which we try to make claims about asymptotic
    significance.
  4. Alternatively, we can see social action as agent
    driven who are interacting by rules.
  5. We would like to liberate them from strong
    topological impositions (e.g. regular graphs, or
    NK landscapes) but still come to understand the
    relation of local and macro structures on
    behavior, and vice versa.

21
Analyses of large data sets Venture Capital In US
  • We know little about entrepreneurial activities
    in terms of network dynamics.
  • Many good studies on venture capital, but we have
    no global picture.
  • We have no studies on dynamics.

22
Theories on VCs
  • Two common hypotheses
  • VCs do deal to signal prestige this should lead
    to prestigious getting more rich. Graph
    prediction power law in degree.
  • VCs do deals to find complements in expertise.
    Graph prediction power law in weighted link
    strength.
  • Implication of (1) for components and clusters
  • Venture capital is clustered in geographies and
    a few prestigious companies come later to bridge
    them.
  • Implication of (2) for components and clusters
  • VC firms will seek new partners when new
    expertise is required and we will thus see
    repeated ties for investments in known areas
    and new ties for investments in new areas.
  • Thus we will have a dynamic between the
    conservational rule of relying on proven
    expertiese and the diversity rule of seeking new
    partners.

23
Deal structure
  • Over 150,000 transactions over 40 years.
  • Several thousand VC investors, targets
  • Lets start by posing a simple question
  • Do regional markets grow and then become
    integrated?
  • Or is Braudel right regions develop in relation
    to global (national) dynamics.

24
Deals distribution among Firms
25
Number of Deals
High number of deals per Firm
Technological breaks create opportunities for new
entrants.
26
National Component Grew Early and Connected
Regions and Sectors So much for Clusters
27
We do not find a power law in degrees VC
syndications dont seem to be the product of
preferential attachment
Inference by adduction the dog did not bark, the
graph does not have power law in degree, hence
the culprit of rich get richer is innocent and
released.
28
We do have Power Laws in Strength Incumbents
like to rely upon trusted partners
  • Most Deals are Incumbent to Incumbent
  • Hence we find power laws in repeated ties.
  • Trusted expertise based on experience, not
    signalling of prestige, seems to matter.

VC networks have far more repeated ties than
Guimera Uzzi et al Broadway netorks.
29
Percentage of Deals Where Local cluster is
greater than global
In other words, clusters are stronger globally
than within region or sector.
30
New Links are Formed when
  • A VC company goes to a new geography or sector
  • that is, when it needs new expertise.
  • VC firms are drawn to successful targets.

31
Conclusions to Example 3
  • What we showed
  • Looking at dynamics of graph properties rules out
    certain micro behaviors.
  • Clustering and giant component analysis confirms
    the Braudel hypothesis clusters develop in
    relation to the national graph.
  • What we did not show
  • A formal model of the choice between new and old
    tie that is the equivalent to the Simon/Barabasi
    model of preferred attachment.
  • Agent based models that test more precisely the
    micro rules employed.
  • Why not shown I dont think we have a good
    empirical model yet.
  • We are out of time.

32
Caveats
  • Social systems are harder than physical systems
    if we play only by the rules of the latter.
  • We dont ask an electron where ya been and when
    were ya dere?
  • We can ask people this question.
  • Physical systems have given topologies
  • forests are reasonably viewed as 3 dimensional
    lattices.
  • American suburbs are often 2 dimensional
    lattices, but Paris is not, and people move
    around.
  • Geographical space is not always the same as
    social space.
  • Engineers often like to get rid of people because
    the problem is hard enough.
  • Systems are most often, even today,
    socio-technical.
  • Machines and people inhabit the same graph.

33
Interactions in physical systems. High Power
Items - Jet EngineAdding in People makes this
much harder
Function is physical and cannot be
represented logically and symbolically
Modules display multiple behaviors in multiple
energy domains
Side effects are high power and cant be isolated
The design can be converted to a picture
High power
Severe back- loading
Modules are indep in design
Module behavior changes when combined into system
The picture is an incomplete abstract representat
ion of the design
Modules must be validated physically
Modules must be designed anew specifically for
their function
Interfaces must be tailored to fct
Separate module and system validation steps are
needed
Main fct carriers cant be standardized
A construction process exists that
eliminates most assembled interfaces
Systems cannot be designed with good confidence
that they will work
From Whitney, MIT.
34
A conclusion
  • Complexity is a point of view that the pursuit of
    plausibility is more rewarding than certainty.
  • Social sciences needs to move to an open science
    model, where we spend more time in projects, less
    time collecting data.
  • Simulations and estimation should be seen as part
    of the interpretative methodology to identify
    plausible mechanisms as opposed to verify causes.
  • Interactions, rules, non-saturated designs,
    simulations, estimations, graph theory are the
    words in the new vocabulary.
  • But the going will not be easy Consider Wings
    and Engines and .. People

35
Appendix If Time Permitted Extend This Method
of Experimental Design and Simulated and Real
Data to Complementarities in Manufacturing
  • Consider activity systems that describe how auto
    companies manufacture efficiently with quality,
    including the work teams and social organization.
  • Can we identify better prototypical strategies
    that are robust across settings?

36
Strategy and Prototypes
  • Consider strategy as the problem of choosing
    capabilities and markets, that is the sets C and
    M are the givens to the decision choose CxM such
    that S argmax(C,M).
  • This can rarely if ever be solved, so that people
    think heuristically instead by prototypes that
    represent the best configuration
    differentiate, cut cost, have religion.
  • This formulation is close to the theory of
    complementarities a la Milgrom and Roberts.
  • The empirical question is Can we pick out the
    best configurations from the data?

37
Fuzzy Sets
  • Data are no longer crisp.
  • Important consideration is coding and functional
    transformations.
  • Rules are set theoretic a necessary condition
    means that the outcome is a subset of the
    condition a sufficient means that the condition
    is a subset of the outcome.
  • Values are calculated for combinations using
    fuzzy set algebra. These values are compared to
    value of the outcome. If outcome value larger,
    then indicates combination/element is sufficient
    if smaller, then combination/element is necessary.

38
Benchmarking the fuzzy configuration against Data
Actual Quality
Predicted Quality
The data are 70 or so auto plants around the
world and consist of observations on teams,
technologies, work processes, scale, etc.. These
practices were analyzed to find unique
combinations of minimal practices sufficient to
achieve performance.
Write a Comment
User Comments (0)
About PowerShow.com