Machiavellian Intelligence and the Evolution of Cooperation - PowerPoint PPT Presentation

About This Presentation
Title:

Machiavellian Intelligence and the Evolution of Cooperation

Description:

... high level of Mistrust ... lies (average PC = .25) below their own Mistrust thresholds. ... Downward pressure on mistrust; by generation 5350 mean PC was. ... – PowerPoint PPT presentation

Number of Views:97
Avg rating:3.0/5.0
Slides: 29
Provided by: lynxLetH
Category:

less

Transcript and Presenter's Notes

Title: Machiavellian Intelligence and the Evolution of Cooperation


1
Machiavellian Intelligence and the Evolution of
Cooperation
  • Metternich, on hearing that Talleyrand had died
    What does he mean by that?

2
  • Collaborators.
  • Nicholas Allen, Psychologist, University of
    Melbourne
  • James Hanley, Political Scientist, Workshop for
    Political Theory and Policy Analysis, Indiana
    University
  • Jason Hartwig, Political Scientist, University of
    Oregon
  • Tomonori Morikawa, Political Scientist, Center
    for International Education, Waseda University.
  • John Orbell, Political Scientist, University of
    Oregon.

3
  • I will
  • Show how cooperative dispositions can emerge and
    be sustained as a product of cognitive evolution
    on Machiavellian intelligenceindependent of kin
    selection, reciprocity, and group selection
  • Work within PD paradigm, but with players having
    alternative ways of making a living beyond such
    games.

4
Questions that cognitively well designed social
animals must be able to answer rapidly and
accurately in an ecology of diverse social
games
  • 1. What game is being playedor offeredin this
    encounter?
  • What are the stakes in the game?
  • What resources and intentions do others bring to
    the game?
  • What resources and intentions do I bring to the
    game?
  • What payoff can I expect from playing this game
    with this partner?
  • What are the alternatives to playing this game
    with this partner?

5
In short, as Schmitt Grammerer (1997) put it
Costs and benefits have to be assessed, risks
and chances inferred, both in the case of failure
and success, both for ego and others. The
others mind has to be read, his/her knowledge,
intentions and behavioral capabilities have to be
considered. Finally, the most promising tactic
has to be selected from a host of available
procedures.  
6
This predicts.
  1. (Presumably) modular capacities for Mindreading
    and Manipulation where

...the difference is that mind-reading involves
exploiting the victims behavior as it
spontaneously emerges from the victim, while
manipulation involves actively changing the
victims behavior. Krebs
and Dawkins 1999
2. Mechanisms for calculating what is the most
adaptive choice to makebetween and within games
7
Basic structure of the simulation
  • Some number of individuals encounter each other.
  • When they do, they must decide among.
  • Playing a potentially cooperative (but risky) PD
    game.
  • Choosing some other way of making a living
  • 3. And where each individual chooses between PD
    and NP, based on an expected value (EV)
    calculation.
  • Accumulated wealth determines whether they
    reproduce at the end of a generation, or suffer
    genotypic death
  • 4. Agents have a Probability of Cooperation
    (PC)the probability with which the it cooperates
    in a given PD game

8
BUT information about others intentions is
needed.and
One of the most important things to realize
about systems of animal communication is that
they are not systems for the dissemination of
truth. An animal selected to signal to another
animal may be selected to convey correct
information, misinformation, or both
Robert Trivers (1985)
  • 5. agents can lie ( manipulate) with some
    success, but can also penetrate lies
    (mindread), meaning that they are equipped with
    appropriate cognitive capacities
  • that are transmitted from parent to offspring,
    subject to mutation

9
  • 6. Mutation happens .
  • with values magnitude and frequency on
    interval-level variables from which the
    probability variables are constructed
  • For example
  • Mindreading plus positive capacities to read
    others intentionsan integer above zero
  • Mindread minus negative capacities to read
    others intentionsan integer above zero
  • And the proportion Mindread Mindread plus
    /(Mindread plus Mindread minus)

7. There is a carrying capacity constraint,
with those whose accumulated wealth places them
below that constraint dying without reproducing
10
  • 8. Agents each play the role of sender and
    receiver of messages about their cooperative
    dispositions. We assume
  • Agents always convey the message I will always
    cooperateviz, my PC is 1.0
  • 9. Their actual PC will (normally) be short of
    that. Thus
  • Agents send 100 bits of I will always
    cooperate messages, some of which are (normally)
    false.
  • And
  • Their true and false bits will vary in the
    believability that the agent can muster
  • ....with believability defined as a
    probability between 0.0 and 1.0

11
Mean believability of senders truths
Mean believability of senders lies
Messages believed
Messages not believed
0.0
1.0
  • Example
  • Sender has modest PC, truths somewhat more
    believable than lies
  • Receiver has quite high level of Mistrust
  • Before Mindreading, neither true nor false
    messages believed

12
NOW ADDING SENDERS MINDREADING.. .which makes
Senders false messages less believable for
receivers, and their true messages more
believable Mindreading is a proportion between
0.0 and 1.0with 0.0 accepting messages (true and
false) as sent, and 1.0 recognizing and accepting
all true messages, and recognizing and rejecting
all false messages.
13
Mean believability of lies after mindreading
Mean believability of truths after mindreading
0.0
1.0
  • .after mindreading when (e.g.) the receiver has
    .7 mindreading capacity
  • Now a slight majority of true bits are above
    receivers mistrust threshold, thus are believed
    by receiver
  • Those few accepted bits now the basis for
    receivers EV calculation

14
Higher PC at equilibrium
More unstable spikes
Spikes to cooperative equilibrium frequent
s d-d zero c-c
t
Low PC dominates, society dies
No Play dominates
Value of the NP alternative
15
DEVELOPING THIS We ran 90 simulations, using the
parameters Free riding 15 Mutual cooperation
5 Mutual defection -5 Suckers payoff -15 And
with ten simulations run with, in each case, ALT
set at .5 intervals between 0 and 5 (the mutual
payoff value).
16
Table 1 Cooperative transitions within the
parameter range 0 lt ALT lt c. (Based on ten
simulations for each value of ALT)  
Cooperative transitions across 0 lt ALT lt Mutual
Coop (Mutual coop 5 100 runs of the
simulation, ten at each .5 value until 20,000th
generation)
Where EV of a PD exactly ALT
 
17
(No Transcript)
18
(No Transcript)
19
(No Transcript)
20
(No Transcript)
21
(No Transcript)
22
  • In this case
  • One high PC high mindreading genealogy
    drifted to 1.0, its successive members
    reproducing by rejecting others offers of PD
    playand living off the solitary NP 4 payoff.
  • For the previous 20 generations, members were
    offered PD play on 77 of encountersviz, when
    mindreading pushed the 100 true messages sent
    by members above receivers Mistrust thresholds.
  • In general, offers of PD play increased with an
    agents PC..
  • BUT

23
TABLE 2 Multiple regression beta weights
Proportion of offered PD games acceptedby
mistrust, mindreading and PC. (100 generations
prior to the cooperative transition)  
    R2 .557 P lt.001 N 4978
observations are on 50 agents in each of the 100
generations missing cases had zero offers.
24
  • all of those approaches were rejected by the
    members own high Mindreadingwhich pushed the
    many lies (average PC .25) below their own
    Mistrust thresholds.
  • But these two come into contact with each other,
    recognize each other as a good bet for a PD (EV gt
    ALT), and play with each other
  • Within fifteen generations, their descendants had
    eliminated the less cooperative agents
  • and were now competing with each other for
    slots in the ecologyproducing
  • Downward pressure on mistrust by generation
    5350 mean PC was .33 by comparison with .5 of the
    two founding parents.

25
Generally, the higher MUTUAL COOPERATION is
relative to ALTERNATIVE, the lower the
equilibrium level of cooperation, after the
TRANSITION
WHY? The flypaper theorem
  • OPTIMUM LEVEL OF COOPERATION (given an
    alternative way of making a living) will be
  • HIGH ENOUGH TO ATTRACT COMPETENT MINDREADERS INTO
    PLAYING PD GAMES
  • BUT
  • LOW ENOUGH TO MAXIMIZE GAINS FROM DEFECTION IN
    JOINED PD GAMES

26
Conclusions (1)
  • The availability of an alternative way of making
    a livingwithin appropriate parametersmakes it
    possible for mutant cooperators to avoid being
    exploited in a nasty world
  • But for them TO stay alive, they must have also
    fairly high capacity for Mindreading, and that
    originated in the dangers of low PC agents
    greater willingness to offer PD games
  • And mindreading maintains the cooperative
    equilibrium from invasion from low PC types.

27
Conclusions (2)
In the ancestral past, cooperative dispositions
evolved to their highest levels when the payoff
from mutual cooperation is closest to
alternative, but does not exceed it
28
  • Conclusions (3) in particular, for Political
    Scientists who argue about rationality
  • We can distinguish between
  • Rationality in action, and
  • Rationality in design.
  • This analysis suggests that a rational design
    for highly social animals such as ourselves
    involves (1) well developed capacity for
    mindreading (2) modest levels of mistrust, and
    (3) quite high cooperative dispositions
  • NOTE that rationality in design thus resolves
    many of the empirical anomalies of rationality
    in action
Write a Comment
User Comments (0)
About PowerShow.com