Title: Trust in MultiAgent Systems
1Trust in Multi-Agent Systems
2Agenda
- Introduction
- Trust Models
- Trust in E-Commerce
- - Reputation Mechanisms
- - Trust Revelation
3Introduction
- Trust has been studied in different fields over
- the years
- - Game Theory and Economics
- - Sociology
- - DAI
- - Risk Management
- - Biology
- There exist many definitions, points of view and
- models of trust.
4Why Trust?
-
- Source The Internet Fraud Complaint Center
5Why Trust?(contd)
-
- Source The Internet Fraud Complaint Center
6Why Trust in MAS?
- E-Commerce Credit History, Transactions
- between Buyers and Sellers
- Virtual Communities and Information Sources
- Newsgroups, forums, etc.
- Cooperation between self-interested agents
- in open environments, e.g. coalition
formation, task - delegation.
-
7Defining Trust
- Trust (or, symmetrically, distrust) is a
particular level of the subjective - probability with which an agent assesses that
another agent or group of agents will - perform a particular action, both before he can
monitor such action (or independently - of his capacity ever to be able to monitor it)
and in a context in which it affects his - own action. (Gambetta, 1990).
8Problems with the definition
- Is distrust really symmetric to trust?
- Could trust be reduced to a subjective
- probability measure?
- No reference to the dynamics of trust over
- time.
9Agenda
- Introduction
- Trust Models
- Trust in E-Commerce
- - Reputation Mechanisms
- - Trust Revelation
10Marshs Trust Model
11Definitions and Notations
- - Set of Agents
- - Situations
- for we define as a
Boolean - predicate indicating whether x knows (has
- met with) y at time t.
- Trust is separated into 3 aspects
- - Basic Trust
- - General Trust
- - Situational Trust
12Definitions and Notations(contd)
- Basic Trust - for at time t,
- representative of general trust disposition.
- General Trust given ,
- denotes the amount of trust x has in y at t.
- Situational Trust given and
- situation , denotes the amount
- of trust x has in y in at time t.
- Values of Trust are from -1,1)
13Definitions and Notations(contd)
- - The utility gained by agent x from
- situation at time t. Values from -1,1.
- - The importance of situation
- at time t for agent x. Values from 0,1.
-
14Trust and Cooperation
- Trust could be used by agent x to decide whether
- or not to cooperate with agent y in a
situation - Assumptions
- - x has a choice whether to cooperate.
- - There is someone to cooperate with (y).
- - Kx(y) is true.
- - x is not in debt to y.
- - x has knowledge of the situation.
- Under these assumptions, cooperation occurs when
- the Situational Trust of x in y is above a
certain - threshold.
15Determining Situational Trust
- To estimate Situational Trust agent x can use
-
- where is the general trust
estimation of x in y - Problems with this formula
- - Product of two negative nums leads to
positive one - - These numbers will usually be fractions
small - product.
16Estimating General Trust
- Estimation of in the Situational Trust
formula - is done after considering all possible values
for - in the past.
- Three approaches
- - Maximum Estimate Optimism
- - Minimum Estimate Pessimism
- - Pragmatism/Realism (the mean)
- How to estimate if there are too few
experiences? - Should the entire experiences be saved?
- How many experiences to take in consideration?
-
-
17The Cooperation Threshold
- As mentioned earlier
- The Cooperation threshold is computed as
follows - The idea behind the formula is that the more
- important a situation is, the more we need to
trust - someone to enter into cooperation with.
-
18The Cooperation Threshold (contd)
- Problems with the formula
- - When
- - Is the addition in the formula legal?
- - When the perceived risk is high and the
denominator is - negative, cooperation will probably occur.
- There exists another formula where the
Importance - element appears in the denominator the more
- important a situation is, the faster we should
get - it done.
19Estimating the Risk
- Given a situation , 3 cases are considered
- - The agent has no knowledge or experience of
- - The agent has incomplete knowledge or
experience of - - The agent has considerable knowledge or
experience of - In the first case, the agent x can take
as - the perceived risk.
- In the second case, x could use learning to
estimate - the expected risk.
- In the thirds case, the agent simply calculates
the - expected risk.
20Estimating the Competence
- Given a situation and agent y, there are 3
cases - - The agent is not known at all -
- - The agent is known, but not in this or
similar situations. - - The agent is known and trusted in similar
situations. - In the first case, the competence is calculated
21Estimating the Competence (contd)
- In the second case, the competence is
- where B is the set of all interactions x had
with y. - In the third case, the competence is calculated
by
22Other Aspects of Trust
- Reciprocation
- - Cooperation increases trust.
- - Defection decreases trust.
- Dissemination of Trust Knowledge Social
Network - How to use trust values of other agents in
order - to estimate trust?
23Jonker and Treurs ModelThe Dynamics of Trust
BasedOn Experiences
24Overview
- Purpose of the model analyze and formalize
- the dynamics of trust in the light of
experience. - Trust can be developed over time as the
- outcome of a series of observations.
- The trusting agent makes a continued
- verification and validation of the subject of
- trust over time.
25Overview(contd)
- Two types of experiences with the subject
- of trust
- - trust-negative experience during which
- the agent looses some of its trust.
- - trust-positive experience during which
- the agent gains trust to some degree.
-
26Representations of Trust
- Qualitative using specific qualitative labels.
- Example 4 labels unconditional distrust,
- conditional distrust, conditional trust,
- unconditional trust.
- Quantitative using numbers as a
- representation. Example Trust in Marshs
- model.
-
27Formal Framework - Definitions
- E partially ordered set of experience classes.
- Examples ,- with - lt , an interval
-1,1. - E may have the following structure
- - two sets Epos and Eneg with ev1 negative and
ev2 - positive implies ev1 lt ev2.
- - 0E neutral element of E such that
-
-
28Formal Framework Definitions(contd)
- ES The set EN of experience sequences
- with . ES is partially ordered by
-
- T a partially ordered set of Trust
qualifications. - Examples a set of trust qualifications as in
the - qualitative representation, an interval -1,1.
29Formal Framework Definitions(contd)
- T may have the following structure
- - two sets Tpos and Tneg indicating positive
and - negative elements of T.
- - a neutral element 0T of T, such that
30Trust Evolution Functions
- Trust Evolution Functions are functions relating
- sequence of experiences to trust
representations. - Definition of Trust Evolution Function
- A trust evolution function is a function
-
- Let and then te(e,i) denotes the trust
- after experiences
- trust evolutions functions are ordered by
- iff for all e and i.
31Important Properties of Trust Evolution Functions
- The following are properties (in which
and - ) which could be defined
- - future independence te is future
independent if - its values only depend on the experiences in
the - past.
-
- - indistinguishable past if ek is a temporal
- permutation of fk then te(e,k)te(f,k).
-
32Properties of Trust Evolution Functions(contd)
- - positive trust extension
- - negative trust extension
- - degree of memory based on window n back
-
-
33Properties of Trust Evolution Functions(contd)
- - degree of trust dropping n
- -degree of trust gaining n
- - positive/negative trust fixation of degree n
- if for some i the trust value te(e,k) is
- maximal/minimal for all iltkltin , then
te(e,k) is - minimal maximal/minimal for all kgti.
34Trust Update Functions
- Dont use an expensive representation of all
past - experiences.
- Use experiences only to update current trust
value. - Definition A trust update function is a
function -
-
35TE and TU Functions
- Definition (trust evolution function generated
by - a trust update function)
- Let tu be a trust update function and it
initial trust value. - The trust evolution function te generated by tu
denoted - tetu,it is inductively defined by
-
- te(e,0) it
- te(e,i1) tu(ei,te(e,i))
-
36Quantitative Example
- E -1,1, T-1,1
- There is a rate inflation between 0 and 1 per
- experience step.
- The trust update function is defined to be
- gd(ev,tv) dtv (1-d)ev.
-
37Castelfranchi and FalconesTrust Model
Cognitive View of Trust
38Cognitive View of Trust
- According to the model, x trusts y about g/ -
g - is a specific world state and is an action
that - produces g.
- Thus, trust can be seen as delegation of
action/goal - from xs plan to y.
- trust should be divided into two main
components - - Internal characteristics of the trustee
- (internal trust)
- - Evaluation about the probability and the
consistence - of obstacles, opportunities, etc.
- (external trust)
39Internal Trust
- The trustor must have a theory of mind of
- the trustee complex structure of beliefs and
- goals.
- Such a structure determines a degree of
- trust and an estimation of risk.
40The Beliefs
- Competence Belief A positive evaluation of
- the trustee x believes that y is useful for
the - goal.
- Disposition Belief x should believe that y will
- actually do what x needs.
- Dependence Belief Either x depends on y or
- at least it is better for x to rely that not
to. - Fulfillment Belief x believes that g will be
- achieved.
41The Beliefs(contd)
- Willingness Belief x believes that y has
decided and - intends to do .
- Persistence Belief x should believe that y is
stable - enough in his intentions.
- Self-Confidence Belief x believes that y knows
that - y can do y is self-confident.
42Why Trust isnt just a Subjective Probability?
- After evaluating both internal and external
- trust we can estimate the probability that
- our goal is achieved subjective probability.
- So why isnt it just a subjective probability?
- The trustees mental state might change
- even though the situation remains the same.
- Trust composition produces completely different
- intervention strategies manipulating the
external - variables is different than manipulating
internal - ones.
43Agenda
- Introduction
- Trust Models
- Trust in E-Commerce
- - Reputation Mechanisms
- - Trust Revelation
44Problems of Trust in E-Commerce
- Potential buyers have no physical access to the
- product of interest, therefore susceptible to
- misrepresentation by the sellers.
- Sellers or buyers may refuse to commit the
- transaction, renegotiate the agreed price,
receive - the product and not send the money, or vice
versa. - Therefore, each party needs an accurate
estimation - of the other partys trustworthiness
over-trusting - and under-trusting lead to market
inefficiencies.
45Problems of Trust in E-Commerce(contd)
- Possible Solutions
- - Trust Learning (in case of history of
interactions). - - Reputation Mechanisms (history of
interactions not - available
to the agent). - - Trust Revelation Mechanisms
46Agenda
- Introduction
- Trust Models
- Trust in E-Commerce
- - Reputation Mechanisms
- - Trust Revelation
47Maes and Zacharias WorkTrust Management
ThroughReputation Mechanisms
48Solution by Reputation Mechanisms
- Reputation is a social quantity calculated based
on - actions by a given agent x and observations
made - by others in an embedded social network that
x - resides.
- It is conceived as a multidimensional value
- and affects the amount of trust inspired by x.
- Reputation could also be calculated for web
pages, - information sources, etc.
49Overview of Reputation Systems
- Reputation Systems could be divided into two
major - categories noncomputational reputation systems
and - computational ones.
- The Better Business Bureau (BBB) is an example
for - a noncomputational reputation system.
- It is a centralized repository of consumer and
- business alerts where records of complaints and
- consumer warnings are stored. Numerical ratings
for - business or consumer trustworthiness arent
provided
50Overview of Reputation Systems(contd)
- Computational methods cover a broad domain of
- applications rating of newsgroup postings,
- web pages, people and their expertise in
- specific areas.
- FairIsaac provides software to credit history
agencies - in order to assess the risk involved in giving
a loan - to an end consumer.
51Overview of Reputation Systems(contd)
- Yenta clusters people with common interests
- according to recommendations of users who know
- each other.
- Weaving a Web of Trust is a recommendation
system - for web pages.
- Both systems require prior existence of social
- relationships among their users.
- GroupLens is a system for rating of Usenet
articles - and presenting them to the user in a
personalized - manner.
52Overview of Reputation Systems(contd)
- Bizrate is an online shopping guide that
provides - ratings for largest 500 companies trading
online. - Two different ways for ratings collection
- - through agreement with the rated company
- - through editorial assessment of the rated
company - Scores on a scale of 1 to 5 are given in
different - categories.
53Overview of Reputation Systems(contd)
- Online auction sites like OnSale Exchange, eBay
and - Amazon auctions possess reputation systems as
well. - In OnSale only sellers are rated and their
reputation - value is calculated as the average of all their
ratings - through usage of the OnSale system.
- In Amazon, the reputation system is almost the
- same, only buyers are rated as well.
- In eBay, sellers receive 1, 0 or -1 as feedback
after - each transaction. Reputation values are
calculated - as the sum of these during last 6 months.
54Problems of Reputation Mechanisms
- Zacharia and Maes try to solve two important
- problems of reputation systems
- - Its relatively easy to change ones identity
once - the reputation value falls.
- - Existing reputation systems dont handle well
- fake transactions between friends in order to
raise - reputation values.
-
55SPORAS A Reputation Mechanism For Loosely
Connected Communities
56SPORASs Principles
- SPORAS is based the next principles
- - New users start with a minimum reputation
value - and they build up reputation during their
activity - in the system.
- - The reputation value of a user never falls
bellow - the reputation of a new user.
- - After each transaction, the reputation values
of the - involved users are updated according to
feedback - provided by the other parties, which reflect
their - trustworthiness in the latest transaction.
57SPORASs Principles(contd)
- - Two users may rate each other only once for
- more than one interaction the most recently
- submitted value is used.
- - Users with very high reputation values
experience - much smaller rating changes after each update.
- - Ratings are discounted over time such that the
- most recent ratings have more weight in the
- evaluation of the users reputation.
-
58SPORASs Algorithm
- New users start with reputation values of 0 and
can - advance up to D3000.
- Reputation ratings Wi vary from 0.1 for terrible
to 1 - for perfect.
- At time i the reputation value is updated by
- - effective number of ratings
59SPORASs Algorithm(contd)
- The parameter s is the accelerator factor of the
- the dumping function
- Its purpose is to slow the changes for very
reputable - users.
60SPORASs Algorithm(contd)
- It can be shown that the values of Ri are always
- between 0 and D, so that users have no
incentive to - change their identities.
- Fake transactions have smaller effect here
because - is proportional to
- The equation for Ri is actually a machine
learning - algorithm which guarantees that Ri will
asymptotically - converge to the actual R. The speed of the
- convergence is determined by the learning
factor
61Reliability of the Reputation Values
- Reliability is measured by the reputation
deviation - RD of the estimated reputations.
- A high RD means either that the user has not
been - active enough or that his behavior has a lot of
- variation.
- Ignoring the dumping function, RD can be
calculated -
-
- constant forgetting factor
62HISTOS Reputation Mechanism for Highly Connected
Communities
63HISTOSs Principles
- Based on the assumption that one trusts his
friends - more than he trusts strangers.
- HISTOS is therefore more personalized compared
to - SPORAS.
- Pairwise ratings are represented as a directed
- graph. Nodes represent users and weighted edges
- represent the most recent reputation given.
- If there exists a path between two nodes A and
AL, - then A can compute a more personalized
reputation - value for AL
64HISTOSs Algorithm
- Say that A0 queries the system about the
reputation value of AL. - The system uses BFS to find all directed paths
- connecting A0 to AL that are of length equal or
less - than N.
- If no path exists between these nodes,
reputation - value is calculated by SPORAS Algorithm.
-
65HISTOSs Algorithm(contd)
- For each user Ak(n) at a distance of n from A0
with - mk(n) connected paths between A0 and Ak, his
- reputation is calculated by
-
- - rating of user Ak by Aj.
-
66HISTOSs Algorithm Example
67Agenda
- Introduction
- Trust Models
- Trust in E-Commerce
- - Reputation Mechanisms
- - Trust Revelation
68Motivation for Trust Revelation Mechanism
- Trust assessment remains a serious practical
- problem because
- - Trust assessment requires long-term
interaction - and is usually costly for the learning
agent. - - Reputation Mechanisms have a lot problems
- fake transactions, false identities,
interoperability - and portability of reputation databases,
etc. - - The process of trust learning seldom
produces - complete and accurate estimates.
69What is a Trust Revelation Mechanism?
- Based on the assumption that each agent can have
- a more accurate estimate of his own
trustworthiness. - Each agent reveals his own trust estimate to the
- others before the transaction begins.
- We have to design the mechanism so that
malicious - agents have no interest to declare higher trust
- estimates than they really have
(incentive-compatible - mechanism).
70Contracting With Uncertain Level Of Trust
- In their work, Sandholm and Braynov study
- the impact of trust estimates and beliefs on
market - efficiency and negotiation.
- They prove that under certain conditions, market
- efficiency is achieved.
- They suggest an incentive-compatible Trust
- Revelation mechanism that satisfies these
conditions.
71The Contracting Problem
- Trust is assumed to be a bilateral relation that
- involves an entity manifesting trust trustor
and an - entity being trusted trustee.
- Trust model in this work is according to
Gambettas - definition the trustor depends on the trustee
for - some favorable event E controlled by the
trustee. - The trustee behaves favorably with probability
- We consider a bilateral negotiation involving a
buyer - and a seller.
72The Contracting Problem(contd)
- The seller is assumed to be completely
trustworthy. - The buyers trustworthiness may vary
- Ethe buyer pays with P(E)
- Both buyer and seller are uncertain about
- is the estimate of the seller.
- is the estimate of the buyer.
- At this stage is assumed to be declared
truthfully - and , are considered to be common
knowledge. -
73The Contracting Problem(contd)
- Let q be the quantity which is produced and
sold. - Let C(q) be the sellers cost function and V(q)
be - the buyers value function. Both are assumed to
be - non-negative.
- The contract price is denoted by P, then
utilities are - defined by
74The Contracting Problem(contd)
- Agents assumed to negotiate the transaction
terms - using Nash bargaining solution which in our
case is - Once the contract price has been decided, q is
- chosen so as to maximize the utility functions
of both - agents.
- The buyer is assumed to be undertrusted, i.e.,
- This is the more common situation.
-
75Undertrusting
- Proposition 1 The general solution of the Nash
- bargaining problem in our case is given by
- Proposition 2 For a given contract price P the
seller - and the buyer prefer the same quantity.
76Undertrusting(contd)
- Proposition 3 If , then the quantity
exchanged - q1 maximizes V(q)-C(q) and
- Proposition 4 If the value function V(q)
satisfies the - following conditions
-
- and if the cost function is increasing and
convex - then q1 is the
maximal output.
77Undertrusting(contd)
- Proposition 5 When trust matches
trustworthiness - , the seller and the buyer both
maximize their - individual utility functions.
- Conclusion undertrusting leads to an
inefficiency in - the resource allocation. The sellers accuracy
in - estimating the buyers trustworthiness results
in - maximization of social welfare, the quantity
produced - and the agents utility functions.
78Undertrusting - Example
79Undertrusing Example(contd)
- If then q3, P13.59, and
- If and then q2,
P11.46, - and
- Therefore, lack of trust reduces the quantity
- exchanged and the utility of each agent.
80Improving Trustworthiness By Advance Payments
- One possible way of a distrusted buyer to
convince - the seller of his trustworthiness is to pay
the seller - in advance P0 before the commodity is
delivered. - Proposition 6 If and the agents
choose an - an advance payment contract, then the amount of
- advance payment is
-
81Improving Trustworthiness By Advance
Payments(contd)
- Proposition 6(contd) the quantity exchanged q1
- maximizes the function , the
amount of - contract payment is zero (P0), and
82Improving Trustworthiness By Advance
Payments(contd)
- Proposition 7 The advance payment contract also
- gives each agent higher utility than the
uncertain - payment contract If q1 and q2 are exchanged in
- advance and uncertain payments respectively,
and if - then
-
83Advance Payments vs. Uncertain Payments
- Uncertain payment contract is optimal when trust
- matches trustworthiness.
- When trustworthiness is underestimated, the
- advance payment contract is optimal.
- In both, social welfare is maximized and the
maximal - output is produced.
- The only difference the distribution of the
welfare. - In uncertain payment contract its divided
equally. - In advance payment contract, it is divided as
84Incentive Compatible Negotiation Mechanism Under
Uncertain Trust Level
- Now the buyer doesnt necessarily declare his
- estimate but .
- and are assumed to be common knowledge.
- Proposition 8 If , then it is not
beneficial for - the buyer to declare ,
- What happens if the buyer declares higher values
- of trustworthiness?
85Incentive Compatible Negotiation Mechanism
(contd)
- If the buyer declares he could gain
from it
86Incentive Compatible Negotiation Mechanism
(contd)
- In the case of uncertain payment contracts, such
- a manipulation depends on the value of as
well - as on the value function and the cost function.
- Advance payment contracts are also not
- incentive-compatible since the utility received
by the - buyer is proportional to his declared
trustworthiness. - Conclusion the symmetric Nash bargaining
solution - cannot guarantee that the buyer will truthfully
reveal - the level of his trustworthiness.
87Incentive Compatible Negotiation Mechanism
(contd)
- Lets look at the following nonsymmetric Nash
- function
- F-contract is a contract in which the price, P
- maximizes the function F, the quantity produced
q, - and the agents utility functions.
- Every F-contract can be either an uncertain
payment - contract or an advance payment contract.
88Incentive Compatible Negotiation Mechanism
(contd)
- Proposition 9 If the agents choose an advance
- payment F-contract, then according to the Nash
- bargaining solution the amount of advance
payment - is , the quantity exchanged, q1,
- maximizes V(q)-C(q), the amount of contract
- payment is zero (P0) and
89Incentive Compatible Negotiation Mechanism
(contd)
- Proposition 10 In an advance payment
F-contract, - the buyer cannot benefit by revealing a false
level of - trustworthiness.
- Proof Since doesnt depend on , the
buyer - has no incentive to declare underestimated or
- overestimated value of his trustworthiness.
90Bibliography
- 1.Trust Management Through Reputation Mechanisms,
- 2000, Zacharia and Maes.
- 2.Formalising Trust as a Computational Concept
- (PhD Thesis), 1994, Marsh.
- 3.Formal Analysis of Models for the Dynamics of
Trust - Based on Experiences, 1999, Jonker and Treur.
91Bibliography(contd)
- 4. Trust is Much More than Subjective
Probability - Mental Components and Sources of Trust, 2000,
- Castelfranchi and Falcone.
- 5. Trust Revelation in MA interaction, 2002,
- Braynov and Sandholm.