1 / 18

When a Good Reputation isn’t Good Enough

When a Good Reputation isn’t Good Enough. Jonathan Traupman Robert Wilensky U.C. Berkeley. Introduction. Reputation systems are a key component of many peer-to-peer systems Lots of application specific features, but most share a common structure Aggregates feedback about transactions

Download Presentation

When a Good Reputation isn’t Good Enough

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. When a Good Reputation isn’t Good Enough Jonathan Traupman Robert Wilensky U.C. Berkeley

  2. Introduction • Reputation systems are a key component of many peer-to-peer systems • Lots of application specific features, but most share a common structure • Aggregates feedback about transactions • Seems to create trust from thin air • Anecdotal evidence suggests reputation systems work pretty well

  3. Evaluation is difficult • E.g. peer-to-peer markets • Many trust signals besides reputation • Market and payment provider offer some indemnity • Difficult to separate out the role of the reputation system • Cannot realistically experiment with alternative reputation systems

  4. Some Questions • In the absence of external forces, is a reputation system sufficient for encouraging cooperation? • Under what conditions does stable cooperation arise? • How well or poorly do existing reputation systems meet these requirements? • How can we design systems to better encourage cooperation?

  5. A Game-theoretic model • Model the trading process as a series of simple games • Interaction game: agents decide whether or not to trade • Transaction game: agents decide to cooperate or defect • Reputation game: agents decide how to leave feedback • Observe what strategies are optimal under different conditions

  6. Interaction game • Simultaneous, perfect information • Played repeatedly until a pair willing to interact is found • No direct payoffs • Small penalty for failing too often

  7. Transaction Game • Simultaneous, perfect information • Agents choose whether to cooperate or defect • Payoffs based on both agents’ behavior • Instance of the Prisoners’ Dilemma

  8. Reputation Game • Mixed game • Perfect information • First move simultaneous • Subsequent moves sequential • No direct payoff, but outcome influences reputation • “Tragedy of the Commons” • Honest reputations benefit the community • Individuals benefit from dishonesty or apathy

  9. Repeatedly play the three games Periodically evaluate agent performance Mean payoff per transaction Keep successful agents “Breed” new agents by combining parameters of successful parents Interaction parameters New user interactivity Low-exp. interactivity High-exp. Interactivity Transaction parameters Honesty Reputation parameter 1st negative rate 1st positive rate Retaliation rate Evolutionary Simulator

  10. Experiments • All experiments performed 20 times • Unconstrained evolution • Reputation system modeled after Percent Positive Feedback (PPF) on eBay • Retaliation prohibited • Retaliation rate parameter forced to zero • Simultaneous feedback • Retaliation rendered impossible by forcing agents to leave feedback blind

  11. Unconstrained Evolution • Most similar to current market conditions • None of the markets were able to maintain cooperation • Moderate (~50%) retaliation rate • Retaliation caused all agents to hesitate leaving feedback first • Dysfunctional reputation system permits defection to emerge as the optimal strategy

  12. Unconstrained Evolution

  13. Disabled Reputation • Knock out the retaliation parameter • 14 of 20 markets remained cooperative for 10,000 generations • 2 oscillated • 4 remained uncooperative • Much higher participation in the reputation system • Lack of direct incentives for honest feedback allowed agent apathy to prevent cooperation

  14. Disabled Retaliation • Results improve further if we force the 1st negative rate to 1.0 • Retaliation clearly is an obstacle to cooperation

  15. Simultaneous Feedback • Disabling retaliation outright is not possible in a real marketplace • Simultaneous feedback is a common suggestion • Modify the reputation game to be simultaneous rather than sequential • Can’t retaliate if you don’t know the feedback you’re getting • Still other ways to game the system, but a good first step

  16. Simultaneous Feedback • 19 of 20 markets oscillated • One remained non-cooperative throughout • Agent apathy remains a problem • In highly cooperative markets, agents get lazy about leaving, using feedback • Permits defectors to gain a foothold • Eventually, cooperation is restored

  17. Conclusions • Under the right conditions, a simple reputation system like PPF can maintain cooperation • Users must participate frequently and honestly • As currently implemented, PPF cannot maintain cooperation on its own • Permits retaliation • Does nothing to prevent apathy • Basically confirms intuitions about reputation systems • Provides a better theoretical and experimental foundation for these arguments

  18. Conclusions • Provides some guidelines for designing better reputation systems • Must prohibit retaliation and other means of gaming the reputation system • Should create incentives for honest participation to combat user apathy

More Related