1 / 38

Human Values in Information System Design

Human Values in Information System Design. Batya Friedman Associate Professor The Information School. Overview. Values in Information Technology Value Sensitive Design Project 1. The Watcher & The Watched: Social Judgments about Privacy in a Public Place

holland
Download Presentation

Human Values in Information System Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Human Values in Information System Design Batya Friedman Associate Professor The Information School © Batya Friedman 2003

  2. Overview • Values in Information Technology • Value Sensitive Design • Project 1. The Watcher & The Watched: Social Judgments about Privacy in a Public Place • Project 2. Cookies, Informed Consent, and Web Browsers • Propositions for Technology, Values, and the Justice System © Batya Friedman 2003

  3. Faculty Alan Borning, Ph.D. (CSE) Sybil Carrère, Ph.D. (Nursing) Peter H. Kahn, Jr., Ph.D. (Psych.) David Notkin, Ph.D. (CSE) Zoran Popevic, Ph.D. (CSE) Paul Waddell, Ph.D. (Urban Pla.) Research Staff AJ Brush, Ph.D. Brian Gill, Ph.D. Students Ph.D. Nathan Freier (iSchool) Erika Feldman (Psych.) Janet Davis (CSE) Irene Alexander (Eng.) Masters Nicole Gustine Rachel Severson Current UW Collaborators:Faculty, Staff, and Students Undergrads Brandon Rich (recent grad) Jonathan Sabo Scott Santens Robin Sodeman Anna Stolyar © Batya Friedman 2003

  4. Cornell University Lynette I. Millett New York University HelenNissenbaum, Ph.D. Daniel C. Howe Princeton University Edward Felten, Ph.D. Purdue University Alan Beck, Ph.D. Gail Melson, Ph.D. Nancy Edwards, Ph.D. Brian Gilbert Trace Roberts Rivendel Con. & Design Austin Henderson, Ph.D. (consul.) Collaborators and Consultantsfrom Other Institutions © Batya Friedman 2003

  5. Values in Information Technology • Numerous, strong, and complex interactions between technology and enduring human values • Examples (from among many possibilities): • Privacy • Trust • Accountability • Ownership and property • Freedom from bias • Human welfare (safety, psych. well-being) • Universal usability • Environmental sustainability © Batya Friedman 2003

  6. Goals forValue Sensitive Design • Be proactive: Integrate the consideration of human values with design work (as opposed to providing an outside critique) • Develop a design methodology that is principled and comprehensive • This is a very hard problem, and VSD is certainly not the only possible approach – but we believe it provides a useful methodology © Batya Friedman 2003

  7. Interactional Perspective • Value Sensitive Design is an interactional theory • In general, we don’t view values as inherent in a given technology • However, we also don’t view a technology as value-neutral • Rather, some technologies are more suitable than others for supporting given values • Investigating these value suitabilities (along with what values and whose values) is a key task of VSD © Batya Friedman 2003

  8. VSD’s Tripartite Methodology • Conceptual investigations • Philosophically informed analyses of the values and value conflicts involved in the system • Technical investigations • Identify existing or develop new technical mechanisms; investigate their suitability to support or not support the values we wish to further • Empirical investigations • Using techniques from the social sciences, investigate issues such as: Who are the stakeholders? Which values are important to them? How do they prioritize these values? • These are applied iteratively and integratively © Batya Friedman 2003

  9. Direct and Indirect Stakeholders • Direct stakeholders: Interact with the system being designed and its outputs • Indirect stakeholders: Don’t interact directly with the system, but are affected by it in significant ways © Batya Friedman 2003

  10. Two Projects Investigating Privacy in a Public Place • In a Physical Place: The Watcher and The Watched • In a Virtual Place: Cookies, Informed Consent, and Web Browsers © Batya Friedman 2003

  11. Privacy in Public(Warren and Brandeis, Harvard Educational Review, 1890) “[While in earlier times], the state of the photographic art was such that one’s picture could seldom be taken without his consciously ‘sitting’ for the purpose, the law of contract or of trust might afford the prudent man sufficient safeguards against the improper circulation of his portrait; but since the latest advances in photographic art have rendered it possible to take pictures surreptitiously, the doctrines of contract and of trust are inadequate to support the required protection” (p. 179). © Batya Friedman 2003

  12. Some EmpiricalFindings on Privacy • On the universal side, the empirical evidence points to the existence of and functional need for some form of privacy in all societies studied to date (Cf. Harris et al, 1995; Roberts & Gregor, 1971; Westin, 1984) • On the side of variation, the manifestations, regulations, and mechanisms of privacy vary widely across cultures (Cf. Biggs, 1970; Moore, 1984; Murphy, 1964) • Recent Privacy Polls in the United States: [1] Equifax-Harris; [2] Harris; [3] Privacy and American Business © Batya Friedman 2003

  13. An Inside Office © Batya Friedman 2003

  14. Room with a View –The Laboratory Experiment (Peter Kahn, Batya Friedman, Jennifer Hagman, Sybil Carrère)(N = 90; 30 university students in each condition) © Batya Friedman 2003

  15. Room with a View –The Three Conditions Real Window Blank Wall HDTV Real-Time Plasma Display © Batya Friedman 2003

  16. The Watcher and The Watched: Social Judgments about Privacy in a Public Place(Batya Friedman, Peter Kahn, Jennifer Hagman, 2004)(Surveys: N = 750; Interviews: N= 120) The Watcher TheWatched The Camera © Batya Friedman 2003

  17. W 30 W 30 W 30 w 30 w 750 Int. Int. Int. Int. Sur. M F M F M F M F M F 0 13 13 13 7 27 13 33 --- --- 13 40 13 27 13 33 20 33 17 31 20 27 20 27 47 47 20 20 17 27 17 31 17 27 0 27 27 40 7 40 27 33 19 28 0 27 13 20 0 7 13 27 21 35 0 27 0 53 40 67 47 60 20 47 37 59 19 19 28 28 0 13 21 21 35 35 33 67 27 47 0 53 33 67 35 57 0 20 47 67 27 47 7 47 33 53 32 52 37 37 59 59 27 0 7 40 40 73 53 80 27 53 34 55 35 35 57 57 27 7 0 47 33 73 47 73 40 47 33 54 32 32 52 52 7 27 34 34 55 55 0 40 33 33 54 54 © Batya Friedman 2003

  18. Why Do People Hold These Views? • For “all right” evaluations (on average): • Personal Interest (31%) • Functionality (31%) • Social Expectations (24%) • For “not all right” evaluations (on average): • Functionality (34%) • Social Expectations (30%) • Human Welfare/Safety (25%) • Privacy (29%) • Informed Consent (38%) © Batya Friedman 2003

  19. Why Do People Hold These Views? • For “all right” evaluations (on average): • Personal Interest (31%) • Functionality (31%) • Social Expectations (24%) • For “not all right” evaluations (on average): • Functionality (34%; Watcher: women 53%, men 0%) • Social Expectations (30%) • Human Welfare/Safety (25%) • Privacy (29%; Watched:women 16%; men 37%) • Informed Consent (38%; Watcher: women 61%, men 0%) © Batya Friedman 2003

  20. Summary of Key Findings (1) • More women were concerned then men. Women were more likely then men to be concerned about the HDTV and display of real-time images. • Women’s concerns less context sensitive. Men in the position of power (“The Watchers”) tended to be less concerned then men in the vulnerable position (“The Watched”). Strikingly, nearly identical percentages of women expressed concern, independent of context – “Watcher” or “Watched”. (Cf. Asch, 1952; Milgram, 1963) • Privacy in public is a multi-faceted issue. Participants expressed a range of reasons for their judgments, with a more diverse set used to support “not all right” evaluations. (Cf. Schoeman, 1984) © Batya Friedman 2003

  21. Summary of Key Findings (2) • Research Method: Indirect Stakeholders. The Value Sensitive Design methodology positioned this research to identify the concerns of indirect stakeholders (“The Watched”) and, within those, that of women. • Research Method: Ecological Validity. The results also demonstrate the need to study people’s social judgments in the context of technologies in use (as opposed to “what if” scenarios…). © Batya Friedman 2003

  22. Project 2 – Privacy in Public in a Virtual Place Cookies, Informed Consent, and Web Browsers © Batya Friedman 2003

  23. Cookies and Informed Consentin Web Browsers(Batya Friedman, Edward Felten, Lyn Millett, Daniel Howe, 2000, 2001, 2002) • Conceptual Investigation • What do we mean by informed consent online? • Technical Investigations • A retrospective study (Netscape Navigator and Internet Explorer, 1995 – 1999) • Redesign of the Mozilla browser • Empirical Investigation • Formative evaluation of redesign work • Traditional usability • Value-oriented features © Batya Friedman 2003

  24. © Batya Friedman 2003

  25. © Batya Friedman 2003

  26. © Batya Friedman 2003

  27. © Batya Friedman 2003

  28. © Batya Friedman 2003

  29. Summary: Value Sensitive Design’s Constellation of Features (1) • Proactive: seeks to influence the design of technology throughout the design process. • Enlarges the arena in which values arise to include not only the work place but also education, the home, commerce, online communities, the justice system, and public life. • Integrative methodology that includes conceptual, empirical, and technical investigations. • Enlarges the scope of human values beyond those of cooperation and participation and democracy to include all values, especially those with moral import. © Batya Friedman 2003

  30. Summary: Value Sensitive Design’s Constellation of Features (2) • Identifies and takes seriously both direct and indirect stakeholders • Interactional theory: values are viewed neither as inscribed in technology, nor as simply transmitted by social forces • Builds from the psychological position that certain values are universally held, but that how such values play out in particular cultures will vary widely (abstract vs. concrete or act-based conceptualizations) • Distinguishes between usability and human values with ethical import © Batya Friedman 2003

  31. Technology, Values & the Justice System:Proposition I(Friedman & Nissenbaum, 1996; Friedman, Kahn, & Borning, 2002) We can’t anticipate all the value consequences of designing and deploying a particular information technology. • Use “best practices” but don’t demand perfection • Design systems with the expectation that they will need to be adapted over time © Batya Friedman 2003

  32. Technology, Values & the Justice System:Proposition II With respect to privacy, historically the bulk of our protections have come from the difficulty and cost of accessing and manipulating information. • When we introduce a technology that enhances access to information, we can expect it to unbalance privacy checks within the social fabric. • The justice system may need to reintroduce a reasonable balance. The use of technology may assist the justice system in achieving that balance. © Batya Friedman 2003

  33. Technology, Values & the Justice System:Proposition III Informed consent can be a useful tool for creating the conditions in which a balance between privacy and access can flourish. • Consent implies the existence of an • “on/off” switch © Batya Friedman 2003

  34. Technology, Values & the Justice System:Proposition IV Defaults matter. Most people don’t change the default settings on their machines. • The justice system has a good deal at stake in ensuring that technologists get the defaults “right” the first time © Batya Friedman 2003

  35. Technology, Values & the Justice System:Proposition V Opt in? Or opt out? (Tied to defaults.) © Batya Friedman 2003

  36. Technology, Values & the Justice System:Proposition VI Visible? Invisible? (This is about surreptitious data collection.) © Batya Friedman 2003

  37. Technology, Values & the Justice System:Proposition VII Mobile Data (In the form of ubiquitous computing, location sensing, context aware computing, RFD tags…) © Batya Friedman 2003

  38. THE END © Batya Friedman 2003

More Related