1 / 38

Decision Making Manual: A Toolkit for Making Moral Decisions

Decision Making Manual: A Toolkit for Making Moral Decisions. William J. Frey (UPRM) José A. Cruz-Cruz (UPRM) Chuck Huff (St. Olaf). There is an analogy between design problems and ethical problems. Decision-Making in Business. Rational Choice Method: Textbook (Lawrence and Weber)

morse
Download Presentation

Decision Making Manual: A Toolkit for Making Moral Decisions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Decision Making Manual: A Toolkit for Making Moral Decisions William J. Frey (UPRM) José A. Cruz-Cruz (UPRM) Chuck Huff (St. Olaf)

  2. There is an analogy between design problems and ethical problems

  3. Decision-Making in Business • Rational Choice Method: Textbook (Lawrence and Weber) • Issue Management Process (32) • Identify Issue • Analyze Issue • Generate Options • Take Action • Evaluate Results • Evaluating and ranking given results • Like a multiple choice test—choose the best answer from the given options

  4. Problem-solving in computing can be modeled on software design • The software development cycle can be presented in terms of four stages: • Problem Specification • Solution Generation • Solution Testing • Solution Implementation • Generate or create options that embody or realize ethical value or worth • We don’t find them, we make them

  5. The Difference between choice and problem-solving? • In choice, one chooses among existing options by applying different frameworks such as ethical frameworks (Text 86) • Virtues: An action is ethical when it aligns with good character • Utilitarian: An action is ethical when net benefits exceed net costs • Rights: An action is ethical when basic human rights are respected • Justice: An action is ethical when benefits and costs are fairly distributed

  6. Problem Solving • We do not find a solution but create one • We do not evaluate existing choices in terms of standards • Instead we use the standards to guide the imagination in brainstorming and designing solutions that respond concretely to the situation in question

  7. Problem Solving Specifying the Problem

  8. Prepare a Socio-Technical System (STS) table • “an intellectual tool to help us recognize patterns in the way technology is used and produced” • Components: Hardware, Software, Physical Surroundings, Stakeholders (people, groups, & roles), Procedures, Laws (Criminal Law, Civil Law, Statutes & Regulations), Information Systems (collecting, storing, transferring) • Other Components: Financial Markets, Rate Structure (Power Systems), Environment, Technological Context, Supply Chain • A STS is a system. The components are related and interact. • STSs embody values • Moral: Justice, Respect, Responsibility, Trust, Integrity • Non-Moral: Financial, Efficiency, Sustainability • STSs exhibit trajectories i.e., coordinated paths of change

  9. 1. Identify key components of the STS

  10. Steps to a STS analysis • 1. Describe the software and hardware components of the socio-technical system. • 2. Describe the physical environment(s) in the socio-technical system. • 3. Who are the key stakeholders in the STS? What are their stakes? What are their roles? • 4. Outline important procedures in the STS? Describe them step-by-step. • 5. Describe the legal environment of the STS including general branches of the law, key laws, important regulations, and statutes. • 6. How is information collected, stored, and disseminated in the STS?

  11. More Steps • Are there any value conflicts in the STS? • What are the values? • Where are they located? (Under which components?) • Moral versus Moral? Moral versus Non-Moral? Non-moral versus Non-moral? • Are there any value vulnerabilities in the STS • What values are under threat? • Where are they located? (Under which components?) • Are there any harms latent or potential in the STS? • What is their magnitude? (Catastrophic or trivial?) • What is their likelihood or probability • What is the trajectory or path of change in the STS? • Is it positive or negative?

  12. Technology: Hardware and Software • Technology includes hardware, software, designs, prototypes, products, or services. • Hardware in classroom 236 includes computer, data display projector, smart board. • Software would include PowerPoint, Word (Microsoft features) as well as software that runs smart board. • Technologies enable and constrain action • PowerPoint allows for picturing ideas • But it tempts us to read off the slides instead of talking with our audience • 1. Describe the software and hardware components of the socio-technical system

  13. Physical Surroundings • Describe the physical environment in a given STS • Geography, weather, landscape, flora, fauna, cities, forests, etc. • Physical surroundings embed values. • Fixed tables in 236 constrain group work while enabling listening. • 236 clearly separates presenting (active) area from listening (passive) area. • Doors, by their weight, strength, material, size, and attachments (such as locks) can promote values such as security. • Physical surroundings promote, maintain, or diminish other values in that they can permit or deny access, facilitate or hinder speech, promote privacy or transparency, isolate or disseminate property, and promote equality or privilege. • Think about how the mountains in the center of Puerto Rico have shaped the community life of the pueblos located there • 2. Describe the physical environment(s) in the socio-technical system

  14. People, Groups, and Roles • These are often characterized as “constituyentes” or stakeholders • Any group or person with an essential interest at risk (at stake) in the situation at hand • Teachers, administrators, technical staff, and, of course, students are stakeholders of 236 • What are their interests? • What are their roles? • How does these interest and role sets interact? • 3. Who are the key stakeholders in the STS? What are their stakes? What are their roles?

  15. Procedures • Relevant procedures to room 236 • Setting up the class (turning on the equipment, setting the lights, positioning the podium, etc.) • Lecturing, discussing, taking exams, reading, are all pedagogical procedures that take place in room 236 • Procedures can promote values or they can display value vulnerabilities • Recording grades online may place privacy in jeopardy if the storage place is not protected by encryption. • A grievance procedure may undermine due process rights if there are unreasonable time delays between steps in the procedure • 4. Outline important procedures in the STS? Describe them step-by-step.

  16. Laws, Statutes, and Regulations • Class attendance is required in the University’s Reglemento; faculty are required to hold a certain number of office hours per credit hour of teaching • Business (such as restaurants) are subject to safety regulations (OSHA), food preparation regulations (FDA), and legal regulations governing employee treatment (overtime requirements, minimum wage, maximum hours of work per week, minimum age requirements) • How these differ from one nation to another creates many problems for multinational corporations • Corporations are considered legal persons and have certain legal rights such as free commercial and non-commercial speech • Bankruptcy laws and legally mandated procedures • Legal requirement to form a bankruptcy committee composed of representative of creditors • 5. Describe the legal environment of the STS including general branches of the law, key laws, important regulations, and statutes.

  17. Information gathering, storage, and dissemination • Key issue in businesses that rely on computer technology • How does a company gather information on market conditions, suppliers, customers, competitors? • How does it store this information (In file cabinets behind locked doors?) • How does it transfer this information, to whom, and on what occasions? • Toysmart guaranteed that it would not transfer customer financial information to third parties. This promise aided it in collecting this information. They promised secure (say encrypted) methods for storing this information • 6. How is information collected, stored, and disseminated in the STS?

  18. STS and Problem Specification • Having a good STS description, highlighting its key values, and locating these values in components of the STS provide set the stage for problem specification • Identifying value conflicts, potential or latent harms, and negative trajectories in STS document and validate problem specification • In many cases, specifying the problem in reference to the STS also outlines potential solutions • A security vulnerability can be addressed by holistically adjusting elements of the STS; strengthening physical barriers, securing data, refining procedures and lobbying for new laws could represent adjustments to strengthen the security of a STS

  19. Classify the problem: • Disagreement on Facts • Did the supervisor sexually harass the employee? (What happened—there are two different versions) • Disagreement on Concepts • Has the supervisor created a hostile environment? (Meaning of hostile environment?) • Conflicts • Conflict between moral values (Toysmart either honors property claims of creditors or privacy rights of customers) • Conflicts between moral and non-moral values (In order to get the chips to clients on time, LaRue has told the quality control team to skip environmental tests and falsify results) • A key value becomes vulnerable • Online activity has magnified the potential harms of cyberslander against companies like Biomatrix • Immediate, Midterm, or Remote Harms • Is it the case that Therac-25 patients are receiving radiation overdoses?

  20. Table summarizing problem classification (With Generic Solutions)

  21. Problem Solving Solution Generation

  22. Solution Generation • Don’t fall into the dilemma trap • Assumption that all ethical problems in business offer only two solution forms: do the right thing financially or do the right thing ethically • Brainstorm • Do exercises to unlock creative thought • Start with an individual list • Share your list with others while suspending criticism • Once you have a preliminary list (set a quota) refine it • Eliminate solutions that are impractical • Combine solutions (one is part of another; one is plan A, the other plan B) • Test solutions globally and quickly to trim them down to a manageable list

  23. Use more than one frame when generating solutions • How would an engineer specify the problem? • How would a lawyer specify the problem? • How would a manager characterize the problem? • How would a politician specify the problem? • How would a financial expert or economist specify the problem? • Try to integrate these different framings.

  24. Refined Solution List

  25. Generic Solutions (For every occasion) • Gather more information • NoloContendere • Be diplomatic. Negotiate with the different parties. Look for a “win-win” solution • Oppose. Stand up to authority. Organize opposition. Document and publicize the wrong • Exit (Get a transfer. Look for another job. Live to fight another time) • Organize these as plans A, B, C, etc. (Try one, then the other if the first doesn’t work.)

  26. Solution Testing Reversibility, Harm/Benefits, Publicity

  27. Test Solutions • Develop a solution evaluation matrix • Test the ethical implications of each solution • See if the solution violates the code • Carry out a global feasibility assessment of the solution. • What are the situational constraints? • Will these constraints block implementation?

  28. Solution Evaluation Matrix

  29. Reversibility • Does the action still look good when viewed from the standpoint of key stakeholders? • Agent projects into standpoint of those targeted by the action and views it through their eyes • Avoid extremes of too little and too much identification with stakeholder (go beyond your egocentric standpoint but don’t become lost in the perspective of the other)

  30. Harm / Benefits • What are the likely harms and benefits that will follow from the action under consideration? • What is their magnitude and range? • How are they distributed? • Which alternative produces the most benefits coupled with the least harms? • Avoid too much (trying to factor in all consequences) and too little (leaving out significant consequences)

  31. Publicity Test • What are the values embedded in the action you are considering? • Is it responsible or irresponsible? Just or unfair? Respectful or disrespectful? • Would you want to be publically associated with this action given the values it embodies? • People would view you as responsible, just, or respectful; irresponsible, unjust (biased?), disrespectful

  32. Code of Ethics Test • How does the action accord with your profession’s or company’s code of ethics? • How does the action accord with the key values professed by your company or profession?

  33. Solution Implementation Will it work given the background constraints?

  34. A Feasibility Test—Will it Work? • Restate your global feasibility analysis • Are there resource constraints? • Are these fixed or negotiable? • Are there technical or manufacturing constraints? • Are these fixed or negotiable? • Are there interest constraints? • Are these fixed or negotiable?

  35. Feasibility Matrix

  36. What if there are major constraints? • Try out what Westin calls the “intermediate impossible” (Practical Companion, 38) • Take your ethically, financially, technically ideal solution • Test its feasibility. If it is lacking… • Modify it as little as possible until it becomes feasible. Then implement the “intermediate impossible.”

  37. Final Considerations • Has your problem shifted? • Check over your refined solution list and your final solution. Sometimes the process moves from one problem to another. If so, re-specify your problem given what you have learned. • Have you opened all possible doors to solving your problem? • Multiple framings. Resisting dilemma trap

  38. Some Readings • Anthony Weston. (2002). A Practical Companion to Ethics: Second Edition. Oxford, UK: Oxford University Press. • Weston has several excellent suggestions for brainstorming solutions to ethical problems. He also discusses how to avoid the dilemma trap. • Good Computing. (Book under development through Jones and Bartlett) (Huff, Frey, Cruz) • The manuscript describes the four-stage software development cycle that is used as a model here for problem-solving. • Carolyn Whitbeck. (1998). Ethics in engineering practice and research. Cambridge, UK: Cambridge University Press. • Whitbeck provides an illuminating discussion of the analogy between ethics and design problems.

More Related