Ethics and Technology - Herman T. Tavani

Ethics and Technology

Controversies, Questions, and Strategies for Ethical Computing
Buch | Softcover
400 Seiten
2023 | 5th edition
John Wiley & Sons Inc (Verlag)
978-1-119-23975-8 (ISBN)
101,60 inkl. MwSt
Ethics and Technology, 5th Edition, by Herman Tavani introduces students to issues and controversies that comprise the relatively new field of cyberethics. This text examines a wide range of cyberethics issues--from specific issues of moral responsibility that directly affect computer and information technology (IT) professionals to broader social and ethical concerns that affect each of us in our day-to-day lives. The 5th edition shows how modern day controversies created by emerging technologies can be analyzed from the perspective of standard ethical concepts and theories.

PREFACE xvii

New to the Fifth Edition xviii

Audience and Scope xix

Organization and Structure of the Book xx

The Web Site for Ethics and Technology xxii

A Note to Students xxiii

Note to Instructors: A Roadmap for Using This Book xxiii

A Note to Computer Science Instructors xxiv

Acknowledgments xxv

FOREWORD xxvii

CHAPTER 1

Introduction to Cyberethics: Concepts, Perspectives, and Methodological Frameworks 1

Scenario 1–1: Hacking into the Mobile Phones of Celebrities 1

1.1 Defining Key Terms: Cyberethics and Cybertechnology 2

1.1.1 What Is Cybertechnology? 3

1.1.2 Why the Term Cyberethics? 3

1.2 The Cyberethics Evolution: Four Developmental Phases in Cybertechnology 4

1.3 Are Cyberethics Issues Unique Ethical Issues? 7

Scenario 1–2: Developing the Code for a Computerized Weapon System 8

Scenario 1–3: Digital Piracy 8

1.3.1 Distinguishing between Unique Technological Features and Unique Ethical Issues 9

1.3.2 An Alternative Strategy for Analyzing the Debate about the Uniqueness of Cyberethics Issues 10

1.3.3 A Policy Vacuum in Duplicating Computer Software 10

1.4 Cyberethics as a Branch of Applied Ethics: Three Distinct Perspectives 12

1.4.1 Perspective #1: Cyberethics as a Field of Professional Ethics 12

1.4.2 Perspective #2: Cyberethics as a Field of Philosophical Ethics 14

1.4.3 Perspective #3: Cyberethics as a Field of Sociological/Descriptive Ethics 16

Scenario 1–4: The Impact of Technology X on the Pleasantville Community 17

1.5 A Comprehensive Cyberethics Methodology 19

1.5.1 A “Disclosive” Method for Cyberethics 19

1.5.2 An Interdisciplinary and Multilevel Method for Analyzing Cyberethics Issues 21

1.6 A Comprehensive Strategy for Approaching Cyberethics Issues 21

1.7 Chapter Summary 22

Review Questions 23

Discussion Questions 23

Scenarios for Analysis 23

Endnotes 24

References 25

Further Readings 26

Online Resources 26

CHAPTER 2

Ethical Concepts And Ethical Theories: Frameworks For Analyzing Moral Issues 27

Scenario 2–1: The Case of the “Runaway Trolley”: A Classic Moral Dilemma 27

2.1 Ethics and Morality 29

2.1.1 What Is Morality? 29

2.1.2 The Study of Morality: Three Distinct Approaches for Evaluating and Justifying the Rules Comprising a Moral System 32

2.2 Discussion Stoppers as Roadblocks to Moral Discourse 35

2.2.1 Discussion Stopper #1: People Disagree on Solutions to Moral Issues 36

2.2.2 Discussion Stopper #2: Who Am I to Judge Others? 37

2.2.3 Discussion Stopper #3: Morality Is Simply a Private Matter 39

2.2.4 Discussion Stopper #4: Morality Is Simply a Matter for Individual Cultures to Decide 40

Scenario 2–2: The Price of Defending Moral Relativism 41

2.3 Why Do We Need Ethical Theories? 43

2.4 Consequence‐Based Ethical Theories 44

2.4.1 Act Utilitarianism 46

Scenario 2–3: A Controversial Policy in Newmerica 46

2.4.2 Rule Utilitarianism 46

2.5 Duty‐Based Ethical Theories 47

2.5.1 Rule Deontology 48

Scenario 2–4: Making an Exception for Oneself 48

2.5.2 Act Deontology 49

Scenario 2–5: A Dilemma Involving Conflicting Duties 50

2.6 Contract‐Based Ethical Theories 51

2.6.1 Some Criticisms of Contract‐Based Theories 52

2.6.2 Rights‐Based Contract Theories 53

2.7 Character‐Based Ethical Theories 54

2.7.1 Being a Moral Person vs. Following Moral Rules 54

2.7.2 Acquiring the “Correct” Habits 55

2.8 Integrating Aspects of Classical Ethical Theories into a Single Comprehensive Theory 56

2.8.1 Moor’s Just‐Consequentialist Theory and Its Application to Cybertechnology 57

2.8.2 Key Elements in Moor’s Just‐Consequentialist Framework 58

2.9 Chapter Summary 59

Review Questions 59

Discussion Questions 60

Scenarios for Analysis 60

Endnotes 61

References 61

Further Readings 62

CHAPTER 3

Critical Reasoning Skills for Evaluating Disputes in Cyberethics 63

SCENARIO 3–1: Reasoning About Whether to Download Software from “Sharester” 63

3.1 What Is Critical Reasoning? 64

3.1.1 Some Basic Concepts: (Logical) Arguments and Claims 64

3.1.2 The Role of Arguments 65

3.1.3 The Basic Structure of an Argument 65

3.2 Constructing an Argument 67

3.3 Valid Arguments 68

3.4 Sound Arguments 71

3.5 Invalid Arguments 73

3.6 Inductive Arguments 74

3.7 Fallacious Arguments 75

3.8 A Seven‐Step Strategy for Evaluating Arguments 77

3.9 Identifying Some Common Fallacies 79

3.9.1 Ad Hominem Argument 79

3.9.2 Slippery Slope Argument 80

3.9.3 Fallacy of Appeal to Authority 80

3.9.4 False Cause Fallacy 81

3.9.5 Fallacy of Composition/Fallacy of Division 81

3.9.6 Fallacy of Ambiguity/Equivocation 82

3.9.7 The False Dichotomy/Either–Or Fallacy/All‐or‐Nothing Fallacy 82

3.9.8 The Virtuality Fallacy 83

3.10 Chapter Summary 84

Review Questions 84

Discussion Questions 85

Scenarios for Analysis 85

Endnotes 85

References 86

Further Readings 86

CHAPTER 4

Professional Ethics, Codes of Conduct, and Moral Responsibility 87

Scenario 4–1: Fatalities Involving the Oerlikon GDF‐005 Robotic Cannon 87

4.1 What Is Professional Ethics? 88

4.1.1 What Is a Profession? 89

4.1.2 Who Is a Professional? 89

4.1.3 Who Is a Computer/IT Professional? 90

4.2 Do Computer/IT Professionals Have Any Special Moral Responsibilities? 90

4.3 Professional Codes of Ethics and Codes of Conduct 91

4.3.1 The Purpose of Professional Codes 92

4.3.2 Some Criticisms of Professional Codes 93

4.3.3 Defending Professional Codes 94

4.3.4 The IEEE‐CS/ACM Software Engineering Code of Ethics and Professional Practice 95

4.4 Conflicts of Professional Responsibility: Employee Loyalty and Whistle‐Blowing 97

4.4.1 Do Employees Have an Obligation of Loyalty to Employers? 97

4.4.2 Whistle‐Blowing 98

Scenario 4–2: NSA Surveillance and the Case of Edward Snowden 101

4.5 Moral Responsibility, Legal Liability, and Accountability 103

4.5.1 Distinguishing Responsibility from Liability and Accountability 104

4.5.2 Accountability and the Problem of “Many Hands” 105

Scenario 4–3: The Case of the Therac‐25 Machine 105

4.5.3 Legal Liability and Moral Accountability 106

4.6 Do Some Computer Corporations Have Special Moral Obligations? 107

4.7 Chapter Summary 108

Review Questions 109

Discussion Questions 109

Scenarios for Analysis 110

Endnotes 110

References 111

Further Readings 112

CHAPTER 5

Privacy and Cyberspace 113

Scenario 5–1: A New NSA Data Center 113

5.1 Privacy in the Digital Age: Who Is Affected and Why Should We Worry? 114

5.1.1 Whose Privacy Is Threatened by Cybertechnology? 115

5.1.2 Are Any Privacy Concerns Generated by Cybertechnology Unique or Special? 115

5.2 What Is Personal Privacy? 117

5.2.1 Accessibility Privacy: Freedom from Unwarranted Intrusion 118

5.2.2 Decisional Privacy: Freedom from Interference in One’s Personal Affairs 118

5.2.3 Informational Privacy: Control over the Flow of Personal Information 118

5.2.4 A Comprehensive Account of Privacy 119

Scenario 5–2: Descriptive Privacy 119

Scenario 5–3: Normative Privacy 120

5.2.5 Privacy as “Contextual Integrity” 120

Scenario 5–4: Preserving Contextual Integrity in a University Seminar 121

5.3 Why Is Privacy Important? 121

5.3.1 Is Privacy an Intrinsic Value? 122

5.3.2 Privacy as a Social Value 123

5.4 Gathering Personal Data: Surveillance, Recording, and Tracking Techniques 123

5.4.1 “Dataveillance” Techniques 124

5.4.2 Internet Cookies 124

5.4.3 RFID Technology 125

5.4.4 Cybertechnology and Government Surveillance 126

5.5 Analyzing Personal Data: Big Data, Data Mining, and Web Mining 127

5.5.1 Big Data: What, Exactly, Is It, and Why Does It Threaten Privacy? 128

5.5.2 Data Mining and Personal Privacy 128

Scenario 5–5: Data Mining at the XYZ Credit Union 129

5.5.3 Web Mining: Analyzing Personal Data Acquired from Our Interactions Online 132

5.6 Protecting Personal Privacy in Public Space 132

5.6.1 PPI vs. NPI 133

Scenario 5–6: Shopping at SuperMart 133

Scenario 5–7: Shopping at Nile.com 134

5.6.2 Search Engines and the Disclosure of Personal Information 135

5.7 Privacy Legislation and Industry Self‐Regulation 137

5.7.1 Industry Self‐Regulation and Privacy‐Enhancing Tools 137

5.7.2 Privacy Laws and Data Protection Principles 139

5.8 A Right to “Be Forgotten” (or to “Erasure”) in the Digital Age 140

Scenario 5–8: An Arrest for an Underage Drinking Incident 20 Years Ago 141

5.8.1 Arguments Opposing RTBF 142

5.8.2 Arguments Defending RTBF 143

5.8.3 Establishing “Appropriate” Criteria 144

5.9 Chapter Summary 146

Review Questions 146

Discussion Questions 147

Scenarios for Analysis 148

Endnotes 148

References 149

Further Readings 150

CHAPTER 6

Security in Cyberspace 151

Scenario 6–1: The “Olympic Games” Operation and the Stuxnet Worm 151

6.1 Security in the Context of Cybertechnology 152

6.1.1 Cybersecurity as Related to Cybercrime 153

6.1.2 Security and Privacy: Some Similarities and Some Differences 153

6.2 Three Categories of Cybersecurity 154

6.2.1 Data Security: Confidentiality, Integrity, and Availability of Information 155

6.2.2 System Security: Viruses, Worms, and Malware 156

6.2.3 Network Security: Protecting our Infrastructure 156

Scenario 6–2: The “GhostNet” Controversy 157

6.3 Cloud Computing and Security 158

6.3.1 Deployment and Service/Delivery Models for the Cloud 158

6.3.2 Securing User Data Residing in the Cloud 159

6.3.3 Assessing Risk in the Cloud and in the Context of Cybersecurity 160

6.4 Hacking and “The Hacker Ethic” 160

6.4.1 What Is “The Hacker Ethic”? 161

6.4.2 Are Computer Break‐ins Ever Ethically Justifiable? 163

6.5 Cyberterrorism 164

6.5.1 Cyberterrorism vs. Hacktivism 165

Scenario 6–3: Anonymous and the “Operation Payback” Attack 166

6.5.2 Cybertechnology and Terrorist Organizations 167

6.6 Information Warfare (IW) 167

6.6.1 Information Warfare vs. Conventional Warfare 167

6.6.2 Potential Consequences for Nations that Engage in IW 168

6.7 Chapter Summary 170

Review Questions 170

Discussion Questions 171

Scenarios for Analysis 171

Endnotes 171

References 172

Further Readings 174

CHAPTER 7

Cybercrime and Cyber‐Related Crimes 175

Scenario 7–1: Creating a Fake Facebook Account to Catch Criminals 175

7.1 Cybercrimes and Cybercriminals 177

7.1.1 Background Events: A Brief Sketch 177

7.1.2 A Typical Cybercriminal 178

7.2 Hacking, Cracking, and Counter Hacking 178

7.2.1 Hacking vs. Cracking 179

7.2.2 Active Defense Hacking: Can Acts of “Hacking Back” or Counter Hacking Ever Be Morally Justified? 179

7.3 Defining Cybercrime 180

7.3.1 Determining the Criteria 181

7.3.2 A Preliminary Definition of Cybercrime 181

7.3.3 Framing a Coherent and Comprehensive Definition of Cybercrime 182

7.4 Three Categories of Cybercrime: Piracy, Trespass, and Vandalism in Cyberspace 183

7.5 Cyber‐Related Crimes 184

7.5.1 Some Examples of Cyber‐Exacerbated vs. Cyber‐Assisted Crimes 184

7.5.2 Identity Theft 185

7.6 Technologies and Tools for Combating Cybercrime 187

7.6.1 Biometric Technologies 187

7.6.2 Keystroke‐Monitoring Software and Packet‐Sniffing Programs 188

7.7 Programs and Techniques Designed to Combat Cybercrime in the United States 189

7.7.1 Entrapment and “Sting” Operations to Catch Internet Pedophiles 189

Scenario 7–2: Entrapment on the Internet 189

7.7.2 Enhanced Government Surveillance Techniques and the Patriot Act 189

7.8 National and International Laws to Combat Cybercrime 190

7.8.1 The Problem of Jurisdiction in Cyberspace 190

Scenario 7–3: A Virtual Casino 191

Scenario 7–4: Prosecuting a Computer Corporation in Multiple Countries 192

7.8.2 Some International Laws and Conventions Affecting Cybercrime 192

Scenario 7–5: The Pirate Bay Web Site 193

7.9 Cybercrime and the Free Press: The Wikileaks Controversy 193

7.9.1 Are WikiLeaks’ Practices Ethical? 194

7.9.2 Are WikiLeaks’ Practices Criminal? 194

7.9.3 WikiLeaks and the Free Press 195

7.10 Chapter Summary 196

Review Questions 197

Discussion Questions 197

Scenarios for Analysis 198

Endnotes 199

References 199

Further Readings 200

CHAPTER 8

Intellectual Property Disputes in Cyberspace 201

Scenario 8–1: Streaming Music Online 201

8.1 What Is Intellectual Property? 202

8.1.1 Intellectual Objects 203

8.1.2 Why Protect Intellectual Objects? 203

8.1.3 Software as Intellectual Property 204

8.1.4 Evaluating a Popular Argument Used by the Software Industry to Show Why It Is Morally Wrong to Copy Proprietary Software 205

8.2 Copyright Law and Digital Media 206

8.2.1 The Evolution of Copyright Law in the United States 206

8.2.2 The Fair‐Use and First‐Sale Provisions of Copyright Law 207

8.2.3 Software Piracy as Copyright Infringement 208

8.2.4 Napster and the Ongoing Battles over Sharing Digital Music 209

8.3 Patents, Trademarks, and Trade Secrets 212

8.3.1 Patent Protections 212

8.3.2 Trademarks 213

8.3.3 Trade Secrets 214

8.4 Jurisdictional Issues Involving Intellectual Property Laws 214

8.5 Philosophical Foundations for Intellectual Property Rights 215

8.5.1 The Labor Theory of Property 215

Scenario 8–2: DEF Corporation vs. XYZ Inc. 216

8.5.2 The Utilitarian Theory of Property 216

Scenario 8–3: Sam’s e‐Book Reader Add‐on Device 217

8.5.3 The Personality Theory of Property 217

Scenario 8–4: Angela’s B++ Programming Tool 218

8.6 The “Free Software” and “Open Source” Movements 219

8.6.1 GNU and the Free Software Foundation 219

8.6.2 The “Open Source Software” Movement: OSS vs. FSF 220

8.7 The “Common Good” Approach: An Alternative Framework for Analyzing the Intellectual Property Debate 221

8.7.1 Information Wants to be Shared vs. Information Wants to be Free 223

8.7.2 Preserving the Information Commons 225

8.7.3 The Fate of the Information Commons: Could the Public Domain of Ideas Eventually Disappear? 226

8.7.4 The Creative Commons 227

8.8 Pipa, Sopa, and Rwa Legislation: Current Battlegrounds in the Intellectual Property War 228

8.8.1 The PIPA and SOPA Battles 228

8.8.2 RWA and Public Access to Health‐Related Information 229

Scenario 8–5: Elsevier Press and “The Cost of Knowledge” Boycott 229

8.8.3 Intellectual Property Battles in the Near Future 231

8.9 Chapter Summary 231

Review Questions 231

Discussion Questions 232

Scenarios for Analysis 232

Endnotes 233

References 234

Further Readings 235

CHAPTER 9

Regulating Commerce and Speech in Cyberspace 236

Scenario 9–1: Anonymous and the Ku Klux Klan 236

9.1 Introduction and Background Issues: Some Key Questions and Critical Distinctions Affecting Internet Regulation 237

9.1.1 Is Cyberspace a Medium or a Place? 238

9.1.2 Two Categories of Cyberspace Regulation: Regulating Content and Regulating Process 239

9.1.3 Four Modes of Regulation: The Lessig Model 240

9.2 Digital Rights Management (Drm) 242

9.2.1 Some Implications of DRM for Public Policy Debates Affecting Copyright Law 242

9.2.2 DRM and the Music Industry 243

Scenario 9–2: The Sony Rootkit Controversy 243

9.3 E‐Mail Spam 244

9.3.1 Defining Spam 244

9.3.2 Why Is Spam Morally Objectionable? 245

9.4 Free Speech vs. Censorship and Content Control in Cyberspace 246

9.4.1 Protecting Free Speech 247

9.4.2 Defining Censorship 247

9.5 Pornography in Cyberspace 248

9.5.1 Interpreting “Community Standards” in Cyberspace 248

9.5.2 Internet Pornography Laws and Protecting Children Online 249

9.5.3 Virtual Child Pornography 250

9.5.4 Sexting and Its Implications for Current Child Pornography Laws 252

Scenario 9–3: A Sexting Incident Involving Greensburg Salem High School 252

9.6 Hate Speech and Speech that Can Cause Physical Harm to Others 254

9.6.1 Hate Speech on the Web 254

9.6.2 Online “Speech” that Can Cause Physical Harm to Others 255

9.7 “Network Neutrality” and the Future of Internet Regulation 256

9.7.1 Defining Network Neutrality 256

9.7.2 Some Arguments Advanced by Net Neutrality’s Proponents and Opponents 257

9.7.3 Future Implications for the Net Neutrality Debate 257

9.8 Chapter Summary 258

Review Questions 259

Discussion Questions 259

Scenarios for Analysis 260

Endnotes 260

References 261

Further Readings 262

CHAPTER 10

The Digital Divide, Democracy, and Work 263

Scenario 10–1: Digital Devices, Social Media, Democracy, and the “Arab Spring” 264

10.1 The Digital Divide 265

10.1.1 The Global Digital Divide 265

10.1.2 The Digital Divide within Nations 266

Scenario 10–2: Providing In‐Home Internet Service for Public School Students 267

10.1.3 Is the Digital Divide an Ethical Issue? 268

10.2 Cybertechnology and the Disabled 270

10.3 Cybertechnology and Race 271

10.3.1 Internet Usage Patterns 272

10.3.2 Racism and the Internet 272

10.4 Cybertechnology and Gender 273

10.4.1 Access to High‐Technology Jobs 274

10.4.2 Gender Bias in Software Design and Video Games 275

10.5 Cybertechnology, Democracy, and Demotratic Ideals 276

10.5.1 Has Cybertechnology Enhanced or Threatened Democracy? 276

10.5.2 How has Cybertechnology Affected Political Elections in Democratic Nations? 279

10.6 The Transformation and the Quality of Work 280

10.6.1 Job Displacement and the Transformed Workplace 281

10.6.2 The Quality of Work Life in the Digital Era 283

Scenario 10–3: Employee Monitoring and the Case of Ontario vs. Quon 284

10.7 Chapter Summary 287

Review Questions 287

Discussion Questions 288

Scenarios for Analysis 288

Endnotes 289

References 289

Further Readings 291

CHAPTER 11

Online Communities, Virtual Reality, and Artificial Intelligence 292

Scenario 11–1: Ralph’s Online Friends and Artificial Companions 292

11.1 Online Communities and Social Networking Services 293

11.1.1 Online Communities vs. Traditional Communities 294

11.1.2 Blogs and Some Controversial Aspects of the Bogosphere 295

Scenario 11–2: “The Washingtonienne” Blogger 295

11.1.3 Some Pros and Cons of SNSs (and Other Online Communities) 296

Scenario 11–3: A Suicide Resulting from Deception on MySpace 298

11.2 Virtual Environments and Virtual Reality 299

11.2.1 What Is Virtual Reality (VR)? 300

11.2.2 Ethical Aspects of VR Applications 301

11.3 Artificial Intelligence (AI) 305

11.3.1 What Is AI? A Brief Overview 305

11.3.2 The Turing Test and John Searle’s “Chinese Room” Argument 306

11.3.3 Cyborgs and Human–Machine Relationships 307

11.4 Extending Moral Consideration to AI Entities 310

Scenario 11–4: Artificial Children 310

11.4.1 Determining Which Kinds of Beings/Entities Deserve Moral Consideration 310

11.4.2 Moral Patients vs. Moral Agents 311

11.5 Chapter Summary 312

Review Questions 313

Discussion Questions 313

Scenarios for Analysis 313

Endnotes 314

References 315

Further Readings 316

CHAPTER 12

Ethical Aspects of Emerging and Converging Technologies 317

Scenario 12–1: When “Things” Communicate with One Another 317

12.1 Converging Technologies and Technological Convergence 318

12.2 Ambient Intelligence (AmI) and Ubiquitous Computing 319

12.2.1 Pervasive Computing, Ubiquitous Communication, and Intelligent User Interfaces 320

12.2.2 Ethical and Social Aspects of AmI 321

Scenario 12–2: E. M. Forster’s “(Pre)Cautionary Tale” 322

Scenario 12–3: Jeremy Bentham’s “Panopticon/Inspection House” (Thought Experiment) 323

12.3 Nanotechnology and Nanocomputing 324

12.3.1 Nanotechnology: A Brief Overview 324

12.3.2 Ethical Issues in Nanotechnology and Nanocomputing 326

12.4 Autonomous Machines 329

12.4.1 What Is an AM? 329

12.4.2 Some Ethical and Philosophical Questions Pertaining to AMs 332

12.5 Machine Ethics and Moral Machines 336

12.5.1 What Is Machine Ethics? 336

12.5.2 Designing Moral Machines 337

12.6 A “Dynamic” Ethical Framework for Guiding Research in New and Emerging Technologies 340

12.6.1 Is an ELSI‐Like Model Adequate for New/Emerging Technologies? 340

12.6.2 A “Dynamic Ethics” Model 341

12.7 Chapter Summary 341

Review Questions 342

Discussion Questions 342

Scenarios for Analysis 343

Endnotes 343

References 344

Further Readings 346

GLOSSARY 347

INDEX 353

Erscheinungsdatum
Verlagsort New York
Sprache englisch
Maße 201 x 252 mm
Gewicht 658 g
Themenwelt Mathematik / Informatik Informatik Theorie / Studium
Wirtschaft Betriebswirtschaft / Management Unternehmensführung / Management
ISBN-10 1-119-23975-3 / 1119239753
ISBN-13 978-1-119-23975-8 / 9781119239758
Zustand Neuware
Haben Sie eine Frage zum Produkt?
Mehr entdecken
aus dem Bereich
was jeder über Informatik wissen sollte

von Timm Eichstädt; Stefan Spieker

Buch | Softcover (2024)
Springer Vieweg (Verlag)
37,99
Grundlagen – Anwendungen – Perspektiven

von Matthias Homeister

Buch | Softcover (2022)
Springer Vieweg (Verlag)
34,99
Eine Einführung in die Systemtheorie

von Margot Berghaus

Buch | Softcover (2022)
UTB (Verlag)
25,00