+ All Categories
Home > Documents > Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any...

Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any...

Date post: 13-Dec-2015
Category:
Upload: marie-merwin
View: 216 times
Download: 0 times
Share this document with a friend
43
Computers: Individual and Social Responsibility
Transcript
Page 1: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

Computers: Individual and Social Responsibility

Page 2: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

I. Introduction

• Computer ethics raises few if any fundamentally different types of ethical issues, but it does raise several issues in a new and urgent form.– Privacy

– Ownership of computer programs

– Computer crimes

– Fractured responsibility (when many people are involved)

Page 3: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

The issues also call on most of the problem-solving methods we have used

• Terminology– Cookie - small computer file sent to your computer without your

knowledge or consent.

– Algorithm - set of steps for accomplishing a task.

– Source Program - algorithm expressed in a language such as BASIC or FORTRAN.

– Object Program - a machine program which is created by a computer from a source program. The object program actuates the setting of switches which enables the computer to perform the underlying algorithm and accomplish the task.

Page 4: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

II. Computers and Privacy: A Conflict of Values

• Computer matching can violate our informational privacy (control of information about ourselves) by serving as the means for the construction of databases about our income, purchasing habits, religious political affiliations, maybe even sexual preferences.

• This is often done through “matching,” in which apparently unrelated information from several sources is put in a single data bank. For example, your credit and employment records, criminal or traffic violations, and other records can be combined into a composite picture.

• Defenders of matching say the merged files do not contain new information, but this is a case where the whole is greater than the sum of its parts.

Page 5: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

Privacy Versus Social Utility

• James Rachels argues that informational privacy is necessary in order to control the intimacy of our relationships with others (RP)

• Computers can give relative strangers the kind of information we would want only our closest friends to have (ticket for drunk driving when 20, sexual orientation, etc.)

Page 6: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

• From a rights standpoint, our rights to be presumed innocent until proven guilty should be respected. Some think computer matching violates this right. Matching identifies (among other things) people who might be guilty of wrongdoing before there is any evidence against them. Also, they have no chance to defend themselves.

• Example. Matching of federal employees with welfare rolls identified some who were taking welfare payments while still being employed by the federal government. Those identified were not informed until much later-- a violation of due process.

Page 7: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

• Yet there is much utility to collecting information:– credit cards– preventing sale of handguns to convicted felons– target marketing-- more efficient for business– identifying physicians who double bill for

medicare and medicaid services.

Page 8: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

A Creative Middle Way?

• How do we combine the utility of computer databases with an adequate protection of privacy?

• Congress passed the Privacy Act in 1974, which prohibited the executive branch from using information gathered in one program area in an entirely different and related area. Enforcement was left to the agencies, which interpreted the Act to suit themselves. The Act has lost much of its force.

Page 9: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

• Anita Allen has proposed some guidelines.– The existence of data systems containing personal information

should be public knowledge.

– Personal information should be used only for specific purposes and in ways consistent with the purposes of its collection.

– Personal information should be collected only with the informed consent of the persons about whom the information is collected.

– Personal information should not be shared without the consent of those about whom the information is collected.

– To insure accuracy, storage of information should be limited in time and individuals should be able to review it.

– The collectors of personal data should ensure its integrity and security.

Page 10: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

III. Ownership of Computer Software

• The VP-Planner spreadsheet had copied the entire menu structure of the Lotus 1-2-3 spreadsheet. Lotus sued, but Paperback International (owner of VP-Planner) argued that only the part of the computer program written in some computer language is able to be copyrighted, not the overall organization, menu, general presentation on the screen, etc. Lotus countered by arguing that copyright protection extends to all elements of a computer program that embody original expression.

Page 11: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

* On June 28, 1990, Judge Keaton ruled that the Lotus spreadsheet was original and non-obvious enough to be copyrightable and that VP-Planner had infringed on the copyright.

COPYRIGHTS vs PATENTS

* Software does not fit the paradigm of either something that should be copyrighted or patented.

* In some ways it is like a “work of authorship” and should be copyrighted:

* written in a language

* has a logical sequence like a story

Page 12: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

* In some ways it is more like in Vietnam because it is more like a list of ways to react to certain conditions, much like a machine reacts to conditions.

* Or perhaps it is a “legal hybrid,” so that special laws should be made for software; but these special laws might not be recognized elsewhere in the world.

* Copyrights have been the most popular form of legal protection. But one may copyright only the expression of an idea, no an idea (e.g. boy meets girl, gets married, live happily thereafter).

* Is Lotus 1-2-3 copyrightable?

* Copyright does not cover algorithms, the most creative part of the program, so there is reason to look at patents.

Page 13: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

* In order to be patented, a program must be

* useful

* novel

* non-obvious

* the type of thing that is generally accorded a patent.

* processes (way to cure rubber)

* machines (hearing aid)

* Programs do not fit neatly into any of these catagories, but are usually considered processes. But processes change something. What do processes change? The data? The internal structure of the computer?

Page 14: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

* Another problem: mathematical algorithms are not patentable, but applications are patentable. It is not always easy to distinguish the two (application issue).

OWNERSHIP: THE MORAL BASIS

* Ownership= the ability to control use and exclude others from use. (my car)

* Rule utilitarians can argue that granting ownership of software promotes the growth of technology.

* Rule utilitarians can also argue that there was more innovation in the early days of software development, when software was free.

*RP arguments hold that people own their bodies and whatever they mix the labor of their bodies with, provided that other material is free or owned by them.

Page 15: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

* It is difficult to believe that there would be sufficient creativity in, and distribution of, computer software without legal protection. RP arguments also support strong legal protection.

* Since algorithms are some of the most creative parts of the computer programs, patent protection should be allowed, because it is the only vehicle for protection algorithms.

* Perhaps mathematical algorithms should not be protected, however, because this would be too much a restriction on creativity.

Page 16: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

IV COMPUTER ABUSE: A SPECTRUM OF CASES

* One of the interesting questions raised by computers is the moral status of certain actions.

* What things seem important in determining the moral status of various types of computer abuse?

* malicious intent

* reckless attitude

* negligent attitude

* harm or good done

Page 17: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

* Now consider the following cases -

(1) On March 2, 1988, the anniversary of the advent of Apple Computer’s MacIntosh II and SE Models, the following message popped up on the monitor of approximately 350,000 Mac Intosh personal computers in the U.S. and Canada:

Richard Brandow, the publishes of MacMag, and its entire staff would like to take this opportunity to convey their universal message of peace to all MacIntosh users around the world.

Brandow admitted that he had authored what came to be called the Aldus Peace Virus. This was the first known contamination of off-the-shelf software. The software did not destroy files or interfere with computer functions and it erased itself.

Page 18: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

According to this analysis, Brandow’s action does not count as a serious moral offense, although

Brandow should not have done what he did. There was no malicious intent and relatively little harm

was done.

Feature Neg.

Paradigm Brandow Case

Pos. Paradigm

Malicious Intent Yes _ _ _ _ _ _ X _ No

Recklessness Yes _ _ _ _ X _ _ _ No

Negligence Yes _ _ _ _ _ _ X _ No

Foreknowledge of harm done

Yes _ _ _ _ _ _ _ _ No

Harm to others Yes _ _ X _ _ _ _ _ _ No

Page 19: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

(2) On Nov. 1988, Robert T. Morris, a graduate student at Cornell University’s computer science program, unleashed a computer worm on the internet. By the following morning, over 6,000 computers running a version of the UNIX operating system known as 4.3 BSD had been affected. The worm also affected versions of UNIX that ran on SUN Microsystems Computers. The worm operated by uncontrollable replication which clogged the computer’s memory, resulting in substantial degradation of the system. The worm did not, however, destroy system or user files or data. Estimates are that it took twenty people days to purge computers of the worm. Morris may not have intended the worm to replicate uncontrollably, although he should have been aware that it would, given his knowledge. He did, however, design the worm so it would not be detected, and would continue to exist even if it were detected. The investigation of Morris concluded that he recklessly disregarded the most probably consequences of his actions.

Page 20: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

Morris’ intent was not malicious, although it was reckless and in some areas negligent. The harm done was considerable, but not catastrophic. Morris was morally culpable, but not

evil.

FeatureNeg.

ParadigmMorris Case

Pos.Paradigm

Malicious Intent Yes _ _ _ _X _ _ _ No

Recklessness Yes _ X _ _ _ _ _ _ No

Negligence Yes _ X _ _ _ _ _ _ No

Foreknowledgeof harm done

Yes _ _ _ _X _ _ _ No

Harm to others Yes _ _ X _ _ _ _ _ No

Page 21: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

(3) Kevin David Mitnick received one of the early convictions under the Computer Fraud and Abuse Act of 1986. Metnick, a twenty-five-year-old, was described by a colleague as one who could not pass a day happily unless he had invaded some computer network or data base which he was not authorized to enter. Although many of his intrusions were benign, some were not.

Page 22: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

Although Metnick’s intent appeared to have an elementary malice, his intent was apparently more to prove his

computer prowess rather than to cause harm. He caused some harm, however, and his action was reckless. Morally, his action is probably more serious than that of Brandon or

Morris.

FeatureNeg.

ParadigmMetnick Case

Pos.Paradigm

Malicious Intent Yes _ X_ _ _ _ _ _ No

Recklessness Yes _ X_ _ _ _ _ _ No

Negligence Yes _ X_ _ _ _ _ _ No

Foreknowledgeof harm done

Yes _ X_ _ _ _ _ _ No

Harm to others Yes _ X_ _ _ _ _ _ No

Page 23: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

(4) In 1985, a brokerage house and insurance company in Fort Worth, Texas; discovered that 68,000 of its sales commission records had vanished. By working over a weekend, employees were able to recover most of the records, but the next Monday the entire system crashed and became inoperable. No permanent damage was done, but the culprit was identified as Donald Gene Burleson, who was described as a talented programmer, but also arrogant and rebellious against authority. Burleson was convicted of computer abuse under the Texas Penal Code,which permits a felony charge to be filed if the damage from altering, damaging, or destroying data, from causing a computer to malfunction, or interrupting normal operations exceeds $2500. Here, unlike the Morris case, there was malice a forethought. Burleson was fined $11,800 and given a seven-year probation.

Page 24: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

From a moral standpoint, this was the most serious offense, because Burleson intended to do harm, and his action was arrogant and reckless. The long-term harm was minimal,

but only because of the intervention of others.

FeatureNeg.

ParadigmBurleson Case

Pos.Paradigm

Malicious Intent Yes X _ _ _ _ _ _ _ No

Recklessness Yes X _ _ _ _ _ _ _ No

Negligence Yes _ _ _ _ _ _ X _ No

Foreknowledgeof harm done

Yes X _ _ _ _ _ _ _ No

Harm to others Yes _ _ _ X _ _ _ _ No

Page 25: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

V. Computers and Moral Responsibility

The Therac-25 was a high-energy accelerator that can destroy tumors with minimal damage to surrounding healthy tissue. It was developed by Atomic Energy of Canada Limited (AECL). Eleven machines were installed in the U.S. and Canada. Between 1985 and 1987, six accidents involving massive overdoses occurred. The machine was almost totally software controlled.

1. (1985) Kennestone Regional Oncology Center. The patient felt a “tremendous force of heat.” Patient’s breast had to be removed.

2. (1985) Ontario Cancer Foundation, Hamilton, Ontario. Death due to cancer, but a total hip replacement would have been necessary due to radiation.

3. (1985) Yakima Valley Memorial Hospital, Yakima, Washington. Patient in constant pain due to overexposure. The fact that there had been other incidents did not trigger an investigation.

Page 26: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

4. (1986) East Texas Cancer Center, Tyler, Texas. The patient died from complications of the overdose.

5. (1986) East Texas Cancer Center. The patient died from the overdose.

6. (1987) Yakima Valley Hospital. The patient died from the overdose.

The Evasion of Responsibility

This tragic story illustrates irresponsible actions on both the corporate and individual level. Yet the investigators of the accidents say they do not wish “to criticize the manufacturer of the equipment or anyone else.” Philosopher Helen Nissenbaum believes this situation is not unusual: “accountability is systematically undermined in our computerized society…”

Page 27: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

Responsibility can be evaded in two ways:

• Depersonalizing responsibility - Responsibility can be “pushed up” to the corporate or organizational level. One possible locus of responsibility is the organizations involved, such as AECL, Yokima Valley Memorial Hospital, and the East Texas Cancer Center. The argument is that the actions were actions of the corporation, but corporations are not persons, so they cannot be responsible.

• Fracturing responsibility - Responsibility can be “pushed down” to so many people that it is unfair to pin responsibility on any individual person.

Consider the following issues which the case raises, some of which may involve corporate and some individual responsibility.

Page 28: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

1. The manufacturing personnel who built a faulty microswitch on some of the machines.

2. The AECL programmer who was responsible for the software coding errors and obscure error messages.

3. The AECL personnel who wrote the inadequate user manuals for the Therac-25.

4. The engineers and others at AECL who did inadequate testing and quality assurance.

5. Managers and others at AECL who made exaggerated claims about the safety of the machine.

6. The engineers and others at AECL who were slow in understanding and responding to the problems and informing customers of the problems.

Page 29: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

7. The personnel at the hospitals and treatment centers who were insensitive to patient distress and who denied that a patient could have been burned.

8. Administrators at some of the facilities who did not insure that the monitoring equipment was in working order.

9. The physicians who were slow to diagnose and adequately treat the radiation burns.

The result of these two types of evasion of responsibility is that terrible tragedies can happen, and yet we cannot pin the responsibility on any individuals or groups. How can these evasions of responsibility be avoided?

Page 30: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

Corporate/Organizational Responsibility

How can we show that corporations and other organizations can be responsible for harm? (We will focus on corporations.)

A corporation is not a human person. It does not have a body, cannot be sent to jail, and has an indefinite life.

On the other hand, corporations are described as “artificial persons” in the law. According to Block’s Law Dictionary,

“…the law treats the corporation itself as a person which can sue and be sued. The corporation is distinct from the individuals who comprise it (shareholders).”

Corporations can come into being and pass away, and be fined.

But they have three characteristics that make them similar to moral agents.

Page 31: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

•Corporations, like people, have decision-making mechanism - boards of directors and executives, then subordinates carry out the decision.

•Policies that guide decision-making, such as corporate codes of ethics, a “corporate culture” (personality).

•Interests that are not necessarily the same as those of the executives, employees, and others who make up the corporation. Corporate interests may include making a certain purchase, making a project, maintaining a good public image, and staying out of legal trouble.

Page 32: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

•Example of a corporate decision: Suppose an oil corporation is considering beginning a drilling operation in Africa. A mountain of paperwork will be forwarded to the CEO and other top executives and probably the board of directors. When a decision is made, according to the decision-making procedure established by the corporation, it can properly be called a “corporate decision.” It was made for “corporate reasons” and presumably according to “corporate policy,” to satisfy “corporate interests.” Presumably, it was made according to standards of “corporate ethics,” and in line with the “corporate culture.”

•Both organizations and individuals can also be held responsible for inaction as well as actions. Corporations and other organizations can also be held responsible for negligence as well as positive action.

•How does all of this apply to Yokima Valley Memorial Hospital, the East Texas Cancer center, and the other corporate entities involved in the Therac-25 accidents? Corporate policies (or the absence of corporate policies), corporate decisions, and corporate culture may have been responsible for many of the factors that led to the injuries and deaths - either through corporate action or corporate negligence.

Page 33: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

•We can say several things about the responsibility of AECL, relying primarily on items 5-6 and 8 in the earlier list:

•Its employees did not have proper training in systems engineering.

•Its instruction manuals were inexcusably inadequate.

•Its claims for safety were exaggerated.

•Its responses to accidents were slow and inadequate

•These faults were probably the result of management, decision procedures, and a corporate culture that was excessively driven by motives of profit. Or perhaps AECL was incompetently managed.

Page 34: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

Overcoming Fractured Responsibility

• Philosopher Larry May, focusing on responsibility for inactor rather than actor, has proposed the following formula: “… if a harm has resulted from collective inaction, the degree of individual responsibility of each member of a putative group for the harm should vary based on the role of each member would, counter factually, have played in preventing the action. (cont.)

Page 35: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

The more influence an individual could have had in preventing the inaction, the more responsibility the individual has for the inaction.

• We can use the same formula for responsibility for action: the more influence an individual had in causing the acion, the more responsibility that individual has for the action.

• We can say several things about individual responsibility, relying on items 1-4, and on 10 of the earlier list.

Page 36: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

• The manufacturing personnel who built the faulty micro-switch that controlled the position of the turntable on which the patients were placed were important causal agents in some of the accidents, especially the one at the Ontario Cancer Foundation. The fault may be attributed to negligence, or even incompetence.

• The programmers who created the program were responsible for the errors in programming and the obscure error messages.

Page 37: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

• The user manuals were unusually incompetently written. There was no explanation, for example, of the “error 54” message. Had the operators known how to respond to error messages, they might have been able to avoid some of the accidents.

• The engineers who designed the Therac-25 without hardware backups were major culprits in the accidents. If the machines had been equipped with the hardware backups, as earlier models were, the accidents probably would not have occurred (cont.)

Page 38: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

If they lacked proper training in systems engineering, they were not competent to design the machines, and they were responsible.

• In some of the accidents, the technicians were shockingly insensitive to patient distress and complaints, partially due, no doubt, to AECL claims that radiation burns were not possible.

Page 39: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

• In several cases, physicians were slow to recognize that overexposure had occurred, physicians in such medical settings should have been alert to the possibility of such injuries. AECL claims that overexposure was not possible again may have been a factor.

Page 40: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

In the absence of additional information about the individuals involved, it is not possible to say what punishment, if any, should be given to the individuals who held partial responsibility for the deaths and injuries. It is clear, however, that the fact that many people were involved does not mean that they cannot be held responsible.

Page 41: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

Proposals for Maintaining Responsibility in a Computerized

Society

Page 42: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

• Standards of care should be promoted in computer science and computer engineering. Guidelines for producing safer more reliable computer systems should be promulgated and adhered to by computer professionals. The existence of such standards would also make it easier to identify those who should be held responsible. One such standard might be that programs should not bear the sole responsibility for safety.

Page 43: Computers: Individual and Social Responsibility. I. Introduction Computer ethics raises few if any fundamentally different types of ethical issues, but.

• Strict liability should be imposed for defective customer-oriented software and for software that has a considerable impact on society, strict liability would help to insure that victims are properly compensated and send a strong message to software producers that they should be vitally concerned for public safety, presently, software producers give little if anything in the way of warranties for the safety of reliability of their products.


Recommended