+ All Categories
Home > Documents > Ethical Issues Underlying Responsible Conduct of Science Explored

Ethical Issues Underlying Responsible Conduct of Science Explored

Date post: 05-Feb-2017
Category:
Upload: pamela-s
View: 212 times
Download: 0 times
Share this document with a friend
4
NEWS FOCUS Ethical Issues Underlying Responsible Conduct of Science Explored Forum concluded ethical behavior needs to be communicated and practiced at all levels of research, urged training in ethics for all science students Pamela S. Zurer, C&EN Washington T wo days isn't much time to wrestle with the pressing ethi- cal issues facing science today. But some 400 scientists, engineers, and research administrators attending Sig- ma Xi's forum on "Ethics, Values, and the Promise of Science" last month worked diligently to do so. They joined philosophers, historians of sci- ence, and government officials in as- sessing scientists' obligations both to themselves and to the society that sup- ports their work. Sigma Xi, the scientific research so- ciety composed of chapters and clubs across the nation, sponsored the meet- ing in San Francisco in an attempt to involve the scientific community in debate on key ethical issues. One goal of the forum was to recommend actions scientists can take to help restore public confidence in the way research is con- ducted. Some speakers viewed the challenges facing science as stemming only from external pressures. In his address to the forum, Nobel Laureate J. Michael Bishop restated a theme that has become familiar within the community during the past de- cade: Science is underappreciated, un- derfunded, and under attack from a public unschooled in its basics. Bishop is professor of microbiology, immu- nology, biochemistry, and biophysics at the University of California, San Francisco. "Fear, bewilderment, disdain: these are all opponents science must best," Bishop said. "And there is one other which is now current: mistrust The public is ignorant of the formula by which science advances, and hence is easy prey for our critics." Bishop's lament, echoed by a few other speakers, found many sympa- thetic listeners at the forum. Yet the Gert: scientists react like sports fans stir were those that challenged the re- . search community to look within and examine its own principles and behav- iors. A philosopher confronted research- ers with their illogic in holding estab- lished scientists to lesser standards than junior members of the communi- ty. A medical ethicist countered skep- ticism that ethics can or should be taught. And a panel of postdoctoral researchers described treatment by su- pervisors that could be viewed as dis- couraging at best. "With 2 million members, it's un- likely that the moral character of scien- tists and engineers is different from that of other groups of people, such as doctors, lawyers, or philosophers," said Bernard Gert, professor for the study of ethics and human values at Dartmouth College in Hanover, N.H. Gert was the only philosopher to serve on the National Academy of Sciences (NAS) panel that last year produced the report "Responsi- ble Science: Ensuring the Integrity of the Research Process." Everyone agrees, Gert said, that certain acts such as deceiv- ing, cheating, and neglecting one's duties are immoral un- less one has adequate justification. "Some scientists may claim that ... some kinds of misreporting of their experiments, in order to enhance the acceptability of an hypothesis of whose correctness one is very confi- dent, is justified," Gert said. "They may hold that if experienced scientists are very confident of their claims, those claims are usually true, so that such enhancement will result in less time being wasted doing futile re- search. Thus, they may claim that this kind of deception actually results in more truth being discovered than fail- ure to deceive. It may be, in the words of [the NAS report] a questionable practice, but it is not scientific miscon- duct. presentations that caused the greatest Lo: ethical guidelines are insurance "I do not claim that as presently un- MARCH 15,1993 C&EN 7
Transcript

NEWS FOCUS

Ethical Issues Underlying Responsible Conduct of Science Explored

Forum concluded ethical behavior needs to be communicated and practiced at all levels of research, urged training in ethics for all science students

Pamela S. Zurer, C&EN Washington

Two days isn't much time to wrestle with the pressing ethi­cal issues facing science today.

But some 400 scientists, engineers, and research administrators attending Sig­ma Xi's forum on "Ethics, Values, and the Promise of Science" last month worked diligently to do so. They joined philosophers, historians of sci­ence, and government officials in as­sessing scientists' obligations both to themselves and to the society that sup­ports their work.

Sigma Xi, the scientific research so­ciety composed of chapters and clubs across the nation, sponsored the meet­ing in San Francisco in an attempt to involve the scientific community in debate on key ethical issues. One goal of the forum was to recommend actions scientists can take to help restore public confidence in the way research is con­ducted.

Some speakers viewed the challenges facing science as stemming only from external pressures. In his address to the forum, Nobel Laureate J. Michael Bishop restated a theme that has become familiar within the community during the past de­cade: Science is underappreciated, un­derfunded, and under attack from a public unschooled in its basics. Bishop is professor of microbiology, immu­nology, biochemistry, and biophysics at the University of California, San Francisco.

"Fear, bewilderment, disdain: these are all opponents science must best," Bishop said. "And there is one other which is now current: mistrust The public is ignorant of the formula by which science advances, and hence is easy prey for our critics."

Bishop's lament, echoed by a few other speakers, found many sympa­thetic listeners at the forum. Yet the

Gert: scientists react like sports fans

stir were those that challenged the re-. search community to look within and

examine its own principles and behav­iors.

A philosopher confronted research­ers with their illogic in holding estab­lished scientists to lesser standards than junior members of the communi­ty. A medical ethicist countered skep­ticism that ethics can or should be taught. And a panel of postdoctoral researchers described treatment by su­pervisors that could be viewed as dis­couraging at best.

"With 2 million members, it's un­likely that the moral character of scien­tists and engineers is different from that of other groups of people, such as doctors, lawyers, or philosophers," said Bernard Gert, professor for the study of ethics and human values at

Dartmouth College in Hanover, N.H. Gert was the only philosopher to serve on the National Academy of Sciences (NAS) panel that last year produced the report "Responsi­ble Science: Ensuring the Integrity of the Research Process." Everyone agrees, Gert said, that certain acts such as deceiv­ing, cheating, and neglecting one's duties are immoral un­

less one has adequate justification. "Some scientists may claim that . . .

some kinds of misreporting of their experiments, in order to enhance the acceptability of an hypothesis of whose correctness one is very confi­dent, is justified," Gert said. "They may hold that if experienced scientists are very confident of their claims, those claims are usually true, so that such enhancement will result in less time being wasted doing futile re­search. Thus, they may claim that this kind of deception actually results in more truth being discovered than fail­ure to deceive. It may be, in the words of [the NAS report] a questionable practice, but it is not scientific miscon­duct.

presentations that caused the greatest Lo: ethical guidelines are insurance "I do not claim that as presently un-

MARCH 15,1993 C&EN 7

NEWS FOCUS

Government should help promote scientific integrity, committee advises The federal government has a role to play in promoting the responsible con­duct of research, according to the De­partment of Health & Human Servic­es' Advisory Committee on Research Integrity. The government should pro­vide information, support scholarly re­search on ethics, and prod universities to take concrete steps to foster integri­ty in science, the panel concluded at a meeting last month.

Advising the government to get more involved in the affairs of aca-demia is something of a switch for the group. In the two years of its ex­istence, the committee—formally charged with advising the Secretary of Health & Human Services on issues of research integrity—has for the most part focused on limiting government's role in dealing with scientific miscon­duct.

For example, the 10-member panel's first efforts focused on narrowing the Public Health Service's (PHS) defini­tion of scientific misconduct (C&EN, March 23,1992, page 14). Its aim was to lessen the anxiety of the scientific community, many of whom fear the current definition as vague and overly broad. PHS intends to formally pro­pose a revised definition based on the

committee's recommendations later this year.

Last fall, the committee asked PHS's Office of Research Integrity (ORI) to abandon its application for an ex­emption from the federal Privacy Act (C&EN, Nov. 2, 1992, page 19). ORI, which investigates allegations of sci­entific misconduct, wants the exemp­tion so it can protect the identities of whistleblowers and other confiden­tial informants. In this instance, how­ever, PHS has chosen not to accept the panel's advice and is proceeding with its application.

The committee held its sixth meet­ing in San Francisco on Feb. 27 and 28, immediately following Sigma Xi's fo­rum on ethics. Although the group in previous sessions has concentrated on reviewing ORI's policies and proce­dures, this time the panel focused on what committee chairman Nicholas H. Steneck described as "the more diffi­cult part of our charge: providing ad­vice on strategies for promoting integ­rity in science." Steneck is professor of history and director of the Historical Center for the Health Sciences at the University of Michigan, Ann Arbor.

Research integrity is not the same thing as research fraud or misconduct,

Steneck told the committee, but en­compasses behavior the group did not want included in the definition of sci­entific misconduct. "Integrity means living a complete professional life," he said. "It brings in issues of sloppy sci­ence, inadequate mentoring or labora­tory supervision, inadequate citations, and unwillingness to cooperate with colleagues."

Steneck said there is ample room for improvement. "We don't know the fre­quency of scientific misconduct but we do know a fair amount about prob­lems with the integrity of biomedical research. . . . There is bias in review­ing. Researchers are pretty sloppy in some things, such as use of statistics. It may not be misconduct but it is re­search lacking integrity."

Education can address issues of sci­entific integrity, he continued. "We can either foster proper values or turn them off in our training of researchers. . . . The goal is to help researchers be more knowledgeable about proper handling of data, sharing information, and the social implications of their re­search. I'm a believer that education can make a difference."

But Steneck saw little evidence that universities will provide such educa-

derstood [such deception! would count as scientific miscon­duct; that is, fabrication, falsification, or plagiarism," Gert continued. "But I hold that from the point of view of mo­rality, the only difference between many questionable prac­tices and scientific misconduct is that the former are fairly widespread and the latter fairly rare."

As an example, Gert discussed the attitude of some scien­tists toward Robert A. Millikan's selective reporting of re­sults of his famous oil-drop experiments. Gert characterizes their view as "consequentialism," the idea that all that counts morally are the consequences of one's actions.

As described in American Scholar [60, 505 (1991)], Milli-kan—after being challenged by a rival on his unit theory of electric charge—published a paper in which he says he pre­sented all his data from 60 consecutive days. But Millikan's notebooks show that he omitted data from experiments that did not turn out the way he expected.

The author of the American Scholar paper, David Good­stein, vice provost of California Institute of Technology, says Millikan's omissions were motivated by the need to convince skeptics of what Millikan perceived to be the sci­entific truth. Goodstein says that "inestimable damage to science would have been done" had Millikan not succeed­ed.

Gert disagrees. "How does [the consequentialist] know that if . . . Millikan had been honest, admitting that [his] re­

sults were not perfect, that others might not have acted in such a way as to bring about the correct results more quick­ly? How does he know that these acts of honesty under great temptation to deceive might not have so influenced the scientific tradition that science would be even more suc­cessful than it is now?"

The tendency to hero worship great scientists, together with loyalty among members of the profession, may explain the failure to recognize that great scientists sometimes be­have immorally, Gert said. "Some scientists seem to react very much like sports fans. If a scientist or player is great enough, then almost nothing they do counts as seriously wrong. . . . I do not want to inhibit any scientist from exer­cising his scientific judgment; all I want is that he not de­ceive anyone about what he is doing."

"Imagine teaching a graduate student to exercise his sci­entific judgment by keeping secret all unwanted results, publishing only those that confirmed his hypothesis, and claiming that he is publishing everything," Gert continued. "If this is the way in which graduate students in science are being taught, it is amazing that scientific fraud is not far more common than it seems to be. But I suspect that the students are made to understand, though not explicitly told, that this kind of behavior is not acceptable for them—it is only acceptable for someone who has already attained stat­ure in the field."

8 MARCH 15,1993 C&EN

tion on their own. Only since the Na­tional Institutes of Health (NIH) im­posed a requirement that all recipients of research-training grants receive a grounding in ethical issues have some universities set up courses or sympo­sia. "My reluctant conclusion is that if left on their own, [universities] would continue to make general statements about the importance of integrity, but there would be limited concrete ac­tion," he said. "There is a role for gov­ernment in this area."

Surprisingly, Steneck's views met no opposition from the other commit­tee members present. However, Estelle A. Fishbein, vice president and gener­al counsel of Johns Hopkins Universi­ty, who at earlier meetings argued forcefully against any expansion of government into the university's do­main, did not attend the San Francisco meeting. In her absence the committee turned to the question of what the government could reasonably do.

The panel members decided the re­quirement for education in research ethics should be extended to all PHS-funded research. They advised ORI to publicize successful ethics courses in its new quarterly newsletter and to suggest—but for the time being not re­

quire—such training at all institutions receiving PHS funds.

The group also endorsed the idea that principal investigators on grants be required to complete a checklist on matters relating to scientific integrity when applying for grants or submit­ting reports. Suggested by Eleanor G. Shore, dean for faculty affairs at Har­vard Medical School, such a checklist could include assurances that the in­vestigator had formulated an appropri­ate plan for retaining primary data; all coinvestigators had been consulted in drafting proposals, reports, or publica­tions; trainees had received training in research ethics; and attributions and citations had been verified.

"Faculty time spent on a checklist pales beside the time and effort of an inquiry and investigation" into scien­tific misconduct, Shore said. "A check­list reaches everyone, from the most junior to the most senior, with the no­tion there will be accountability if problems arise."

Noting that most current research on ethical issues in science is supported by the National Science Foundation, Steneck posed the question of wheth­er PHS should set aside some funding specifically for research on questions

of scientific integrity. He cited as an example NIH's Center for Human Ge­nome Research, which reserves 3% of its $110 million annual budget for studies of ethical issues raised by the human genome project.

The committee backed the idea. "I would hope ORI would develop a re­search program with requests for pro­posals saying what sort of questions it would like to address, such as the prevalence of misconduct," said Paul J. Friedman, dean of academic affairs at the school of medicine at the Universi­ty of California, San Diego.

Currently, ORI has no funds for un­derwriting such research, noted Lyle W. Bivens, ORI's acting director, but could incorporate the idea in its plans for next fiscal year's budget. He point­ed out that PHS sets aside 1% of its to­tal budget each year for program eval­uation, suggesting these monies might prove to be a source for supporting re­search on scientific integrity.

The panel recommended that ORI in­clude ethics research programs in its next budget. In the meantime, the com­mittee advised that ORI make known its interest in such work in its newslet­ter and challenge outside groups to sup­port ongoing research.

Gert's presentation seemed to leave many scientists in the audience uncomfortable—including Goodstein, who the day before had spoken persuasively on scientists' responsi­bility to educate the public. Not surprisingly, when Good­stein asked for a show of hands from all who considered themselves "hero-worshipping consequentialists," no one responded.

Bernard Lo, associate professor of medicine at UCSF, of­fered rebuttals to the objections of skeptics who doubt eth­ics can or should be taught. Lo heads a program in medical ethics that holds seminars for trainees on ethical issues in the conduct of biomedical and clinical research. Many of his colleagues, he told the forum, wonder whether ethical is­sues are worth the time, preferring to stick to their labs.

For example, Lo said he often hears: "Ethics is being a good person, not a system of rules or set of guidelines." He responds, he said, by pointing out that being a good person isn't always enough to act properly. "Scientists may be gen­uinely perplexed in a given situation; for example, how to respond to an allegation of misconduct. Scientists are hu­man; they make mistakes and work under stress. Ethical guidelines are like insurance, protecting against scientists whose character is flawed, who suffer lapses in character, or who are overwhelmed by a difficult situation."

"Ethics is common sense and experience" is another ob­jection to ethics training Lo regularly faces. He replies that

many ethical dilemmas are so complicated that sensible and experienced scientists may be perplexed or disagree over what to do. Ethical guidelines can also help scientists make their thinking explicit and help them explain their reason­ing to students and the public.

Another objection Lo hears is that "Every case is unique, so guidelines are impossible." He counters that general guidelines obviously cannot be applied mechanically. "We'll always need practical knowledge, judgment, and compassion. But certain situations—such as problems with missing primary data—come up repeatedly and ought to be resolved in consistent ways if we are to be fair."

Students learn best about ethics when their interest is captured by examples from real life and they actively take part in thinking issues through for themselves, Lo said. In seminars at UCSF, research trainees discuss realistic cases and participate in role playing that leads to developing spe­cific suggestions for dealing with the problem at hand. Lo also tries to involve senior scientists, who by sharing their own experiences legitimize discussion about ethical issues.

Lo recruited from his institution two of the four postdoc­toral researchers who took part in a panel discussion of their uniquely vulnerable place in the academic research system. Two more—both chemists—were drafted by Stan­ford University chemistry professor Carl Djerassi. The fo­rum's organizers were puzzled by how difficult it was to

MARCH 15,1993 C&EN 9

NEWS FOCUS

Jones and Meyer: postdoctoral scientists occupy precarious positions

get postdocs to come forward and describe their experienc­es—but the panel members were not surprised.

"It's pretty much thought to be professional suicide to stand up and speak about a bad experience you've had/' said Jan Gurley, a physician participating in the Robert Wood Johnson clinical scholars program at UCSF. She de­scribed postdoctoral scientists' dependence on their bosses as at best apprenticeship and at worst indentured servitude or slavery.

Postdocs are often first to find out when something is wrong with a thesis, reagent, or experiment, Gurley point­ed out. Disappointed research supervisors' reactions all too often fit the "kill-the-messenger syndrome," with the post­doctoral scientists blamed or shunned.

Lisa Backus opted for medical school at UCSF rather than postdoctoral work after getting her Ph.D. degree in neuro­pharmacology in large part because of issues of mentoring and gender, she told the forum. "I can't overemphasize how important the issue of mentoring is," she said.

Mentors must teach young researchers to do good sci­ence, write papers, and get grants, Backus said, but they also must steer beginners through the politics of modern science. "He—and it almost always is he—must look out for your career. He must help you publish in the right place, meet the right people at conferences, and make calls to get you into the old-boys' network."

"It's difficult to find a principal investigator who will do all of this," she continued, "and issues of gender make it even harder. Men feel more comfortable mentoring other men." The lack of mentors has had noticeable negative ef­fects on women's careers, she said.

Tara Meyer, a postdoctoral researcher in chemistry at UC Berkeley, said her own experience has been positive. Nev­ertheless, she noted that postdocs occupy a uniquely precar­ious perch in the typical feudal hierarchy of academic insti­tutions. Having no separate relationship with the universi­ty, they are totally dependent on their research supervisors.

Meyer outlined what she sees as the obligations of re­search advisers toward their postdoctoral students—obliga­

tions she believes are largely understood but rarely discussed. Among those she list­ed are that postdocs must not face discrim­ination or sexual harassment. Their advis­ers need to give them a realistic picture of the nature of the project they will be work­ing on and provide sufficient materials and equipment so they can perform the work.

Postdoctoral duties to the research group as a whole—such as supervising graduate students or ordering supplies— should be kept under control so that the postdocs will have time to carry out their own research. Their work should be prop­erly acknowledged, and they should be given encouragement and help in finding jobs, she continued.

Meyer's husband, Garth Jones, a post­doctoral chemist at Stanford, described the difficulties in finding postdoctoral work. "Both academic and industrial institutions want new employees to have the breadth of experience a postdoc provides," he said,

yet there is no systematic way to discover what opportuni­ties exist.

Many institutions, fearing repercussions for failing to comply with equal employment opportunity regulations, never advertise available openings, Jones said. Jobs are found through the old-boy network or through the luck of contacting the right place at the right time. "There are end­less possibilities for unfairness in hiring," he said.

After two intense days of thought-provoking speeches and spirited discussion groups, the attendees developed a set of conclusions and recommendations for the scientific community. Addressing issues ranging from teaching ethics to peer review to improving mentoring to the ethics of di­versity, many of the recommendations assigned to scientists themselves the responsibility for improving both the prac­tice of science and the public's perceptions.

Among other conclusions, the group decided that appro­priate ethical behavior needs to be communicated and prac­ticed at all levels of academic, governmental, industrial, and other research organizations. They also strongly recom­mended that training in ethics be required for all science students.

Another conclusion held that significant and widespread problems exist in mentoring relationships, while at the same time there is widespread reluctance to address those problems. Sigma Xi was urged to set up a committee to as­sess the situation and to define an ideal mentor.

And the group agreed that scientific misconduct is often an outgrowth of mismanagement or lack of proper supervi­sion. In responding to allegations, institutions should also seek to ascertain why and how misconduct in research was allowed to occur.

Sigma Xi plans to publish the proceedings of the forum and its conclusions and recommendations by early summer, according to the society's executive director, John F. Ahearne. He described the forum as a first step in a major effort by Sigma Xi in the area of ethics and values. The soci­ety will form a steering committee to decide what to take on next. •

10 MARCH 15,1993 C&EN


Recommended