+ All Categories
Home > Documents > Calendar

Calendar

Date post: 16-Jan-2017
Category:
Upload: phungbao
View: 213 times
Download: 0 times
Share this document with a friend
4
Calendar Source: IRB: Ethics and Human Research, Vol. 17, No. 4 (Jul. - Aug., 1995), pp. 4-6 Published by: The Hastings Center Stable URL: http://www.jstor.org/stable/3564153 . Accessed: 17/06/2014 05:00 Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at . http://www.jstor.org/page/info/about/policies/terms.jsp . JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact [email protected]. . The Hastings Center is collaborating with JSTOR to digitize, preserve and extend access to IRB: Ethics and Human Research. http://www.jstor.org This content downloaded from 185.44.78.113 on Tue, 17 Jun 2014 05:00:55 AM All use subject to JSTOR Terms and Conditions
Transcript
Page 1: Calendar

CalendarSource: IRB: Ethics and Human Research, Vol. 17, No. 4 (Jul. - Aug., 1995), pp. 4-6Published by: The Hastings CenterStable URL: http://www.jstor.org/stable/3564153 .

Accessed: 17/06/2014 05:00

Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at .http://www.jstor.org/page/info/about/policies/terms.jsp

.JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range ofcontent in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new formsof scholarship. For more information about JSTOR, please contact [email protected].

.

The Hastings Center is collaborating with JSTOR to digitize, preserve and extend access to IRB: Ethics andHuman Research.

http://www.jstor.org

This content downloaded from 185.44.78.113 on Tue, 17 Jun 2014 05:00:55 AMAll use subject to JSTOR Terms and Conditions

Page 2: Calendar

IRB port Ethical Guidelinesfor the Protection of Human Subjects. Washington, D.C.: D.H.E.W. Publication 78-0012, 1978.

4. Glass, ES: Restructuringinformnned con- sent: Legal therapy for the doctor-patient relationship. Yale Law Journal 1970; 79:1533-76.

5. Fried, C: Medical Experimentation Per- sonal Integrity and Social Policy. New York: American Elsevier Publishing Co., 1974.

6. Meisel, A, and Kabnick, LD: Informed consent to medical treatment: An analy- sis of recent legislation. University of Pitts- burgh Law Review 1980; 41:407-45.

7. President's Commission for the Study of Ethical Problems in Medicine and Biomedical and Behavioral Research: Protecting Human Subjects. Washington, D.C.: U.S. Government Printing Office, 1982.

8. Katz, J: The Silent World of Doctor and Pa- tient New York: The Free Press, 1984.

9. Faden, RR, and Beauchamp, TL: A His- tory and Theory of Informed Consent. New York: Oxford University Press, 1986.

10. Levine, RJ: Ethics and Regulation ofClini- cal Research, 2nd ed. New Haven: Yale University Press, 1988.

11. Veatch, RM: The Patient as Partner: A Theory of Human Experimentation Eth- ics. Bloomington: Indiana University Press, 1987.

CALENDAR

19-20 OCTOBER PRIM&R and Tufts University School of Medicine will cosponsor a conference, IRBs: Encountering Special Problems of this Decade, in Boston. Immediately preceding PRIM&R's conference, on 18 October, Applied Research Ethics National Association (ARENA) will hold its annual human subjects research meeting, IRBs on the Edge. For information: PRIM&R, 132 Boylston St., Boston, MA 02116; (617) 423-4112; (617) 423-1185 fax.

9-10 NOVEMBER PRIM&R and Tufts University School of Medicine will cosponsor a conference, The Future of Clinical AIDS Research, in Boston. For information: PRIM&R, 132 Boylston St., Boston, MA 02116; (617) 423-4112; (617) 423-1185 fax.

10 NOVEMBER The Center for Bioethics at University of Pennsylvania Medical Center in Philadelphia will host a conference, Informed Con- sent: Who Really Decides? For information: Helen DiCaprio, The Center for Bioethics, University of Pennsylvania, 3401 Market St., Suite 320, Philadelphia, PA 19104-3308; (215) 573-9378; (215) 573-3036 fax.

JOB LISTING The Clinical Center, National Institutes of Health, invites applications for chief of the Clinical Bioethics Program. The chief serves as the principal adviser to the director of the Clinical Center and Institute clinical directors on matters related to ethical aspects of patient care and research involving human subjects. Application dead- line: 13 October. For information: Alice L. Owens, Personnel Office, The Clinical Center, (301) 496-6924.

Experiment Spot-Checks: A Method for Assessing the Educational Value of Undergraduate Participation in Research by R. Eric Landrum and Garvin Chastain

For years, psychology departments around the nation have justified the use of subject pools as having educa- tional value for students.1,2,3 That is, they claim that students can learn firsthand about the research process through participation, and this out-of- class activity fosters an understand- ing and appreciation for psychological research. Ideally, there is a dual bene- fit (to the researcher and the student participant) with minimal risk (typi- cally the time necessary to partici- pate). In the last fifteen years, however, some have begun to question whether participation in a subject pool is a valuable educational experience.4,5

A related issue is that of coercion or perceived coercion for students to participate in research. While an in- teresting issue, it is not the focus of this paper. We attempt to answer the following: (1) Do students feel that participating in research is a learning experience? and (2) Can we have

R. Eric Landrum, PhD, is associate professor and Garvin Chastain, PhD, is professor, De-

partment of Psychology, Boise State University, Boise, Idaho.

some confidence that the instrument used to measure student perceptions has some validity?

These questions have been ad- dressed in the past with mixed re- sults. Britton4 developed a question- naire to assess the ethical and educa- tional aspects of subject pool partici- pation, and found that an experimen- ter's politeness, student comfort, and the explanation given for performing the experiment rated highly, while the educational value of the experience rated somewhat lower. Britton urged supervisors of subject pools to gather information about the subjects' experi- ence, and suggested improving the debriefing process as a method of en- hancing that experience. Debriefings tend to be seen as a critical compo- nent. Coulter6 suspected that insuffi- cient debriefings were responsible for students rating research experiences as boring, irrelevant, and a waste of time.

Of course, student opinion of re- search participation is mixed. Al- though Coulter6 found students held negative attitudes toward research,

Leak7 found that students viewed re- search participation positively. In ex- ploring student attitudes toward re- search, Nimmer and Handelsman8 performed a quasi-experiment with groups of students working in differ- ent research situations. They found that students felt that research par- ticipation does have some education- al value. Further, they recommend adequate debriefings and giving a one-page questionnaire to students for their feedback.

In their survey of graduate depart- ments in psychology, Sieber and Saks5 noted that some departments have developed an evaluation form to assess educational value. The pre- sent study is an extension of that approach, involving the development of an assessment instrument di- rected to student-participants, and its use by a group of researchers for one academic year.

Method

Participants. Each student in the General Psychology course at Boise

4

This content downloaded from 185.44.78.113 on Tue, 17 Jun 2014 05:00:55 AMAll use subject to JSTOR Terms and Conditions

Page 3: Calendar

July-August 1995

State University is required to com- plete some sort of outside-of-class activity exposing him or her to psy- chological research. Most students choose to be research participants. During the 1993-1994 academic year, all researchers in the psychology de- partment cooperated by having 10 percent of their experimental subjects fill out an experiment spot-check form. Questions on the form are pre- sented in Table 1. Two hundred sub- jects completed these forms during the 1993-1994 academic year.

Materials. Based on a review of the literature and the general concerns

about the value of research participa- tion, we formulated six items to which students replied using 5-point Likert- scale (strongly disagree to strongly agree) responses. We also tracked the particular experiment for which the form was completed, the date, the number of other projects in which the students had participated (our stu- dents need two research experiences), the semester of participation, and whether or not the student signed the spot-check form (the student's signa- ture was optional).

Design and Procedure. Spot- check forms were distributed to all

researchers at the beginning of each semester. They were asked to admin- ister this form to 10 percent of their research subjects. Researchers var- ied in how they selected their sample; for example, those running single- subject sessions administered the form to every tenth subject, while those running group sessions occa- sionally asked an entire group of stu- dents to complete the form.

Results

Table 1 presents means and stand- ard deviations for the six question- naire items, as well as information on the number of days into the semester that had elapsed when the experi- ment was conducted, the semester of participation, the number of times the student had already participated in research, and whether or not the stu- dent signed the form.

A factor analysis was performed on the student response data. Factor analysis is a multivariate statistical tool that allows for the examination of multiple relationships between vari- ables simultaneously. The results of a factor analysis identify patterns of response among multiple variables. These interrelationships are often maximized using a variety of mathe- matical rotations of the resulting ma- trix of values. In the current study, using a varimax rotation and a mini- mum eigenvalue of 1.0, the solution converged in 7 iterations. (An eigen- value is similar to a cutoff value indi- cating a degree of communality among the clustered variables; itera- tions refers to the number of rota- tions necessary to find the optimal statistical solution.) Table 2 shows the factor loadings, scores indicating the degree of expression on any one particular factor (all above 0.5). Fac- tors 1 (educational value) and 3 (pro- fessionalism) are the most important. Factor 2, time, seems to be an index of how far into the semester the ex- periment was completed, and Factor 4, with only one question loading on it, seems to identify disclosure (sig- nature).

Discussion

Do subject pools have educational value? Our students' answer is yes. Students agreed with statements in- dicating that participating helped them to learn about psychology and to understand research better, and students strongly agreed that they were treated fairly and with respect. They further indicated that partici- pating in the experiment added vari-

Table 1. Means (M) and Standard Deviations (SD) for Spot-Check Items and Other Data

Item M SD

1. I was treated fairly and with respect during my 4.87 .35 research session.

2. I learned about psychology by participating in a 3.52 .94 research project.

3. The research experience is a good way to add 4.19 .75 variety to introductory psychology.

4. I think that this research experience was a waste 1.94 .82 of time.

5. I think that participating in this project helped me 3.60 .89 understand research better.

6. The purpose of this experiment was adequately 4.41 .80 explained to me.

Days into Semester 70.1 22.65

Semester Tested (1=fall, 2=spring) 1.30 .46

Number of Prior Research Participations 1.45 .83

Student Signature (O=no, 1=yes) .80 .44

Note: Responses to items 1-6 were on a 5-point Likert scale from 1 = strongly disagree to 5 = strongly agree. N = 200.

Table 2. Factor Loadings Based on Rotated Factor Matrix

Factor 1 Factor 2 Factor 3 Factor 4 Educational Time Professionalism Disclosure

Item Value

2. Learned .818 5. Understand .810 3. Variety .651 4. Waste of Time -.605

Experiment No. .812 Days into Semester .710

6. Explained .744 Semester Tested -.574 1. Respect .566

Signature .898

Note: A factor loading represents the degree of relationship (or weight) between a particular item and a factor. Only factor loadings above 0.5 are displayed here. Actually, each item has a loading on all four factors, but it is desired that the item load highly (above 0.5 on one factor, and low on the other factors). Fortunately, that is the case in the present study.

5

This content downloaded from 185.44.78.113 on Tue, 17 Jun 2014 05:00:55 AMAll use subject to JSTOR Terms and Conditions

Page 4: Calendar

IRB ety to the course, and that the pur- pose of the experiment was adequate- ly explained. Contrary to the findings of Coulter, our students disagreed that the experience was a waste of time. These findings, on the whole, provide us with some confidence that students value the educational expe- rience provided by their participation in research, and that our researchers and student research assistants do a good job of treating students fairly and with respect, and provide an ade- quate debriefing.

Additionally, the spot-check form revealed that most of our students completed research in the second half of the course (70.1 days into the se- mester), that more students partici- pated in research in the fall semester (69.5% of the total), that the average number of experiments previously completed when filling out the form was 1.45, and that 78.5 percent of the students signed the form.

The results of the factor analysis merely indicate that trends or pat- terns exist in the students' responses, and that there are sets of items that tend to evoke similar responses. In- terpreting the factors is subjective, but we believe that the most meaning- ful factors to emerge are educational value and professionalism, factors

that make good sense considering the original items and the purpose of the spot-check. Finding such underlying patterns and factors is part of the process of establishing validity of the spot-check questions.

Having current information about the performance and outcomes of the department subject pool is valuable. It provides a snapshot of current per- formance, a method of confirming the educational value of research partici- pation (accountability), and a vehicle for identifying and solving subject-re- lated problems should they arise. Use of such a spot-check form has other benefits as well. Distributing the form to researchers at the beginning of the semester sensitizes them to these is- sues, since they know that their re- search project and personnel will be evaluated. Britton suggests that an experimenter's behavior may be influ- enced by the use of a questionnaire. This procedure also emphasizes the importance of debriefing in experi- mental studies, and the spot-check form clearly captures student opinion in that area.

Other researchers concerned with the educational value of research par- ticipation are encouraged to pursue those issues empirically. Based on the psychometric qualities of the

questions developed here, others can have confidence in measuring the educational value and professional- ism of research participation.

References

1. Orne, M: On the social psychology of the psychological experiment. American Psy- chologist 1962; 11: 776-83.

2. Sieber, JE: Planning ethically responsible research A guidefor student and Internal Review Boards. Newbury Park, Calif.: Sage, 1992.

3. Underwood, BJ, Schwenn, E, and Kep- pel, G: Verbal learning as related to point of time in the school term. Journal of Ver- bal Learning and Verbal Behavior 1964; 3: 222-25.

4. Britton, BK Ethical and educational as- pects of participating as a subject in psy- chology experiments. Teaching of Psy- chology 1979; 6:195-98.

5. Sieber, JE, and Saks, MJ: A census of subject pool characteristics and policies. American Psychologist 1989; 44: 1053-61.

6. Coulter, X Academic value of research participation by undergraduates. Ameri- can Psychologist 1986; 41: 317.

7. Leak, GKI Student perception of coercion and value from participation in psycho- logical research. Teaching of Psychology 1981; 8: 147-49.

8. Nimmer, JG, and Handelsman, MM: Ef- fects of subject pool policy on student at- titudes toward psychology and psycho- logical research. Teaching of Psychology 1992: 19, 141-44.

Clinical Trials Committees: How Long Is the Protocol Review and

Approval Process in Spain? A Prospective Study by Rafael Ortega and Rafael Dal-Re

Since 1982 regulation of all clinical trial protocols (phases I to IV) in Spain, irrespective of sponsorship, in- volves mandatory review and ap- proval by (1) the clinical trials com- mittee (CTC) at each participating center, and (2) the Ministry of Health.1,2 The regulation also in- cludes guidelines for the composition of the CTCs (which must be approved by the Ministry of Health), and states that they must evaluate ethical and scientific aspects of the protocol, as do ethics review commitees in other western countries. The content of the dossier submitted is quite stand- ardized: protocol (according to a 23- item format), patient consent form

Rafael Ortega, MD, is Clinical Research Man-

ager, Medical Department, and Rafael Dal-Rd, MD, PhD, is Medical Director, SmithKline Beecham Pharmaceuticals, Madrid, Spain.

(also official format), case report form, and updated investigator's brochure. In recent years, a patient information sheet has also been requested. In ad- dition, health insurance coverage for potential damages for subjects who participate in trials is also required by law.3 All documents should be in Spanish, but in practice this is limited to the protocol and forms related to patient's consent.

Studying the protocol review and approval process at the CTC level is relevant because of the impact it may have on the timing of clinical research projects, hence in their proper plan- ning. Delays in initiating research due to this review process have been a source of concern,4-9 and they are likely to become increasingly so in the future, when shorter times for the international clinical development of drugs are to be sought actively.l0 In

addition, analysis of factors that could potentially influence--either positively or negatively--the time consumed by the review and approval process is also of interest, since the results may bring possible sources of improvement.

Material and Methods

The first 10 drug protocols submit- ted by our company to the CTCs since 1 July 1992 were evaluated. A data- base of study and CTC-related fea- tures was designed in advance for this prospective study. The following were recorded: type (local or multina- tional) and phase of protocol, essen- tial features of design (comparative, use of placebo, double-blind, mul- ticenter, etc.), time (days) from sub- mission (by the investigator) to ap- proval and from approval to reception

6

This content downloaded from 185.44.78.113 on Tue, 17 Jun 2014 05:00:55 AMAll use subject to JSTOR Terms and Conditions


Recommended