+ All Categories
Home > Documents > DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml...

DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml...

Date post: 14-Apr-2020
Category:
Upload: others
View: 4 times
Download: 0 times
Share this document with a friend
20
DATA BASICS A PUBLICATION SUPPORTED BY AND FOR THE MEMBERS OF THE SOCIETY FOR CLINICAL DATA MANAGEMENT, INC To advance excellence in the management of clinical data Volume 21 Number 4 / 2015 Winter This Issue 2 Letter From the Chair 3 Letter From the Editors 4 Japan PMDA Inspections from a CDM/EDC Perspective 6 The Balancing Act: Edit Check Development in an Environment of Competing Demands 9 Automated Monitoring and Alerts for Interim Reporting in Phase II Clinical Trials 13 Seven Simple Rules to Better Data Management and Improved Pharmacovigilance in Non-Interventional Studies 18 Ask SASsy 20 Submission Requirements
Transcript
Page 1: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

DATABASICS

A PUBLICATION SUPPORTED BY AND FOR THE MEMBERS OFTHE SOCIETY FOR CLINICAL DATA MANAGEMENT, INC

To advance excellencein the management

of clinical data

Volume 21 Number 4 / 2015 Winter

This Issue2Letter From the Chair

3Letter From the Editors

4Japan PMDA Inspections from a CDM/EDC Perspective

6The Balancing Act: Edit Check Development in an Environment of Competing Demands

9Automated Monitoring and Alerts for Interim Reporting in Phase II Clinical Trials

13Seven Simple Rules to Better Data Management and Improved Pharmacovigilance in Non-Interventional Studies

18Ask SASsy

20Submission Requirements

Page 2: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

DATA BASICS 2015

LetterFrom the Chair

2

Derek Perrin

Return to Cover

It’s ALWAYS about the patient October 2015 – Northbrook, IL

As Chair for the Society, I am so pleased to report that we had a record turnout for our Annual Conference in Washington D.C. A total of 746 people attended this year, representing 22 countries and over 200 companies. I would like to extend my appreciation and thanks to the 2015 Conference Task Force for organizing such a successful event!

On Sunday, September 20th, I attended the pre-conference Leadership Forum, along with over 35 industry colleagues. The topic for discussion was “Patient Centricity” in clinical research. The session was comprised of four focused ses-sions led by subject matter expert panelists: Andy Lee from Merck, Anthony Costello from Mytrus, Jeremy Gilbert from PatientsLikeMe, and Joseph Kim from Eli Lilly and Company. Each of them shared their ideas and potential solutions to support building more patient centricity into the clinical development process.

The fifth panelist Carly Medosch truly highlighted the patient view into the work that we do every day. Carly is a patient advocate and previous clinical trial

subject who agreed to come and share her powerful story with us. Her journey into the world of clinical trials started at the young age of 13, when she and

her family looked for solutions to her chronic illness. I encourage you to check her website to learn more about her journey and the work that she is doing to advocate

for patients battling chronic diseases. www.chronicarly.com

Carly represents millions of patients on the other end of that Case Report Form/EDC tool. She raised good suggestions on how to better engage patients and patient advocates in the

design and conduct of clinical research. Specifically, how can we do each of the following:

• Involve potential clinical trial patients in the trial design and feasibility? That is, how do we design studies in which target patients can actually participate?

• Think about what diagnostics tests (e.g., blood draws) are required versus the “nice to have” sources of data for the trial?

• Share the results of the trial with the patients who participated? They are just as interested in the outcome and want to know.

• Make it as easy as possible for them to participate in the trial? As an example, create ePRO measures that are easy to understand and do not create an unnecessary burden for patient completion.

I’ve thought a lot about Carly’s messages after this Leadership Forum and I admire her commitment and advocacy for the millions of patients participating in clinical trials globally. It’s always encouraging to hear a patient’s success story

from a study or drug that you’ve worked on. It was even more powerful to hear Carly’s story of what it means to be a clinical trial participant. Thank you for joining us, Carly, and thank you for sharing your story.

I encourage each of you to look for ways to best serve the patients, who are counting on each of us to make their lives a little better.

As we approach the year-end holiday season, I will end by wishing you all a happy holiday, good health and much happiness!

Derek Perrin, CCDMChair Astellas Pharma

Demetris ZambasVice Chair Novartis

Debra JendrasekTreasurer Global eClinical Solutions Chiltern

Jaime Baldner, CCDMSecretary Clinical Data Management Genentech

Jonathan R. Andrus, M.S., CQA, CCDMClinical Ink

Harshad SodhaPAREXEL International

Charlene Dark, CCDMQuintiles

Melissa Lamb Advanced Clinical

Shannon Labout, CCDMCDISC

Emma Banks DataTrial

Vadim TantsyuraTarget Health, Inc.

2015 SCDM Board of Trustees

Derek C. Perrin2015 Chair, SCDM, Board of Trustees

Page 3: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

3 DATA BASICS 2015 Return to Cover

Letter From the Editors

Editorial Board(also known as Publications Committee)

Lynda L. Hunter, CCDMPRA Health SciencesCo-Editor

Michelle Nusser-Meany, CCDMNovartis OncologyCo-Editor

Rehana Blunt, M.A.Amgen

Maria FassanoPRA Health Sciences

Stacie T. Grinnon, MS, CCDMQuintiles

Kit HowardKestrel Consultants, Inc.

Elizabeth KelchnerRho, Inc.

Nadia MatchumSt Jude Medical

Claudine Moore, CCDMQuintiles

Sanet Olivier, CCDMQuintiles Publication Committee Co-Chair

Derek Petersen, CCDMBaxalta, Inc.

Margarita Strand, CCDMGilead Sciences, Inc.

Vadim Tantsyura, DrPHTarget Health, Inc.

Janet Welsh, CCDMBoehringer Ingelheim Pharmaceuticals, Inc.

Rey Wong, MS, CCDMEisai Inc. Publication Committee Co-Chair

Chandra Wooten, CCDMSDC

Dear Readers,

Dear Readers,

Once again, it has proven to be another memorable year for SCDM. The most recent success was the 2015 Annual Conference which took place in Washington, DC. For those of you unable to attend this year, we are working closely with conference and poster session presenters to encourage them to write articles for future issues of Data Basics.

This quarter’s edition of Data Basics is filled with a variety of thought-provoking articles that we are sure you will find not only informative, but very applicable to your data manage-ment work. We are fortunate to have two articles present-ing interesting and challenging topics entitled “Automated Monitoring and Alerts for Interim Reporting in Phase II Clinical Trials” and “Seven Simple Rules to Better Data Management and Improved Pharmacovigilance in Non-Interventional Studies”. We look forward to your observations and ques-tions about the concepts presented in these articles.

Also in this issue, two articles present posters from the 2015 Annual Conference. Research posters summarize informa-tion or research concisely and attractively to help publicize it and generate discussion. The articles represent the actual poster information. If you have questions or comments about the poster articles, please let SCDM know and we will foster communication between you and the poster authors.

If you have an idea or an article please feel free to contact SCDM. Guidelines for article submission can be found on the SCDM website.

Best Regards,

Lynda Hunter and Michelle Nusser-Meany

Data Basics Co-Editors

In our last issue, we inadvertently omitted Margarita Strand’s name from the Editorial Board list. Margarita was instrumental in putting together our Fall Issue and we would like to apologize to her and to the readers for this mistake.

Sincerely,

Fall 2015 Issue Editors

SAve The DATe!

Page 4: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

4 DATA BASICS 2015 Return to Cover

ObjectiveDescribe the regulatory requirements and the process of preparation for New Drug Applications (NDA) submitted to the Japanese Pharmaceutical and Medical Devices Agency (PMDA), focusing on Clinical Data Management (CDM) and Electronic Data Capture (EDC) Systems.

MethodsBetween June 2014 and June 2015, Gilead filed two consecutive, similar, first-in-country J-NDAs with the Japanese Pharmaceutical and Medical Devices Agency (PMDA). Performance metrics, such as number of pre-paratory meetings, number of individual work hours, number of emails, and number of inspection requests, related to the tasks of preparing for and participating in the approval process were collected and analyzed to demonstrate the change between initial and subsequent J-NDAs [Figures 2 and 3].

In general, there are three overarching sets of delivera-bles for which CDM was responsible:

PMDA GCP On-site Inspection• Formal, one-day Good Clinical Practices (GCP)

compliance inspection by 2-4 inspectors, typically within three months of J-NDA submission

• Performed on-site at the sponsor facility or facility of sponsor’s In-Country Representa-tive (ICR)

• Requires real-time local and remote support from large cross-functional team spanning most every aspect of Development Operations

• Mandatory topics are defined in the PMDA Inspection Sponsor Checklist

• Requires preparation, organization and cataloging of all study trial master file documentation

Inspection Sponsor Checklist• Prerequisite documents for the

PMDA GCP inspection

• Detailed list of documents inspected (at time of GCP on-site inspection) includes “Record of blank Case Report Form (CRF) development/revision,” “Records related to CRF and Data Management”

• Pre-inspection documents for Document Conformity Inspection (to be submitted within 30 days of notification of inspection date [Figure 1]) includes “Record of Blank CRF versions,” “Explanatory material of QC/QA process,” “Explanatory material of EDC (including Validation data of EDC system and EDC Management Sheet),” “Copies of all completed CRFs for site(s) subject to inspection”

EDC Management Sheet

• Describe policies and procedures ensuring compliance with each Japanese Electronic Record/Signature (ERES) guideline1

• Requires diverse cross-functional knowledge of contracts, systems, policies and organization: Biometrics (CDM, Programming, EDC Vendor), Clinical Operations (Project Management, Contracts and Finance), Regulatory (Compliance, Affairs, Operations), IT (Business, Systems)

Results are presented in Figures 1, 2, 3.

Results

Japan PMDA Inspections from a CDM/EDC Perspective By Ryan Lariviere

Figure 1.

Continued on page 5

Figure 2.

Figure 3.

Page 5: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

5 Return to CoverDATA BASICS 2015

ConclusionIn conclusion, the most obvious difference observed between Japanese and American/European submissions was a shift in focus from objective data quality to process and system quality. It is noted that at the time of these J-NDA’s electronic data submission was not required or included. This removed the need for common Food and Drug Administration (FDA) deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange Standards Consortium (CDISC) compliant electronic data will be required2.

Based on the experience of these two first in-country NDAs, the following advice and best practices were observed:

PMDA GCP On-site Inspection• Create visual process flows

• Utilize earphones for real-time interpretation

• Develop a handover process for clear and direct access to remote support

• Create a “Code of Conduct” for those present during the inspection

• In terms of preparation time and efficiency, electronic filing of support documentation is preferable

• Provide Clinical Research Associate (CRA) training and oversight: create “story boards” for CRAs to reference

• Recycle/Re-use work where possible, ensuring document/Standard Operating Procedures (SOP) versions were effective during course of the clinical trial being inspected

Inspection Sponsor Checklist• Know your timelines! Be aware that pre-inspection documents are

required within 30 days of notification of inspection date (~10 weeks post JNDA submission)

• Proactively identify and document process deviations, have a prescribed process for documenting process deviations

• PMDA prefers completed subject casebooks provided on write-only CD-ROM or DVD

• Be prepared to explain QA/QC measures from data entry to data quality review to programmatic statistical analysis to Clinical Study Report publication

EDC Management Sheet• Ensure there is clear record of system versioning (contact your

vendor), vendor contracting

• Recognize the difference between “EDC System” and “Study Configuration”; avoid referencing study specific documents

• Be able to demonstrate user management processes for all user groups, a process for review and maintenance of active system users

• Understand what is considered source data and where it is stored, during conduct and after completion of the clinical trial

• Proactively identify and investigate critical system faults affecting data and audit trail integrity.

References:

1. Japan Pharmaceutical Manufacturers Association Drug Evaluation Committee, European Pharmaceutical Federation (EFPIA), and Pharmaceutical Research and Manufacturers of America (PhRMA). June 30, 2013. Accessed October 15, 2014. <http://www.jpma.or.jp/information/evaluation/allotment/translation_edc.html>.

2. Japan Pharmaceutical Manufacturers Association Drug Evaluation Committee, European Pharmaceutical Federation (EFPIA), and Pharmaceutical Research and Manufacturers of America (PhRMA). June 30, 2013. Accessed October 15, 2014. <http://www.cdisc.org/Japan-PMDA-Technical-Conformance>.

Author Biography:

Ryan Lariviere is a Clinical Data Associate (CDA) at Gilead Sciences, Inc., based in Foster City, California. Ryan’s experience as lead CDA for two Phase 3 Japanese clinical trials, which became the basis for J-NDA submissions and subsequent PMDA inspections, inspired this article. Ryan continues to support the product launch in Japan through his role as lead CDA for Japanese post-marketing studies.

Japan PMDA Inspections from a CDM/EDC Perspective Continued from page 4

匀椀洀瀀氀椀昀礀 琀栀攀 倀愀琀栀 琀漀 䄀瀀瀀爀漀瘀愀氀 眀椀琀栀 倀漀眀攀爀昀甀氀 攀䌀伀䄀 匀漀氀甀琀椀漀渀猀

Page 6: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

6 Return to CoverDATA BASICS 2015

The Balancing Act: Edit Check Development in an Environment of Competing DemandsBy D.W. Theriaque, D. Leblond, J. Auman

IntroductionThe development of edit checks (ECs) serves an important role in the creation of any high-quality data management system, allowing us to identify potentially problematic data during data acquisition1. Within the Clinical Research Informatics core at RTI International, edit check require-ments are developed within the larger framework of system design and development efforts that take place for every clinical research study.

EC development also takes place in an environment of competing de-mands. These demands typically relate to the complexity of project requirements, timetable for study start-up, budgetary limitations and de-veloper workload. Finding the right balance of effort in our development work is an important consideration in order to achieve both project-specific and overarching informatics goals, and for projects developed under US Food and Drug Administration (FDA) regulations, systems ”…should be designed to… prevent errors in data creation, modification, maintenance, archiving, retrieval or transmission…”2

PurposeThe purpose of this project was to perform a qualitative analysis, charac-terizing our in-house edit check development processes for the Electronic Data Capture (EDC) systems we use. We sought to relate our EC devel-opment processes to project technical requirements, budget, and devel-oper workload to determine the most appropriate level of effort for edit check development. Specific areas we wanted to evaluate included how study complexity related to edit check complexity and the factors that limit our ability to develop comprehensive ECs.

MethodsWe reviewed edit check creation activities for more than twenty on-going and recently completed studies, specifically evaluating the effort expended for these types of EC programming activities:

• Standard EC programming and system configuration for commonly used, simple ECs;

• Custom coding of reusable ECs, which are modified and applied across multiple studies and

• Custom coding of “one-off” ECs to support unique, project-specific requirements.

We compared the level of effort of these programming activities to:

• The study development budget;• Project requirements complexity and• Developer availability and workload.

Results

Edit Check ProgrammingThrough review of our standard in-house EC development practices, we identified three distinct levels of programming activities. As described below, and detailed in Table 1, these included Standard ECs, Custom-Coded Reusable ECs, and Study-Specific Custom-Coded ECs.

Standard ECs were used to identify missing values, evaluate basic skip logic and perform range checks and were developed through simple coding or via system configuration. These ECs are typically considered “dynamic” ECs and depending on the EDC system, are executed either in real time or when the record is saved.

As their name implies, Custom-Coded Reusable ECs were developed by adapting existing code and repurposing it for new studies that have similar design requirements. This programming required a modest level of effort, but provided necessary higher-order evaluation of relationships between the data. As an example, this type of EC is commonly devel-oped for eligibility forms used for research networks and consortiums. Studies within a given network tend to focus on similar populations and use similar inclusion and exclusion criteria. Thus, ECs written for one study can be easily adapted for subsequent projects.

Custom-coded study-specific ECs were developed under special circum-stances, only when absolutely necessary. Because of their specificity and complexity, they have limited reusability and are developed as “one-off” EC solutions. Both types of custom-coded ECs typically execute either when the record is saved or as part of a batched process at pre-specified time points.

Project BudgetOur review of budgeting revealed that a sufficient number of hours was typically provided to meet EC programming needs and budget was rare-ly the rate-limiting step in the development of ECs. In some cases due to

Type of EC Level of Customization

Level of Effort Considerations

Standard None. Standard edit checks include fields required, range checks, simple skip checks.

Minimal. Involves configuration via existing tools and entering simple values.

Even if data issues related to the check are not encountered often, the effort to implement is small and these should be implemented.

Custom-Coded (Reusable)

Reusable custom code. Functions taken from similar studies and modified for use in developing checks for inclusion/exclusion criteria and simple cross form checks.

Medium. As reused functions, only a modest level of programming is required to meet study needs.

Consider how often data entry staff will encounter this check, as well as if there is an easy way to change it into a standard check.

Custom-Coded (Study-Specific)

Study-specific custom code. These functions have never been used before, will probably never be used again, and are written just for the study.

High. Has to be custom programmed and is so closely tied to study requirements that future reuse is unlikely.

If not absolutely necessary, these should be avoided. First check for reusable code that can be substituted. Failing that, see if there is a systemic issue that requires the edit check in the first place.

Table 1. Level of programming effort and need for customization in edit check development.

Continued on page 7

Page 7: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

7 Return to CoverDATA BASICS 2015

changing study requirements, allocation of hours were shifted from one area of programming to another, however, this was not a pervasive issue.

Complexity of Project RequirementsThe complexity of project requirements directly related to the level of pro-gramming effort needed to meet development objectives. As depicted in Figure 1, we observed that EC development for projects whose require-ments related simply to the number of variables and forms in the project was neither complex nor time-consuming.

Development requirements related to evaluating relationships between variables and across multiple case report forms (CRFs) and the develop-ment of ECs to examine repeated blocks of data (either within log-type CRFs or across multiple study time points) required more complex ECs and more development time.

Finally, studies with complex eligibility requirements and complex ran-domization algorithms often required custom-coded ECs that were the most complex and time-consuming to develop. The development and testing of these produced the greatest need for increased effort.

Developer WorkloadDeveloper workload also significantly impacted our ability to meet pro-ject development requirements. On average, data managers and devel-opers worked on four to six projects at any one point in time. Because projects are in various stages of completion, developers were often re-quired to quickly shift focus and effort from one project to another. While often unavoidable, we observed that these shifting priorities lengthened the time to project completion and generally decreased work efficiency. In particular, in cases where complex ECs were planned, substantial time for reorientation to programming requirements was necessary.

DiscussionWe found that the ideal level of effort to apply in EC development is a function of the type of edit check required, complexity of project requirements and developer workload. The Considerations column in Table 1, provides guidance in determining the utility of various types

of ECs. Standard ECs are an indispensable tool for robust system de-velopment and should be included in every data management system. Reusable Custom-Coded ECs are also routinely used and necessary for evaluating more subtle characteristics of the data. Developing Study-Specific Custom-Coded ECs should always be done with caution, as they require significant programming effort. In evaluating the need for programming of these one-time solutions, developers (and other analytic staff) should consider the aspects of the study design or implementation that are driving the need for custom coding to determine if there is a more systemic underlying issue with the study or a more parsimonious way to approach development of complex ECs.

The greatest impediment to applying the required effort was workload rather than project deadlines or budgetary issues. We found that project work ebbs and flows, with staff routinely responsible for multiple ongoing projects. These competing demands create unique challenges in being able to carve out time for development work and staying effectively ori-ented to more complex development tasks. These issues must be care-fully managed to develop and maintain a supportive, productive data management ecosystem.

In summary, substantial effort is required to develop focused, useful ECs. Allocating an appropriate amount of time for these activities while ensur-ing DMs availability to complete the work will help ensure project suc-cess. Our best-practice recommendation is to apply Standard ECs first, and then develop other ECs in a staged manner according to carefully evaluated study and programming requirements to ensure robust data management systems.

References:

1. Good Clinical Data Management Practices, Society for Clinical Data Management, October 2013 edition.

2. US Food and Drug Administration. Guidance for Industry: Computerized Systems Used in Clinical Investigations. Rockville, MD: US Department of Health and Human Services; 2007.

Author Biographies:

Douglas Theriaque is a research programmer/analyst with more than 18 years of experience in biomedical informatics. He has expertise in Web-based application development, database development, data management, statistics, and programming methodologies. He is currently a research programmer/analyst at RTI International in North Carolina. Douglas can be reached at [email protected].

Dave Leblond Dave LeBlond is a research programmer/analyst who has been working as a software engineer for 15 years and in the clinical industry for the last five years. He attended North Carolina State University, where he received a degree in Business Management. His current responsibilities include designing and creating edit specifications, and developing clinical data management systems. He is currently a research programmer/ analyst at RTI International in North Carolina. Dave can be reached at [email protected].

Jeanette Auman is a research programmer/analyst with more than 18 years of experience in data management, software design, and development of a variety of computer systems used on multisite clinical research projects. She is currently a research programmer/analyst at RTI International in North Carolina. Jeanette can be reached at [email protected].

The Balancing Act: Edit Check Development in an Environment of Competing DemandsContinued from page 6

Figure 1. Project requirements and edit check complexity.

Number of Study Variables

Number of Study Forms

Relationships between Forms

Com

plex

ity

of E

dit

Chec

ks

Number of repeat instances

Complex eligibility requirements

Multiple randomiza-tion requirements

Page 8: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

eClinicalOS is a registered mark of Merge, an IBM Company. 2015 All rights reserved. 866.387.4257 | eClinicalOS.com

Flexible and cost-efficient Our cloud-based, modular platform means you can implement a full turnkey solution or start by integrating the world’s most popular endpoint adjudication module alongside your current platform.

Fast implementation benefits your research eCOS adapts to your existing processes, and your trial will be up and running in weeks, not months. Plus, you pay only for what you use – and nothing more.

Designed with you in mind We work hard to make things easy, because an intuitive system means more effective users and less time spent in training. Learn exactly what we mean with a free trial at www.meetecos.com.

Visit www.meetecos.com for an open-ended test drive

The ultimate

Cost effective | Flexible | Easy-to-use Data management platform for clinical research

Page 9: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

9 Return to CoverDATA BASICS 2015

AbstractBackground: Interim analyses are integrated into clinical trial designs for safety and feasibility. These designs are advantageous; however, they require real-time ascertainment of enrollment, toxicities and/or outcomes of interest. As clinical trials move to electronic data capture (EDC), the feasibility of automated notifications for outcomes of interest from the data source directly has improved. We present a series of SAS programs that access data directly to generate such notifications.

Methods: A set of SAS programs ascertains real-time clinical trial data and assesses the need for interim analyses. Included are a program specifying trial parameters and reading current data, a dataset of each subjects’ study dates and status and a program comparing observed data with the trial design. These programs collaborate to compare daily enrollment with trial parameters and send an automated e-mail alert when necessitated. For clinical trials which incorporate a futility safety stop, these programs also integrate the safety alert.

Results: The system has been implemented for ten investigator-initiated trials since inception three years ago; three enrollment and no safety trig-gers have been prompted. Implementation examples are presented to illustrate the mechanics of the system.

Conclusions: Our system has been a welcome resource for trial manage-ment. The logic used to develop the system could be adapted for other EDC systems and other valuable notifications.

Keywords: Interim Analyses, Trial Management, Monitoring

BackgroundInterim analyses are incorporated into clinical trial designs to allow the investigational team an opportunity to assess various monitoring goals, which may be based on enrollment, toxicities/events, or study timelines. Although clinical trial designs incorporating interim monitoring are prefer-able, they require real-time ascertainment of enrollment, toxicities and/or progressions. Timely notification of planned interim analyses requires direct communication between study staff and biostatisticians, as well as an understanding of the associated trial parameters. Barriers that delay data entry and notification of upcoming interim analyses can result from personnel and resource limitations1. Automating notifications improves timeliness; it helps ensure patient safety and may reduce costs1. Ideally the data will be cleaned and validated prior to the interim analyses being conducted2; therefore, clinical research staff must make prompt recording and source validation of the essential data a priority.

As clinical trials move to EDC, the feasibility of generating automated no-tifications of enrollments, toxicities and/or responses directly from the data source has improved. We present a series of SAS programs that access data directly from a data management system using customized Oracle data views, assess the trial status, determine if intervention is needed and, when needed, inform the appropriate trial staff. The system is dem-onstrated with an ongoing single-institution Phase II ovarian cancer trial.

PurposeMany commercial and private EDC systems are available to coordinate clinical trial management, including protocol approval, regulatory issues and data collection. For many clinical trials, data entry is completed by the clinical research staff through web-based interfaces of electronic case report forms (eCRFs). Standard eCRFs are typically available in the EDC system and, if trial-specific items are needed, custom forms can be developed. However, the reporting and alerting features available with these systems can vary greatly. While our current clinical trial data management system has some functionality for alerts, these center on regulatory aspects of clinical trials and not on trial characteristics. Our objective was to create a supplemental notification system to send alerts based on enrollment status and interim analyses requirements.

Methods

The Electronic Data Capture SystemClinical trials collect a variety of demographic, treatment, adverse event, outcome and safety data with each collected in its own eCRF. Most EDC systems have standard forms and allow customized eCRFs to be devel-oped. Since each eCRF is stored as a unique database, data elements are consistent across cancer clinical trials at the institution. For custom eCRFs, the variables names can be standardized and used for multiple clinical trials with similar designs and outcomes.

While de-identified clinical trial data can be accessed from the EDC system’s web interface and downloaded, direct retrieval of clinical trial data via Oracle views of the underlying databases is also available. Customized Oracle views can be built to provide the biostatistician live access to the database containing consent, eligibility and randomization data. Our EDC system does not permit access to trial management data elements from the web interface, so customized views are a necessity. For convenience and accessibility, the SAS programs reside on the server with the EDC system’s Oracle databases.

Our SAS programs and sophisticated SAS macros3 tested using Unix SAS 9.3 work together to facilitate trial management based on real-time trial data. Each trial in the system has three components that work closely together: a trial specification program, a permanent enrollment dataset, and a design assessment macro. The system currently houses macros to assess a variety of common Phase II designs; however, we focus on the macro for the Simon’s 2-Stage design, a commonly utilized single-arm Phase II trial design for response outcomes in cancer trials4. The underly-ing concept and setup is similar for other designs.

Trial Specification Program

The trial specification SAS programs are particular to individual clinical trials. Each initializes a set of macro variables with the clinical trial’s specifications, including sample size and interim analysis requirements. Figure 1 illustrates the sequential logic of the system using a single arm

Automated Monitoring and Alerts for Interim Reporting in Phase II Clinical Trials By Stacey A. Slone, Emily V. Dressler, Mark Stevens, Rachel W. Miller

Continued on page 10

Page 10: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

10 Return to CoverDATA BASICS 2015

Automated Monitoring and Alerts for Interim Reporting in Phase II Clinical TrialsContinued from page 9

ovarian trial with a Simon’s 2-Stage design4. The Simon’s 2-stage design is calculated using a predetermined enrollment number as the trigger for the interim analyses.

Hence, the study design parameters included in the trial specification program are the stage 1 sample size (N1), stage 2 sample size (N2) and requested “lag (L)”. The lag is determined by the study team and denotes when they begin receiving notifications of an approaching accrual limit. For example, if notification is to begin with a single accrual remaining, the L=1. For larger lags, notifications are sent after each accrual until the limit is met.

Other parameters incorporated into the program include the unique trial identifier from the EDC system, the corresponding SAS enrollment dataset and the study staff’s e-mail addresses. E-mail addresses are listed for the primary investigator, the biostatistician, the clinical research nurse and the project manager.

Permanent Enrollment DatasetThe trial specification program initially creates a temporary SAS data-set and verifies the existence of the trial’s permanent dataset. When

the dataset exists, the program compares the previous and current days’ datasets; if they match exactly, no new relevant data have been entered. Then the program overwrites the permanent dataset with the current day’s data and stops execution. However, when no permanent dataset exists or the previous and current datasets do not match, the trial program then calls the associated design macro to assess the current enrollment with the trial design elements to ascertain if a trigger e-mail is warranted. The variables vary depending on the triggers created, such as enrollment sta-tus or safety events, but typically include a patient identifier, the on-study date and the protocol identifier with one observation per enrolled patient.

Design Assessment MacroThe design assessment macro is a program that compares the observed data with the trial design. Since the design programs are SAS macros, they can create triggers for multiple trials by changing the trial specifica-tions. In our example, the Simons 2-stage design macro compares the current accrual as recorded in the EDC system with the enrollment limits, N1 and N2 (see Figure 1) using the pre-specified L. If the current accrual is between N1-L and N1, then the program generates an automatic e-mail5, 6 to the study team relaying the number of accruals remaining until an interim analysis is required. When the accrual is equal to N1, a similar e-mail is generated stating an interim analysis is necessary. If the current enrollment is outside the fore mentioned intervals, then the macro ends execution and the day’s temporary dataset is saved as the updated per-manent trial dataset. The macros also assess the rare occasion where a day’s accrual oversteps a limit. Similar logic is followed for N2.

For trials with safety triggers, the safety alert is built into the trial param-eters program and assessed using similar comparison logic. For example, another Phase II trial studying the effects of a nutritional supplement on length of hospital stay for hematologic malignancy patients was con-cerned with Grade 3 Bearman toxicities. An eCRF was created to collect this particular adverse event information and a safety alert section was added to the trial specification e-mail.

Program AutomationThe trigger programs and datasets reside on an UNIX server. The key to the automation is an executable file, which consecutively calls the trial specification files. The executable file runs daily at 7 a.m.

Implementation

ExampleA single-arm Phase II investigator-initiated ovarian trial with a Simon’s 2-stage design focusing on neoadjuvant chemotherapy with surgical de-bunking utilizes the trigger system. The sample size is 28 women with 9 women in Stage 1. The primary objective is to estimate the rate of maximal surgical cytoreduction, i.e., no gross residual disease. The study team chose notification when enrollment was within 2 patients of a limit, so the L=2. Since the interim analyses are based on accrual, this trial’s permanent trial dataset contains one observation per patient and cap-tures patient identifier, on-study date and protocol identifier. After the 7th

patient was enrolled, the system generated an e-mail stating the interim

Continued on page 11Figure 1. System flowchart.

Replace permanent

data set with

today’s data set

YES DESIGN MACRO: Compare current enrollment with trial specifications, determine which formula is true:

Accrual vs Trial Specifications Interpretation

Enrollment Status

Accrual < (N1 – L) Current Accrual < Stage 1 Trigger OR N1 < Accrual < (N2 – L) Stage 1 Target < Current Accrual AND Current Accrual < Stage 2 Trigger

Stage 1 or 2 Accrual Period

(N1 – L) ≤ Accrual < N1 Stage 1 Trigger ≤ Current Accrual AND Current Accrual < Stage 1 Target OR (N2 – L) ≤ Accrual < N2 Stage 2 Trigger < Current Accrual AND Current Accrual < Stage 2 Target

Stage 1 or 2 Notification

Period

Accrual = N1 Current Accrual = Stage 1 Target OR Accrual = N2 Current Accrual = Stage 2 Target

Stage 1 or 2 Target

Acquired

Notify PI of Enrollment Status via

email

Daily enrollment data set is created by EDC system

[Only patient ID, enrollment date, and protocol ID are kept]

Does a corresponding permanent enrollment data set

exist?

Is today’s enrollment data set same as permanent data

set?

NO

YES NO

Initiate Design Macro to determine Enrollment

Status

Figure 1: Flow chart example of sequential logic utilized by SAS programs that provide enrollment status alerts.

Page 11: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

11 Return to CoverDATA BASICS 2015

Automated Monitoring and Alerts for Interim Reporting in Phase II Clinical TrialsContinued from page 10

analyses should be performed after 2 more accruals. A similar e-mail sent after the 8th patient was enrolled. (see Figure 2).

Finally, when the 9th patient was enrolled, an e-mail was triggered stating the stage 1 accrual was met (see Figure 3). After data adjudication, the biostatistician completed the interim analyses and the number of women with no gross residual disease was met and enrollment continued. The trial is still accruing and daily monitoring continues. The next e-mail will be generated when accrual reaches 26 (28 – 2 participants).

DiscussionOver the past three years, the system has been activated for 10 stud-ies. Currently, 7 clinical trials have active programs running daily. Three of the trials have triggered alerts that notified study staff of upcoming interim analyses and/or trial accruals met. No safety alerts have trig-gered. For investigator-initiated trials without interim analyses, an inde-pendent program runs to check total accrual. During the implementation phase of new investigator-initiated trials, possible triggers are being considered and added to the data management of the trials. The macros can be accessed at http://ukhealthcare.uky.edu/markey/biostats-tools-software.

AdvantagesThe system provides enrollment confirmations for the primary investigator and biostatistician to ensure interim analyses are performed in a time-ly manner since the study staff is responsible for multiple clinical trials. Depending on the triggers requested, serious adverse events or other issues can also be reported and tracked in real time. With further minor tweaks to the programs and macros, the daily e-mail could create a daily or weekly PDF report of all investigator-initiated trial enrollments.

LimitationsOur system does have some limitations. If SAS 9.3 and the EDC system do not reside on the same server, the automation of the program to run daily without intervention would be challenging. However, the general concepts would still be applicable. Perhaps the largest limitation is the data entry time. The program is only beneficial if eCRF data are entered in a timely manner. At our institution, trigger generating elements for a trial must be entered into the EDC system within 24 hours. These elements are detailed prior to study initiation, so all study personnel are aware of the requirement.

ConclusionCurrently, the trigger system has been implemented for ten investigator-initiated trials at our institution. Enrollment triggers were sent for three tri-als and fortunately, no safety triggers have been prompted. Our system has been a welcome addition to the data management of our cancer clinical trials. The daily accrual e-mails are referenced frequently during clinical trial reviews. The logic used to develop the e-mail triggers is quite general and could easily be adapted for other EDC systems which allow direct access to the underlying database.

AcknowledgementsThe authors thank Ms. Donna Gilbreath, Dr. Heidi Weiss, and Ms. Meng Liu for their review and comments which strengthened the manu-script. This research was supported by the Cancer Research Informatics and Biostatistics and Bioinformatics Shared Resource Facilities of the University of Kentucky Markey Cancer Center (P30CA177558).

References1. Day RS. Failsafe automation of Phase II clinical trial interim monitoring for

stopping trials. Clinical Trials. 2010; 7: 78-84.

2. Nolen T, Dimmick B, Ostrosky-Zeichner L, Kendrick AS, Sable C, Ngai A and Wallace D. A web-based endpoint adjudication system for interim analyses in clinical trials. Clinical Trials. 2009; 6: 60-6.

3. SAS 9.3 Macro Language: Reference. Cary, NC: SAS Institute Inc., 2011.

4. Simon R. Optimal two-stage designs for phase II clinical trials. Control Clin Trials. 1989; 10: 1-10.

5. Tilanus E. Sending E-mail from the DATA Step. SAS Global Forum. San Antonio, TX: SAS Institute Inc, 2008.

6. Worden J and Jones P. You’ve Got Mail – E-mailing Messages and Output Using SAS Email Engine. In: Inc SI, (ed.). Twenty-Ninth Annual SAS Users Group International Conference. Montreal, Canada: SAS Institute Inc, 2004.

Figure 3: Trigger E-mail Example at Interim Enrollment

Figure 2: Trigger E-mail Example with 1 Patient Remaining

FROM:    [email protected] DATE:        Tuesday,  September  11,  2012,  7:00  AM TO:                [email protected],  [email protected],  [email protected],   RE:                  Enrollment  Alert  for  Protocol=11-­‐GYN-­‐XX

An  interim  analyses  is  planned  for  protocol=11-­‐GYN-­‐XX  after  9  subjects  are  enrolled.  

As  of  11SEP2012,  8  subjects  are  enrolled.    Interim  analyses  should  be  conducted  after  1  additional  subject  is  enrolled.      -­‐-­‐Statement  of  Confidentiality-­‐-­‐ This  message  (and  any  attachment)  is  intended  only  for  the  recipient and  may  contain  confidential  and/or  privileged  material.  If  you  have received  this  in  error,  please  contact  the  sender  and  delete  this message  immediately.    Thank  you.

FROM:    [email protected] DATE:        Thursday,  September  13,  2012,  7:00  AM TO:                [email protected],  [email protected],  [email protected],   RE:                  Enrollment  Alert  for  Protocol=11-­‐GYN-­‐XX

An  interim  analyses  is  planned  for  protocol=11-­‐GYN-­‐XX  after  9  subjects  are  enrolled.  

As  of  13SEP2012,  9  subjects  are  enrolled.    Please  run  interim  analyses.      -­‐-­‐Statement  of  Confidentiality-­‐-­‐ This  message  (and  any  attachment)  is  intended  only  for  the  recipient and  may  contain  confidential  and/or  privileged  material.  If  you  have received  this  in  error,  please  contact  the  sender  and  delete  this message  immediately.    Thank  you.

Continued on page 12

Page 12: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

12 Return to CoverDATA BASICS 2015

Author Biographies:

Stacey Slone is a biostatistician in the Biostatistics and Bioinformatics Shared Resource Facility of the Markey Cancer Center at the University of Kentucky. She earned her MS in statistics from the University of Kentucky and has a wealth of experience in all phases of clinical trials as well as population based studies. She currently is involved with multiple investigator-initiated trials, is proficient in statistical SAS programming and leads the Data Management group within the SRF.

Emily Dressler is an assistant professor in the Division of Cancer Biostatistics at the University of Kentucky and a faculty member of the Biostatistics and Bioinformatics Shared Resource Facility of the Markey Cancer Center. She earned her PhD from the Medical University of South Carolina and has expertise in Phase I adaptive trial designs based on the continual reassessment method. She has been involved in the development of many early phase clinical trials and currently oversees the implementation, management, and interim monitoring of 8 investigators initiated clinical trials as lead biostatistician.

Mark Stevens has 18 years of experience working as a data analyst. The last 10 years he has served with the Markey Cancer Center Cancer Informatics Research Facility as a database analyst and administrator of Oncore Enterprise Research System, the cancer centers CTMS (Clinical Trials Management System) and EDC (Electronic Data Capture) System. He has prior experience working for two mid-size CROs (Clinical Research Organizations) as a statistical programmer and quality assurance auditor where he worked on clinical trials (Phase I - Phase III) and registries (Phase IV) in multiple therapeutic areas. He also has worked as a statistical programming consultant for social sciences and as a data research analyst for institutional research at the University of Kentucky.

Rachel Miller is an Assistant Professor in the Department of Obstetrics and Gynecology, Division of Gynecology Oncology, within the College of Medicine at the University of Kentucky. She is a clinician-scientist with interests in the areas of ovarian cancer including survivorship issues, specifically chemotherapy-induced cognitive impairments; early detection of ovarian cancer; novel strategies in the treatment of advanced ovarian, fallopian tube, and peritoneal cancers; and population-based ovarian cancer research. She has experience with investigator initiated clinical trial development, participated in numerous cooperative group trials and played a key role in a longstanding ovarian screening study.

Automated Monitoring and Alerts for Interim Reporting in Phase II Clinical TrialsContinued from page 11

Page 13: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

13 Return to CoverDATA BASICS 2015

IntroductionNon-interventional studies (NIS) are an essential part of the clinical de-velopment program. The term non-interventional includes post-marketing surveillance studies (PMS), as well as disease or drug registries, obser-vational studies and mandated post authorization safety studies (PASS). Over 18% of the clinical studies registered on the clinicaltrial.gov site are observational studies (or 35,146 data as of March 10, 2015)1. In clinical trials, the efficacy of an investigational product is explored in a patient population which has been selected according to strict inclusion and ex-clusion criteria. In non-interventional studies, patients are treated under real life conditions. The fundamental difference between a clinical trial and a NIS is that the data collection or patient-participation in the NIS does not interfere with the choice of treatment, sample collection, procedures, or the treatment itself, all of which follow standard practice of care. The patients will receive the same treatment and diagnostic procedures as they would have received if they were not included in the study.

Similar project tasks are performed in clinical trials and NIS (e.g. proto-col writing, Ethic Committee (EC) applications, electronic Case Report Form (eCRF) specification and data management). This often leads phar-maceutical companies and Contract Research Organisations (CROs) to believe that they can follow the same processes and standard operating procedures (SOPs). What is unique to NIS is perhaps the sample size; we see patient numbers increasing to thousands and shorter observation periods than in randomized clinical trials (RCTs). A recent national study we performed involved 7,000 patients across 700 sites in the Russian Federation. The approach outlined in this paper was successfully imple-mented in the above referenced national study with respect to the eCRF design and data management.

Over the past decade, we have seen the industry use adaptive trial design to reduce patient numbers in RCTs, yet we also see much larger numbers of patients in NIS. This begs the question: is the blanket application of pro-cedures developed for earlier phase studies creating inefficiencies in NIS? At the European CRO Federation (EUCROF), we know from shared experi-ence that running NIS along the same processes can lead to inefficiencies and added cost for both the clinical investigator teams and our staff.

The outsourced clinical trials market will increase revenues from 2013 to 2023, achieving strong growth with late phase trial services driv-ing growth according to a recent marketing report2. For quite some time there has been an opinion that NIS evidence is complementary to RTCs3 and data are usually published and made available for the scien-tific community; meaning that results have to be robust and reliable.

Rule 1: Understand and Identify the Important DataPrimary outcome data and safety data are important and therefore the data validation focus must be here. After that we need to stop and think, which data address the objectives?

In an observational setting, it is quite common for sponsors to add sev-eral secondary objectives to make the most of the large population ex-

amined and the period of observation. It can happen that, from the sponsor’s point of view, some of these secondary objectives could have the same strategic importance as the primary objective. Often the rela-tive importance of the secondary objective vs. the primary one can be ambiguous due to the specific needs of the sponsor. As a result, the rela-tive importance of the secondary data should be explicitly detailed so the correct level of data scrutiny and cleaning is applied.

I hear it said, that if we don’t analyse the data we should not collect it. However, at the design stage it is often very difficult for a sponsor representative to answer the question: “Do we really need this data?” Therefore, rule 2 may assist.

Rule 2: Only Collect the Necessary DetailThe protocol writer does not always review the work of the CRF design-ers and when we use Sponsor Standard Forms. These forms are standard for early phase work, but when we conduct post approval work we should really develop new standard templates.

We must also ask whether we need to run all the standard checks that we use in other studies, i.e. start date before stop date, visit sequence dates, height checks, and weight checks. This uncertainty is compounded by the fact that CRF and database design in NIS has largely incorporated standard forms taken from clinical trials CRFs, e.g. concomitant medica-tion pages, Serious Adverse Event (SAE) pages, vital sign pages and laboratory pages. These standard forms may not be appropriate to the more important study outcomes in NIS, but when utilised, data manage-ment feel obliged to run all the usual checks. Yet it is strange that in RCT software despite all these extensive checks, we still have a lot of queries going back to the clinics. Should we not expect zero data queries?

In NIS, it may be that concomitant medications information is collected to support Adverse Drug Reactions (ADRs) cases. Therefore, we do not require the same level of detail as in RCTs. It is probably sufficient to have the medication, the dose and an estimate of how long the patient has been receiving this medication, e.g. less than 1 year, 1-5 years or greater than 5 years. A simple design change like this can significantly reduce the data management and clinical staff workload. See Figure 1 for a suggested late phase concomitant medications data template. This will produce zero data queries.

Seven Simple Rules to Better Data Management and Improved Pharmacovigilance in Non-Interventional StudiesBy Tomás O’Mahoney, Ander Zaldumbide, Conal Nolan

Figure 1: Late Phase concomitant medications page template

Continued on page 14

Page 14: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

14 Return to CoverDATA BASICS 2015

Seven Simple Rules to Better Data Management and Improved Pharmacovigilance in Non-Interventional StudiesContinued from page 13

Similarly, the standard Adverse Event/Serious Adverse Event (AE/SAE) forms for clinical trials are very often stand-alone case studies processed by the Pharmacovigilance department and used to build up the product safety profile. This results in a good picture of the safety profile of the product by the time of product launch. Of course, we can learn more from the non-controlled sample population that participate in NIS and we know that under-reporting of ADR is well documented in the literature4.

The important thing should be that all suspected ADRs are reported and the initial detail should be secondary. We would recommend that the big four are recorded, (the product, the event, the patient identifier and the credible reporter) plus one (the causality, i.e. is it related to the product?). If we capture minimal data, we are encouraging the reporting of events by making the process simpler – ask for less and you will get more. We will have the initial report with essential information – this is a good thing. By simplifying the reporting, we may get a better safety profile of the drug in real world practice. Marketing authorisation holders must have mechanisms in place to collect full and comprehensive case information at the time of initial reporting, in order to allow meaningful assessment of individual cases and expedited reporting of valid Individual Case Safety Reports (ICSRs) to competent authorities as applicable. The point here is that the eCRF should only contain the initial minimum information and this can be followed up through Pharmacovigilance processes for support data that are not included in the CRF. Uncoupling the study database from the PV database is not intuitive, but it will be more efficient.

In accordance with the current legislation5, only serious ADRs (SARs) are required by the competent authorities in NIS (see Figure 2). All SARs should be reported within 15 days of becoming aware of the adverse reaction. The periodic safety update report (PSUR) is intended to provide an evaluation of the risk-benefit balance of a medicinal product and will be submitted by marketing authorisation holders at defined time points during the post-authorisation phase. However, PSUR may not be required for low-risk or old products (generic medicinal products (DIR Art 10(1)), well-established use medicinal products (DIR Art 10a), homeopathic me-dicinal products (DIR Art 14), traditional herbal medicinal products (DIR Art 16a)6. Non-related adverse events, including serious events, are not appropriate to NIS. More often we still see the reporting of AEs written as part of the study protocol.

Perhaps that is because the GVP Module VI in the section Reports from non-interventional studies indicates that only reports of adverse reactions where a possible causal relationship with the suspected medicinal prod-

uct is considered by the primary source or the marketing authorisation holder should be reported; other reports of events should be included in the final study report. Other reports of events can be interpreted as AEs of Special Interest, those initially reported as ADRs that turn out to be non-related on review of the ICSR and non-serious expected ADRs which do not have to be reported to the competent authority.

There is a strong argument for removing the collection of AEs from NIS Protocols, that is compliant with GVP regulation July 2012 Module VI Reporting of Adverse reaction.

Rule 3: Think of the Relevance of Data QueriesAnd when your NIS does have queries, only asking clinical staff to resolve data queries that are essential to the descriptive analysis is a good start.

More challenging perhaps is to think about the query and whether it is critical to their treatment decisions. Let’s look at an example:

A Summary Product Characteristics for a product specifies that a diag-nostic procedure (MRI) is advisable within 90 days of deciding to pre-scribe a drug.

The observational data collected shows that often physicians are pre-scribing the drug without performing the diagnostic procedure within the 90 days. Perhaps they are using “older data” from an MRI that was performed more than 90 days or other diagnostic evidence to make a treatment decision. We have data, but not the data we expect.

Do we raise this as a query?

We must strike a balance between data collection and burdening the clinical team or the patient. A well designed CRF, tightly reflecting the protocol, will also assist with this.

NIS-DM Rule 4: Simplify the Query Process FlowResolving data queries is easier when there are fewer people involved. If the Data Manager and the clinic can communicate directly to resolve the query, then we should keep the monitor/CRA out of the data-cycle. We can, of course, provide read-only access to the query for the monitor/CRA and allow the monitor to write “notes” (not considered queries or part of the Data Management (DM) process). Of course, if the monitor/CRA wants to raise a query, they should be able to advise the data man-ager of their observations so that the Data Manager can consider this.

The query to be resolved should have minimal options, if possible. We would suggest just two query-response options: change the data or con-firm it is correct.

NIS-DM Rule 5: Be “Smarter” with Edit ChecksWith edit checks, one could consider the categorisation of data checks and plan how best to handle a query accordingly, e.g. such data cat-egories would include “mandatory and defined”, “of acceptable ranges and exceptions” and “laboratory data and significance”. Management of each category can be part of the NIS-DM plan.

For example, a scale such as the EDSS (Expanded Disability Score) can be mandatory in many Multiple Sclerosis studies. It is a well-defined

Continued on page 16

AE/SAEs SARs SU-SARs

Fig 2: Serious Adverse Drug reactions are more relevant to NIS.

Page 16: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

16 Return to CoverDATA BASICS 2015

scale i.e. all entries must be 1-10. Therefore with this mandatory/de-fined category we DO NOT ACCEPT other values; all other values are blocked using an edit-check at the time of data entry. No exceptions and no queries.

“SMARTER” means understanding that it is real world data that we are col-lecting, which by its nature will be more variable than clinical trial data. Understanding this, we can categorise the data, allowing us to make an informed decision whether to raise a query or not. We must look at the context of the data to make decisions about whether the data has a good probability of being correct.

“SMARTER” also means looking at the data across sites and over time. Critical data can be compared between sites and over the course of the study. This will highlight any systemic issues with data collection and al-low a corrective or preventative action to be considered while the study is still operational.

Rule 6: Understand the Analysis Requirement of the Protocol before Writing a Data Validation PlanThe analysis section of the protocol is important. Data management per-sonnel must understand what data are important to the statistician/ana-lysts, including what baseline data are required for the analysis. The data management plan must reflect these needs.

When a descriptive data analysis is required and no confirmatory statis-tics will be applied, it is equally important to work towards a clean data set to support the analysis.

Do not, however, employ the standard checks that are routinely used in interventional clinical trials. Ask yourself “what can be eliminated from the standard list of checks?”

Equally important is planning for missing data. You will have missing data when the patients are treated under differing standards of care across different institutions and regions.

Evaluate and communicate the availability of key data elements which make up the core data set. The statistician/analyst should define the criteria for handling missing data in the analysis plan.

Rule 7: Write a NIS-data Validation PlanIn effect we are recommending the same approach to the preparation of any data validation plan or data management plan. Do not add in “all the usual” rules. Do not just use the early phase template or SOP.

For example, in most situations it’s relevant to have a good classification of medications and diseases (by means of coding using the WHO Drug Anatomic Therapeutic Chemical (ATC) classification or Medical Dictionary for Regulatory Activities (MedDRA)). It is then important to ensure that all verbatim terms are legible. It is essential that the data manager under-stands the purpose of coding data and why it is performed. It may not be necessary in all studies, e.g. in the case of a mature product, and if you choose not to code, coding can always be completed at a later stage.

Prepopulating the AE form with terms that can be auto-coded is another approach to reduce the number of terms that require coding.

In conclusion, NIS specific standards and processes need to be devel-oped. As we work through the process of data cleaning, it is important to liaise with data analysis and to understand the study reporting require-ments. Standards in RCTs should not be applied to the NIS plan.

We are working in an environment of standard care which differs across centres and countries and therefore, we must adjust our processes to pro-duce data of good quality. NIS can produce good results, but it requires the processes that we use to be reviewed and adjusted. Data collection must be well defined, limited and focused, where possible. A good CRF is closely aligned with the primary objectives of the study.

Along with this, we suggest the development of an NIS data validation plan incorporating some flexibility for review of data that may lead to decisions being made NOT to query real world data. Expect missing data and plan on how to handle this. Do not think that data queries are to be expected! Change the thinking on this.

While a lot of these recommendations may be obvious, what is more ob-vious to us is that the same mistakes are being made across many studies by different stakeholders. So, to all multifunctional members of your NIS team, our advice to you is in order to enable robust, efficient NIS, do not simply edit the current SOPs. Instead, invest time in writing new NIS procedures and design eCRFs that are appropriate to NIS.

References:

1. http://clinicaltrials.gov/ct2/resources/trends

2. Pharma Clinical Trial Services: World Market 2013-2023, published in January 2013 by Visiongain.

3. Thadhani R. Formal trials versus observational studies. In: Mehta A, Beck M, Sunder-Plassmann G, editors. Fabry Disease: Perspectives from 5 Years of FOS. Oxford: Oxford PharmaGenesis; 2006. Chapter 14.

4. Under-reporting of adverse drug reactions : a systematic review. Drug Saf. 2006;29(5):385-96. Hazell L, Shakir SA.

Underreporting of recognized adverse drug reactions by primary care physicians: an exploratory study. Pharmacoepidemiology and Drug Safety Volume 20, Issue 12, pages 1287–1294, December 2011. A. Calderón-Larrañaga et al.

5. GVP regulation July 2012 Module VI Reporting of Adverse Drug Reactions.

6. GVP regulation July 2012 Module VII Periodic Safety Update Report.

EUCROF, European CRO Federation, aims to promote clinical research and support the close relationship and exchange of information between member associations. The Federation develops training and educational programmes for clinical research and shares information on developments in clinical research with health care professionals at international level. EUCROF consists of members and associated members from 17 countries and stands for 300 member CROs and over 15,000 employees.

Late Phase Working Group: The mission of this working group is to promote good practices in the conduct of Late Phase studies, including observational and non-interventional through the sharing of knowledge, competence, expertise and skills. To be a recognized stakeholder among industry and regulators in the conduct of such studies in Europe and globally. www.eucrof.eu

For further correspondence contact: [email protected]

Seven Simple Rules to Better Data Management and Improved Pharmacovigilance in Non-Interventional StudiesContinued from page 14

Continued on page 17

Page 17: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

17 Return to CoverDATA BASICS 2015

Author Biographies:

Tomás O’Mahoney is co-founder of RealWorldEDC and has over 20 years of international clinical trial experience in Japan, the UK and Ireland. As Observational and NIS Study operations director at RealWorld EDC, Tomás works with sponsor companies to interpret the operational challenges for your study and makes the EDC practical for each stakeholder. Tomás holds a B.Sc (Pharmacology) from the National University of Ireland Dublin and an MBA from UCD’s Smurfit Graduate School of Business. Tomás is a member of the EUCROF’s (the European CROs Federation) late phase working group.

Conal Nolan B.Sc. – Data Manager and eCRF Designer with clinical trial endpoint/ RealWorldEDC, focused on late phase research. Oracle OCA. Conal is a science graduate with experience in software development and roll out to international sites for the clinical research, laboratory software and logistics industries.

Ander Zaldumbide is a Project Leader and he has developed his career in the clincal research industry, focused mainly in the observational research field. He started his research career 6 years ago as a project leader managing local observational studies and soon moved to international project management. Prior to that, he worked as a pharmacist in the UK as Pharmacy Manager for 5 years, which allowed him to better understand patients‘ and prescribers‘ needs. Ander Zaldumbide holds an MA in MDCIF (Scientific Departments of the Pharmaceutical Industry) from ESAME foundation and University of Barcelona, Spain, and holds a BSc in Pharmacy from the University of Navarra, Pamplona, Spain. He is a member and an active contributor for the EUCROF Late Phase Working Group.

Seven Simple Rules to Better Data Management and Improved Pharmacovigilance in Non-Interventional StudiesContinued from page 16

Improving patient safety worldwide

WHODrugTM - the independent, high-quality medical reference source built on 40 years of pharmacovigilance excellence.

• Promotes patient safety by exporing the benefits and risks of medicines

• Transforms medicine safety data into knowledge and best practice

INSPIRE. ENGAGE. TRANSFORM.

Visit our booth 408.

www.who-umc.org

109M

-01

Page 18: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

18 Return to CoverDATA BASICS 2015

Ask SASsy By Kelly Olano

Dear SASsy,

I have used the LAG function in SAS to return the value of a variable from the previous observation, but what about a LEAD function? I need to return the value of a variable from the next observation.

Lagging Behind in Laguna Beach

DATA a; INPUT visit x; lagx = LAG(x); DATALINES;1 10 2 20 3 30 4 40 5 50 6 60 7 70 8 80 ;RUN;

PROC SORT DATA = a OUT= b; BY DESCENDING visit; RUN;

DATA c; SET b; BY DESCENDING visit; leadx = LAG(x);RUN;

PROC SORT DATA = c; BY visit x; RUN;

The PROC PRINT of data set c is:

Dear Lagging,

Unfortunately, The LEAD function does not exist as a SAS function simply because SAS reads data sequentially and while previous values can be ac-cessed, SAS cannot look ahead to the values of the next observation while processing the current one. There are, however methods one can use to simulate a lead function. Probably the most obvious and most used method is to sort a data set in descending order and then apply lag functions to create lead values. The data must be sorted a second time to get the data back to the original order however. An example follows:

Continued on page 19

Page 19: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

19 Return to CoverDATA BASICS 2015

Another, perhaps easier way to create a LEAD variable is by using the DATA step, MERGE statement, and some data options. The first step in creating a lead variable is to merge the a data set with itself. The key is to add the FIRSTOBS= data set option to the second data set. The FIRSTOBS=n data set option specifies the first observation that SAS will begin reading in the data set. By default, FIRSTOBS is set to 1 because we usually want process-ing to begin with the first observation. But when FIRSTOBS=2 is added as an option to the second merge data set, the first observation and second observation can be accessed at the same time. It is important to also include the KEEP= and RENAME= data set options so that the x variable has a unique name and the other variables do not write over each other. An example using data set a above follows:

DATA example2; MERGE a a(FIRSTOBS=2 KEEP=x RENAME=(x=leadx)); RUN;

The PROC PRINT of data set example2 is:

As you can see, the results of the two methods are identical. Hopefully, this adds a useful tool to your SAS programming toolbox!

Happy Programming!

Ask SASsy Continued from page 18

Author Biography:

Kelly (SASsy) Olano is a Research Database Programmer for the Data Management Center within the Division of Biostatistics and Epidemiology at Cincinnati Children’s Hospital Medical Center. She has had over 10 years of experience in SAS programming and database design for clinical trials and learning networks. She received her Bachelor of Science degree in Psychology/Mathematics and Bachelor of Arts in Education from Northern Kentucky University. She is a member of the Society for Clinical Data Management and the Greater Cincinnati SAS User’s Group. She is also a SAS Certified Base Programmer for SAS 9.

YOU A DATA CHAMPION?

ARE

SCDM 2014

NextGen

Technology

Innovation

Award Finalist!

SCDM 2014

NextGen Technology

Innovation Award

WINNER!

Use DATA and ANALYTICS TOOLS that turn CDMs into superheroes with power to raise trial quality.

ASK US ABOUT COMPASS.

www.bioclinica.com/compass

Page 20: DATA - scdm.org · deliverables such as Study Data Tabulation Model (SDTM) data sets, define.xml file and sample CRFs. However, as of 01 October 2016 submission of Clinical Data Interchange

20 DATA BASICS 2015

Submission Requirements

Return to Cover

Publication PolicyWe welcome submission of materials for publication in Data Basics. Materials should prefer-ably be submitted in electronic form (MS Word). Acceptance of materials for publication will be at the sole discretion of the Editorial Board. The decision will be based primarily upon pro-fessional merit and suitability. Publication may be edited at the discretion of the Editorial Board.

Neither SCDM nor the Data Basics Editorial Board endorses any commercial vendors or systems mentioned or discussed in any materials published in Data Basics.

Advertising PolicyAD RATES** x1 x2 x3 x4FULL Page $1,064 each $1,008 each ($2,016) $960 each ($2,880) $906 each ($3,624)HALF Page $740 each $700 each ($1,400) $670 each ($2,010) $630 each ($2,520)QTR Page $450 each $425 each ($850) $402 each ($1,206) $378 each ($1,512)

**Ads are net, non-commissionable.

Advertisers purchasing multiple ad packages will have the option of placing those ads anytime within the 12-month period following receipt of payment by SCDM.

Quarter Page = (3 5/8 inches x 4 7/8 inches) Half Page-Vertical = (3 5/8 inches x 10 inches)

Half Page-Horizontal = (7 1/2 inches x 4 7/8 inches) Full Page = (7 1/2 inches x 10 inches)

MECHANICAL REQUIREMENTS: Do not send logo/photos/images from word processing

software, presentation software or websites. Files should be saved in the native application/file format

in which they were created at a resolution of 300 dpi or higher. Acceptable file formats include AI, EPS

and high resolution PSD, JPEG, TIF and PDF.

PAYMENT: Payment must be received with advertisement. Space reservations cannot be made by

telephone. There is NO agency discount. All ads must be paid in full.

CANCELLATIONS: Cancellations or changes in advertising requests by the advertiser or its

agency five days or later after the submission deadline will not be accepted.

GENERAL INFORMATION: All ads must be pre-paid. Publisher is not liable for advertisement

printed from faulty ad materials. Advertiser agrees to hold SCDM harmless from any and all claims or

suits arising out of publication on any advertising. SCDM assumes no liability, including but not limited to

compensatory or consequential damages, for any errors or omissions in connection with any ad. SCDM

does not guarantee placement in specific locations or in a given issue. SCDM reserves the right to refuse

or pull ads for space or content.

Please submit all forms, artwork, and payments to:

Global HeadquartersSociety for Clinical DataManagement, IncBoulevard du Souverain, 280B-1160 Brussels, BelgiumTel: +32-2-740.22.37Fax: [email protected]

North America OfficeSociety for Clinical DataManagement, Inc7918 Jones Branch Drive Suite 300 McLean VA 22102, USATel: +1-703-506-3260Fax: [email protected]

India OfficeSociety for Clinical DataManagement, Inc203, Wing B, Citipoint(Near Hotel Kohinoor Continental)J. B. Nagar, Andheri-Kurla RoadAndheri (East). Mumbai – 400059IndiaTel: +91-22-61432600 Fax: [email protected]

Authors:For each article published, authors receive 0.2 CEUs.

Disclaimer: The opinions expressed in this publication are those of the authors. They do not reflect the opinions of SCDM or its members. SCDM does not endorse any products, authors or procedures mentioned in this publication.


Recommended