The data management team at DCU is involved in a wide scope of tasks to ensure timely and accurate collection and processing of data. Prior to the study initiation, the data management team digitizes the study protocol, develops the case report forms, conducts database end user validation, and provides user training. During the trial operation period, data managers oversee the quality and efficiency of trial conduct and clinical data collection across all clinical sites, and provide instructions and technical support for WebDCUTM users.

DCU’s data management services include:

Scope of Data Management
Digitalization of the Study Protocol
Database development starts with digitalizing the study protocol into the WebDCUTM architecture. Data managers first determine the data points to be collected, based upon the study protocol. Next, the protocol is separated into distinct study visits, such as Baseline; Randomization; Day 30; and End of Study. Combining study visits with data points to be collected, allows for the creation of the data collection schedule, and the identification of eCRFs required for the study. eCRFs at each visit are further defined as being required or optional, as well as single or repeatable.

Data managers also define the transition logic from one visit to the next, forcing the subject to transition appropriately through the digitalized protocol. This ensures that required visits cannot be skipped, such as Baseline; Randomization; Hospital Discharge; and End of Study. At the same time, it allows for other study visits to be skipped, as appropriate, such as allowing inpatient days to be skipped when a subject is prematurely discharged from the hospital.

CRF Development
Based upon the study protocol and statistical analysis plan, the data management team develops the study visit transition matrix; study data collection schedule; CRFs; skip patterns and conditional selection logic for data entry user interfaces; and data validation and protocol violation rule checks. The paper CRFs act as prototypes of the data entry user interfaces. They define the format for each data point, such as numeric; date; or text; and indicate which data points can be legitimately skipped, due to skip patterns. Adobe PDF files, matching the web-based eCRF and detailed CRF Completion Guidelines, are posted in the study database. These are available for download by the sites. DCU data managers conduct testing and end-user validation of each paper and eCRF, per DCU’s Standard Operating Procedures, prior to the release of the study database.

Data Quality Assurance
Project specific quality assurance (QA) plans are developed in collaboration with the data managers, statisticians, and the sponsor. The QA processes are aimed at providing an accurate final study database. All data submitted to the central database are subject to several layers of data QA. First, during database development, the data managers and statisticians work together to define the logic checks to be programmed for each data item, including rules for missing data; values that are out of range; and sequence errors. These rules are programmed to execute when the site personnel save the eCRFs. The eCRF is blocked from final submission until all rule violations have been addressed by the site. WebDCU™ accommodates three levels of rule violations: rejections, warnings, and protocol violations. Rejections are logic violations that must be corrected prior to data submission, such as an AE resolution date entered as occurring prior to the date of AE onset. Warnings flag missing or suspicious data, such as a lab value that is out of an expected range. Protocol violations flag data points that support the occurrence of a protocol violation at the site. In the case of both warnings and protocol violations, the site personnel can correct their entry or confirm the data, and provide an explanation of the anomaly.

Second, as the eCRFs are submitted, data managers review each for identification of errors. For example, numeric values are checked for outliers within subject, site, and overall study. General comments are reviewed to ensure that they don’t contradict the data on eCRF. Text fields are reviewed to ensure de-identification and accuracy. Reasons for missing data are examined. Data Clarification Requests (DCRs) are sent to the site when issues arise during data manager CRF review. Site personnel receive email alerts, and are required to provide a response for each DCR, editing CRF data if needed. Data managers can close the DCR if site responses are accepted. Otherwise, the site will be contacted for further action regarding the DCR. All CRFs cleaned by data managers are transferred to the project statistics team, in SAS dataset format.

In the third level of quality assurance, DCU statisticians conduct periodic reviews of the study data in aggregate, and across CRFs, to identify outliers; odd trends over time; and bring them to the attention of the DCU study team. This information is only shared in aggregate (combined treatment arms), in order to protect the treatment blind. Data identified as suspicious or questionable are flagged for data managers, who then submit DCRs to the sites via the WebDCU™ for resolution. The logic checks in the database are updated to prevent propagation of the error.

One caveat to web-based data management is its dependence on the timeliness of data entry at the clinical sites. Data managers carefully monitor this process and continuously remind investigators of outstanding data, open queries, and missing or late visits. Should a site become delinquent in this regard, the data manager will contact the site coordinator to determine the reasons for delay and suggest means to improve site performance.

WebDCU™ Support
Scope of Data Management
In a multicenter clinical trial, a study database for CRF data collection and a clinical trial management system to support project management activities are typically required. WebDCUTM combines these two components into one system, thus providing the tools for CRF data management and project management tasks, such as drug tracking; medical safety monitoring; and regulatory document collection. Integrating project management tools in the central database system allows collected data to be used for trial operation management, and eliminates the redundancies and discrepancies inherent in maintaining separate databases.

DCU data managers oversee the quality and efficiency of trial conduct and clinical data collection across all clinical sites, and provide technical support for WebDCUTM users. The DCU data management team trains all users of the system, which includes not just site personnel, but which may include project managers, regulatory managers, financial managers, site monitors, independent medical safety monitors, central pharmacists, and endpoint adjudicators. This training is conducted via video, webinar, or in-person. In addition, DCU data managers provide ongoing technical support for questions regarding WebDCU™ and immediate 24-hour hotline support.

 
Back to Top
go to DPHS

go to MUSC