Rethinking Data Quality Best Practices In The Era Of Decentralized Clinical Trials
By Stacy Weil, Debra Jendrasek, LaRae Bennett and Joan Sutphen-Glowatz
Pandemic-related disruptions have accelerated much-needed change in clinical operations, but this change has been accompanied by questions about data collection and data quality. In a recent survey commissioned by Oracle Health Sciences, more than 75 percent of industry respondents indicated that limitations in patients’ ability to attend on-site visits sped up their adoption of decentralized clinical trial (DCT) approaches.1 A separate survey conducted by Greenphire found that 84 percent of sponsors and contract research organizations are actively seeking to increase their use of technology to better support DCTs.2 This momentum behind DCTs has, however, raised concerns among sponsors about how to collect data remotely while ensuring data reliability and quality.1
As sponsors move to DCTs, one of the most common steps they have taken has been to implement both patient-and investigator-facing technologies.1 Adopting these technologies has added layers of complexity to the planning and processes required to ensure data quality. The use of apps, ePROs, and wearable devices may increase patient convenience, provide real-time data, and reduce site burden. Still, it may also require different approaches for collecting and managing data and complying with evolving regulatory guidance. Implementing technologies such as eConsent may involve additional training and technical considerations.
Understanding what processes are required to integrate data from where it is collected to where it can be analyzed and do this efficiently and securely is the key to optimizing data quality in DCTs. Technology can be used to enable these processes, but only if it has been well-vetted to minimize risk to sponsors.
Get unlimited access to:
Enter your credentials below to log in. Not yet a member of Cell & Gene? Subscribe today.