In order to comply with evolving regulation guidelines, pharmaceutical companies are challenged with the resource-intensive process of monitoring and implementing regulatory changes, along with the risk of overlooking critical updates due to human error. This Work Product Team aims to use AI technology to streamline how regulatory updates are handled.

By automatically analyzing new regulatory changes and comparing them with internal documents, the AI system will help to pinpoint which documents might be affected and identify any discrepancies or omissions within internal documentation to ensure compliance.

Goal

To develop an AI workflow that streamlines the impact assessment of regulatory changes on the QMS and makes it highly efficient.

Expected Outcomes

A detailed white paper that outlines the entire process of developing and utilizing an AI-powered Regulatory Intelligence tool. This will include specifications for input formats, output details, AI modeling techniques, infrastructure requirements, and additional information.

Lead(s)

Resources

Related Work Products

  • This is a collaborative whitepaper which offers insights and guidance on transforming Quality Assurance (QA) in Good Clinical Practice (GCP) and Good Pharmacovigilance Practice (GVP) through the application of advanced analytics.

  • Accurate and timely reporting of adverse events (AEs) in clinical trials is crucial to ensuring data integrity and patient safety. However, AE under-reporting remains a challenge, often highlighted in Good Clinical Practice (GCP) audits and inspections. Traditional detection methods, such as on-site investigator audits via manual source data verification (SDV), have limitations. To address this, we aim to develop an analytics approach that can facilitate rapid, comprehensive, and near-real-time detection of AE under-reporting and over-reporting at each clinical trial site.

  • To maintain high-quality data in clinical trials, it is crucial to be able to detect systematic anomalies in time series data at the site and subject level, which often stem from protocol misinterpretation or device miscalibration. Traditional approaches can be slow and resource intensive. We therefore aim to develop an analytics approach that can be run with minimal effort and with a high degree of reliability.