Is your SDTM review systematic and methodological?

25 February Standard 25 min read
Is your SDTM review systematic and methodological?

Introduction

In the realm of complex, varied and multi-origin raw datasets, development of SDTM datasets and related documentation is becoming lengthy process. It requires programmers to refer several different documents such as Codelist, agency guidelines, therapeutic area user guides, Implementation guides along with study documents such as Protocol and CRF completion guidelines. It is, therefore, important to ensure every SDTM datasets and documents are reviewed thoroughly before including in regulatory submission package. This article depicts systematic approach and methodological way of reviewing SDTM datasets, aCRF, define and reviewer’s guide so that submission package is compliant with CDISC guideline and agency expectations.

 

The review of SDTM documents and their data do not just start with programming activities, but rather much before. It has been evident that early decision making in deciding which versions of CDISC standard document to choose helps massively and saves significant time. Decision should also be made on several other documents such as QRS, TAUG and any applicable agency guidance.

 

Usually, organizations do expect their SDTM expert to do heavy lifting of review which ideally should be the case.! However, manual review of aCRF, datasets, specifications, define and csdrg usually takes much time. Moreover, there is always a human error factor to miss out on several review points and it is difficult to verify synchronization between documents & data. To avoid trivial errors, gaps and shortcomings in review, a methodological review should be performed to ensure that the submission package has been prepared with utmost quality.

 

One of the first suggestions is to build several checklists of aCRF, datasets, define and csdrg review. Furthermore, a well-organized, well-structure specification has been established, which can be verified manually and programmatically to eliminate trivial errors and gaps. This well defined specification can also be source of datasets & define metadata. The pre-defined checks can remove onus from reviewer at significant extent and therefore review can be focused on proper mapping, study specific derivations and making overall package fully compliant.

Below process steps will explain how and what time review should be done:

Before programming conduct:
  1. Initial aCRF review: Usually in study, aCRF is first programming document that gets drafted. The way to achieve systematic review of aCRF just does not start once it is completed but rather much before when it is being drafted. It is important in the first place that aCRF is with good quality and generated with proper guidance which aligns with MSG V2.0. Moreover, all agency requirements are also fulfilled. CRF annotations in the form of .xfdf file format can be handy to perform certain checks with specification once it has been drafted. This systematic approach can help cross verify page-numbers & variable existence between aCRF and specification. All necessary decisions on versions should be made before aCRF creation and reviewer must check this against IG, CT version, any TAUG and agency guidance.
  2.  

  3. Review of SDTM Specification: Similarly, as aCRF, to achieve methodological review of SDTM start from specifications. It is important that SDTM specifications are built structurally, so that they can be used to create datasets and define.xml and remain single source of truth. Having well-structured specifications also allows to perform several trivial programmatic checks and eliminates lots of inconsistencies before even actual review starts. Reviewers can then focus on actual mapping instructions, VLM entries and making sure that information in specification has been appropriately filled to avoid conformance issues before even occurring. These focus aspect of specification review can save time and effort to make datasets and entire package compliant with minimal efforts.
  4.  

  5. Review of define & csdrg with data: To reduce efforts in making datasets compliant and re-work at downstream programming, it has been recommended that define.xml must be generated before programming of datasets and thoroughly reviewed without xpt file. This helps to verify specification metadata and allows to have greater synchronization between datasets and define.xml. Leveraging pre-built programmatic checks can eliminates lots of inconsistencies and reduce efforts in trivial review. Reviewers then can focus variable derivations, VLM entries, subset of codelist and overall compliance with define standards. Once define is reviewed and finalized, csdrg should be drafted leaving few sections which are dataset dependent. It is well-observed that 60% of csdrg can be ready even before dataset programming initiation and this approach helps to achieve consistency amongst specification & define from start.
  6.  

During Programming Conduct:
  1. Review of Conformance Report: To reduce time, it is important that conformance reports must be reviewed thoroughly during and after dataset validation. Having performed conformance report review simultaneously during dataset programming conduct versus after full validation of data impacts only one side of programming which significantly saves time and effort. Moreover, this approach avoids any amazement at last stage of programming conduct.
  2.  

  3. Specification Review: During programming conduct, any major updates in specification must be reviewed thoroughly by experienced reviewer. This helps to ensure that updates are aligned with implementation assumption and regulatory agency expectation. Moreover, review must also be focused on promoting consistency across aCRF, dataset, define and csdrg documents so they are prepared without having to wait post programming conduct.
  4.  

After Programming Conduct:
  1. Thorough Review of SDTM Datasets: In several organization, there is a practice to perform review on datasets only once they are fully validated. As depicted in earlier in this article, it is important that systematic reviews should occur throughout the study life cycle. This process helps to eliminate any gaps in documents and/or datasets. However, regardless earlier reviews were taken place or not, once all datasets have been validated, a thorough review on datasets and its conformance report must take place. The review must focus on variable derivation, visit mapping and epoch creation algorithms, key safety and efficacy dataset review. Moreover, appropriate pre-defined programming checks should also be run to ensure datasets follow internal guidance (if any!)
  2.  

  3. Thorough Review of define: Once datasets have been reviewed in detail and updated, a detail review on define must be done along with its conformance report with and without data. Review must focus on VLM, formats, codelist, active linkage with aCRF and csdrg. Moreover, it should also ensure all unresolved issues are explained properly in report to transfer to csdrg at a later stage.
  4.  

  5. Thorough Review of csdrg: In several organization, csdrg is the last document that gets drafted and reviewed. However, having it created before programming or during programming eliminates burden of finalizing it later and reduces the iteration of review. The review must focus on all necessary information starting from correct versions of standards to all the way documenting explanation on conformance issues. Additionally, all lengthy, special and complex programming notes which can help agency reviewer to review must be thoroughly reviewed. Lastly, reviewer must ensure that pdf document follows all FDA pdf rules to comply with submission package.

Conclusion

As depicted throughout in this article, SDTM review is no single time-point bound process, but multi time-point. Once documents are initially reviewed and approved, they serve robust documentation for programming which is highly time-consuming. A methodological and systematic review isn’t just achieved by experienced reviewer but should be done alongside selecting right workflow, well structured documents, well defined internal standards & guidance and usage of internal automation.

Abdulkadir Lokhandvala

25 February,2025