Heus P, Damen JAAG, Pajhouheshnia R, Scholten RJPM, Reitsma JM, Collins GS, Altman DG, Moons KGM, Hooft L. More than half of the TRIPOD items are inadequately reported in prediction modelling studies. Poster presented at the Methods for Evaluating Medical Tests and Biomarkers 2016 Symposium; July 19, 2016. Birmingham, UK. [abstract] Diagn Progn Res. 2017 Feb 16; 1(Suppl 1):27. doi: 10.1186/s41512-016-0001-y


BACKGROUND: Prediction models, both diagnostic and prognostic, are developed with the aim to guide clinical decision making. To validate, evaluate their impact and eventually use these models in clinical practice, clear and comprehensive reporting of prediction modelling studies is required. To improve the reporting of prediction models, a guideline for Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD) was launched n January 2015. The TRIPOD statement is a checklist of 22 main items considered essential for informative reporting of studies developing or validating multivariable prediction models.

OBJECTIVES: To assess the quality of reporting of prediction modelling studies that were published before the launch of TRIPOD in 2015.

METHODS: We selected the 10 journals with the highest impact factors within 37 clinical domains. A PubMed search was performed to identify prediction models published in May 2014. Publications that described the development and/or validation of a diagnostic or prognostic prediction model were considered eligible. We also included studies evaluating the incremental value of adding a predictor to a model. TRIPOD items were translated into a data extraction form, which was piloted extensively. Three reviewers extracted data. If they disagreed on when to consider an item “adhered”, it was discussed in consensus meetings with the other co-authors.

RESULTS: Our search identified 4871 references, of which 347 potentially eligible references were assessed in full text. Eventually 148 references (within 28 clinical domains) met our eligibility criteria. Of these, 17% described diagnostic and 83% prognostic prediction models. Model development was described in 43% of the publications, validation of an existing prediction model in 26%, incremental value of adding a predictor to a model in 19% and a combination of development and validation of a model was described in 12%. The analysis showed that overall a mean of 48.4% of the TRIPOD items (on publication level) was adhered (range 20.7%–72.4%). The mean adherence was 46.5%, 51.4%, 47.1% and 48.5% in publications about development, validation, incremental value and combination of development and validation, respectively. There was incomplete reporting of TRIPOD items concerning title and abstract, blinding, model building procedures, final model and performance measures. Source of data, eligibility criteria, study limitations and overall interpretation were adequately reported in the majority of publications.

CONCLUSIONS: There is room for improvement in the reporting of multivariable prediction models: more than half of the TRIPOD items are currently not or inadequately reported. Our study could serve as a baseline measurement for future research evaluating the impact of the introduction of the TRIPOD statement.

Share on: