Table D5Theme #5: Evaluation of models

16“Formulation…how are they justifying the assumptions and the parameters…are they incorporating all the data…how was the weighting determined.”
15“Inputs, parameters, and sensitivity analysis are key to evaluating models.”
“The structure of the model must be well documented…it is important to be able to evaluate the underlying structure of the model…how well does it characterize the pathway or natural history of the disease.”
“What journals use as transparency vis a vis publishing standards is a good place to start.”
“Models need to be based on a good CER…credibility is lost when modelers use expert judgment.”
13“The structure of the model.”
“Do the results map to what we know about the data…modeling is an iterative process…how is the model performing?”
“Who is the modeler…this is unscientific, but if I know who the modeler is, and if I know they do good work, then I am confident in the model.”
“Then there are specifics…representation of the available alternatives…assumptions and tests…exploring the uncertainty…a conservative view…what are the results of the sensitivity analysis…what are the inputs, evidence…what data was used.”
10“Review the structure of the model and the basic assumptions…what are the technical aspects of the model…review the data used as inputs.”
9“Understand the intervention…clear specification of the model and the framework.”
6“Inspect the analytical framework…clear model specification is critical.”
“The criteria is similar to that for any primary study.”
5“Traditional sources from decision analysis and cost effectiveness should be used.”
“The Weinstein text is a good source.”
“Model construction needs to be assessed…then sensitivity analyses need to be reported.”
4“Assumptions and where they break down…relative versus absolute nature of the measures…utility of the model outputs by non-modelers.”
3“We hope to be able to determine the model validity or quality…but it is very challenging.”
2“I like to see a detailed description of model inputs and assumptions that went into it.”
“A lot of description about sensitivity analysis especially for input and assumptions that were based on evidence that was not as strong.”
“Are data input valid? Is there an appropriate search strategy to find data? Are calculations adequate? Are appropriate sensitivity analyses conducted?”
12“I look at both the structure and the fidelity of the model… The latter is to judge the representation and can be very difficult.”
“Need to judge the evidence of the input parameters… This can be done in just the same way as any other evidence review.”
“Judging the caliber of a model is very difficult… Checklists are OK, but more important is who developed the model.”
“For complex model, one should develop an enormous amount of time debugging, testing and exercising the model… It would be helpful to understand how much time modelers spent on this.”
14“Grading scheme for grading models has to be different from the grading scheme for grading empirical evidence. However, the grading scheme for the value of evidence needs a coherent whole but before you could really do that, you would need to define different types of models. For certain types of models, a proportion of data is empirical, some of it is opinion… the proportion would affect how you view the model as well as the evidence of the model.”
“If no modeling expertise, can only tell you whether the model is useful. So, first I need to understand the model. Are the right nodes in there? Is the important information used and addressed by the model? Then, have I learned anything? Leave it to other people to evaluate whether the right type of model is used and whether it is technically sound. I trust other people who have that kind of expertise.”
1“Definitely do a lot of sensitivity analysis. Should even be done with systematic reviews… For example would we get the same result in a systematic review if we take one study out? Sensitivity analysis in meta-analysis.”

From: Appendix D, Verbatim Quotes for Key Themes

Cover of Decision and Simulation Modeling in Systematic Reviews
Decision and Simulation Modeling in Systematic Reviews [Internet].
Kuntz K, Sainfort F, Butler M, et al.

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.