Meta-Modelling

The Problem

Decision analytic models are frequently required for conducting cost-effectiveness analyses, and invariably a probabilistic sensitivity analysis (PSA) will have to be performed on any model to assess model output uncertainty induced by uncertainty in the model inputs. A PSA will typically a large number of model runs at different input values, and more detailed investigations of input parameter uncertainty, such as value of information analysis, will require a larger number still. The problem arises when the model is computationally expensive to run, for example if the model is a patient simulation model, so that the model user is only able to run the model a relatively small number of times. In this case, it may not be able to feasible to do a PSA using conventional Monte Carlo methods. A model developer may be deterred from building a sufficiently realistic model due to the consequent computational problems.

This problem is not unique to the field of health economics. Complex mathematical models are used in many fields of science and technology, and it is common to investigate the consequences of input uncertainty using approaches such as PSA. An established technique for dealing with computationally expensive models is to construct a meta-model (also known as an emulator). A meta-model is a statistical representation of the original model that can be used to obtain very fast approximations of the model output. A meta-model can be constructed using a relatively small number of runs of the original model, and so be can used to conduct a PSA with accuracy comparable to Monte Carlo, but with far fewer model runs.

Meta-modelling involves the use of statistical regression to learn the model input-output relationship. A popular regression method used in meta-modelling involves the use of Gaussian processes. Gaussian process regression is a nonparametric method, and so it is not necessary to make restrictive assumptions about the input-output relationship in the decision model.

Research at CHEBS

Research in this area is being carried out by Tony O'Hagan and Jeremy Oakley, who have developed and applied meta-modelling in many different fields. Tony O´Hagan is project leader and Jeremy Oakley is an investigator on the project "Managing Uncertainty in Complex Models", a Research Councils UK funded project concerned with quantifying and reducing uncertainty in the predictions of complex models across a wide range of application areas. [ http://mucm.group.shef.ac.uk/index.html]

Software is available for implementing Gaussian process meta-modelling:

The GEM Software Project's website

Publications

  • O'Hagan, A. (2006). Bayesian analysis of computer code outputs: a tutorial. Reliability Engineering and System Safety, 91, 1290-1300.

  • Oakley, J. (2005). Decision-theoretic sensitivity analysis for complex computer models. Research Report, Department of Probability and Statistics, University of Sheffield. [http://www.jeremy-oakley.staff.shef.ac.uk/]

  • Stevenson M.D., Oakley, J. and Chilcott, J.B. (2004). Gaussian process modelling in conjunction with individual patient simulation modelling: A case study describing the calculation of cost-effectiveness ratios for the treatment of osteoporosis. Medical Decision Making, 24(1), 89-100.