Global sensitivity analysis of model outputs with dependent inputs: some methods and practical aspects
Bertrand Iooss  1  
1 : EDF R&D
EDF Recherche et Développement
Chatou -  France

In uncertainty quantification of numerical models and investigation of experimental data, the importance measures (or sensitivity indices) aim to quantify the influence of the model inputs on its outputs. For example, in environmental pollution impact calculation studies, sensitivity analysis allows to determine which physical parameters and environmental data have the most influence on the variability of the calculated pollutant concentration. Beyond the variance-based sensitivity indices (also known as the Sobol' indices) whose interpretation is restricted by a mutual independence assumption between the model inputs, the Shapley effects, based on cooperative game theory concepts, have recently aroused great interest among users eager for the interpretability of numerical models (computer codes or machine learning models). This talk will be focused on the practical use of Shapley effects for global sensitivity analysis of model outputs. In particular, the statistical estimation techniques and the algorithmic implementations in the sensitivity package of R will be discussed. Moreover, one potential undesirable effect of the particular allocation induced by Shapley effects will be highlighted, while proposing another allocation choice.



  • Poster
Online user: 8 Privacy
Loading...