Schweitzer Fachinformationen
Wenn es um professionelles Wissen geht, ist Schweitzer Fachinformationen wegweisend. Kunden aus Recht und Beratung sowie Unternehmen, öffentliche Verwaltungen und Bibliotheken erhalten komplette Lösungen zum Beschaffen, Verwalten und Nutzen von digitalen und gedruckten Medien.
Christian Gogu is Associate Professor at the University Toulouse III-Paul Sabatier, France. His research, which he carries out at the Clement Ader Institute, focuses, in particular, on taking into account uncertainties in the design and optimization of aeronautical systems.
Foreword xiMaurice LEMAIRE
Preface xvChristian GOGU
Part 1. Modeling, Propagation and Quantification of Uncertainties 1
Chapter 1. Uncertainty Modeling 3Christian GOGU
1.1. Introduction 3
1.2. The usefulness of separating epistemic uncertainty from aleatory uncertainty 6
1.3. Probability theory 10
1.3.1. Theoretical context 10
1.3.2. Probabilistic approach for modeling aleatory uncertainties 13
1.3.3. Probabilistic approach for modeling epistemic uncertainties 16
1.4. Probability box theory (p-boxes) 21
1.5. Interval analysis 24
1.6. Fuzzy set theory 25
1.7. Possibility theory 27
1.7.1. Theoretical context 27
1.7.2. Comparison between probability theory and possibility theory 30
1.7.3. Rules for combining possibility distributions 34
1.8. Evidence theory 35
1.8.1. Theoretical context 35
1.8.2. Rules for combining belief mass functions 38
1.9. Evaluation of epistemic uncertainty modeling 40
1.10. References 40
Chapter 2. Microstructure Modeling and Characterization 43François WILLOT
2.1. Introduction 43
2.2. Probabilistic characterization of microstructures 45
2.2.1. Random sets 45
2.2.2. Covariance 47
2.2.3. Granulometry 50
2.2.4. Minkowski functionals 51
2.2.5. Stereology 53
2.2.6. Linear erosion 53
2.2.7. Representative volume element 54
2.3. Point processes 55
2.3.1. Homogeneous Poisson point processes 56
2.3.2. Inhomogeneous Poisson point processes 58
2.4. Boolean models 59
2.4.1. Definition and Choquet capacity 59
2.4.2. Properties 61
2.4.3. Covariance 63
2.4.4. Other characteristics 63
2.5. RSA models 66
2.6. Random tessellations 67
2.6.1. Voronoi tessellation 68
2.6.2. Johnson-Mehl tessellation 69
2.6.3. Laguerre tessellation 69
2.6.4. Random Poisson tessellation 70
2.6.5. The dead-leaves model 71
2.6.6. Generalized random partition models 72
2.7. Gaussian fields 73
2.8. Conclusion 76
2.9. Acknowledgments 77
2.10. References 77
Chapter 3. Uncertainty Propagation at the Scale of Aging Civil Engineering Structures 83David BOUHJITI, Julien BAROTH and Frédéric DUFOUR
3.1. Introduction 83
3.2. Problem positioning 85
3.2.1. Probabilistic formulation 85
3.2.2. Thermo-hydro-mechanical-leakage transfer function 86
3.2.3. Resulting probabilistic THM-F problem 87
3.3. Random field-based modeling of material properties 88
3.3.1. Random fields 88
3.3.2. Generation methods for discretized random fields 88
3.3.3. Random fields and autocorrelations 91
3.3.4. Application: contribution to modeling the cracking of reinforced concrete works by self-correlated r.f 92
3.4. Modeling uncertainty propagation using response surface methods 98
3.4.1. Probabilistic coupling strategies 98
3.4.2. Polynomial chaos method 101
3.5. Conclusion 108
3.6. References 108
Chapter 4. Reduction of Uncertainties in Multidisciplinary Analysis Based on a Polynomial Chaos Sensitivity Study 113Sylvain DUBREUIL, Nathalie BARTOLI, Christian GOGU and Thierry LEFEBVRE
4.1. Introduction 113
4.2. MDA with model uncertainty 115
4.2.1. Formalism 115
4.2.2. Solving the random MDA 119
4.2.3. Approximation of the quantity of interest using sparse polynomial chaos 122
4.3. Sensitivity analysis and uncertainty reduction 124
4.3.1. Introduction 124
4.3.2. Sobol' indices approximated by polynomial chaos 126
4.4. Application to an aeroelastic test case 128
4.4.1. Presentation 128
4.4.2. Construction of disciplinary metamodels 131
4.4.3. Sensitivity analysis and uncertainty reduction 133
4.5. Conclusion 140
4.6. References 140
Part 2. Taking Uncertainties into Account: Reliability Analysis and Optimization under Uncertainties 143
Chapter 5. Rare-event Probability Estimation 145Jean-Marc BOURINET
5.1. Introduction 145
5.1.1. Mapping to the multivariate standard normal space 147
5.1.2. Copulas and correlation 149
5.1.3. Isoprobabilistic transformations 152
5.2. MPFP-based methods 159
5.2.1. First-order reliability method 159
5.2.2. Second-order reliability method 163
5.3. Simulation methods 166
5.3.1. Crude MC simulation 167
5.3.2. Subset simulation 168
5.3.3. IS and CE methods 182
5.4. Sensitivity measures 189
5.4.1. Introduction 189
5.4.2. FORM 191
5.4.3. Crude MC simulation and subset simulation 195
5.5. References 198
Chapter 6. Adaptive Kriging-based Methods for Failure Probability Evaluation: Focus on AK Methods 205Cécile MATTRAND, Pierre BEAUREPAIRE and Nicolas GAYTON
6.1. Introduction 205
6.2. Presentation of Kriging 208
6.2.1. Principle 208
6.2.2. Identification of Kriging hyperparameters 209
6.2.3. Kriging-based prediction 210
6.2.4. Illustration of Kriging-based prediction 210
6.3. Employing Kriging to calculate failure probabilities 211
6.3.1. The EFF function 212
6.3.2. The U function 212
6.3.3. The IMSET function 213
6.3.4. The SUR function 213
6.3.5. The H function 214
6.3.6. The OBJ function 214
6.3.7. The L function 214
6.3.8. Discussion 214
6.4. The AK-MCS method: presentation and generic principle 215
6.4.1. Presentation of the AK-MCS method 215
6.4.2. Illustration of the AK-MCS method 217
6.4.3. Discussion 219
6.5. The AK-IS method for estimating probabilities of rare events 219
6.5.1. Presentation of the AK-IS method 219
6.5.2. Illustration of the AK-IS method 220
6.5.3. Discussion 220
6.6. The AK-SYS method for system reliability problems 222
6.6.1. Some generalities about system reliability analysis 222
6.6.2. Presentation of the AK-SYS method 223
6.6.3. Illustration of the AK-SYS method 225
6.6.4. Alternatives to the AK-SYS method 226
6.6.5. Application to problems indexed by a subset 227
6.7. The AK-HDMR1 method for high-dimensional problems 229
6.7.1. HDMR functional decomposition 230
6.7.2. Presentation of the AK-HDMR1 method 231
6.8. Conclusion 233
6.9. References 234
Chapter 7. Global Reliability-oriented Sensitivity Analysis under Distribution Parameter Uncertainty 237Vincent CHABRIDON, Mathieu BALESDENT, Guillaume PERRIN, Jérôme MORIO, Jean-Marc BOURINET and Nicolas GAYTON
7.1. Introduction 237
7.2. Theoretical framework and notations 242
7.3. Global variance-based reliability-oriented sensitivity indices 244
7.3.1. Introducing the Sobol' indices on the indicator function 244
7.3.2. Rewriting Sobol' indices on the indicator function using Bayes' Theorem 245
7.4. Sobol' indices on the indicator function adapted to the bi-level input uncertainty 247
7.4.1. Reliability analysis under distribution parameter uncertainty 247
7.4.2. Bi-level input uncertainty: aggregated versus disaggregated types of uncertainty 249
7.4.3. Disaggregated random variables 250
7.4.4. Extension to the bi-level input uncertainty and pick-freeze estimators 251
7.5. Efficient estimation using subset sampling and KDE 253
7.5.1. The problem of estimating the optimal distribution at failure 253
7.5.2. Data-driven tensorized KDE 257
7.5.3. Methodology based on subset sampling and data-driven tensorized G-KDE 258
7.6. Application examples 258
7.6.1. Example #1: a polynomial function toy-case 261
7.6.2. Example #2: a truss structure 264
7.6.3. Example #3: application to a launch vehicle stage fallback zone estimation 267
7.6.4. Summary about numerical results and discussion 274
7.7. Conclusion 274
7.8. Acknowledgments 275
7.9. References 275
Chapter 8. Stochastic Multiobjective Optimization: A Descent Algorithm 279Quentin MERCIER and Fabrice POIRION
8.1. Introduction 279
8.2. Mathematical refresher 281
8.2.1. Stochastic processes 281
8.2.2. Convex analysis 282
8.3. Multiobjective optimization and common descent vector 288
8.3.1. Binary relations 288
8.3.2. Multiobjective optimization, Pareto preorder 290
8.3.3. Common descent vector 296
8.4. Descent algorithm for multiobjective optimization and its extension to the stochastic framework 298
8.4.1. Multiple gradient descent algorithm 298
8.4.2. Stochastic multiple gradient descent algorithm 300
8.5. Illustrations 305
8.5.1. Performance of the SMGDA algorithm 305
8.5.2. Multiobjective approach to RBDO problems 309
8.5.3. Rewriting the probabilistic constraint 310
8.6. References 316
List of Authors 319
Index 321
Christian GOGU
Clément Ader Institute, Paul Sabatier University, Toulouse, France
Several decades ago, the scientific and engineering community began to recognize the value of considering uncertainties in the design, optimization and risk analysis of complex systems, such as aircraft, space vehicles or nuclear power plants (Wagner 2003; Lemaire 2014). These uncertainties can manifest themselves in many forms, originating from many different sources. In the field of mechanics, sources of uncertainty can typically be due to:
With respect to these different sources of uncertainty, a distinction is often made between aleatory and epistemic uncertainties (Vose 2008; National Research Council 2009), although this distinction is debatable, as will be discussed in more detail in section 1.2.
Aleatory uncertainty is also referred to as irreducible uncertainty, stochastic uncertainty, inherent uncertainty or type I uncertainty. This uncertainty typically arises from environmental stochasticity, fluctuations in time, variations in space, heterogeneities and other intrinsic differences in a system. It is often referred to as irreducible uncertainty because it cannot be reduced further except by modification of the problem under consideration. On the other hand, it can be better characterized when it is empirically estimated. For example, the characterization of the variability of a material property can be improved by increasing the number of samples used, allowing a better estimate of statistical properties such as the mean and standard deviation.
An example of random uncertainty can be seen in an (unbiased) coin toss (that is, "heads or tails"). The intrinsic characteristics of the toss create uncertainty about the outcome of the toss, with a probability of 0.5 of obtaining tails. Assuming that tails is an undesirable outcome, it would then be desirable to reduce the probability of obtaining tails in order to reduce the uncertainty of the desired outcome (obtaining heads). However, without breaking the rules of the game, that is, without modifying the problem under consideration, it is not possible to reduce this uncertainty, hence the term "irreducible uncertainty". On the other hand, if this uncertainty is characterized empirically on the basis of several tosses, it is obvious that this uncertainty can be better characterized by increasing the number of tosses.
In terms of an engineering example, the amplitude of the gusts that an aircraft will be likely to encounter during its lifetime can be seen as a random or irreducible uncertainty. Indeed, when designing a new model of aircraft, it would be desirable for the engineer to reduce this uncertainty as much as possible in order to reduce the weight of the aircraft structure. Unfortunately, since this uncertainty is essentially related to environmental stochasticity, the engineer has no way to reduce it without changing the design being considered, therefore, the uncertainty is seen as irreducible. However, it could perhaps be reduced by changing the problem under study. For example, the engineer might want to consider installing sensors that allow the aircraft to detect the amplitude of turbulence in its trajectory in advance, thus allowing pilots to undertake avoidance maneuvers. Nevertheless, such a choice would have numerous and serious consequences. (Would such an aircraft be certifiable? Is it more economical to avoid turbulence rather than designing the aircraft to withstand it? Would passenger comfort be satisfactory? etc.) Usually, the problem is thus considered fixed and the irreducible or non-irreducible nature of the uncertainties is estimated on a given problem.
Epistemic uncertainty, also called reducible uncertainty or type II uncertainty, is the result of a lack of knowledge. This type of uncertainty is usually associated with uncertainty in measurements, a small number of experimental data, censorship or the unobservability of phenomena or scientific ignorance, which, in general terms, amounts to a lack of knowledge of one kind or another. It is often referred to as reducible uncertainty, as it can potentially be reduced by additional actions (for example, additional studies), leading to an improvement in knowledge. It should be noted that once epistemic uncertainty has been reduced, random uncertainty may remain, which could then become the predominant uncertainty and would therefore be irreducible.
An example of epistemic uncertainty would be that associated with the estimation of the age (in years) of an individual. Let us assume that we have just heard on the radio that a Nobel Prize winning scientist is going to give an acceptance speech in our town and we would like to know the age of the scientist. There is no mention of that on the radio. At this point, let us say that we can only estimate the age of this person between 30 and 90 years old. Therefore, there is a lot of uncertainty about their age at that point. However, as mentioned earlier, epistemic uncertainty can be reduced through improved knowledge. If we attend the acceptance speech, we will have the opportunity to see this person, which could allow us to reduce the uncertainty about their age to between, let us say, 50 and 60 years of age. If we even go to talk to them after the speech, we may even be able to get additional information that will allow us to further reduce the uncertainty about their age. In this case, the uncertainty could even be reduced to zero if we can find out their date of birth.
A typical example of epistemic uncertainty in engineering problems is measurement uncertainty. Similar to the scientist's age, the quantity that must be measured has a true value that is fixed (considering measurements at the macroscopic, not the quantum scale). Nevertheless, measurement instrumentation usually only allows this quantity to be determined with uncertainty. This uncertainty can be reduced by developing better instruments, using a better knowledge of the measurement phenomena involved, hence the term reducible uncertainty, although, in general, this does not mean that the uncertainty can be reduced to zero.
Probability theory has historically provided the first framework for modeling and quantifying uncertainties. Today, it is generally accepted that aleatory uncertainties can be adequately handled by probability theory. While the probabilistic approach can also be used to model epistemic uncertainties, other alternative representations have been proposed for such uncertainties, such as interval analysis, fuzzy set theory, possibility theory or evidence theory. These alternative approaches address, in particular, the need to quantify uncertainty when little or no data (either numerical or experimental) are available. Attempts to unify all of these approaches under a generalized theory of inaccurate probabilities have also been undertaken (Walley 2000; Klir 2004), without, however, leading to fully satisfactory approaches.
The purpose of this chapter is therefore to provide an overview of some of the different approaches used to represent and quantify uncertainties, both aleatory and epistemic. It is organized as follows. In section 1.2, a discussion about the need to distinguish between epistemic and aleatory uncertainty is presented. In section 1.3, an illustration of the probabilistic modeling approach is provided, including illustrations of some cases where its use may be problematic for representing epistemic uncertainty. In section 1.4, p-box theory is presented, which is an extension of probability theory that is designed to better address problematic cases in modeling epistemic uncertainties. In section 1.5, interval analysis is briefly discussed and in section 1.6 fuzzy set theory is addressed. In section 1.7, possibility theory is introduced, while evidence theory (or Dempster-Shafer theory) is presented in section 1.8. Some concluding remarks and discussions are provided in section 1.9.
The need to classify uncertainty into epistemic and aleatory has been the subject of much debate and discussion (Hoffman and Hammonds 1994; Apostolakis 1999; Der Kiureghian and Ditlevsen 2009; Lemaire 2014). It therefore seemed useful to create a section dedicated to the challenges and the usefulness of this distinction. This debate arises from the controversy between Niels Bohr and Albert Einstein concerning the nature of randomness observed at the quantum scale. For the former, it was a fundamental randomness, while for the latter, it was linked to an incomplete knowledge of phenomena at these scales. To illustrate his point of view, Einstein stated that "God does not play dice". Subsequent theoretical and experimental work by physicists seems to have proved Niels Bohr right: randomness observed at these scales is intrinsic to quantum nature. Now that the debate at the microscopic scale has been closed, one may raise the same...
Dateiformat: ePUBKopierschutz: Adobe-DRM (Digital Rights Management)
Systemvoraussetzungen:
Das Dateiformat ePUB ist sehr gut für Romane und Sachbücher geeignet – also für „fließenden” Text ohne komplexes Layout. Bei E-Readern oder Smartphones passt sich der Zeilen- und Seitenumbruch automatisch den kleinen Displays an. Mit Adobe-DRM wird hier ein „harter” Kopierschutz verwendet. Wenn die notwendigen Voraussetzungen nicht vorliegen, können Sie das E-Book leider nicht öffnen. Daher müssen Sie bereits vor dem Download Ihre Lese-Hardware vorbereiten.Bitte beachten Sie: Wir empfehlen Ihnen unbedingt nach Installation der Lese-Software diese mit Ihrer persönlichen Adobe-ID zu autorisieren!
Weitere Informationen finden Sie in unserer E-Book Hilfe.