Computer Simulation Validation (eBook)

Fundamental Concepts, Methodological Frameworks, and Philosophical Perspectives
eBook Download: PDF
2019 | 1. Auflage
XIII, 1056 Seiten
Springer-Verlag
978-3-319-70766-2 (ISBN)

Lese- und Medienproben

Computer Simulation Validation -
Systemvoraussetzungen
234,33 inkl. MwSt
  • Download sofort lieferbar
  • Zahlungsarten anzeigen

This unique volume introduces and discusses the methods of validating computer simulations in scientific research. The core concepts, strategies, and techniques of validation are explained by an international team of pre-eminent authorities, drawing on expertise from various fields ranging from engineering and the physical sciences to the social sciences and history. The work also offers new and original philosophical perspectives on the validation of simulations.

Topics and features: introduces the fundamental concepts and principles related to the validation of computer simulations, and examines philosophical frameworks for thinking about validation; provides an overview of the various strategies and techniques available for validating simulations, as well as the preparatory steps that have to be taken prior to validation; describes commonly used reference points and mathematical frameworks applicable to simulation validation; reviews the legal prescriptions, and the administrative and procedural activities related to simulation validation; presents examples of best practice that demonstrate how methods of validation are applied in various disciplines and with different types of simulation models; covers important practical challenges faced by simulation scientists when applying validation methods and techniques; offers a selection of general philosophical reflections that explore the significance of validation from a broader perspective.

This truly interdisciplinary handbook will appeal to a broad audience, from professional scientists spanning all natural and social sciences, to young scholars new to research with computer simulations. Philosophers of science, and methodologists seeking to increase their understanding of simulation validation, will also find much to benefit from in the text.



Prof. Dr. Dr. Claus Beisbart is Professor for Philosophy of Science (Extraordinarius) in the Institute for Philosophy at the University of Bern, Switzerland.

Prof. Dr. Nicole J. Saam is Professor for Sociology (Chair) in the Institute of Sociology at Friedrich-Alexander University Erlangen-Nürnberg, Germany.

Preface 6
Contents 8
Contributors 12
1 Introduction: Computer Simulation Validation 15
1.1 Introduction 15
1.2 Goals and Readership of this Handbook 17
1.3 Structure and Topics 19
1.3.1 Foundations (Parts I–II) 20
1.3.2 Methodology (Parts III–VI) 24
1.3.3 Validation at Work—Best Practice Examples (Part VII) 31
1.3.4 Challenges in Simulation Model Validation (Part VIII) 34
1.3.5 Reflecting on Simulation Validation: Philosophical Perspectives and Discussion Points (Part IX) 37
1.4 Outlook 41
References 43
Foundations—Basic Conceptions in Simulation Model Validation 46
2 What is Validation of Computer Simulations? Toward a Clarification of the Concept of Validation and of Related Notions 47
2.1 Introduction 48
2.2 Preliminaries 49
2.3 Influential Definitions of Validating Computer Simulations 52
2.4 Discussion of the Definitions 53
2.4.1 Commonalities 53
2.4.2 Difference 1: The Object of the Validation 54
2.4.3 Difference 2: The Standard of Evaluation 58
2.4.4 Difference 3: Type of Evaluation 68
2.4.5 Difference 4: Cogency (Degree of Credibility) 69
2.4.6 Difference 5: Empirical Methodology 71
2.5 Conclusions 76
References 77
3 Simulation Accuracy, Uncertainty, and Predictive Capability: A Physical Sciences Perspective 80
3.1 Introduction 81
3.2 Foundational Issues in Simulation Credibility 83
3.3 Verification Activities 86
3.3.1 Code Verification 86
3.3.2 Solution Verification 88
3.4 Validation, Calibration, and Prediction 90
3.4.1 Model Validation 90
3.4.2 Calibration and Predictive Capability 97
3.5 Concluding Remarks 104
References 105
4 Verification and Validation Principles from a Systems Perspective 109
4.1 Introduction 109
4.2 Approaches to Verification 114
4.3 Approaches to Validation 118
4.3.1 Quantitative Approaches to Validation 120
4.3.2 Qualitative Methods: Face Validation Approaches 122
4.3.3 Validation of Library Sub-models and Generic Models 123
4.4 Acceptance or Upgrading of Simulation Models 124
4.5 Discussion 125
References 127
5 Errors and Uncertainties: Their Sources and Treatment 129
5.1 Introduction 129
5.2 Verification-Related Errors/Uncertainties 131
5.2.1 Discrete Algorithm Choice and Software Programming 132
5.2.2 Numerical Approximation Errors 134
5.2.3 Conversion of Numerical Errors into Uncertainties 137
5.2.4 Estimating Total Numerical Uncertainty 138
5.3 Validation-Related Errors/Uncertainties 139
5.3.1 Experimental Measurement 139
5.3.2 Model Validation 140
5.3.3 Model Calibration 141
5.3.4 Extrapolation 142
5.4 Uncertainty Propagation-Related Uncertainties 143
5.4.1 Model Inputs 144
5.4.2 Model Parameters (i.e.., Parametric Uncertainty) 145
5.5 Total Prediction Uncertainty 146
5.6 Discussion 148
5.7 Conclusions 149
References 149
Foundations—Validation as a Scientific Method: Philosophical Frameworks for Thinking about Validation 152
6 Invalidation of Models and Fitness-for-Purpose: A Rejectionist Approach 153
6.1 Setting the Scene for Model Evaluation 154
6.2 The Falsification Framework of Karl Popper 156
6.3 Simulation Models, Invalidation and Falsification 158
6.4 Fitness-for-Purpose, Verisimilitude and Likelihood 160
6.5 If All Models May Be False, When Can They Be Considered Useful? 162
6.6 Defining Fitness-for-Purpose and Model Invalidation 165
6.6.1 Using Bayes Ratios to Differentiate Between Models 168
6.6.2 Use of Implausibility Measures to Differentiate Between Models 169
6.6.3 Use of Limits of Acceptability to Define Behavioural Models 170
6.7 Epistemic Uncertainties and Model Invalidation 171
6.8 The Model Advocacy Problem 172
6.9 Conclusions 174
References 175
7 Simulation Validation from a Bayesian Perspective 180
7.1 Introduction 180
7.2 The Fundamentals of Bayesian Epistemology 182
7.2.1 Basic Tenets of Bayesian Epistemology 182
7.2.2 A Brief Discussion of Bayesian Epistemology 187
7.3 Bayesian Epistemology and the Validation of Computer Simulations 189
7.3.1 Data-Driven Validation 190
7.3.2 The Problem of the Priors in Validation 194
7.4 Discussion 199
7.5 Conclusions 205
References 205
8 Validation of Computer Simulations from a Kuhnian Perspective 209
8.1 Introduction 210
8.2 Kuhn's Philosophy of Science 211
8.3 A Revolution, but not a Kuhnian Revolution: Computer Simulations in Science 214
8.4 Validation of Simulations from a Kuhnian Perspective 215
8.4.1 Do Computer Simulations Require a New Paradigm of Validation? 216
8.4.2 Validation of Simulations and the Duhem–Quine Thesis 219
8.4.3 Validation of Social Simulations 221
8.5 Summary and Conclusions 227
References 228
9 Understanding Simulation Validation—The Hermeneutic Perspective 231
9.1 Introduction 231
9.2 Hermeneutics in Versus Hermeneutics of Validation 233
9.3 Hermeneutics in Validation 234
9.3.1 Hermeneutics According to Kleindorfer, O’Neill and Ganeshan 235
9.3.2 A Reply to Kleindorfer, O’Neill and Ganeshan 237
9.3.3 Claim C-Open—A Second View 243
9.4 Hermeneutics of Validation 243
9.4.1 The Requirement of a Hermeneutic Situation 244
9.4.2 Hermeneutic Naiveté Versus Hermeneutic Consciousness 245
9.4.3 Interdisciplinary Dialogue 245
9.4.4 The Hermeneutic Tasks 247
9.5 Discussion 249
9.6 Conclusions 250
References 251
Methodology—Preparatory Steps 253
10 Assessing the Credibility of Conceptual Models 254
10.1 Introduction 255
10.2 Simulation-Based Knowledge, Verification, Validity, and the Function of Models 256
10.3 Taking the Notion of “Credibility” Seriously 261
10.4 The Credibility of Models: Lessons from Scientific Practice 263
10.5 Empirical Fit and Causal Understanding 266
10.6 Models and the Exploration of Credible Worlds 269
10.7 Summary 272
References 273
11 The Foundations of Verification in Modeling and Simulation 275
11.1 Verification in Modeling and Simulation 275
11.2 Code Verification 277
11.3 Types of Code Verification Problems and Associated Benchmarks 281
11.4 Solution Verification 285
11.5 Solution Verification for Complex Problems 289
11.6 Conclusion and Prospectus 291
References 296
12 The Method of Manufactured Solutions for Code Verification 298
12.1 Introduction 298
12.2 Broad Description of MMS 300
12.3 Three Example Problems in MMS 301
12.3.1 Example 1 301
12.3.2 Example 2 302
12.3.3 Example 3 303
12.3.4 Complex Problems 304
12.4 Application to Code Verification 304
12.5 Features and Examples of MMS Code Verification 307
12.5.1 Radiation Transport Codes 307
12.5.2 Nonhomogeneous and Nonlinear Boundary Conditions 308
12.5.3 Shocks, Partitioning, and “Glass-Box” Verification 308
12.5.4 Shocks, Multiphase Flows, and Discontinuous Properties 309
12.5.5 Verification of Boundary Conditions 310
12.5.6 Unsteady Flows and Divergence-Free MMS 310
12.5.7 Variable Density Flows Combustion
12.6 Attributes of MMS Code Verification 311
12.6.1 Two Multidimensional Aspects 311
12.6.2 Blind Study 311
12.6.3 Burden of MMS and Option Combinations 312
12.6.4 Code Verification for Commercial Codes 312
12.6.5 Code Verification with a Strong Completion Point 313
12.6.6 Proof? 313
12.6.7 Mere Mathematics 314
12.6.8 Irrelevance of Solution Realism to Code Verification 315
12.7 Reasons for Solution Realism in MMS 315
12.7.1 Realistic MMS in Code Verification of Glacial Ice Flow Modeling 316
12.7.2 Realistic MMS in Solution Verifications and Turbulence Models 316
12.7.3 Realistic MMS in Singularity Studies 317
12.7.4 Other Uses and Generation Methods for Realistic MMS 317
12.8 Alternative Formulations and General References for MMS 318
12.9 Conclusion 318
References 319
13 Validation Metrics: A Case for Pattern-Based Methods 322
13.1 Introduction 322
13.2 Validation Metrics 324
13.2.1 Four Types of Measurement Scales 324
13.2.2 The Desirable Properties of a Validation Metric 325
13.3 Four Families of Validation Measures 326
13.3.1 Empirical Likelihood Measures 326
13.3.2 Stochastic Area Measures 327
13.3.3 Pattern-Based Measures I: Information-Theoretic Measures 328
13.3.4 Pattern-Based Measures II: Strategic State Measures 328
13.4 Measures of Closeness or of Information Loss 329
13.4.1 Kullback–Leibler Information Loss 329
13.4.2 The Generalized Subtracted L divergence (GSL-div) 330
13.5 The Example: Models and Data 331
13.6 The State Similarity Measure (SSM) 332
13.6.1 Results for the Models 333
13.6.2 Monte Carlo Simulations of the SSM 333
13.7 Classical Possibility Theory 334
13.7.1 The Generalized Hartley Measure (GHM) for Graded Possibilities 335
13.7.2 Applying U-Uncertainty to Our Data 336
13.8 Comparing the Distances Measured by SSM and GHM 338
13.9 Conclusions 339
References 340
14 Analysing Output from Stochastic Computer Simulations: An Overview 342
14.1 Introduction 342
14.2 Preliminaries 344
14.2.1 Definitions 344
14.2.2 Background Statistical Knowledge 344
14.2.3 Setting Up the Problem 346
14.3 Working with Terminating Simulations 346
14.4 Working with Non-terminating Simulations 347
14.4.1 Welch's Method 349
14.4.2 MSER-5 351
14.5 How Many Replications? 352
14.6 Making Comparisons 353
14.6.1 Comparing Two Systems 353
14.6.2 Comparing Many Systems 355
14.7 Conclusion 355
References 356
Methodology—Points of Reference and Related Techniques 357
15 The Use of Experimental Data in Simulation Model Validation 358
15.1 Introduction 358
15.2 Data Sets for Model Development and Testing 361
15.3 Comparison Methods for Model and Target System Data Sets 362
15.3.1 Graphical Methods for System and Model Data Comparisons 362
15.3.2 Some Quantitative Measures for System and Model Comparisons in the Time Domain 363
15.3.3 Frequency-Domain Measures and Comparisons 364
15.4 System Identification and Parameter Estimation in Model Validation 365
15.4.1 A Brief Overview of System Identification and Parameter Estimation 365
15.4.2 Issues of Identifiability 367
15.4.3 Applications of System Identification and Parameter Estimation to the Processes of Validation 369
15.5 Design of Experiments and Selection of Inputs for Model Testing 371
15.6 Model Structure Optimisation 373
15.7 Experimental Data for Validation: A Physiological Modelling Example 373
15.7.1 Experimental Constraints 377
15.7.2 Experimental Design and Test Signal 378
15.8 Discussion 380
15.9 Conclusions 381
References 382
16 How to Use and Derive Stylized Facts for Validating Simulation Models 384
16.1 Introduction 384
16.2 Epistemological Foundations of Stylized Facts 386
16.2.1 Development and Definition of the Stylized Facts Concept 386
16.2.2 Using Stylized Facts for Simulation Model Validation 388
16.3 Existing Approaches to Establish Stylized Facts 392
16.4 An Alternative Process to Derive Stylized Facts 396
16.5 Conclusion and Outlook 401
References 402
17 The Users’ Judgements—The Stakeholder Approach to Simulation Validation 405
17.1 Introduction 405
17.2 Action Research and the Use of Simulation Models 407
17.2.1 Meta-Theoretical Foundations of Action Research 408
17.2.2 The Validity of Action Research Knowledge 409
17.2.3 The Use of Simulation Models in Action Research and the Subject Matter of Their Validation 410
17.3 The Logical Empiricist Versus the Post-positivist Understanding of Validity 412
17.3.1 The Logical Empiricist Understanding of Validity 412
17.3.2 The Need for a Post-positivist Understanding of Validity 413
17.4 A General Definition of Simulation Validity 414
17.4.1 Definition 415
17.4.2 Application to Socio-Ecological Simulation Models in Action Research 416
17.5 Validation Techniques Related to the Stakeholder’s Judgements 422
17.5.1 Qualitative Interviewing 422
17.5.2 Focus Groups 423
17.5.3 Role-Playing Games 424
17.5.4 Inappropriate Techniques and Related Consequences 425
17.6 Discussion 425
17.7 Conclusions 428
17.8 Outlook 428
References 429
18 Validation Benchmarks and Related Metrics 432
18.1 Introduction 432
18.2 The Concept of Validation Benchmarks 434
18.2.1 Defining Validation Benchmarks 434
18.2.2 Motivations for Using Validation Benchmarks 435
18.2.3 Sources of Benchmarks 436
18.3 The Benchmarking Process 438
18.3.1 Types of Benchmarking 438
18.3.2 Criteria of Benchmark Selection 441
18.4 A Typology of Validation Benchmarks 443
18.4.1 Strong-Sense Benchmarks 443
18.4.2 Standard Benchmarks 445
18.4.3 Yardstick Benchmarks 446
18.4.4 Touchstone Benchmarks 446
18.5 Metrics Related to Benchmarking 447
18.5.1 Basic Concepts 448
18.5.2 Measures of Accuracy 449
18.5.3 Skill Scores 451
18.5.4 Murphy–Winkler Framework and Beyond 452
18.5.5 Holistic Measurement 452
18.6 Discussion 453
18.6.1 Normalizing Simulation Validation 453
18.6.2 The Social Character of Validation Benchmarks 453
18.6.3 Between Validation and Comparison—the Limitations of Benchmarking 455
18.6.4 The Price of Efficient Benchmarks 456
18.6.5 The Devaluation of Benchmarks Proper 456
18.7 Conclusions 457
References 459
Methodology—Mathematical Frameworks and Related Techniques 461
19 Testing Simulation Models Using Frequentist Statistics 462
19.1 Introduction 462
19.2 Frequentist Statistics 464
19.2.1 Important Background 464
19.2.2 Estimation 467
19.2.3 Models of Dependence 468
19.2.4 Null Hypothesis Significance Tests 468
19.3 Statistical Model Validation: Why and How? 471
19.3.1 Why Validate? 472
19.3.2 Estimating Goodness of Fit 472
19.3.3 Testing Goodness of Fit 473
19.3.4 Tests for Splitting and Tests for Lumping 474
19.3.5 Conceptual Entry Point: TOST 476
19.3.6 A Uniformly Most Powerful Invariant Test 478
19.3.7 More Descriptive: Test of Fidelity 478
19.3.8 Statistical Validation Overview 479
19.4 Examples 481
19.4.1 Fitness for Purpose 481
19.4.2 Validation of a Theoretical Model 484
19.5 Discussion 487
19.5.1 Generalizations 488
19.5.2 Significant and Important? 489
19.5.3 Nuisance Parameters 489
19.5.4 Bayesian or Frequentist Approach? 489
19.5.5 Conclusion 491
References 491
20 Validation Using Bayesian Methods 494
20.1 Introduction 494
20.2 Fundamentals 497
20.3 Bayesian Decision Rule 499
20.4 Bayesian Univariate Hypothesis Testing 501
20.5 Multivariate Bayesian Hypothesis Testing 502
20.6 A Bayesian Measure of Evidence 503
20.7 Bayes Network 505
20.8 Non-normal Data Transformation 506
20.9 Bayesian Model Validation Process 507
20.10 Numerical Application 510
20.10.1 Example 1: Bayesian Decision Rule 510
20.10.2 Example 2: Univariate Model Validation 514
20.10.3 Example 3: Multivariate Model Validation 516
20.11 Concluding Remarks 519
References 519
21 Imprecise Probabilities 522
21.1 Introduction 522
21.2 Basics 523
21.3 Examples 525
21.3.1 Unknown Parameters 525
21.3.2 The Challenge Problems 526
21.3.3 Nonprobabilistic Odds 527
21.4 Interpretations 528
21.4.1 One-Sided Betting 528
21.4.2 Indeterminate Belief 529
21.4.3 Robustness Analysis 530
21.4.4 Evidence Theory 530
21.5 Problems 531
21.5.1 Updating 531
21.5.2 Decision-Making 532
21.6 Validation and IP 533
21.6.1 Interpretations 534
21.6.2 Problems 534
21.7 Conclusion 535
References 535
22 Objective Uncertainty Quantification 538
22.1 Introduction 538
22.2 Gene Regulatory Networks 541
22.3 Optimal Operators 544
22.4 Optimal Intervention in Regulatory Networks 545
22.5 Intrinsically Bayesian Robust Operators 547
22.6 IBR Intervention in Regulatory Networks 550
22.7 Objective Cost of Uncertainty 550
22.8 Optimal Experimental Design for Regulatory Networks 552
22.9 Discussion 554
22.10 Conclusion 555
References 556
Methodology—The Organization and Management of Simulation Validation 558
23 Standards for Evaluation of Atmospheric Models in Environmental Meteorology 559
23.1 Introduction 560
23.2 Definitions Used 561
23.2.1 Specifics of an Atmospheric Model 561
23.2.2 Modeling 562
23.2.3 Guideline 563
23.2.4 Standard 563
23.2.5 Verification 563
23.2.6 Validation 564
23.2.7 Evaluation 564
23.2.8 Model Quality Indicator 565
23.2.9 Reference Data 567
23.3 From Guidelines to Standards 567
23.3.1 Historical Background 567
23.3.2 How to Achieve a Standard 569
23.4 Generic Structure of an Evaluation Guideline 570
23.4.1 Specification of Application Area 571
23.4.2 Evaluation Steps to be Performed by the Model Developer 571
23.4.3 Evaluation Steps to be Performed by the Model User 573
23.5 Examples for Standards 574
23.5.1 Comparing Application Areas of Two Standards 574
23.5.2 Detailed Specification of an Application Area 575
23.5.3 Some Detailed Evaluation Steps to be Performed by the Model Developer 576
23.5.4 Some Detailed Evaluation Steps to be Performed by the Model User 578
23.6 Conclusions 579
References 580
24 The Management of Simulation Validation 583
24.1 Introduction 583
24.2 Simulation Terminology 585
24.3 Principles of Simulation Validation 586
24.4 Management of Simulation V& V: A Framework
24.5 Process-Oriented Simulation V& V Management
24.5.1 Simulation Validation Steps 591
24.5.2 Simulation Verification Steps 593
24.6 Draw up an Optimized V& V Scheme
24.7 Quantify Simulation V& V Results
24.8 Computer Aided Management of Simulation V& V
24.8.1 Management Platform 597
24.8.2 Other Validation Tools 598
24.9 Discussion 599
24.10 Conclusions 599
References 600
25 Valid and Reproducible Simulation Studies—Making It Explicit 603
25.1 Introduction 603
25.2 Example: A Model of the Decision to Migrate 606
25.3 Managing the Model: Domain-Specific Modeling Languages 608
25.4 Managing an Experiment: Experiment Specification Languages 611
25.5 Managing a Simulation Study: Provenance Models 613
25.6 Discussion 619
25.7 Conclusion 619
References 620
Validation at Work—Best Practice-Examples 624
26 Validation of Particle Physics Simulation 625
26.1 Introduction 625
26.2 What Particle Physics is About: Example LHC 626
26.2.1 The Status of the Standard Model 626
26.2.2 The Forefront Experiment: LHC 627
26.3 Data Analysis and the Use of Simulations 628
26.3.1 From Data to Physics 628
26.3.2 The Role of Simulation for Data Analysis 629
26.4 Modeling the LHC Processes 630
26.4.1 The Matrix Element of the Hard Collision 631
26.4.2 Parton Distribution Functions: Dressing the Initial State 632
26.4.3 Dressing of the Outgoing Partons 632
26.5 Detector Simulation 633
26.6 Principles of Validation and Uncertainties 636
26.6.1 Factorization of Migration 637
26.6.2 Is Factorization Correct? 638
26.7 General Procedures of Validation in Particle Physics 638
26.8 Validation of the Physics Generators 639
26.8.1 pdfs 640
26.8.2 Pile-up in pp Scattering 642
26.9 Validation of Detector Simulation 643
26.9.1 Testing the Detector Geometry 643
26.9.2 Validation of Electron Simulation 644
26.10 How Simulation is Applied in Data Analysis 646
26.10.1 Measurement of the Higgs Cross Section 646
26.10.2 Search for a Stop Quark 647
26.11 Discussion 650
26.12 Summary and Conclusion 651
References 652
27 Validation in Fluid Dynamics and Related Fields 655
27.1 Fluid Dynamics and Related Fields 655
27.1.1 Weak Models, Strong Models, and RANS Turbulence Models 656
27.2 Separation of Verification and Validation 657
27.3 Errors and Uncertainties 657
27.4 Validation—What Does It Mean? 658
27.4.1 Issue #1. Acceptability (Pass/Fail) Criteria 659
27.4.2 Issue #2. Necessity for Experimental Data 660
27.4.3 Issue #3. Intended Use 660
27.4.4 Issue #4. The Prediction Issue 661
27.5 Validation Methodology Based on ASME V& V 20-2009
27.5.1 ASME V& V 20-2009 Background, Motivation, and Philosophy
27.5.2 Validation Metrics 662
27.5.3 Defining Validation Uncertainty Uval 663
27.5.4 Estimating Validation Uncertainty 664
27.5.5 Interpretation of Validation Results and Caveats 665
27.5.6 Observations 667
27.5.7 Importance of Case 2 668
27.5.8 Model Quality Versus Validation Quality 668
27.5.9 Forthcoming Addenda to V& V 20-2009
27.6 Model Form Errors Versus Parameter Errors 669
27.7 Model Form Uncertainty and Probability Distribution Functions 670
27.8 Weakest Link in Validation Practice 671
27.9 New Paradigm of Experiments Designed Specifically for Validation 672
27.10 Unrealistic Expectations Placed on Experimentalists 672
27.11 Can Models be Validated? A Discussion of Falsificationism Versus Validation 673
27.11.1 Truth Versus Accuracy 674
27.11.2 Summary of Falsificationism Versus Validation 675
References 676
28 Astrophysical Validation 678
28.1 Introduction 678
28.2 Approach to Verification and Validation 679
28.3 Simulation Instruments 681
28.3.1 The Flash Code 682
28.3.2 The Postprocessing Toolkit 684
28.3.3 Simulating Reactive Flow 684
28.4 Validation Examples 685
28.4.1 Overview of Flash Problems 686
28.4.2 Shocks and Fluid Instabilities 687
28.4.3 Computation of Reaction Products in Large Eddy Simulations Of Supernovae 693
28.5 Discussion 697
28.6 Conclusions 699
References 699
29 Validation in Weather Forecasting 703
29.1 Introduction 703
29.2 Setting the Scene 705
29.2.1 The Atmospheric Model: State of the Art 705
29.2.2 Intended Use of the Simulation Output 709
29.3 Validation Concepts 710
29.3.1 Idealized Tests for the Verification of the Dynamical Core 711
29.3.2 Validation of Parameterizations 717
29.3.3 Comparison to Observations 718
29.4 Uncertainty Estimation via Ensemble Forecasting 723
29.5 Discussion and Summary 724
References 726
30 Validation of Climate Models: An Essential Practice 729
30.1 Introduction 729
30.2 Climate Model Validation: Emergence of Definition and Community Practice 731
30.3 Definition of Terms 734
30.4 Model Construction, Observations, Assimilation: Roles in Validation 737
30.5 Validation of Climate Models in Practice 739
30.5.1 Independence, Transparency, and Objectivity: Basic Values of Verification and Validation 740
30.5.2 Identification of Independent Observational Data 741
30.5.3 Deliberative Validation and Expert Judgment 742
30.5.4 Quantitative Evaluation 745
30.6 Discussion 750
30.7 Conclusion 751
References 752
31 Validation of Agent-Based Models in Economics and Finance 755
31.1 Introduction 756
31.2 Agent-Based Computational Economics: Common Practices 756
31.2.1 The Development of a Typical Agent-Based Model 757
31.2.2 Inputs of Agent-Based Models 759
31.2.3 Outputs of Agent-Based Models 760
31.2.4 Relation Between Input and Output 761
31.3 Agent-Based Model Validation: Theoretical Framework 761
31.4 Agent-Based Model Validation: Literature Review 763
31.4.1 Calibration and Estimation 765
31.4.2 Validation 767
31.5 A New Wave of Validation Approaches 768
31.5.1 Validation As Replication of Time Series Dynamics 769
31.5.2 Validation as Matching of Causation 770
31.5.3 Global Sensitivity Analysis via Kriging Meta-Modeling 771
31.5.4 Parameter Space Exploration and Calibration via Machine-Learning Surrogates 772
31.6 Conclusions 773
References 774
Challenges in Simulation Model Validation 780
32 Validation and Equifinality 781
32.1 Introduction 781
32.2 The Origins of Equifinality Concepts 783
32.3 Equifinality as an Empirical Result 783
32.4 Equifinality in Model Calibration in the Inexact Sciences 787
32.5 Equifinality as Behavioural Model Ensembles 788
32.6 Defining a Model Likelihood 791
32.7 Equifinality and Model Validation in the Inexact Sciences 794
32.8 Discussion 795
References 796
33 Validation and Over-Parameterization—Experiences from Hydrological Modeling 800
33.1 Introduction 800
33.1.1 Over-Parameterization Terminology 801
33.1.2 Main Types of Hydrological Models 803
33.1.3 Peculiarities of Hydrological Models 806
33.2 Types of Validation in Hydrological Modeling 807
33.2.1 Validation Based on Independent Time Periods 807
33.2.2 Validation Based on Independent Catchments 808
33.2.3 Validation Based on Independent Variables 808
33.3 Conclusions—All Models Are Wrong, but Which Are Useful? 813
Textbox: Short Description of Catchment Hydrology 815
References 816
34 Uncertainty Quantification Using Multiple Models—Prospects and Challenges 824
34.1 Introduction 824
34.2 Challenges for Uncertainty Quantification in Climate Modeling 826
34.3 Uncertainty Quantification Using Model Ensembles 828
34.4 Problems with Model Democracy 830
34.5 Beyond Model Democracy 832
34.6 Illustration of Model Weighting for Arctic Sea Ice 834
34.7 Discussion and Open Issues 836
34.8 Conclusion 840
References 841
35 Challenges to Simulation Validation in the Social Sciences. A Critical Rationalist Perspective 845
35.1 Introduction 845
35.2 Illustrative Example: Models of Social Influence 849
35.3 Challenges to Model Validation 852
35.3.1 Obscure Concepts 852
35.3.2 Abundance of Latent Concepts 854
35.3.3 Representation of Time 856
35.3.4 Interplay of Multiple Processes 857
35.3.5 Context Characteristics Matters 859
35.4 Discussion 861
35.4.1 Compare Models and Identify Critical Assumptions! 861
35.4.2 Defend Your Assumptions! 862
35.4.3 Explore Model Scope and Its Boundaries! 863
35.4.4 More Validation! 863
References 864
36 Validation and the Uniqueness of Historical Events 868
36.1 A Brief History of Simulation’s Semantics 870
36.2 History 871
36.3 Challenges 873
36.4 Uses and Potentials of Simulations in History 877
36.4.1 Big-Data and Longue Durée History 877
36.4.2 Microhistorical Research and Simulation 878
36.4.3 Digital Games and Simulation Games 879
36.5 Conclusion and Outlook 880
References 882
Reflecting on Simulation Validation: Philosophical Perspectives and Discussion Points 885
37 What is a Computer Simulation and What does this Mean for Simulation Validation? 886
37.1 Introduction 887
37.2 Preliminaries 888
37.3 Computer Simulations and Experiments 890
37.3.1 Computer Simulations as Experiments 890
37.3.2 Computer Simulations as Modeled Experiments 892
37.4 Computer Simulations, Thought Experiments and Argumentation 895
37.5 Models and Simulations 900
37.6 Conclusions 904
References 906
38 How Do the Validations of Simulations and Experiments Compare? 909
38.1 Introduction 909
38.2 Epistemology and Methodology of Validation 911
38.2.1 The Concept of Validation 911
38.2.2 Epistemology and Methodology 916
38.3 Illustration: Validation of Experiments and Simulations in the Field of Evolution 920
38.4 Discussion and Conclusion 925
References 925
39 How Does Holism Challenge the Validation of Computer Simulation? 927
39.1 Introduction 927
39.2 Holism and Modularity—Two Counteracting Concepts 929
39.2.1 Modularity—The Rational Picture 929
39.2.2 Holism—A Multifaceted Challenge 932
39.3 The Challenge Arising from Parameterization and Tuning 933
39.4 The Challenge from Kluging 937
39.5 The Limits of Validation 941
References 943
40 What Types of Values Enter Simulation Validation and What Are Their Roles? 945
40.1 Introduction 945
40.2 The Framework 946
40.3 A Defense of Epistemic Values that Assess the Credibility of Simulation Results 949
40.4 Roles of Cognitive and Social Values in Assessing the Credibility of Simulation Results 951
40.4.1 Assistance in the Assessment of Performance in Terms of Epistemic Values 951
40.4.2 Determining Minimal Probabilities for Accepting or Rejecting a Hypothesis 952
40.5 Roles of Cognitive and Social Values in Assessments of the Usefulness of Simulation Models 954
40.5.1 Accounting for the Practicability of Simulation Models 955
40.5.2 Accounting for the Relevance of Simulation Models 956
40.6 Simulation Validation as a Multi-criteria Assessment 958
40.7 Summary and Conclusion 959
References 961
41 Calibration, Validation, and Confirmation 964
41.1 Introduction 964
41.2 Computer Simulations, and Calibration 965
41.2.1 Calibration, Verification, and Validation 965
41.2.2 Adequacy for Purpose 970
41.3 Predictivism 972
41.3.1 The Paradox of Predictivism 972
41.3.2 Bayesian Confirmation Theory V and the Problem of Old Evidence 974
41.3.3 Validation and Confirmation 977
41.4 The Problem of Old Evidence and Model Calibration 977
41.4.1 The Static Problem of Old Evidence 977
41.4.2 The Dynamic Problem of Old Evidence 978
41.4.3 An Argument for Predictivism 979
41.4.4 A Novel Bayesian Argument for Predictivism 981
41.5 Conclusion 983
Appendix 984
References 985
42 Should Validation and Verification be Separated Strictly? 988
42.1 Introduction 989
42.2 Preliminaries 990
42.2.1 Scientific Methods 990
42.2.2 Verification and Validation 991
42.3 The Distinction Between Verification and Validation 994
42.3.1 Verification as a Means of Validation of the Computational Model? 996
42.3.2 Verification as Means for Validation of the Conceptual Model? 999
42.4 Arguments Against a Clean Separation Between Verification and Validation 1003
42.4.1 The Separation Between Verification and Validation 1005
42.4.2 Verification and Mathematics 1008
42.5 Conclusions 1009
References 1010
43 The Multidimensional Epistemology of Computer Simulations: Novel Issues and the Need to Avoid the Drunkard’s Search Fallacy 1012
43.1 Introduction: Computer Simulations, a Revolutionary Epistemology? 1013
43.2 Methodological and Conceptual Preliminaries 1014
43.3 Dimensions of Computational Inquiries, or Where Things Can Go Wrong Epistemically 1017
43.3.1 The Production of Computational Results: Can We Control the Beast? 1018
43.3.2 The Reception and Post Hoc Assessment of Computational Results 1026
43.4 Should Epistemologists of Science Bother, After All? 1030
43.4.1 Target Models, Actually Investigated Models, and Failure 1030
43.4.2 The Valuable Redundancy Argument 1031
43.4.3 The Procrustean Objection 1031
43.4.4 The Absence of Data Argument and the Ostrich Strategy 1033
43.5 Conclusion and Moral 1034
References 1036
Index 1039

Erscheint lt. Verlag 9.4.2019
Reihe/Serie Simulation Foundations, Methods and Applications
Zusatzinfo XIII, 1074 p. 105 illus., 58 illus. in color.
Sprache englisch
Themenwelt Geisteswissenschaften Philosophie Erkenntnistheorie / Wissenschaftstheorie
Mathematik / Informatik Informatik
Mathematik / Informatik Mathematik
Schlagworte computer simulation • Epistemology • error • Methodology • Simulation Model • Uncertainty • Validation • verification
ISBN-10 3-319-70766-3 / 3319707663
ISBN-13 978-3-319-70766-2 / 9783319707662
Haben Sie eine Frage zum Produkt?
PDFPDF (Wasserzeichen)
Größe: 22,4 MB

DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasser­zeichen und ist damit für Sie persona­lisiert. Bei einer missbräuch­lichen Weiter­gabe des eBooks an Dritte ist eine Rück­ver­folgung an die Quelle möglich.

Dateiformat: PDF (Portable Document Format)
Mit einem festen Seiten­layout eignet sich die PDF besonders für Fach­bücher mit Spalten, Tabellen und Abbild­ungen. Eine PDF kann auf fast allen Geräten ange­zeigt werden, ist aber für kleine Displays (Smart­phone, eReader) nur einge­schränkt geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.

Zusätzliches Feature: Online Lesen
Dieses eBook können Sie zusätzlich zum Download auch online im Webbrowser lesen.

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich
Studien zur Philosophie des Geistes von Locke bis Kant

von Udo Thiel; Dieter Hüning; Stefan Klingner …

eBook Download (2024)
Walter de Gruyter GmbH & Co.KG (Verlag)
109,95