Introduction to Probability Models -  Sheldon M. Ross

Introduction to Probability Models (eBook)

eBook Download: EPUB
2006 | 10. Auflage
800 Seiten
Elsevier Science (Verlag)
978-0-12-375687-9 (ISBN)
Systemvoraussetzungen
77,35 inkl. MwSt
  • Download sofort lieferbar
  • Zahlungsarten anzeigen
Introduction to Probability Models, Tenth Edition, provides an introduction to elementary probability theory and stochastic processes. There are two approaches to the study of probability theory. One is heuristic and nonrigorous, and attempts to develop in students an intuitive feel for the subject that enables him or her to think probabilistically. The other approach attempts a rigorous development of probability by using the tools of measure theory. The first approach is employed in this text.

The book begins by introducing basic concepts of probability theory, such as the random variable, conditional probability, and conditional expectation. This is followed by discussions of stochastic processes, including Markov chains and Poison processes. The remaining chapters cover queuing, reliability theory, Brownian motion, and simulation. Many examples are worked out throughout the text, along with exercises to be solved by students.
This book will be particularly useful to those interested in learning how probability theory can be applied to the study of phenomena in fields such as engineering, computer science, management science, the physical and social sciences, and operations research. Ideally, this text would be used in a one-year course in probability models, or a one-semester course in introductory probability theory or a course in elementary stochastic processes.

New to this Edition:

  • 65% new chapter material including coverage of finite capacity queues, insurance risk models and Markov chains
  • Contains compulsory material for new Exam 3 of the Society of Actuaries containing several sections in the new exams
  • Updated data, and a list of commonly used notations and equations, a robust ancillary package, including a ISM, SSM, test bank, and companion website
  • Includes SPSS PASW Modeler and SAS JMP software packages which are widely used in the field

Hallmark features:

  • Superior writing style
  • Excellent exercises and examples covering the wide breadth of coverage of probability topics
  • Real-world applications in engineering, science, business and economics


Sheldon M. Ross is a professor in the Department of Industrial Engineering and Operations Research at the University of Southern California. He received his Ph.D. in statistics at Stanford University in 1968. He has published many technical articles and textbooks in the areas of statistics and applied probability. Among his texts are A First Course in Probability, Introduction to Probability Models, Stochastic Processes, and Introductory Statistics. Professor Ross is the founding and continuing editor of the journal Probability in the Engineering and Informational Sciences. He is a Fellow of the Institute of Mathematical Statistics, and a recipient of the Humboldt US Senior Scientist Award.
Introduction to Probability Models, Tenth Edition, provides an introduction to elementary probability theory and stochastic processes. There are two approaches to the study of probability theory. One is heuristic and nonrigorous, and attempts to develop in students an intuitive feel for the subject that enables him or her to think probabilistically. The other approach attempts a rigorous development of probability by using the tools of measure theory. The first approach is employed in this text. The book begins by introducing basic concepts of probability theory, such as the random variable, conditional probability, and conditional expectation. This is followed by discussions of stochastic processes, including Markov chains and Poison processes. The remaining chapters cover queuing, reliability theory, Brownian motion, and simulation. Many examples are worked out throughout the text, along with exercises to be solved by students. This book will be particularly useful to those interested in learning how probability theory can be applied to the study of phenomena in fields such as engineering, computer science, management science, the physical and social sciences, and operations research. Ideally, this text would be used in a one-year course in probability models, or a one-semester course in introductory probability theory or a course in elementary stochastic processes. New to this Edition:- 65% new chapter material including coverage of finite capacity queues, insurance risk models and Markov chains- Contains compulsory material for new Exam 3 of the Society of Actuaries containing several sections in the new exams- Updated data, and a list of commonly used notations and equations, a robust ancillary package, including a ISM, SSM, and test bank- Includes SPSS PASW Modeler and SAS JMP software packages which are widely used in the field Hallmark features:- Superior writing style- Excellent exercises and examples covering the wide breadth of coverage of probability topics- Real-world applications in engineering, science, business and economics

Front Cover 1
Title Page 4
Copyright Page 5
Table of Contents 6
Preface 12
Chapter 1. Introduction to Probability Theory 18
1.1 Introduction 18
1.2 Sample Space and Events 18
1.3 Probabilities Defined on Events 21
1.4 Conditional Probabilities 24
1.5 Independent Events 27
1.6 Bayes’ Formula 29
Exercises 32
References 37
Chapter 2. Random Variables 38
2.1 Random Variables 38
2.2 Discrete Random Variables 42
2.2.1 The Bernoulli Random Variable 43
2.2.2 The Binomial Random Variable 44
2.2.3 The Geometric Random Variable 46
2.2.4 The Poisson Random Variable 47
2.3 Continuous Random Variables 48
2.3.1 The Uniform Random Variable 49
2.3.2 Exponential Random Variables 51
2.3.3 Gamma Random Variables 51
2.3.4 Normal Random Variables 51
2.4 Expectation of a Random Variable 53
2.4.1 The Discrete Case 53
2.4.2 The Continuous Case 55
2.4.3 Expectation of a Function of a Random Variable 57
2.5 Jointly Distributed Random Variables 61
2.5.1 Joint Distribution Functions 61
2.5.2 Independent Random Variables 65
2.5.3 Covariance and Variance of Sums of Random Variables 67
2.5.4 Joint Probability Distribution of Functions of Random Variables 76
2.6 Moment Generating Functions 79
2.6.1 The Joint Distribution of the Sample Mean and Sample Variance from a Normal Population 88
2.7 The Distribution of the Number of Events that Occur 91
2.8 Limit Theorems 94
2.9 Stochastic Processes 101
Exercises 103
References 112
Chapter 3. Conditional Probability and Conditional Expectation 114
3.1 Introduction 114
3.2 The Discrete Case 114
3.3 The Continuous Case 119
3.4 Computing Expectations by Conditioning 123
3.4.1 Computing Variances by Conditioning 134
3.5 Computing Probabilities by Conditioning 139
3.6 Some Applications 157
3.6.1 A List Model 157
3.6.2 A Random Graph 158
3.6.3 Uniform Priors, Polya’s Urn Model, and Bose–Einstein Statistics 166
3.6.4 Mean Time for Patterns 170
3.6.5 The k-Record Values of Discrete Random Variables 174
3.6.6 Left Skip Free Random Walks 177
3.7 An Identity for Compound Random Variables 183
3.7.1 Poisson Compounding Distribution 186
3.7.2 Binomial Compounding Distribution 188
3.7.3 A Compounding Distribution Related to the Negative Binomial 189
Exercises 190
Chapter 4. Markov Chains 208
4.1 Introduction 208
4.2 Chapman–Kolmogorov Equations 212
4.3 Classification of States 221
4.4 Limiting Probabilities 231
4.5 Some Applications 247
4.5.1 The Gambler’s Ruin Problem 247
4.5.2 A Model for Algorithmic Efficiency 251
4.5.3 Using a Random Walk to Analyze a Probabilistic Algorithm for the Satisfiability Problem 254
4.6 Mean Time Spent in Transient States 260
4.7 Branching Processes 262
4.8 Time Reversible Markov Chains 266
4.9 Markov Chain Monte Carlo Methods 277
4.10 Markov Decision Processes 282
4.11 Hidden Markov Chains 286
4.11.1 Predicting the States 290
Exercises 292
References 307
Chapter 5. The Exponential Distribution and the Poisson Process 308
5.1 Introduction 308
5.2 The Exponential Distribution 309
5.2.1 Definition 309
5.2.2 Properties of the Exponential Distribution 311
5.2.3 Further Properties of the Exponential Distribution 318
5.2.4 Convolutions of Exponential Random Variables 325
5.3 The Poisson Process 329
5.3.1 Counting Processes 329
5.3.2 Definition of the Poisson Process 330
5.3.3 Interarrival and Waiting Time Distributions 333
5.3.4 Further Properties of Poisson Processes 336
5.3.5 Conditional Distribution of the Arrival Times 342
5.3.6 Estimating Software Reliability 353
5.4 Generalizations of the Poisson Process 356
5.4.1 Nonhomogeneous Poisson Process 356
5.4.2 Compound Poisson Process 363
5.4.3 Conditional or Mixed Poisson Processes 368
Exercises 371
References 387
Chapter 6. Continuous-Time Markov Chains 388
6.1 Introduction 388
6.2 Continuous-Time Markov Chains 389
6.3 Birth and Death Processes 391
6.4 The Transition Probability Function Pij(t) 398
6.5 Limiting Probabilities 407
6.6 Time Reversibility 414
6.7 Uniformization 423
6.8 Computing the Transition Probabilities 426
Exercises 429
References 436
Chapter 7. Renewal Theory and Its Applications 438
7.1 Introduction 438
7.2 Distribution of N(t) 440
7.3 Limit Theorems and Their Applications 444
7.4 Renewal Reward Processes 456
7.5 Regenerative Processes 464
7.5.1 Alternating Renewal Processes 467
7.6 Semi-Markov Processes 474
7.7 The Inspection Paradox 477
7.8 Computing the Renewal Function 480
7.9 Applications to Patterns 483
7.9.1 Patterns of Discrete Random Variables 484
7.9.2 The Expected Time to a Maximal Run of Distinct Values 491
7.9.3 Increasing Runs of Continuous Random Variables 493
7.10 The Insurance Ruin Problem 495
Exercises 501
References 512
Chapter 8. Queueing Theory 514
8.1 Introduction 514
8.2 Preliminaries 515
8.2.1 Cost Equations 516
8.2.2 Steady-State Probabilities 517
8.3 Exponential Models 519
8.3.1 A Single-Server Exponential Queueing System 519
8.3.2 A Single-Server Exponential Queueing System Having Finite Capacity 528
8.3.3 Birth and Death Queueing Models 534
8.3.4 A Shoe Shine Shop 539
8.3.5 A Queueing System with Bulk Service 541
8.4 Network of Queues 544
8.4.1 Open Systems 544
8.4.2 Closed Systems 549
8.5 The System M/G/1 555
8.5.1 Preliminaries: Work and Another Cost Identity 555
8.5.2 Application of Work to M/G/1 556
8.5.3 Busy Periods 557
8.6 Variations on the M/G/1 558
8.6.1 The M/G/1 with Random-Sized Batch Arrivals 558
8.6.2 Priority Queues 560
8.6.3 An M/G/1 Optimization Example 563
8.6.4 The M/G/1 Queue with Server Breakdown 567
8.7 The Model G/M/1 570
8.7.1 The G/M/1 Busy and Idle Periods 575
8.8 A Finite Source Model 576
8.9 Multiserver Queues 579
8.9.1 Erlang’s Loss System 580
8.9.2 The M/M/k Queue 581
8.9.3 The G/M/k Queue 582
8.9.4 The M/G/k Queue 584
Exercises 585
References 595
Chapter 9. Reliability Theory 596
9.1 Introduction 596
9.2 Structure Functions 597
9.2.1 Minimal Path and Minimal Cut Sets 599
9.3 Reliability of Systems of Independent Components 603
9.4 Bounds on the Reliability Function 607
9.4.1 Method of Inclusion and Exclusion 608
9.4.2 Second Method for Obtaining Bounds on r(p) 617
9.5 System Life as a Function of Component Lives 619
9.6 Expected System Lifetime 627
9.6.1 An Upper Bound on the Expected Life of a Parallel System 631
9.7 Systems with Repair 633
9.7.1 A Series Model with Suspended Animation 637
Exercises 640
References 646
Chapter 10. Brownian Motion and Stationary Processes 648
10.1 Brownian Motion 648
10.2 Hitting Times, Maximum Variable, and the Gambler’s Ruin Problem 652
10.3 Variations on Brownian Motion 653
10.3.1 Brownian Motion with Drift 653
10.3.2 Geometric Brownian Motion 653
10.4 Pricing Stock Options 655
10.4.1 An Example in Options Pricing 655
10.4.2 The Arbitrage Theorem 657
10.4.3 The Black-Scholes Option Pricing Formula 661
10.5 White Noise 666
10.6 Gaussian Processes 668
10.7 Stationary and Weakly Stationary Processes 671
10.8 Harmonic Analysis of Weakly Stationary Processes 676
Exercises 678
References 682
Chapter 11. Simulation 684
11.1 Introduction 684
11.2 General Techniques for Simulating Continuous Random Variables 689
11.2.1 The Inverse Transformation Method 689
11.2.2 The Rejection Method 690
11.2.3 The Hazard Rate Method 694
11.3 Special Techniques for Simulating Continuous Random Variables 697
11.3.1 The Normal Distribution 697
11.3.2 The Gamma Distribution 701
11.3.3 The Chi-Squared Distribution 701
11.3.4 The Beta (n, m) Distribution 702
11.3.5 The Exponential Distribution—The Von Neumann Algorithm 703
11.4 Simulating from Discrete Distributions 705
11.4.1 The Alias Method 708
11.5 Stochastic Processes 713
11.5.1 Simulating a Nonhomogeneous Poisson Process 714
11.5.2 Simulating a Two-Dimensional Poisson Process 720
11.6 Variance Reduction Techniques 723
11.6.1 Use of Antithetic Variables 724
11.6.2 Variance Reduction by Conditioning 727
11.6.3 Control Variates 732
11.6.4 Importance Sampling 734
11.7 Determining the Number of Runs 739
11.8 Generating from the Stationary Distribution of a Markov Chain 740
11.8.1 Coupling from the Past 740
11.8.2 Another Approach 742
Exercises 743
References 751
Appendix: Solutions to Starred Exercises 752
Index 792

Preface


This text is intended as an introduction to elementary probability theory and stochastic processes. It is particularly well suited for those wanting to see how probability theory can be applied to the study of phenomena in fields such as engineering, computer science, management science, the physical and social sciences, and operations research.

It is generally felt that there are two approaches to the study of probability theory. One approach is heuristic and nonrigorous and attempts to develop in the student an intuitive feel for the subject that enables him or her to “think probabilistically.” The other approach attempts a rigorous development of probability by using the tools of measure theory. It is the first approach that is employed in this text. However, because it is extremely important in both understanding and applying probability theory to be able to “think probabilistically,” this text should also be useful to students interested primarily in the second approach.

New to This Edition


The tenth edition includes new text material, examples, and exercises chosen not only for their inherent interest and applicability but also for their usefulness in strengthening the reader’s probabilistic knowledge and intuition. The new text material includes Section 2.7, which builds on the inclusion/exclusion identity to find the distribution of the number of events that occur; and Section 3.6.6 on left skip free random walks, which can be used to model the fortunes of an investor (or gambler) who always invests 1 and then receives a nonnegative integral return. Section 4.2 has additional material on Markov chains that shows how to modify a given chain when trying to determine such things as the probability that the chain ever enters a given class of states by some time, or the conditional distribution of the state at some time given that the class has never been entered. A new remark in Section 7.2 shows that results from the classical insurance ruin model also hold in other important ruin models. There is new material on exponential queueing models, including, in Section 2.2, a determination of the mean and variance of the number of lost customers in a busy period of a finite capacity queue, as well as the new Section 8.3.3 on birth and death queueing models. Section 11.8.2 gives a new approach that can be used to simulate the exact stationary distribution of a Markov chain that satisfies a certain property.

Among the newly added examples are 1.11, which is concerned with a multiple player gambling problem; 3.20, which finds the variance in the matching rounds problem; 3.30, which deals with the characteristics of a random selection from a population; and 4.25, which deals with the stationary distribution of a Markov chain.

Course


Ideally, this text would be used in a one-year course in probability models. Other possible courses would be a one-semester course in introductory probability theory (involving Chapters 13 and parts of others) or a course in elementary stochastic processes. The textbook is designed to be flexible enough to be used in a variety of possible courses. For example, I have used Chapters 5 and 8, with smatterings from Chapters 4 and 6, as the basis of an introductory course in queueing theory.

Examples and Exercises


Many examples are worked out throughout the text, and there are also a large number of exercises to be solved by students. More than 100 of these exercises have been starred and their solutions provided at the end of the text. These starred problems can be used for independent study and test preparation. An Instructor’s Manual, containing solutions to all exercises, is available free to instructors who adopt the book for class.

Organization


Chapters 1 and 2 deal with basic ideas of probability theory. In Chapter 1 an axiomatic framework is presented, while in Chapter 2 the important concept of a random variable is introduced. Subsection 2.6.1 gives a simple derivation of the joint distribution of the sample mean and sample variance of a normal data sample.

Chapter 3 is concerned with the subject matter of conditional probability and conditional expectation. “Conditioning” is one of the key tools of probability theory, and it is stressed throughout the book. When properly used, conditioning often enables us to easily solve problems that at first glance seem quite difficult. The final section of this chapter presents applications to (1) a computer list problem, (2) a random graph, and (3) the Polya urn model and its relation to the Bose-Einstein distribution. Subsection 3.6.5 presents k-record values and the surprising Ignatov’s theorem.

In Chapter 4 we come into contact with our first random, or stochastic, process, known as a Markov chain, which is widely applicable to the study of many real-world phenomena. Applications to genetics and production processes are presented. The concept of time reversibility is introduced and its usefulness illustrated. Subsection 4.5.3 presents an analysis, based on random walk theory, of a probabilistic algorithm for the satisfiability problem. Section 4.6 deals with the mean times spent in transient states by a Markov chain. Section 4.9 introduces Markov chain Monte Carlo methods. In the final section we consider a model for optimally making decisions known as a Markovian decision process.

In Chapter 5 we are concerned with a type of stochastic process known as a counting process. In particular, we study a kind of counting process known as a Poisson process. The intimate relationship between this process and the exponential distribution is discussed. New derivations for the Poisson and nonhomogeneous Poisson processes are discussed. Examples relating to analyzing greedy algorithms, minimizing highway encounters, collecting coupons, and tracking the AIDS virus, as well as material on compound Poisson processes, are included in this chapter. Subsection 5.2.4 gives a simple derivation of the convolution of exponential random variables.

Chapter 6 considers Markov chains in continuous time with an emphasis on birth and death models. Time reversibility is shown to be a useful concept, as it is in the study of discrete-time Markov chains. Section 6.7 presents the computationally important technique of uniformization.

Chapter 7, the renewal theory chapter, is concerned with a type of counting process more general than the Poisson. By making use of renewal reward processes, limiting results are obtained and applied to various fields. Section 7.9 presents new results concerning the distribution of time until a certain pattern occurs when a sequence of independent and identically distributed random variables is observed. In Subsection 7.9.1, we show how renewal theory can be used to derive both the mean and the variance of the length of time until a specified pattern appears, as well as the mean time until one of a finite number of specified patterns appears. In Subsection 7.9.2, we suppose that the random variables are equally likely to take on any of m possible values, and compute an expression for the mean time until a run of m distinct values occurs. In Subsection 7.9.3, we suppose the random variables are continuous and derive an expression for the mean time until a run of m consecutive increasing values occurs.

Chapter 8 deals with queueing, or waiting line, theory. After some preliminaries dealing with basic cost identities and types of limiting probabilities, we consider exponential queueing models and show how such models can be analyzed. Included in the models we study is the important class known as a network of queues. We then study models in which some of the distributions are allowed to be arbitrary. Included are Subsection 8.6.3 dealing with an optimization problem concerning a single server, general service time queue, and Section 8.8, concerned with a single server, general service time queue in which the arrival source is a finite number of potential users.

Chapter 9 is concerned with reliability theory. This chapter will probably be of greatest interest to the engineer and operations researcher. Subsection 9.6.1 illustrates a method for determining an upper bound for the expected life of a parallel system of not necessarily independent components and Subsection 9.7.1 analyzes a series structure reliability model in which components enter a state of suspended animation when one of their cohorts fails.

Chapter 10 is concerned with Brownian motion and its applications. The theory of options pricing is discussed. Also, the arbitrage theorem is presented and its relationship to the duality theorem of linear programming is indicated. We show how the arbitrage theorem leads to the Black–Scholes option pricing formula.

Chapter 11 deals with simulation, a powerful tool for analyzing stochastic models that are analytically intractable. Methods for generating the values of arbitrarily distributed random variables are discussed, as are variance reduction methods for increasing the efficiency of the simulation. Subsection 11.6.4 introduces the valuable simulation technique of importance sampling, and indicates the usefulness of tilted distributions when applying this method.

Acknowledgments


We would like to acknowledge with thanks the helpful suggestions made by the many reviewers of the text. These...

Erscheint lt. Verlag 11.12.2006
Sprache englisch
Themenwelt Mathematik / Informatik Mathematik Statistik
Mathematik / Informatik Mathematik Wahrscheinlichkeit / Kombinatorik
Technik
Wirtschaft Betriebswirtschaft / Management Planung / Organisation
ISBN-10 0-12-375687-1 / 0123756871
ISBN-13 978-0-12-375687-9 / 9780123756879
Haben Sie eine Frage zum Produkt?
EPUBEPUB (Adobe DRM)

Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM

Dateiformat: EPUB (Electronic Publication)
EPUB ist ein offener Standard für eBooks und eignet sich besonders zur Darstellung von Belle­tristik und Sach­büchern. Der Fließ­text wird dynamisch an die Display- und Schrift­größe ange­passt. Auch für mobile Lese­geräte ist EPUB daher gut geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine Adobe-ID und die Software Adobe Digital Editions (kostenlos). Von der Benutzung der OverDrive Media Console raten wir Ihnen ab. Erfahrungsgemäß treten hier gehäuft Probleme mit dem Adobe DRM auf.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine Adobe-ID sowie eine kostenlose App.
Geräteliste und zusätzliche Hinweise

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich