Robert P. Dobrow

Probability


Скачать книгу

contains modest changes from the first, including some reorganization of material. It assumes knowledge of differential and integral calculus (two semesters of calculus, rather than three semesters). Double integrals are introduced to work with joint distributions in the continuous case, with instruction in working with them provided in an appendix. While the material in this book stands on its own as a “terminal” course, it also prepares students planning to take upper level courses in statistics, stochastic processes, and actuarial sciences.

      There are several excellent probability textbooks available at the undergraduate level, and we are indebted to many, starting with the classic Introduction to Probability Theory and Its Applications by William Feller.

      Our approach is tailored to our students and based on the experience of teaching probability at a liberal arts college. Our students are not only math majors but come from disciplines throughout the natural and social sciences, especially biology, physics, computer science, and economics. Sometimes we will even get a philosophy, English, or arts history major. They tend to be sophomores and juniors. These students love to see connections with “real-life” problems, with applications that are “cool” and compelling. They are fairly computer literate. Their mathematical coursework may not be extensive, but they like problem solving and they respond well to the many games, simulations, paradoxes, and challenges that the subject offers.

      In addition to simulation, another emphasis of the book is on applications. We try to motivate the use of probability throughout the sciences and find examples from subjects as diverse as homelessness, genetics, meteorology, and cryptography. At the same time, the book does not forget its roots, and there are many classical chestnuts like the problem of points, Buffon's needle, coupon collecting, and Montmort's problem of coincidences. Within the context of the examples, when male and female are referred to (such as in the example on colorblindness affecting males more than females), we note that this refers to biological sex, not gender identity. As such, we use the term “sex” not “gender” in the text.

      Following is a synopsis of the book's 11 chapters.

       Chapter 1 begins with basics and general principles: random experiment, sample space, and event. Probability functions are defined and important properties derived. Counting, including the multiplication principle, permutations, and combinations (binomial coefficients) are introduced in the context of equally likely outcomes. A first look at simulation gives accessible examples of simulating several of the probability calculations from the chapter.

       Chapter 2 emphasizes conditional probability, along with the law of total probability and Bayes formula. There is substantial discussion of the birthday problem. It closes with a discussion of independence.

       Random variables are the focus of Chapter 3. The most important discrete distributions—binomial, Poisson, and uniform—are introduced early and serve as a regular source of examples for the concepts to come.

       Chapter 4 contains extensive material on discrete random variables, including expectation, functions of random variables, and variance. Joint discrete distributions are introduced. Properties of expectation, such as linearity, are presented, as well as the method of indicator functions. Covariance and correlation are first introduced here.

       Chapter 5 highlights several families of discrete distributions: geometric, negative binomial, hypergeometric, multinomial, and Benford's law. Moment-generating functions are introduced to explore relationships between some distributions.

       Continuous probability begins with Chapter 6. Expectation, variance, and joint distributions are explored in the continuous setting. The chapter introduces the uniform and exponential distributions.

       Chapter 7 highlights several important continuous distributions starting with the normal distribution. There is substantial material on the Poisson process, constructing the process by means of probabilistic arguments from i.i.d. exponential inter-arrival times. The gamma and beta distributions are presented. There is also a section on the Pareto distribution with discussion of power law and scale invariant distributions. Moment-generating functions are used again to illustrate relationships between some distributions.

       Chapter 8 examines methods for finding densities of functions of random variables. This includes maximums, minimums, and sums of independent random variables (via the convolution formula). Transformations of two or more random variables are presented next. Finally, there is material on geometric probability.

       Chapter 9 is devoted to conditional distributions, both in the discrete and continuous settings. Conditional expectation and variance are emphasized as well as computing probabilities by conditioning. The bivariate normal is introduced here to illustrate many of the conditional properties.

       The important limit theorems of probability—law of large numbers and central limit theorem—are the topics of Chapter 10. Applications of the strong law of large numbers are included via the method of moments and Monte Carlo integration. Moment-generating functions are used to prove the central limit theorem.

       Chapter 11 has optional material for supplementary discussion and/or projects. These three sections center on random walks on graphs and Markov chains, culminating in an introduction to Markov chain Monte Carlo. The treatment does not assume linear algebra and is meant as a broad strokes introduction.

      There is more than enough material in this book for a one-semester course. The range of topics allows much latitude for the instructor. We feel that essential material for a first course would include Chapters 14, 6, and parts of Chapters 7, 9, and 10.