alt="image"/>
which has four elements, and as all elements have the same chance of occurring, it can be endowed with uniform probability. Since
we have
and
1.2.2. Random variables
Let us now recall the definition of a generic random variable, and then the specific case of discrete random variables.
DEFINITION 1.9.– Let (Ω,
, ℙ) be a probabilizable space and (E, ε) be a measurable space. A random variable on the probability space (Ω, , ℙ) taking values in the measurable space (E, ε), is any mapping X : Ω → E such that, for any B in ε, X−1(B) ∈ ; in other words, X : Ω → E is a random variable if it is an (, ε)-measurable mapping. We then write the event “X belongs to B” byIn the specific case where E = ℝ and = ε =
(ℝ), the mapping X is called a real random variable. If E = ℝd with d ≥ 2, and ε = (ℝd), the mapping X is said to be a real random vector.EXAMPLE 1.12.– Let us return to the experiment where a six-sided die is rolled, where the set of possible outcomes is Ω = {1, 2, 3, 4, 5, 6}, which is endowed with the uniform probability. Consider the following game:
– if the result is even, you win 10 ;
– if the result is odd, you win 20 .
This game can be modeled using the random variable defined by:
This mapping is a random variable, since for any B ∈
({10, 20}), we haveand all these events are in
(Ω).DEFINITION 1.10.– The distribution of a random variable X defined on (Ω,
, ℙ) taking values in (E, ε) is the mapping ℙX : ε → [0, 1] such that, for any B ∈ ε,The distribution of X is a probability distribution on (E, ε); it is also called the image distribution of ℙ by X.
DEFINITION 1.11.– A random real variable is discrete if X(Ω) is at most countable. In other words, if X(Ω) = xi, i ∈ I, where I ⊂ ℕ . In this case, the probability distribution of X is characterized by the family
EXAMPLE 1.13.– Uniform distribution: Let
, ℙ) such that X(Ω) = {x1, ..., xN } and for any i ∈ {1, ..., N },It is then said that X follows a uniform distribution on {x1, ..., xN }.
EXAMPLE 1.14.– The Bernoulli distribution: Let p ∈ [0, 1]. Let X be a random variable on (Ω,
, ℙ) such that X(Ω) = {0, 1} andIt is then said that X follows a Bernoulli distribution with parameter p, and we write X ∼
(p).The Bernoulli distribution models random experiments with two possible outcomes: success, with probability p, and failure, with probability 1 – p. This is the case in the following game. A coin is tossed N times. This experiment is modeled by Ω = {T, H}N, endowed with the σ-algebra of its subsets and the uniform distribution. For 1 ≤ n ≤ N, the mappings Xn from Ω onto ℝ are considered, defined by
the number of tails at the nth toss. Thus, Xn, 1 ≤ n ≤ N, are random real variables in the Bernoulli distribution with parameter 1/2 if the coin is balanced.
EXAMPLE 1.15.– Binomial distribution: Let p ∈ [0, 1],
, ℙ) such that X(Ω) = {0, 1, ..., N } and for any k ∈ {0, 1, ..., N }, It is then said that X follows a binomial distribution with parameters N and p, and we write X ∼