## 1. Concept of Probability

Probability deals with the analysis of random phenomena. It is a way of assigning every event a value between zero and one, with the requirement that the event made up of all possible results is assigned a value of one.

## 2. Experiment

An operation which results in some well-defined outcomes is called an experiment.

### 2.1. Random Experiment

An experiment whose outcome cannot be predicted with certainty is called a random experiment. In other words, if an experiment is performed many times under similar conditions and the outcome of each time is not the same, then this experiment is called a random experiment.

#### Example:

A). Tossing of a fair coin

B). Throwing of an unbiased die

C). Drawing of a card from a well shuffled pack of 52 playing cards

## 3. Sample Space

The set of all possible outcomes of a random experiment is called the sample space for that experiment. It is usually denoted by S.

#### Example:

A). When a die is thrown, any one of the numbers 1, 2, 3, 4, 5, 6 can come up.

Therefore, sample space:

S = {1, 2, 3, 4, 5, 6}

B). When a coin is tossed either a head or tail will come up, then the sample space w.r.t. the tossing of the coin is:

S = {H, T}

C). When two coins are tossed, then the sample space is

### 3.1 Sample Point or Event Point

Each element of the sample spaces is called a sample point or an event point.

#### Example:

When a die is thrown, the sample space is S = {1, 2, 3, 4, 5, 6} where 1, 2, 3, 4, 5 and 6 are the sample points.

### 3.2 Discrete Sample Space

A sample space S is called a discrete sample if S is a finite set.

## 4. Event

A subset of the sample space is called an event.

### 4.1. Problem of Events

Sample space S plays the same role as universal set for all problems related to the particular experiment.

(i). $\phi$ is also the subset of S and is an impossible Event.

(ii). S is also a subset of S which is called a sure event or a certain event.

## 5. Types of Events

### A. Simple Event or Elementary Event

An event is called a Simple Event if it is a singleton subset of the sample space S.

#### Example:

A). When a coin is tossed, then the sample space is

S = {H, T}

Then A = {H} occurrence of head and

B = {T} occurrence of tail are called Simple events.

B). When two coins are tossed, then the sample space is

S = {(H,H); (H,T); (T,H); (T,T)}

Then A = {(H,T)} is the occurrence of head on $1{st}$ and tail on $2{nd}$ is called a Simple event.

### B. Mixed Event or Compound Event or Composite Event

A subset of the sample space S which contains more than one element is called a mixed event or when two or more events occur together, their joint occurrence is called a Compound Event.

#### Example:

When a dice is thrown, then the sample space is

S = {1, 2, 3, 4, 5, 6}

Then let A = {2, 4 6} is the event of occurrence of even and B = {1, 2, 4} is the event of occurrence of exponent of 2 are Mixed events.

Compound events are of two type:

(i). Independent Events, and

(ii). Dependent Events

### C. Equally Likely Events

Outcomes are said to be equally likely when we have no reason to believe that one is more likely to occur than the other.

#### Example:

When an unbiased die is thrown all the six faces 1, 2, 3, 4, 5, 6 are equally likely to come up.

### D. Exhaustive Events

A set of events is said to be exhaustive if one of them must necessarily happen every time the experiments is performed.

#### Example:

When a die is thrown events 1, 2, 3, 4, 5, 6 form an exhaustive set of events.

#### Important:

We can say that the total number of elementary events of a random experiment is called the exhaustive number of cases.

### E. Mutually Exclusive Events

Two or more events are said to be mutually exclusive if one of them occurs, others cannot occur. Thus if two or more events are said to be mutually exclusive, if not two of them can occur together.

Hence, $A_1, A_2, A_3,..., A_n$ are mutually exclusive if and only if $A_i \cap A_j = \phi , \text{ for }i \neq j$

#### Example:

A). When a coin is tossed the event of occurrence of a head and the event of occurrence of a tail are mutually exclusive events because we cannot have both head and tail at the same time.

B). When a die is thrown, the sample space is S = {1, 2, 3, 4, 5, 6}

Let A is an event of occurrence of number greater than 4 i.e., {5, 6}

B is an event of occurrence of an odd number {1, 3, 5}

C is an event of occurrence of an even number {2, 4, 6}

Here, events B and C are Mutually Exclusive but the event A and B or A and C are not Mutually Exclusive.

### F. Independent Events or Mutually Independent Events

Two or more event are said to be independent if occurrence or non-occurrence of any of them does not affect the probability of occurrence of or non-occurrence of their events.

Thus, two or more events are said to be independent if occurrence or non-occurrence of any of them does not influence the occurrence or non-occurrence of the other events.

#### Example:

Let bag contains 3 Red and 2 Black balls. Two balls are drawn one by one with replacement.

Let A is the event of occurrence of a red ball in first draw.

B is the event of occurrence of a black ball in second draw.

Then probability of occurrence of B has not been affected if A occurs before B. As the ball has been replaced in the bag and once again we have to select one ball out of 5(3R + 2B) given balls for event B.

## 6. Occurrence of an Event

For a random experiment, let E be an event

Let E = {a, b, c}. If the outcome of the experiment is either a or b or c then we say the event has occurred.

Sample Space: The outcomes of any type

Event: The outcomes of particular type

### 6.1. Probability of Occurrence of an Event

Let S be the same space, then the probability of occurrence of an event E is denoted by $P(E)$ and is defined as

$P(E) = \dfrac{n(E)}{n(S)}$ $= \dfrac{\text{number of elements in E}}{\text{number of elements in S}}$

$P(E) = \dfrac{\text{number of favourable/particular cases}}{\text{total number of cases}}$

#### Example:

A). When a coin is tossed, then the sample space is S = {H, T}

Let E is the event of occurrence of a head

E = {H}

B). When a die is tossed, sample space S = {1, 2, 3, 4, 5, 6}

Let A is an event of occurrence of an odd number

And B is an event of occurrence of a number greater than 4

A = {1, 3, 5} and B = {5, 6}

$\text{P(A)} = \text{Probability of occurrence of an odd number}$ $= \dfrac{n(A)}{n(S)}$

$= \dfrac{3}{6} = \dfrac{1}{2}$ and

$P(B) = \text{Probability of occurrence of a number greater than 4}$ $= \dfrac{n(B)}{n(S)}$

$= \dfrac{2}{6} = \dfrac{1}{3}$

## 7. Basic Axioms of Probability

Let $S$ denote the sample space of a random experiment.

1. For any event $E$, $P(E) \geq 0$

2. $P(S) = 1$

3. For a finite or infinite sequence of disjoint events $E_1, E_2, ...$

$P(E_1 \cup E_2 \cup E_3 \cup ... ) = \sum\limits_{i}P(E_i)$

## 8. Some Basic Probability Theorems

### 8.1 Theorem 1

Let E' be the complement of E defined by $E'= S-E$

The following always holds: $P(E)=1-P(E')$

#### Proof:

Since $E'= S-E$, we have $E \cup E'=S$ and E \cup E'= \phi . Hence

$P(E)+ P(E')=P(E \cup E')$ (By Axiom 3)

$=P(S)$

$=1$ (By Axiom 2)

Solving the equation for $P(E)$ completes the proof.

### 8.2 Theorem 2

$P( \phi ) = 0$

#### Proof:

Since $\phi ' =0$, By theorem 1 we have

$P( \phi ) = 1-P(S)$

$=1-1$ (By Axiom 2)

$=0$

### 8.3 Theorem 3

If $A \subseteq B$ ,then $P(A) \leq P(B)$

#### Proof:

Since $B=A \cup (B-A)$ where $A$ and $B-A$ are disjoint, Axiom 3 implies.

=> $P(B)=P(A)+P(B-A)$.

By Axiom1, $P(B-A) \geq 0$.

Hence $P(A) \leq P(B)$.

### 8.4 Theorem 4

For any event E, $0 \leq P(E) \leq 1$

#### Proof:

Since Axiom1 gives us $P(E) \geq 0$ we need only show that $P(E) \leq 1$.

But $E \subseteq S$ implies $P(E) \leq P(S)$ by Theorem 3.

Therefore $P(E) \leq 1$ by Axiom 2.

### 8.5 Theorem 5

For A and B, $P(A \cup B)=P(A)+P(B)-P(A \cap B)$

#### Proof:

Since A,B and $(A \cup B)$ can be partitioned as follows:

$A = (A \cap B') \cup (A \cap B)$

$B= (B \cap A' ) \cup (A \cap B)$

$A \cup B= (A \cap B') \cup (B \cap A' ) \cup (A \cap B)$

We have

$P(A)= P(A \cap B' )+P(A \cap B)$' '

$P(B)=P(B \cap A')+P(A \cap B)$

$P(A \cup B)= P(A \cap B' )+P(B \cap A')+P(A \cap B )$

Therefore,

$P(A \cup B)= P(A \cap B')+ P(B \cap A')+P(A \cap B)$

$= [P(A \cap B' )+P(A \cap B)]+ [P(B \cap A')+P(A \cap B)]-P(A \cap B)$

$=P(A)+P(B)-P(A \cap B)$

### 8.6 Theorem 6 - (Conditional probability)

$P(A \cup B \cup C)=P(A)+P(B)+P(C)-P(AB)-P(AC)-P(BC)+P(ABC)$

#### Definition - Conditional probability:

$P( A | B)={P(A \cap B)}/{P(B)}$ provided $P(B) \neq 0$

### 8.7 Theorem 7

$P(A \cap B)=P(A)*P(B | A)$

$=P(B)*P(A | B)$

### 8.8 Theorem 8

$P(A \cap B \cap C)=P(A)*P(B|A)*P(C|A \cap B)$

### 8.9 Theorem 9

If $A_1,A_2,A_3......A_n$ are mutually independent events, then

$P(A_1 \cap A_2 \cap ... \cap A_n) = P(A_1)*P(A_2)* ... *P(A_n)$

### 8.10 Theorem 10 - (Theorem of total Probability)

#### Theorem of Total Probability:

If $B_1,B_2,B_3, ... ,B_n$ are mutually exclusive events with

$\cup {B_i}=S$ then

$P(A)=P(A \cap B_1)+ P(A \cap B_2)+ P(A \cap B_3)+ ... + P(A \cap B_n)$

### 8.11 Theorem 11 - (Bayes' Theorem)

#### Bayes' Theorem:

$P(A | B)=\dfrac{P(B|A)*P(A)}{P(B|A)*P(A)+P(B|A')*P(A')}$

Hide

Number formats

Decimals
Lofoya.com   2016

You may drag this calculator