Aptitude Questions and Answers

- 1. Concept of Probability
- 2. Experiment
- 3. Sample Space
- 4. Event
- 5. Types of Events
- 6. Occurrence of an Event
- 7. Basic Axioms of Probability
- 8. Some Basic Probability Theorems

Probability deals with the analysis of random phenomena. It is a way of assigning every event a value between zero and one, with the requirement that the event made up of all possible results is assigned a value of one.

An operation which results in some well-defined outcomes is called an experiment.

An experiment whose outcome cannot be predicted with certainty is called a random experiment. In other words, if an experiment is performed many times under similar conditions and the outcome of each time is not the same, then this experiment is called a random experiment.

**A).** Tossing of a fair coin

**B).** Throwing of an unbiased die

**C).** Drawing of a card from a well shuffled pack of 52 playing cards

The set of all possible outcomes of a random experiment is called the sample space for that experiment. It is usually denoted by S.

**A).** When a die is thrown, any one of the numbers 1, 2, 3, 4, 5, 6 can come up.

Therefore, sample space:

S = {1, 2, 3, 4, 5, 6}

**B).** When a coin is tossed either a head or tail will come up, then the sample space w.r.t. the tossing of the coin is:

S = {H, T}

**C).** When two coins are tossed, then the sample space is

Each element of the sample spaces is called a sample point or an event point.

When a die is thrown, the sample space is S = {1, 2, 3, 4, 5, 6} where 1, 2, 3, 4, 5 and 6 are the sample points.

A sample space S is called a discrete sample if S is a finite set.

A subset of the sample space is called an event.

Sample space S plays the same role as universal set for all problems related to the particular experiment.

**(i).** $ \phi $ is also the subset of S and is an impossible Event.

**(ii).** S is also a subset of S which is called a sure event or a certain event.

An event is called a Simple Event if it is a singleton subset of the sample space S.

**A)**. When a coin is tossed, then the sample space is

S = {H, T}

Then A = {H} occurrence of head and

B = {T} occurrence of tail are called Simple events.

**B). **When two coins are tossed, then the sample space is

S = {(H,H); (H,T); (T,H); (T,T)}

Then A = {(H,T)} is the occurrence of head on $1{st}$ and tail on $2{nd}$ is called a Simple event.

A subset of the sample space S which contains more than one element is called a mixed event or when two or more events occur together, their joint occurrence is called a **Compound Event**.

When a dice is thrown, then the sample space is

S = {1, 2, 3, 4, 5, 6}

Then let A = {2, 4 6} is the event of occurrence of even and B = {1, 2, 4} is the event of occurrence of exponent of 2 are Mixed events.

Compound events are of two type:

**(i).** Independent Events, and

**(ii).** Dependent Events

Outcomes are said to be equally likely when we have no reason to believe that one is more likely to occur than the other.

When an unbiased die is thrown all the six faces 1, 2, 3, 4, 5, 6 are equally likely to come up.

A set of events is said to be exhaustive if one of them must necessarily happen every time the experiments is performed.

When a die is thrown events 1, 2, 3, 4, 5, 6 form an exhaustive set of events.

We can say that the total number of elementary events of a random experiment is called the exhaustive number of cases.

Two or more events are said to be mutually exclusive if one of them occurs, others cannot occur. Thus if two or more events are said to be mutually exclusive, if not two of them can occur together.

Hence, $A_1, A_2, A_3,..., A_n$ are mutually exclusive if and only if $A_i \cap A_j = \phi , \text{ for }i \neq j$

**A).** When a coin is tossed the event of occurrence of a head and the event of occurrence of a tail are mutually exclusive events because we cannot have both head and tail at the same time.

**B).** When a die is thrown, the sample space is S = {1, 2, 3, 4, 5, 6}

Let A is an event of occurrence of number greater than 4 i.e., {5, 6}

B is an event of occurrence of an odd number {1, 3, 5}

C is an event of occurrence of an even number {2, 4, 6}

Here, events B and C are Mutually Exclusive but the event A and B or A and C are not Mutually Exclusive.

Two or more event are said to be independent if occurrence or non-occurrence of any of them does not affect the probability of occurrence of or non-occurrence of their events.

Thus, two or more events are said to be independent if occurrence or non-occurrence of any of them does not influence the occurrence or non-occurrence of the other events.

Let bag contains 3 Red and 2 Black balls. Two balls are drawn one by one **with replacement**.

Let A is the event of occurrence of a red ball in first draw.

B is the event of occurrence of a black ball in second draw.

Then probability of occurrence of B has not been affected if A occurs before B. As the ball has been replaced in the bag and once again we have to select one ball out of 5(3R + 2B) given balls for event B.

For a random experiment, let E be an event

Let E = {a, b, c}. If the outcome of the experiment is either a or b or c then we say the event has occurred.

**Sample Space****:** The outcomes of any type

**Event****:** The outcomes of particular type

Let S be the same space, then the probability of occurrence of an event E is denoted by $P(E)$ and is defined as

$P(E) = \dfrac{n(E)}{n(S)}$ $= \dfrac{\text{number of elements in E}}{\text{number of elements in S}}$

$P(E) = \dfrac{\text{number of favourable/particular cases}}{\text{total number of cases}}$

**A).** When a coin is tossed, then the sample space is S = {H, T}

Let E is the event of occurrence of a head

E = {H}

**B).** When a die is tossed, sample space S = {1, 2, 3, 4, 5, 6}

Let A is an event of occurrence of an odd number

And B is an event of occurrence of a number greater than 4

A = {1, 3, 5} and B = {5, 6}

$\text{P(A)} = \text{Probability of occurrence of an odd number}$ $= \dfrac{n(A)}{n(S)}$

$= \dfrac{3}{6} = \dfrac{1}{2}$ and

$P(B) = \text{Probability of occurrence of a number greater than 4}$ $= \dfrac{n(B)}{n(S)}$

$= \dfrac{2}{6} = \dfrac{1}{3}$

Let $S$ denote the sample space of a random experiment.

1. For any event $E$, $P(E) \geq 0$

2. $P(S) = 1$

3. For a finite or infinite sequence of disjoint events $E_1, E_2, ...$

$P(E_1 \cup E_2 \cup E_3 \cup ... ) = \sum\limits_{i}P(E_i)$

Let E' be the complement of E defined by $ E'= S-E$

The following always holds: $P(E)=1-P(E')$

Since $ E'= S-E$, we have $E \cup E'=S$ and E \cup E'= \phi . Hence

$P(E)+ P(E')=P(E \cup E')$ (By Axiom 3)

$=P(S)$

$=1$ (By Axiom 2)

Solving the equation for $P(E)$ completes the proof.

$P( \phi ) = 0$

Since $ \phi ' =0$, By theorem 1 we have

$P( \phi ) = 1-P(S)$

$=1-1$ (By Axiom 2)

$=0$

If $A \subseteq B$ ,then $P(A) \leq P(B)$

Since $B=A \cup (B-A)$** **where $A$ and $B-A$ are disjoint, Axiom 3 implies.

=> $P(B)=P(A)+P(B-A)$.

By Axiom1, $P(B-A) \geq 0$.

Hence $P(A) \leq P(B)$.

For any event E, $0 \leq P(E) \leq 1$

Since Axiom1 gives us $P(E) \geq 0$ we need only show that $P(E) \leq 1$.

But $E \subseteq S$ implies $P(E) \leq P(S)$ by Theorem 3.

Therefore $P(E) \leq 1$ by Axiom 2.

For A and B, $P(A \cup B)=P(A)+P(B)-P(A \cap B)$

Since A,B and $(A \cup B)$ can be partitioned as follows:

$A = (A \cap B') \cup (A \cap B)$

$B= (B \cap A' ) \cup (A \cap B)$

$A \cup B= (A \cap B') \cup (B \cap A' ) \cup (A \cap B) $

We have

$P(A)= P(A \cap B' )+P(A \cap B)$' '

$P(B)=P(B \cap A')+P(A \cap B)$

$P(A \cup B)= P(A \cap B' )+P(B \cap A')+P(A \cap B )$

Therefore,

$P(A \cup B)= P(A \cap B')+ P(B \cap A')+P(A \cap B) $

$= [P(A \cap B' )+P(A \cap B)]+ [P(B \cap A')+P(A \cap B)]-P(A \cap B)$

$=P(A)+P(B)-P(A \cap B)$

$P(A \cup B \cup C)=P(A)+P(B)+P(C)-P(AB)-P(AC)-P(BC)+P(ABC)$

$P( A | B)={P(A \cap B)}/{P(B)}$ provided $P(B) \neq 0$

$P(A \cap B)=P(A)*P(B | A)$

$=P(B)*P(A | B)$

$P(A \cap B \cap C)=P(A)*P(B|A)*P(C|A \cap B)$

If $A_1,A_2,A_3......A_n$ are mutually independent events, then

$P(A_1 \cap A_2 \cap ... \cap A_n) = P(A_1)*P(A_2)* ... *P(A_n)$

If $B_1,B_2,B_3, ... ,B_n$ are mutually exclusive events with

$ \cup {B_i}=S$ then

$P(A)=P(A \cap B_1)+ P(A \cap B_2)+ P(A \cap B_3)+ ... + P(A \cap B_n)$

$P(A | B)=\dfrac{P(B|A)*P(A)}{P(B|A)*P(A)+P(B|A')*P(A')}$