## Page 1

1

BASICS OF PROBABILITY

Unit Structure

1.0 Objectives

1.1 Introduction

1.2 Some Terminologies and notations

1.3 Different Approaches of Probability

1.4 Chapter End Exercises

1.0 Objectives

After going through this chapter you will learn

x What is random experiment? How it forms the basis for the ͆probability ͇

x Notion of sample space and its types

x Various types of events

x Operations of the events and the laws these operations obey.

x Mathematical and Statistical definition of probability and their limitations.

1.1 Introduction

In basic sciences we usually come across deterministic experiments whose results

are not uncertain. Theory of probability is based on Statistical or random

experiments. These experiments have peculiar features

Definition 1.1. Random Experiment : A non deterministic experiment is called as

a random experiment if

1. It is not known in advance , what will be the result of a performance of trial

of such experiment .

2. It is possible to list out all possible of this experiment outcomes prior to

conduct it.

3. Under identical conditions , it w possible to repeat such experiment as many

times as one wishes .

1munotes.in

## Page 2

2SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Definition 1.2. Sample space : Collection of all possible outcomes of a random

experiment is known as sample space

Sample space is denoted by :. And an element of : by Z

1. Each Z represents a single outcome of the experiment.

2. Number of elements of : are called sample points, and total number of

sample points are denoted by #:

3. Number of elements of : may be finite, or it may have one one

correspondence with the , or with .

4. Depending on its nature : is called as finite, countable or uncountable.

Example 1.1.

1. A roulette wheel with pointer fixed at the center is spinned . When it comes to

rest, the angle made by pointer with positive direction is noted . This

experiment is random . Since we do not know before spinning where the

pointer would rest. But it may make angle any where between (O, 3600) thus

here sample space 00,360 : It is subset of. It w uncountable

2. A coin is tossed until it turns up head . Number of tosses before we get head

are noted . This is a random experiment . The corresponding sample space

0,1,2,3 : } has one one correspondence with the , so it is countable

3. A gambler enters a casino with initial capital ͆ C͇ .If his policy is to

continuing to bet for a unit stake until, either his fortune reaches to ͆ C͇

or his funds are exhausted . Gambler ̓s fortune after any game is though

uncertain we can list it out. The sample space of this random experiment is

0,1,2,3 , : } C . Here sample space is finite .

1.2 Some Terminologies and notations

Event: Any subset of : is termed as an event. Thus corresponding to random

experiment a phenomenon may or may not be observed as a result of a random

experiment is called as an event.

Note: Event is made up of one or many outcomes Outcomes which entails

happening of the event is said to be favorable to the event. An event is generally munotes.in

## Page 3

3Chapter 1: Basics of Probability

denoted by alphabets. Number of sample points in an event “ A” is denoted by

#A .

Algebra of events: Since events are sets algebraic operations on sets work for the

events.

x Union of two events: A and B are two events of : ,then their union is an

event representing occurrence of (at least one of them) A or B or both and

denoted by AB

Thus, AB{:AZZ or BZ or A and 13 both}

x Intersection of two events: A and B are two events of : ,then their

Intersection is an event representing simultaneous occurrence of A and B both

and denoted by AB

Thus, AB{:AZZ and BZ}

x Complement of an event: Non occurrence of an event is its complementary

event. Complement of an event is denoted by ̺A. It contains Z that are not

in A. Thus , A{:ZZ does not belong to A}

x Relative complementarity: Out of the two events occurrence of exactly one

event is relative complement of the other. In particular if an event A occurs

but B does not, it is relative complement of B relative to A. It is denoted by

AB or AB.This event contains all sample points of A that are not in B.

Similarly BA or BA represents an event that contains all sample points

of 13 that are not in A. Thus , AB{:ZZ A and Z does not belong to

13 }

x Finite Union and Countable Union: 12,,,}nAAA be the events of the sample

space 1 n

i iAUis called as finite union of the events.

I f ofn we have 1f

i iAU which is called as countable union of the events

x Finite intersection and Countable intersection: 12,,,}nAAA be the events of

the sample space 1 in

iA is called as finite intersection of the events

I f ofn we have 1f

iiA which is called as countable intersection of the

events munotes.in

## Page 4

4SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Laws of Operations: U n i o n a n d i n t e r s e c t i o n a r e t h e s e t o p e r a t i o n s , t h e y o b e y

following laws.

x Commutative law i) ABB A and ii) ABB A

x Reflexive law i) AAA and ii) AAA

x Associative law i) AB CA BC

ii) AB CA BC

x Distributive law i) AB C A B A C

ii) A1 3 C A B A C

x De Morgan’s Law i) AB AB . ii) AB AB

Impossible event: An event corresponding to an empty set.

Certain event: An event corresponding to .:

Mutually Exclusive Event: When occurrence of one event excludes the

occurrence of the other for all choices of it then the two events are called as

Mutually exclusive events. Alternately, when the two events do not occur

simultaneously then the two events are called as Mutually exclusive events. Here

. ABM

Exhaustive events: The t w o e v e n t s a r e s a i d t o b e E x h a u s t i v e e v e n t s i f t h e y

together form the sample space Alternately when all sample points are included in

them they are called Exhaustive events. Here AUB :

Equally likely events: I f w e h a v e n o r e a s o n t o e x p e c t a n y o f t h e e v e n t s i n

preference to the others, we call the events as Equally likely events.

Indicator function: Indicator function of an event denoted by AIZ and defined

as

1

0 ®¯AAIAZZZ (1.1)

Partition of sample space: 12,,,}nAAA be the events of the sample space such

that they are Mutually exclusive and Exhaustive then are said to form (finite)

partition of the sample space. munotes.in

## Page 5

5Chapter 1: Basics of Probability

So, if 12,,}nAAA are forming partition of a sample space, for every 1, 2, z }ij n

; ijAAM And 1 :n

i iA

Note: Concepts of Mutually Exclusive Event and Exhaustive events and hence for

partition can be generalized for countable events 12,,AA }

Example 1.2. ^`12345,,,, : eeeee . If ^`135,, Aeee , and ^`1234,,, Be e e e .

Answer the following (i) Are A, B mutually exclusive ? (ii) Are A, B exhaustive9 .

(iii) ^`24, IfC e e . find ABC and AB

Solution : (i) Since ^`13, ABe e which is non null, so A, B are not mutually

exclusive . (ii) :AB , so ,AB are exhaustive .

(iii) :AB C and ABC

1.3 Different Approaches of Probability

Definition 1.3. Classical or Mathematical definition (Leplace ): If a random

experiment is conducted results into N mutually exclusive , exhaustive and equally

likely outcomes , M of which are favorable to the occurrence of the event A , then

probability of an event A is defined as the ratio MN, and denoted by PA

#

# :AMPAN

This definition has limitations

1 It is not applicable when outcomes are not equally likely.

2 We may not always come across a random experiment that results into a finite

number of outcomes.

3 Even if outcomes are finite, can not be enumerated or the number favorable

to the event of interest may not be possible to count.

munotes.in

## Page 6

6SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Definition 1.4. Empirical or Statistical definition (Von Mises ):If a random

experiment is conducted N times , out of which M times it results into outcomes

favorable to an event A, then the limiting value of the ratio MN is called probability

of A.

lim

of

nMPAN

This definition also has limitations.

1 This definition gives a stabilized value of the relative frequency, and

overcomes to some extent the drawbacks of classical approach

2 This definition also has some limitations first is, it may not be possible to

repeat the experiment under identical conditions large number of times ,due

to budgeted time and cost.

3 In repeatation of the experiment large number of times conditions no more

remain identical.

4 Since it is based on concept of limit, drawbacks of limit are there with the

definition also. However it it works satisfactorily and is widely used

Example 1.3.

What is the probability that a positive integer selected at random from the set of

positive integers not exceeding 100 is divisible by (i) 5, (ii)5 or 3 (iii)5 and 3 ͂.?

Solution : ^`1, 2, ,100 : } so, #1 0 0:

(i) Let A be an event that no. is divisible by 5, so A , ^`5,10 ,100 }

s o , #2 0 A

# 200.2#1 0 0 :APA

(ii) Let B be an event that no. is divisible by 3, so ^`3, 6, ,99 }B

s o , #3 3 B

# 330.33#1 0 0 :BPB

munotes.in

## Page 7

7Chapter 1: Basics of Probability

(iii) Let C be an event that no. is divisible by 5 or 3, CAB

s o , #1 3 4 7 A

# 470.47#1 0 0 :CPC

(iv) Let D be an event that no. is divisible by 5 and 3, DAB

s o , #6 AB

# 60.06#1 0 0 :DPD

Example 1.4. What is the probability that in a random arrangement of alphabets

of word (“REGULATIONS ͇

(i) All vowels are together . (ii)No two vowels are together ?

Solution : Since there are 11 letters in the word , they can be arranged in 11!

distinct ways so, #1 1 !:

Let A be an event that the random arrangement has all vowels together .

Since the 5 vowels is one group to be kept together and remaining 6

consonants , which is random arrangement of 7 entities in all can be done in

7! ways. In the group of 5 vowels the random arrangement can be done 5!.

ways . so, #7 ! 5 ! AX

# 7!5!.01515#1 1 ! :APA

(ii) Let B be an event that the random arrangement no to vowels together . The

consonants can be arranged as *******CCCCCC , where C stands for

consonants . 5 vowels can be arranged a t 7, * positions in 7

5P ways and 6

consonants in 6! ways , all such random arrangements 576 !PX ways so,

7! 6!#2! XB

1# 7!6!.04545#1 1 ! 2 :BPB

Example 1.5. From a pack of well shuffled 52 cards four cards are selected

without replacing the selected card. Jack, queen , king or ace cards are treated as

honor card. a) What is the probability that are there are i) all honor cards ii) More munotes.in

## Page 8

8SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

honor cards 9. b) What will be these probabilities if cards are drawn with

replacement ?

Solution :

a) Since there are 52 cards in the pack of cards ,4 can be selected without

replacement in 52

4C distinct ways so, 52

4 #: C

i) Let A be an event that the random selection has 4 honor cards . Since there

are in all 44X honour cards , 16

4 # AC

4

4# 160.0067#5 2 :A CPAC

ii) Let B be an event that the random selection has more , that is 4 or 3 honor

cards . 16 36 16

41 3 # 2 1 9 8 0 BCC C

# 219800.08119#2 7 0 7 2 5 :BPB

b) 4 cards can be selected with replacement in 452 ways so, 4#5 2:

i) Let C be an event that the random selection has 4 honor cards . 4#1 6 C

4

4# 160.00896#5 2 :CPC

ii) Let D be an event that the random selection has more ) that is 4 or 3 honor

cards .

43#1 6 3 6 1 6 2 1 2 9 9 2 DX

# 2129920.02913#7 3 1 1 6 1 6 :DPD

Example 1.6. In a party of 22 people , find the probability that (i) All have different

birtday (ii) Two persons have sme bithday (iii) 11 persons have birtday in same

month .

Solution : We assume that none of them have birthday on 29th February . munotes.in

## Page 9

9Chapter 1: Basics of Probability

(i) Since all 22 people can have any of 365 days as their bithday in 22365 ways .

Thus 22#3 6 5:

A be the event that all have different birhday , 365

22 # AP

Hence 0.5243 PA

(ii) B be the event that two have same birthday and remaining 20 have different

birhday , Any 2 out of 22 can be chosen to have same birthday in 222C ways ,

and remaining 21 different birthdays can be chosen from 365 days in 22 365P

ways .

365 22

21 2 # uBP C

Hence 0.352 PB

(iii) C be the event that 11 have birhday in same month and remaining 11 in

different months .

N o w 22#1 2:

And 12 22

11 11 # uCP C

Hence 0.000011 PC

The notion of probability is given modern approach which is based on measure

theory. For this it is necessary to introduce class of sets of :

In next chapter we will discuss various classes of sets.

1.4 Chapter End Exercises

1. Cards are to be prepared bearing a four digit number formed by choosing

digits among 1, 4, 5, 6 and 8. *ind the probability that a randomly chosen

cards among them bear (i) An even number (ii) A number divisible by 4 (iii)A

number has all four digits same.

2. A sample of 50 people surveyed for their blood group. If 22 people have ‘A’

blood group, 5 have ‘B’ blood group, 21 have ‘O’ blood group and 2 have

‘AB’ blood group. Find the probability that a randomly chosen person has

(i) Either ‘A’ or ‘13’ blood group (ii)Neither ‘A’ nor ‘B’ blood group.

3. A roulette wheel has 40 spaces numbered from 1 to 40. Find the probability

of getting (i) number greater than 25(ii) An odd number (iii) A prime number. munotes.in

## Page 10

10SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

4. A, B, C forms a partition. If the event A is twice as likely as B ,and event C

is thrice as likely as A . Find their respective probabilities.

5. What is the probability that in a random arrangement of alphabets of word

“CHILDREN ͇

(i) All vowels are together.

(ii) No two vowels are together?

6. A committee of 5 is to be formed from among a coordinator, chairperson, five

research guides and three research students. What is the probability that

committee (i) Do not have coordinator and chairperson. (ii) All research

guides (iii) None of the students

7. 9 people are randomly seated at a round table. What is the probability that a

particular couple sit next to each other?

8. In a box there are 10 bulbs out of which 4 are not working. An electrician

selects 3 bulbs from that box at random what is the probability that at least

one of the bulb is working?

9. ^`1, 2, , 50 : } A denote number divisible by 5, B denotes number up to

30 ,C is number greater than 25 and D is number less than or equal to 4.

Answer the following

(i) Which events are exhaustive?

(ii) Which events are mutually exclusive?

(iii) Give a pair of events which is mutually exclusive but not exhaustive.

(iv) Give a pair of events which is not mutually exclusive but exhaustive.

(v) Give a pair of events which is neither mutually exclusive nor

exhaustive.

10. A pair of fair dice is thrown what is the probability that the sum of the

numbers on faces of the dice is (i) 6, 7 or 8. (ii) Divisible by 5.(iii)a prime

number?

11. What is the probability that in a group of 25 people (i) all have different

birtdays (ii)11 have birhday in different month and 14 in the same month?

12. Five letters are to be kept in five self addressed envelopes. What is the

probability that (i) All goes to correct envelope(ii)none of them goes to

correct envelope? munotes.in

## Page 11

11Chapter 1: Basics of Probability

13. The coefficients a,b,c of the quadratic equation 20 ax bx c , are obtained

by throwing a die thrice. Find the probability that equation has real roots.

14. What is the probability that there are 53 Thursdays and 53 Fridays in a leap

year?

15. A sequence of 10 bits is randomly generated. What is the probability that

(i) atleast one of these bits is 0? (ii) a sequence has equal number of 0 and 1.

16. The odds against an event A are 3: 5, the odds in favor of an event 13 are 7:

5, What are the probabilities of the events?

17. In a group of 12 persons what is the probability that (i) each of them have

different birthday (ii) each of them have birthday in different calendar month?

18. A, B, C are mutually exclusive. 13 x 14 xPA , PB23 and

1xPC6 . (i)Show that the range for x is, 11

43x (ii) are they

exhaustive?

19. Express ABC as union of three events.

munotes.in

## Page 12

2

FIELDS AND SIGMA FIELDS

Unit Structure

2.0 Objectives

2.1 Class of Sets

2.2 Field

2.3 V field and Borel V field

2.4 Limit of sequence of events

2.5 Chapter End Exercises

2.0 Objectives

After going through this chapter you will learn

x A class of sets and variouus closure properties that it may follow.

x Concept of field and its properties.

x Sigma field and its properties.

x Borel Sigma field, minimal Sigma field.

x Limit superior and limit inferior of sequence of events.

2.1 Class of Sets

Before introducing modern approach of probability, we need to define some terms

from measure theory. Subsequent sections are also explaining their role in

probability theory.

A collection of subsets of : is termed as Class of subsets of : .It plays an

important role in measure theory. They have some closure properties with respect

to different set operations. A be the class of subsets of .:

12munotes.in

## Page 13

13Chapter 2: Fields and Sigma Fields

Complement: A is said to be closed under the complement, if for any set AA , A is A

Union : A is said to be closed under the union if for any sets A, , BAA B is

A Intersection : A is said to be closed under the intersection if for any sets A ,

,BAA B is A

Finite Union and Countable Union : A is said to be closed under the finite union

if for any sets 12,AA , ..., .nAA , 1 *n

iiA is A .Further if ofn and if we have

1Uf

iiA is ,AA is said to be closed under countable unions

Finite intersection and Countable intersection : A is said to be closed under the

finite intersection if for any sets 12,AA , ..., .,nAA

1 n

iiA is A .Further if ofn and if we have 1f

iiA is ,AA is said to be

closed under countable intersection. Note: Closure property for countable operation

implies closed for finite operation

2.2 Field

Definition 2.1. Field :

A class of subsets of a non̺empty set : is called a field on : if

1. . :

2. It is closed under complement .

3. It is closed under finite Union

Notationally

A class of subsets of a non ̺empty set : is called a field on : if

1. :

2. for any set A, .A

3. for any sets 12,AA , ... , 1 ., *n

ni iAA

Following points we should keep, in mind regarding field.

x Closure for complement and finite Union implies closure for

intersection.So,field is closed under finite intersections.

x ^`,:M is a field. munotes.in

## Page 14

14SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

x Power set :P ,which is set of all subsets of : is a field.

x For any ^`A, , , , i s: :AA M smallest field containing A.

x For any sets A,B AB,hence A . 'B

x 1 and 2 are two fields on :, then 12n is a field.

x Field is also called as an Algebra.

Example 2.1 . ^`1, 2, 3 : , ^`^`^`1 ,, 1 , 2 , 3 :M and ^`^`^`2 ,, 2 , 2 , 3 :M are

two fields on :. Is union of these two fields is a )ifield

Solution : ^`12 {, , 1 :M , {2}, {2, 3} }

let ^`^`^`12 1, 2 1 2 U * A

12?*f is not a field.

Example 2.2 . {:A such that A is finite } . Is a field?

Solution : No, if : is infinite then, : does not belong to and hence cannot

be a field.

Example 2.3 . Complete the following class to obtain a field. Given >@0,1 : and

1,1, 0,2½ª· ®¾¸«¬¹¯¿M

Solution : Add >@ ^`>11 10,1 , ,1 , 0, U 1 , 0,1 , ,122 2½ª· ª· ªº®¾ ¸¸«« « »¬¹ ¬¹ ¬¼¯¿ in to make it a field

2.3 D field and Borel D field

Definition 2.2 . V field: A class C of subsets of a non̺empty set : is called a

V ̺field on : if

1. . :C

2. It is closed under complement .

3. It is closed under countable Unions . munotes.in

## Page 15

15Chapter 2: Fields and Sigma Fields

Notationally

A class C of subsets of a non ̺empty set : is called a V ̺ field on : if

1. :C

2. for any set AC , .AC

3. for any sets 12,AA , ..., . C , then 1f

*iiAC

x Field which is closed under countable unions is a D̺field.

x Like fields the intersection of arbitrary V ̺ fields is also V ̺ field

but their union is not a V̺field

x Power set :P ,which is collection of all subsets of : is ao.̺field.

x Given a class of sets consisting all countable and complements of

countable sets is a V̺ field

Example 2.4. A class C of subsets A of : such that either A or its complement

is finite . IsC is Ia field pdI I a field V 9.

Solution :

(I) C { | :AA is finite or ̺ is finite }

Note that iC is closed under complementation , since either of A or ̺ is

finite (ii) If,AB C both finite then *AB is finite

If A is finite B is infinite *AB is infinite . But *ABAB .Since ̺ is

finite ̺ is infinite

B*A finite , hence A U B. Similarly, we can check the case when both A

and B are infinite .

Thus, For any ,ABC , *AB is also C . So C is a field

(II) But if iA are finite 1f

*iiA does not belong to .C

C is not closed for countable unions , hence cannot be 0̺field.

Definition 2.3 . Minimal field V : A class C of subsets of : is called a minimal

V ̺field on : , if it is the smallest 0field containing C munotes.in

## Page 16

16SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

x Minimal D ̺field can be generated by taking intersection of all the D ̺

fields containing C

x If A is family of subsets of : , and ^`n| ACC A C ,which is intersection

of all 0 ̺fields containing A then AC is a minimal V ̺field.

x If A itself is a V ̺field, then ACA

Hence onwards we term the pair ,:C as a sample space

In theory of probability : IR has specific features and sample space (IR, B)

plays vital role.

Definition 2.4. Borel V field : Let C is the class of all open intervals ,fx

where x, a minimal field V generated by C is called as Borel V ̺ field,

and denoted by .B

Borel V ̺ field has following features.

1. It is clear that [,fx) is complement of ,fx but it does not belong to

C .Thus C is not closed under complement C is also not closed under

countable intersections, as

>11,,f

§·f f¨¸©¹nxxn . But B is closed under complements as well as

countable unions or intersections.

H e n c e contains all intervals of the type [,fx)

2. @ 11,,f

§·f f ¨¸©¹n xxn .So B contains all intervals of the type (,fx]

3. ,fx is complement of (,fx]. Thus B contains all intervals of the type

,fx . 4. ,, , f f ab b a , where ab . So contains all

intervals of the type ,ab . And contains even intervals of the type [,ab),

(,ab] for all a,b .

Note that sets of B are called as borel sets.

munotes.in

## Page 17

17Chapter 2: Fields and Sigma Fields

2.4 Limit of sequence of events

In this chapter the concept of limit of a sequence of events is introduced.

Definition 2.5. Limit Superior : ^`nA be the sequence of events of space ,:C.

Limit superior of nA is an event which contains all points of : that belong to nA

for infinitely many n and it is denoted by lim sup nnAor limA , termed as limit

superior of nA

x lim sup nAZ iff for each n1t there exists an integer mnt such that

mAZ for all mnt

x Thus 1 lim sup nff

*nm n mA

x It can be clearly seen that lim sup nAC

x ^`lim .0 nnAAi , where i.o infinitely often.

Definition 2.6 . Limit Inferior : ^`nA be the sequence of events of space ,:C .

Limit inferior of nA is an event which contains all points of : that belongs to nA

but for finite values of n. and it is denoted by lim inf nnAor limA , termed as limit

inferior of nA

x lim inf nAZ iff there exists some n1t such that mAZ for all mnt

x Thus 1 lim inf ff

*nn m n mAA

x It can be clearly seen that limin nAC

x lim inf lim sup nnAsubseteq A

x If limnA exists, lim lim inf lim sup nn nAAA .

Definition 2.7 . ^`nA be the sequence of events of space ,:C such that

12}AA , then ^`nA is called as expanding or increasing sequence and

1 limf

*nn nAA

Definition 2.2. ^`nA be the sequence of events of space ,:C such that

12}AA , then ^`nA is called as contracting or decreasing sequence and

1 limf

nn nAA munotes.in

## Page 18

18SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Remark 2.1. ^`nA be the sequence of events of space ,f

: *nm n m Ct h e niC A

is decreasing sequence . pnCC , where 1 lim sup f

nn n Cc A (ii) f

nm mBA

is increasing sequence . nnBB , where 1 lim inf f

*nn nBBA

Remark 2.2 . ^`nA be the sequence of events of space ,:C then f

*mn mA is also

called as tmn mSup A , and f

mn mA is also called as tmn mInf A .

2.5 Chapter End Exercises

1. is a field .If ,AB then show that AB and +AB are also events off

2. ^`1, 2, 3, 4 . :

Which of the following classes is a field on :?

(i) ^`^`^`1 ,1 , 4,2 , 3 M

(ii) ^`^`^`^`^`^`^`2 ,, 2 , 3 , 2 , 3 , 1 , 4 , 1 , 2 , 4 , 1 , 3 , 4. :M

3. Complete the following class to obtain a field. Given 0,1 : and (i)

11,0 , 1,0 , , , 122½§· §· ®¾¨¸ ¨¸©¹ ©¹¯¿M (ii) 11 22{,0 , 1 ,0 , , 1 )0 , , , 1 }22 33§· ª · §º §· ¨¸¸ ¨ ¨¸«»©¹ ¬ ¹ ©¼ ©¹M

4. A class C of subsets A of : such that either A or its complement is

countable. Is C is a V̺field?

5. A, ,CB forms partition of : , obtain a smallest field containing ,13,AC

6. Show that following are the Borel sets.

( i ) >@ ^`,i iab a (iii) Any finite set (iv)Any countable set (v) A set of rational

numbers(v) A set of natural numbers

7. CisVfield on >@0,1 : such that11,,1ªº«»¬¼Cnnfor n = 1,2,....Show that

following are the events of C. (i) 1(, 1 ]n (ii)(10, ]n

8. 1, 3, 5,

2,4,6 ® }¯nAnABn

F i n d lim inf nA , 7 lim sup lA and show that limnA does not exists. munotes.in

## Page 19

19Chapter 2: Fields and Sigma Fields

9. Prove that (i) lim **nn n nAB limA limB , (ii) lim nnlimA A , (iii)

lim nnAB

nnlimA limB

10. Are the above results true for lim inf?

11. If onAA then Ѹon

munotes.in

## Page 20

20SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

3

PROBABILITY MEASURE AND

LEBESGUE MEASURE

Unit Structure

3.0 Objectives

3.1 Probability Measure

3.2 Lebesgue Measure and integral

3.3 Discrete and absolutely continuous probability measures

3.4 Chapter End Exercises

3.0 Objectives

After going through this chapter you will learn

x A function defined on sample space called as probability measure.

x Types of probability measure ,discrete and continuous.

x Lebesgue measure and Lebesgue integral.

x Properties of probability function

x Probability of limit of sequence of events.

3.1 Probability Measure

The modern approach of probability is based on measure theory. Following

definition is due to Kolmogorov (1933)

Definition 3.1 . Axiomatic definition of probability : C be the Cf field associated

with the sample space :. A function .P defined on C to >@0,1 is called as

probability measure or simply probability if it satisfies following axioms .

1. 0tPA for all .AC

2. 1 : P

3. 12,AA , ... is sequence of mutually exclusive events of C then

11ff

6*ii i i PA P A (3.1)

20munotes.in

## Page 21

21Chapter 3: Probability Measure and Lebesgue Measure

x Axioms are respectively called as non ̺negativity, normality and countable

additivity.

x In this definition probabilities have been already assigned to the events, by

some methods or by past information.

x The triplet ,,:CP is called as a probability space

x Depending on : different types of probability space are decided.

x If : is finite or countable(at most countable)probability space is discrete.

x If : has one one correspondence with IR probability space is continuous.

Properties of the Probability function.

Complement

1 PA PA

Proof:

: *AA

A and Aare mutually exclusive events So

: PP A PA

…by countable additivity axiom

L. H. S1 ….. by normality axiom

R.H. S PA PA Hence

1 PA PA

1 PA PA

Monotone A and B are C such that A AB then dPA PB

Proof: *BA BA

Since A and BA are mutually exclusive, so by countable additivity axiom

APB PA P B

So, PA PB d as 0tPB A munotes.in

## Page 22

22SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Substantivity A and B are C such that A B. Then PB A PB PA

Proof: From the above proof

PB PA PB A

Thus

PB A PB A PB FA

Similarly we can say that

A and B are C such that BA Then PA B PA PB

Continuity lim

of

nnAA ,then lim

of

nnPA PA

Theorem 3.1. ^`j iA is expanding or increasing sequence of events of space

,:C then

1 limf

of *nn nnPA P A (3.2)

(ii) ^`jA is the contracting or decreasing sequence events of space ,:C then 1 limf

of nn nnPA P A (3.3)

Proof: ^`ijA be the sequence of increasing events, so 12}AA . Let

1n jj j jBAAB are mutually exclusive.

So 1 *n

nj jAB

11ff

**jjj jAB (3.4)

By 9.4

11ff

**jj jj PA PB

1f

¦jjPB By countable additivity

1lim of¦n

jj nPB By definition of sum of series

1 lim of*njjnPB By finite additivity

lim

ofnnPA By definition of nA 1 limf

of *nn nnPA P A

^`iijA be the sequence of decreasing events, so 12}AA . hence ^`j be the

sequence of increasing events, so Ѹ1 ̺2" Applying result in (i) to ^`j

1 l i m f

of *jnPjPn (3.5) munotes.in

## Page 23

23Chapter 3: Probability Measure and Lebesgue Measure

1 1L . H . Sf

jj PA by De Morgan’s law

1l i m 1 R . H . S

ofªº¬¼ nnPA By complementation

1 l i m

of nnPA 1 lim nf

of nn nnPA P A

Theorem 3.2. The continuity property of probability .

lim

of nnAA, then lim lim

of of nnnnPA P A PA (3.6)

Proof: 11 lim inf nff f

**nm m n mn nAAB

were

f

nm n mBA

These snB are increasing events 1f

n *nnBB say

using (3.2)

1 limf

of *nn nnPB P B PB

11 lim sup ff f

*nm n m n n AP C

where

f

*nm n mCA

These snC are decreasing events 1f

p nncC s a y

using (3.3)

1 limf

of nn nnPC P C PC

Now consider,

Uff

mn m n mn mAAA (3.7)

Bn ؿ An ؿ Cn

By monotone property of P

ddnnnPB PA PC

taking limits

lim lim lim

of of ofddnnnnnnPB PA PC

So,

lim

ofddrnPB PA PCL

But, lim nAA

lim lim inf lim sup nnAAA B A CJL munotes.in

## Page 24

24SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

implies

PB PA PC

lim lim

of of nnnnPA PA P A (3.8)

Example 3.1 . Which of the following are Probability functions ?

(i) ^`1, 2, 3 : } , , C is field V on :. A function P

defined on space ,:C as

1

2 iPi

for :i

Solution : a)

1112f

: ¦ i iP

b)

0tPA for all AC

)cL e t us define mutually exclusive events , iAi we can verify countable

additivity .

11ff

6*ii i iPA P A (3.9)

By a) ,b)and c) Pis Probability function.

0, : fii , Borel Vfield B defined on .:A function F

defined on space ,:B as, for any I

³x

IPI e d x (3.10)

Solution : a)

01f: ³x

x Pe d

b) 0tPA for all AB

), 1 icA ii we can verify countable additivity

.

11Uf

f

³*

ix

ii xPA e d (3.11)

iAs are mutually exclusive . From the properties of integrals ,

1

11ff

¦¦ixxiiifed P A ( 3 . 1 2 )

By a) ,b), c) P is Probability function. munotes.in

## Page 25

25Chapter 3: Probability Measure and Lebesgue Measure

(iii) , : ff , field V C defined on : A function P defined

on space ,:C as, for any IC

>0, 1

112 f° ®f°¯I

PI

I

Solution : a)

1112f: z³P P is not a Probability function .

Theorem 3.3. Borel Catelli lemma

If f¦ iPA then lim .0 0 nn PAP A i

Proof:

1 lim ff

*nn m n m PA P A

f

d*mn m PA

d asf

¦ mmnPA events need not be mutually exclusive

If f¦ iPA , then ¦ mPA tends to zero as ofn Hence the proof.

Remark 3.1. Other half of the above result is stated as follows . But it needs

independent events .

For iA are independent events of the sample space , If f¦ iPA then

lim .0 1 nn PA P A i

3.2 Lebesgue Measure and integral

Definition 3.2. Lebesgue Measure A function P defined on space ,B is called

as Lebesgue Measure if it satisfies following

1. @, ab b aP

2. 0 PM

3. iEs are mutually exclusive intervals of B then,

1? 1 Xff

*ii tEE PP munotes.in

## Page 26

26SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Example 3.2 . Find Lebesgue Measure for following sets.

(i) 11,26ª·

¸«¬¹(ii)11,32ª·

¸«¬¹(iii) 27,39ªº

«»¬¼(iv)1,: 1 ,2,.} nn(v)1{: }2 nxxn f o r n

Solution :

(i) 11 1 1 1,96 6 9 1 8§º ¨»©¼P

Similarly (ii) 1

9(iii)1

9 (iv)

i111{: n 1 , 2 , . . . } 0nnf

½P P ®¾¯¿¦

(v)n1xn2implies nn11nx n22

x lies in mutually exclusive intervals of length n11

2

n1x: x n2½P ®¾¯¿for nn

i011nn , n 222f

§· P ¨¸©¹¦`

Remark 3.2.

x If >@0,1 : then PP is Probability measure .

x P is actually an extended measure . It is V ̺finite measure ,

x Since 1(lim( , xxxnPP ])

11lim , lim§º ¨»©¼xx xnnPP

by continuity

Thus 0 xP

x From above (,abP] , ab b aP

x The sets whose P measure is zero is called as P ̺null set.

Definition 3.3. .f is a function defined on \ is a called as Borel function if

inverse image is a Borel set.

Definition 3.4. Lebesgue integral : Lebesgue integral is a mapping on non-negative

burel function f which satisfy following ,

1 0,f³fdP

2 ³AIdAPP for any \A munotes.in

## Page 27

27Chapter 3: Probability Measure and Lebesgue Measure

3 ³³ ³fgd f d g dPPP and ³³cfd c fdPP. Where 0tc

4 lim ³³nfdf dPP if lim nfxf x for any x

x For any nonnegative piecewise continuous function f

>@ , ³b

abJf d f x d x F b F a

DP, (3.13)

W h e r e P is antiderivative of f.

x P is nondecreasing function with f fFF is finite.

x If we divide F by f fFF we get probability measure.

x We will revisit this function in next chapters.

3.3 Discrete and absolutely continuous probability measures

Definition 3.5. Density function : A non-negative Borel function f : [0,of ) is

called as a density if

1 ³fdP (3.14)

Theorem 3.4. If f is a density then P satisfying

³APA f dP (3.15)

is a probability measure on Borel subsets A of \

proof: Since f is a density on \

1 ³Pf d P

( 3 . 1 6 )

Now consider 12,}AA be mutually exclusive Borel sets. Let 1t *n

niBA and

1f

*ii BA Since II/

nBBff By monotone convergence of Lebesgue measure

II ³³ ³ ³/

nnnB BBBPB f d f d f d f d PB PPP P (3.17)

Thus P is countably additive, hence it is Probability measure.

Example 3.3 . Find the constant k if following are the density functions .

(i) >@2,3I fxk x munotes.in

## Page 28

28SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

(ii) 2;0 !xfx k e x

Solution :(i) The density is 0 outside [̺2, 3] and on [̺2, 3] it is 1

. 3

215 1

³³fxd x k d x k

( 3 . 1 8 )

1

5 k

(ii) The density is 0 outside 0,f and on 0,f it 2 xsf x keL

. 2

012f ³³x kfx d x k e d x

(3.19)

2 k

Definition 3.6. Absolutely Continuous Probability Measure : P is a probability measure on Borel subsets A of is said to be Absolutely

Continuous probability Measure , if there exists a density f such that

³APA f dP (3.20)

Definition 3.7. Dirac Measure :

Let : be at most countable arbitrary set, be the family of subsets of :. A

measure wG on defined as

1

0 ®¯AAAZZGZ ( 3 . 2 1 )

is called as Dirac Measure concentrated at w

Definition 3.8. : be at most countable arbitrary set ) be the family of subsets

: A Dirac measure on say ZG A probability measure P defined as

1f

¦ kkkPA DG (3.22)

such that 0tkD and

11f

¦k

kD

is said to be discrete probability measure .

Remark 3.3. Dirac Measure is a probability measure .

x ^`:Z is largest set with measure O, and its every subset has also measure

0. Smallest set with measure 1 is w munotes.in

## Page 29

29Chapter 3: Probability Measure and Lebesgue Measure

x For 12,:ZZ ,

121 PA A AZZDG D G is a probability measure .

Where 01 .D

x Some times we come across measures which are neither discrete nor

absolutely continuous . Following theorem is for such mixed probability

measures .

Theorem 3.5. 1P , and 2P are the two probability measures ,

12 1 PA PA P A DD is a probability measure . Where 01 .D

Proof:

1. 12 11 : : : PP P DD ,as both 1P and 2P are probability

measures.

2. 12 10 tPA PA P A DD as 0, 1,2.t iPA i

3. nA be the countable sequence of mutually exclusive events. I3y countable

additivity of 1P and 2P ,

11 1 2 1 1ff f

** *nn nn nnPA PA P A DD

1211 1ff

¦¦ nnnnPA PA DD

1 f

¦ nnPA

This shows that PA is a probability measure.

Remark 3.4 . Generalization of above result can be stated as :,iPs are the

probability measures , ¦ii PA PA D is a probability measure . Where

01iD and 1 ¦iD

Example 3.4 . Find the PA if following are the density functions and F is

absolutely Continuous probability measure ..wrt it and (0,2 A ].

(i) >@3,31I6 fxx . (ii) )0 !xfx e x

Solution :(i)

@2

0110,2 163 ³Pd x

(ii)

@2

00,2 0.8647 ³xPe d x munotes.in

## Page 30

30SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Example 3.5 . Define P by 123131

88 2 PA A A P A GG ,

3P has density >@3,31I6 fxx Compute >@2,3 .

Solution: >@>@122,3 1, 2,3 1 GG , and

>@ >@3

3211 1 3 1 72,3 1 2,3 .66 8 8 1 2 1 2 ³Pd x P

3.4 Chapter End Exercises

1. Find the constant k if following are the density functions.

(i) 5, 5Iªº¬¼ fxk x

(ii) 3;0 1 fxk x x

2. Find the PA , if following are the density functions and P is absolutely

Continuous probability measure w.r. t it. A( 1 , 0 . 5 ].

(i) >@1,41I4 fxx

(ii) 61 ; 0 1 fxx x x

3. , : ff V ̺ field C defined on : .A function P defined on space

,:C as , for any IC

1

0 ®

¯IisfinitePIIisinfinite

4. Find Lebesgue Measure for following sets >@,0 , 1, (, 12,85].

5. Show that Dirac Measure is a probability measure.

6. Define Paneity 123111

442 AAA P A GG

3Phas density 22 xfxe Compute P>@1, 3).̓

munotes.in

## Page 31

31Chapter 3: Probability Measure and Lebesgue Measure

7. Define P by 12 3131

88 2 PA A P A P A G , 2P has density

>@1,11I2 fxx and3Phas density 2;0 1 fxx x .Compute P >@0,1 .

8. If 0o'nPA A as ofn then show that onPA PA .

9. Show that nn 1 t PABC P P P .

10. If A and B implies C then show that P d PP .

munotes.in

## Page 32

4

CONDITIONAL PROBABILITY AND

INDEPENDENCE

Unit Structure

4.0 Objectives

4.1 Conditional Probability and multiplication theorem

4.2 Independence of the events

4.3 Bayes’ Theorem

4.4 Chapter End Exercises

4.0 Objectives

After going through this chapter, you will learn

x Conditional probability and its role in finding probability of simultaneous

occurrence of events.

x Notion of independence of events and its consequences

x Total probability theorem.

x Bayes ̓ theorem and its use to land posterior probabilities.

4.1 Conditional Probability and multiplication theorem

Let 13 be arbitrary set of : . Let A be the class of events of .:

^`| BABA AA

We can easily verify that BA is a V̺field. And ,BBA is a measurable space. P

measure on this space is not a probability as 1,zPB

let BP is defined as

BBPA BPAP (4.1)

32munotes.in

## Page 33

33Chapter 4: Conditional Probability and Independence

BP is called as conditional probability measure or simply conditional probability of

an event A

Theorem 4.1. B be arbitrary set of : A be the class of events of

^`.|: BABA AA

BPA BPAPB (4.2)

BP is a probability measure on ,BBA

proof:

1. 0tBPA for all AA

2. 1 BPB

3. 12,}AA be mutually exclusive sets of BA

From above it is clear that BP is a probability measure on ,BBA

Remark 4.1. Conditional probability is denoted by / PABa n d called as

conditional probability of an event A given event B has occurred Thus it is

necessary to have 0!PB

x /1 PA A

x /1: PA

x //zPAB PB A

x From the definition of conditional probability , it follows that

/ PA B PBPAB

which is also known as a Multiplication theorem on probability .

x For three events Multiplication theorem on probability is stated as

123 1 23 2 3 3 // PA A A PA A A PA A PA

munotes.in

## Page 34

34SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

The conditional probability is not defined when probability of given event is zero.

The conditional probability leads to another concept related with events, known as

independence.

Example 4.1. Show that

/// / *PA BC PAC PBC PA BC

Solution :

/ *

* PAB C

PA BCPC

.. By definition of conditional prob .

*PAc BC

PC

.. By Distributive law

PA C P C P A BCBPC

.. By Addition theorem on probability .

B PA C P C

CC PA B

PC PC P

// / PAC PBC PA BC

By definition of conditional prob .

Example 4.2. Probability that it rains today is 0.4) probability that it will rain

tomorrow is is 0.5, probability that it will rain tomorrow and rains today is 0.3.

Given that it has rained today , what is the probability that it will rain tomorrow .?

Solution : Let

PA P (it rains today ) 0.4

PB P ( it will rain tomorrow ) 0.5

PA B P (it will rain tomorrow and rains today ) 0.3

Required probability is

/0 . 6 PA BPABPB

munotes.in

## Page 35

35Chapter 4: Conditional Probability and Independence

Example 4.3

A box contains cards numbered 1 to 25. A card bearing even number was drawn ,

but the number was not known . What /s the probability that it is card bearing

number divisible by 5c?

Solution : ^`1, 2, 3 25 : } ,

PB P (even no. card) 12

25

PA P (card with no. divisible by 5) 5

25

PA B P (An even no. divisible by 5) 2

25

Required probability is

2/12 PA BPABPB

4.2 Independence of the events

The occurrence and nonoccurrence of the event, when does not depend on

occurrence and nonoccurrence of the other event the two events are said to be

independent. Since occurrence and nonoccurrence of the event is measured in terms

of probability. Instead of only independence we say stochastic independence or

independence in probability sense. Let us first define independence of two events.

we will later call it as pair wise independence

Definition 4.1. Independence of the events : Let ,,:AP be a probability space .

invents A and B of this space are said to be stochastically independent or

independent in probability sense if and only if PA B PAPB

x Above definition works for any pair of events even when either PA or

PB is equal to zero.

x Property of independence is reflexive.

x If A and B are independent then conditional probability and unconditional

probabilities are same. That means if A is independent of B / PAB PA

and / PB A PB munotes.in

## Page 36

36SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Theorem 4.2. If events A and B are independent so are(i) A and Bii B and A

(iii) A and .B

proof: (i) Consider

PA B PA PA B

Since A and B are independent

PA B PAPB

Consider

PA B PA PA B

PA PAPB PAPB

Thus

PA B PAPB

So, A and B are independent. Similarly we can prove (ii)

(iii) Consider

PA B PA B

....By De Morgan’s law

1 PA B PA B

11ªº ª º ¬¼ ¬ ¼PA PB PA B PA PB PAPB

.......since A and I3 are independent

Thus

[1 1ªº ª º ¬¼ ¬ ¼PA B PA PB

PAPB

So,A and B are independent.

Definition 4.2. ,1 }tAin events are mutually or completely independent if and

only if for every sub collection

12

1nn

"k

ki

iPAA A PA

for 2 }kn munotes.in

## Page 37

37Chapter 4: Conditional Probability and Independence

Remark 4.2. If the above condition holds for 2 k we say that events are pairwise

independent . There are such 2nC pairs , and those many conditions have to be

checked . And for n events , to be completely independent , there are 21nn .

conditions have to be checked .

Remark 4.3 . If ,,ABC are three events

x They are pairwise independent if

1. PA B PAPB

2. PA C PAPC

3. PB C PBPC

x They are completely independent if

1. PA B PAPB

2. PA C PAPC

3. PB C PBPC And

4. PA B C PAPBPC

4.3 Bayes’ Theorem

It is possible to find probability of an even t if conditional probabilities of such event

given various situations. The situations need to be exhaustive and non-overlapping

Example 4.4 . An urn contains 5 white and 7 black balls .3 balls are drawn in

succession . What is the probability that all are white ’? If ball are drawn (i) with

replacement (ii) Without replacement .

Solution : Let iA be the event that thi drawn ball is white , 1, 2, 3 i . (i) When balls

are drawn Without replacement , events are not independent . Using multiplication

theorem , Required prob . 123 PA A A

12 13 1 2543 1//12 11 10 22 u u PA PA A PA A A

(ii) When balls are drawn with replacement , events are independent

3

123 1 2 35

12§· ¨¸©¹PA A A PA PA PA munotes.in

## Page 38

38SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Example 4.5 . A problem is given to three students whose chances of solving the

problem are 0.2,0.3 and 0.5 respectively. If all of them solve the problem

independently, find the probability that (i) None of them solves it. (ii) the problem

is solved by exactly two students

(iii) the problem is solved.

Solution : Let iA be the event that ,1 , 2 , 3 thii . student solves the problem

1, 2, 3. i (i) P (None of them solves it) 123 PA A A

Since they solve the problem independently , iA are independent , so are iA

123 0.8 0.7 0.5 0.28 u u PAPA PA

(ii) P (the problem is solved by exactly two students )

123 123 123

0.2 0.7 0.5 0.8 0.3 0.5 0.8 0.7 0.5 0.47

uuuuuu PA A A PA A A PA A A

(iii) P ( the problem is solved ) 1 P (None of them solves it) 0.72

Example 4.6 . ^`^`^`^`^`1,1,1 1, 2,1 1,1, 2 2,1,1 : ,A: first no. is 1, B: second no. 1is

, C:third no. is 1, examine whether ,,ABC are completely independent ?.

Solution :

0.5, 0.25

0.25 , ,

PA PB PC PA B PB C PA C PA B

PA B PAP PA C PAPC PB C PBPC B

so ,,ABC are pairwise independent . But 0.25 0.125 z PA B PAPCPB ,

hence they are not completely independent .

Theorem 4.3. Theorem of total probability : 12,, .n AA A are forming partition of a

:, Let B be another event , :B then we can find probability of B by following

relation

1/ ¦n

iiiPB PB A PA (4.5) munotes.in

## Page 39

39Chapter 4: Conditional Probability and Independence

proof: 12,AA, . nA are forming partition of a sample space, for every 1, 2 z ij , ...

n; ijAAI And 1 :*n

iiA

:B

? :PB PB 1 *n

ii PB PB A

Since siA forms partition

1U n

ii PB A

...... By distributive law.

1 ¦n

iiPB A .

By finite additivity of Probability function

1/ ¦n

iiiPB A P A

.. ... By multiplication theorem. Hence

1/ ¦n

iiiPB PB A PA

x Though the theorem is proved for finite partition, it is also true for countable

partition.

x At least two siA are should have nonzero probability.

x If B is an effect and siA are different causes PB summarizes chance of the

effect due to all possible causes.

Example 4.7 . Screws are manufactured by two machines A and B. Chances of

producing defective screws are by machine A and B are 04r and 01r

respectively . For a large consignment A produced 070r and B produced 030r

screws . What is the probability that a randomly selected screw from this

consignment is defective ?

Solution : Let A be the event that screws are manufactured by two machines A and

B be the event that screws are manufactured by machines B. D be the event that munotes.in

## Page 40

40SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

defective screws are manufactured . Given 0.7 PA , 0.3 PB ,

/0 . 0 4 PD A , /0 . 0 1 PD B . By Theorem of total probability

// . 0 3 1 PD PAPD A PBPD B

Example 4.8 . A ball is selected at random a box containing 3 white 7 black balls .

If a ball selected is white it is removed and then second ball is drawn . If the first

ball is black it is put back with 2 additional black balls and then second ball is

drawn . What is the probability that second drawn ball is white ̓?

Solution : LetWA be the event that ball drawn at the first draw is white BA be the

event that ball drawn at the first draw is black . D be the event that ball drawn at

the second draw is white Given 20.3, 0.7, /9 WB WPA PA PD A ,

3/12 B PD A . By total probability theorem ,

// 0 . 2 4 1 7 WW BB PD PA PD A PA PD A

Theorem 4.4. Bayes ’ Theorem : 12,, .nAAA are forming partition of a :, Let B

be another event , :B then we can find probability of B by following relation

(/

/

/

¦jj

j

iiPB A PA

PA B

PB A P A (4.6)

proof:

/

j

jPA B

PA BPB

By multiplication theorem.

And using PB from total probability theorem the proof of theorem follows.

x This theorem is useful for posterior analysis of cause and effect.

x Given iPA , which are prior probabilities of the thi cause.Where as

/iPA B are posterior probability of the cause iA given that I3 is effect

observed.

Example 4.9 . Three people ,,XYZ have been nominated for the Manager ̓s

post. The chances for getting elected for them are 0. 4, 0.35 and 0.25 respectively .

If X will be selected the probability that he will introduce Bonus scheme is 0.6 the

respective chances in respect of Y and Z are 0.3 and 0.4 respectively . If it w munotes.in

## Page 41

41Chapter 4: Conditional Probability and Independence

known that Bonus scheme has been introduced , what is the probability that X is

selected as a Manager ?

Solution : Let B be the event that bonus scheme is introduced . ,,XYZ denotes

respectively that ,,XYZ are elected . Thus given 0.4 PX ,,

0.35, 0.25, / 0.6, / 0.3, / 0.4 PY PZ PB X PB Y PB Z By Bayes

Theorem ,

//// / PB X PXPX BPB X PX PBYPY PB ZPZ

0.4 0.60.52750.4 0.6 0.25 0.3 0.35 0.4u u u u

Example 4.10. 1% of the population suffer from a dreadful disease . A suspected

person undergoes a test. However the test making correct diagnosis 90% of times .

Find the probability that person who has really caught by that disease given that

the test resulted positive ? Solution : Let 1A be the event that person was really

caught by that disease

2A be the event that person was healthy D be the event that person got the test positive

1/0 . 9 PD A , 21 2/0 . 1 , 0 . 0 1 , 0 . 9 9 PD A A A

11 2 2 // 0 . 1 0 8 PD PA PD A PA PD A

Required Probability is

11

1

11 2 2(//0 . 0 8 3 3 3(/ ( / PAPD APA DPA PD A PD A PA

4.4 Chapter End Exercises

1. What is the probability that(i)husband, wife and daughter have same birthday

(ii)two children have birthday in March?

2. 4 soldiers A,B,C and D fire at a target .Their chances of hitting the target are

0.4,0.3,0.75, and 0.6 respectively. They fire simultaneously. What is the

chance that(i) the target is not hit? (ii) the target is hit by exactly one of them.

3. If A, ,CB are independent show that, (i) A, BC are independent(ii)A,

BC are independent(iii) A, BC are independent munotes.in

## Page 42

42SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

4. ^`1, 2, 3, 4 : , ^`A1 , 2 List all ]3 such that A,B are independent.

5. If /PAB PA then /!PAB PA , and vice versa.

6. Show that

n/1 PA PABPABPB

, 0zPB ,hence prove n1tPAB PA PB

7. Examine for pairwise and mutual independence of events K,R , and S

which are respectively getting of a king, red and spade card in a random draw

from a well shuffled pack of 52 cards.

8. Urn A contains numbers 1 to 10 and B contains numbers 6 to 15. An urn is

selected at random and from it a number is drawn at random. What is the

probability of urn 3l was selected, if the number drawn is less than 7.

9. In a population of 55 % males and 45% females, 4% of the males and 1% of

females are colorblind. ind* the probability a randomly selected person is

colorblind person 7.

10. A man is equally likely to drive by one of the three routes A,B,and C from

his home to office.The chances of being late to the office are 0.2,

0.4,0.3 ,provided he has chosen the routes A,B,c respectively.If he was late

on riday what is the prob. that he has chosen route C?

munotes.in

## Page 43

5

RANDOM VARIABLE AND ITS

DISTRIBUTION FUNCTION

Unit Structure

5.0 Objectives

5.1 Random Variable

5.2 Distribution unction

5.3 Discrete random variable and its p.m.f

5.4 Continuous random variable and its p.d.f

5.5 Chapter End Exercises

5.0 Objectives

After going through this chapter you will learn

x A real valued function defined on sample space,known as random variable

x Discrete and continuous r.v

x Distribution function of a r.v and its association with probability measure.

x Properties of Distribution function

x Probability mass function of a r.v.

x Probability density function of a r.v.

5.1 Random Variable

Definition 5.1. Random Variable ,:C be a measurable space . A real valued

function X defined on this space is called as a random variable if every inverse

image is a Borel set. nThat s for all BB we have

^`1| XBX w B C Z

x Random variable is abbreviated by ‘ r.v’

x X is a r.v iff for each ^`,.d xXx C

x X : :o .Further ^`:: XB ZZ is an event or it is .C

43munotes.in

## Page 44

44SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

x In chapter 2 we have seen that all intervals semi ̺ open,semi ̺ closed,single

tones are B .That is B may be ,fa or [,fa ) etc.,so

^`^`,{ },{ },{ }, d d ddxaXbaXbaXbaXb are all hence are

events.

Example 5.1 . Show that Indicator function is a ..rv

Solution : Indicator function is defined for a :A

1

0 ®¯AAIAZ

ZZ (5.1)

AIZ is a .rv on ,:C iff .AC

Example 5.2 . Consider ,:C be a sample space , where {: HH, HT.TH, }TT

If Xw Number of heads in ,CZ is sigma field, Is X a .rv ̓?

Solution : If XZ Number of heads inZ then 2, ( [ XHH X fi T )

1, 1, 0 XT H XI TJ

@10

01,,, 1 2

2

°d °f ®d°

°:t¯x

TT xXxHT TH HH x

xM

(5.2)

All 1X are events , so X is a ..rv

Example 5.3 . If X is a .rv are following functions .rv? (i) aX b (ii) 1X

Solution : If X is a ^`.: drv X x CZ

Case :0 ,!Ia b then : ½d®¾¯¿xbXCaZ

{: } d aX b x CZ

Case II: 0,ab then as a complement of : ½d®¾¯¿xbXCaZ

:, 0 ½! ®¾¯¿xbXa CaZ

{Z : } daX x b C munotes.in

## Page 45

45Chapter 5: Random Variable and its Distribution Function^`:d aX b x CZ

Case III: 0 a

^` ^ `0::0: td d ®¯xbaX b x X x bxbZZM (5.3)

Thus aX b is a ..rv

(ii) Let

11 1 1,0 } U { ,0 U ,0ªº ½ ½d d ! d d ®¾ ® ¾ «»¬¼ ¯ ¿ ¯ ¿xx X x X x XXX X X

{0 } 0

1{, 0 } U { , 0 } 1: ,0

11U, 0 }

°

°d t !° ªºd® «»¬¼ °½ ½°td !®¾ ® ¾°¯¿ ¯ ¿¯Xx

when x positiveXx X X Xwx xXXXX w h e n x i s n e g a t i v exx

All events are C So, 1ªºd«»¬¼xX is an event , hence 1X is ar. v

Example 5.4. ^`^`^`^`1, 2, 3, 4 , , , 1 2, 3, 4 : : C M

Is 1 XZZ is a random variable with respect to ,:C ?

Solution : inverse image of ^`^`:3 2: XC ZZ

So 1 XZZ is not a random variable .

5.2 Distribution Function

Define a probability measure XP by XPP X B ,this is a mapping XP :

>@0,1 .o is a sigma field of borel sets. We define a point function P associated

with probability space ,,:CP

Definition 5.2 . Distribution Function of a random variable : A mapping XF : munotes.in

## Page 46

46SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

IR >@0,1o\ defined by :ªº d¬¼XFP X x ZZ is called as a distribution

function XFx of X.

Example 5. 5. One of the numbers 2, ... 1, 2is chosen at random by throwing a pair

of dice and adding the numbers shown on the two faces . You win $9 in case 2,3, 11

or 12 comes out or lose $ 10 if the outcome is 7, otherwise you do not lose or win

anything . Find [0 ]!PX and [0 ]PX

^`:, : , 1 , 2 , 3 , 4 , 5 , 6: ab ab X Solution : :o define as

9 2 , 3 , 1 1 , 1 2

(, 1 0 7

0 4 , 5 , 6 , 8 , 9 , 1 0

° ®

° ¯if a b

Xv X a b i f ab

if a b ( 5 . 4 )

^` ^`11[0 ] : 9 1 , 1 , 1 , 2 , 2 , 1 , 6 , 5 5 , 6 6 , 6 [0 ] : 766ªº ! ¬¼PX X P PX X ZZ ZZ

Example 5.6. 1 XZ for (, 2 AXZZ for ,2 BXZZ otherwise .

B are /dis oint . Find .df of X. 11,32 PA PB

Solution : x is .rv, inverse image must be an event ,

@102

21,12

2

°d°f ®d°

°:t¯xBxXxAx

x

.Xdf F x of X, then (, fXXFx P x ]

@

02

1212,11512236

12

°

° d ° f ®

° d °

°: t¯XxPBx

Fx P x

PA PB x

Px

x We can establish suitable correspondence between P and F as

(, fXXFP x ]

x []d XX Pa X b F b F aJ

Distribution function (d.f) has following properties.

x XFx is non negative for all x munotes.in

## Page 47

47Chapter 5: Random Variable and its Distribution Function

p r o o f : W e c a n e a s i l y v e r i f y t h i s a s >@ dXFP X x ,and P is a probability

measure.

x XFx is Monotonically non decreasing.

p r o o f : L e t 12,xx such that 12dxx

12,,f fxx

U s i n g m o n o t o n e p r o p e r t y o f 12 P, , , f d fXXPx Px

12dXXFx Fx

x XFx is right continuous.

proof: consider a sequence xpnx such that 12!} !nxxx , and events

(, nnBx x ] now as nB are decreasing events as seen in chapter 1,

np nn nBBM, and using continuity property of P

@ 0l i m l i m l i m , l i m nn n X n Xnn n nPP B P B P x x F x F xM

This implies right continuity,

lim Xn XnFx Fx (5.5)

(i) lim 0. f XXFx F

(ii) lim 1. f XF

proof: (i)Let 12!} !nxxx , and events (, fnnBx ] now as nB are decreasing

events as seen in chapter 1, np nn nBBM, and using continuity property of P

0 PPM ( limnnB) lim lim ( , fnnnnPB P x ] (5.6)

0l i m fXr XnFx FJ

L

lim 0

of f Xn XnFx F (5.7)

(ii)Let 12} nxxx , and events (, fnnBx ] now as nB are increasing events as

seen in chapter 1, Un :nn nBB , and using continuity property of P munotes.in

## Page 48

48SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

1U : nn PP B P (lim

nnB) lim nnPB (5.8)

@ 1l i m , l i m f fnX n XnnPx F x F

But 0 f XF

lim 1

of XnnFx (5.9)

Theorem 5.1. Every .df is a .df of some ..rv

Remark 5.1 . If X is a .rv on ,:CP with :ªº d¬¼XFP X x ZZ is associated

..df

By above theorem for every r.v we associate a d.f.on some prob. space.Thus given

a r.v there exists a d.f and conversely.

Example 5.7 . 1. Write .df of the following ..rvs

(i) XC Z, for all :C Z is constant

(ii)X is no. heads in tossing two coins .

>@@:, 0 d f XXiF x PX x P x Solution i f x C

>@@,1 d f XXFx P X x P x i f x tC

(ii) XN o . of heads , ^`,,, : HH HT TH TT >@104 PX ) >@1 PX

>@11,224 PX

>@)0, 0;

1)0 14

3,1 2 ;4

12 .

°

° d° d ®

°d°

°t ¯Xx

x

Fx P X x

x

x

5.3 Discrete random variable and its p.m.f

Definition 5.3 . Discrete random variable : A random variable X is called as

discrete if there exist an at most countable set D such that >@1 PX D

x Set D contains countable points ^` iXx.They have non negative

mass .They are called as jump points or the point of increase of d.f.As seen munotes.in

## Page 49

49Chapter 5: Random Variable and its Distribution Function

before in chapter 1, ^` i Xx . And ^`: iXx ZZ is an event.We can

assign : ªº ¬¼XiPX xZZ denoted by ipx such that (i) 0tipx and

1 ¦ ipx

x X is a discrete random variable if and only if isXP a discrete probability

measure.

x The distribution function of a discrete r.v is a step function.As

>@ XX PX x F x F x this jump at x.Where lim XXFxF x h

x Random variable has its characteristic probability law. or* discrete r.v it is

also called as probability ass function (p.m.f)

x For X discrete, i( ) d XX Pa X b F b F a

>@ii ( ) XX Pa X b F b F a P X b

>@iii dd XX Pa X b F b F a PX a

>@>@iv ( ) d XX Pa X b F b F a P X a P X b

Definition 5.4. Probability mass function : A collection ipx which is

representing i PX x satisfying (i) 0tipx and 1 ¦ ipx is called as

probability mass function ..pmf of a discrete random variable .X

Example 5.8 . Let X be no. of tosses of a coin up to and including the toss showing

head first time. (i) Write ..pm f of X hence find P [X is even]. (ii) Also write

.df of .X

Solution : (i) Let P ̓ be the chance of showing head . 1 pq ) is chance of

showing tail.

>@ PX x P { 1x tosses are tail, thx toss is head } 1 xpq ; for 1, 2 }x

P [Xis even] ^` ^` ^` ee:U ¦ ie v nie v nPw xe v e n P xi Pxi

P [X is even] 135 2

1f

ªº ªº } ¬¼ «»¬¼¦i

ipq q q p q q munotes.in

## Page 50

50SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Using infinite geometric series with common ratio 2q P [X is even] 211 pq q

qq

(ii) .df of X is

>@ 7

11 d ¦nl

XxFx P X x p x q

>@01

11 ; 1 , 2 , d ®d ¯X nxFx P X xqn x n n

5.4 Continuous random variable and its p.d.f

Definition 5.5 . Continuous random variable : Reject random variable X is

defined on ,:CP with .X dfF is said to be continuous if F is absolutely

continuous .

* is absolutely continuous if there exists a density Xf : >@0,1oX defined as

@@ ,, ªº ¬¼ ³b

XXaPX a b P a b f x d x (5.10)

For every a,b This function f is called as probability density function

(p.d.f)of a continuous r.v X

Definition 5.6 . Probability density function : If f is ..pd f of a continuous .rv X

with .dfF , it satisfies

1. 0tf

2. 1f

f ³Xfxd x

3. @ , ªº ¬¼³b

XX XaPX a b f x d x F b F a

x For P absolutely continuous and f continuous for all x then

XdFfxdx (5.11)

x For continuous r.v XF is continuous, right as well as left.

XX XFx Fx Fx Where lim XXFxF x hJ munotes.in

## Page 51

51Chapter 5: Random Variable and its Distribution Function

x From above it is clear that >@0 PX x for continuous r.v

x ForX continuous, () () () d dd d PaXb P aXb P aXb P aXb

XXFb Fa

Example 5.9. Coin is tossed . If it shows head you pay .2Rs . If it show tail you spin

a wheel which gives the amount to you, distributed with uniform prob . between

.0Rs to 10 you gain or loss is a random variable . Find the distribution function

and use it to compute the probability that you will win at least 5.

>@1:22 PX Solution and for >@10,10 ,10 fx , so 10 XxFx

02

101 0 2002 , 1

11 0 .

°° d d ®

°

° t ¯Xx

Fx x xx

x P [X is at least 5] 115.2 F

Example 5.10. A .rX has ..pd f

2100

0t° ®

°¯kxfx x

otherwise

Find [50 200] iki iP X (iii) M such that 1[]2 PX M

Solution: (i) 1f

f ³Xfxd x So,

001

100 2i[|100ff ³kkdx k xx

1100? k, gives 100 k

(ii)

200

2150100 1[50 200]3 ³PX d xx

iii M such that 1[]2 PX M so munotes.in

## Page 52

52SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

2100100 1

2 ³Mdxx

gives 200 M

5.5 Chapter End Exercises

1. Find the smallest V ̺ field on : .Let XfcD is a random variable

on :

^`2, 1,0,1,2

2. Two dice are rolled. Let r.v X be the larger of the two numbers shown.

Compute

>@2,4XP .

3. >@0,1 : and C is a V ̺ field of borel sets in :

I s ( i ) 1(: i i2 Xv X w w is a random variable on : w.r.t C

4. A r.X has p.d.f . ind * its d.f. Hence [0 . 5 ]!PX

11 0

101

0o t h e r w i s e

° d ®

°¯xx

fx x x

5. A r.X has d.f, find its p.d.f

00

012

1122

3232

13

°

° d

°

°° d®

°

°d°

°t °¯Xx

xx

Fx x

xx

x

6. ff X is a r.v are following functions 2r.v? i ii iiiXXX

munotes.in

## Page 53

6

SOME SPECIAL RANDOM VARIABLES AND

THEIR DISTRIBUTIONS

Unit Structure

6.0 Objectives

6.1 Bernoulli and Binomial distribution

6.2 Poisson distribution

6.3 Normal Distribution

6.4 Chapter End Exercises

6.0 Objectives

After going through this chapter you will learn

x Bernoulli and Binomial distribution and their properties.

x Poisson distribution its relation with Binomial distribution

x Normal distribution and its applications.

6.1 Bernoulli and Binomial distribution

In this chapter we will come across some typical r.v s and their distributions. In

real life situation we come across many experiments which result into only two

mutually exclusive outcomes. Generally the outcome of interest is called as ͂

Success’ and other as ͂Failure ̓.We assign a positive probability ͂p̓ t o

success and 1p to failure.

Definition 6.1. Bernoulli .: .rv A rv X assuming values 1 and 0 with probabilities

‘p̓ and 1p is called as Bernoulli .rv

x Thus Bernoulli r.v is same indicator function AI with ͂A̓ as success.

x The probability law of Bernoulli is also written as

>@ 110 , 1 x xPX x p p x

x Hence onward we denote 1p by ͂q̓.Note that 1 pq

53munotes.in

## Page 54

54SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Example 6.1. 1. An indicator function AI is a Bernoulli .rv, if we assign

probability PA p and 1 PA p

When the trial of a Bernoulli experiment is repeated independently finite number

of times say ͂n̓ times, it gives rise to Binomial situation. If we count total

number of successes in such n trials it is Binomial r.v .The probability law for

Binomial r.v has thn term of binomial expansion of npq

Definition 6.2 . Binomial distribution : A .rv X assuming values 0,1,2 }n is said

to follow Binomial distribution if its ..pm f is given by[1 PX x

0,1, ;0 1, 1

0 } °®°¯nx x

xnC p q x n p p q

otherwise (6.1)

x The notation , oXBnp is used for showing X follows Binomial

distribution with parameters n and p.

x Bernoulli r.v is particular case of Binomial with n1 . And Binomial arises

from sum of n independent Bernoulli r.v.s.

x The Binomial probabilities can be evaluated by the following recurrence

relation, starting from >@0 nPX q And then using recursive formula,

>@ >@ 11§· ¨¸©¹nxpPX x PX xxq (6.2)

This is forward formula. We can also start from >@ PX n and use the equation as

a backward formula.

x A r.v counting number of successes in n bernoulli trials follows 13(n,p) and

counting number of failures nXY say, in n bernoulli trials follows

B(n,q).

x We can easily verify that 0,1 1 ªº} ¬¼PX n , hence Binomial is a discrete

r.v.

Example 6.2.

, oXBnp and if y, , onx Y B n q , then show that

>@>@ nr n r

XY rPX r P Y nr C p q

munotes.in

## Page 55

55Chapter 6: Some Special Random variables and their Distributions

Solution : >@ nr n r

XrPX r C p q >@

nn r r

Yn rPY n r C q p

But n

nr rnC C

,so >@>@ nr n r

XY rPX r P Y nr C p q

Head is thrice as likely as tail for a coin. It is flipped 4 times (i) Write ..pm f of X,

representing number of heads observed in this experiment . ii find probability of

getting 3 or 4 heads .

:3 Solution P H P T , so 3

4 PH p ,tossing coin once is a Bernoulli trial.

X counting number of heads observed in tossing such bais coin 4 times . iX

follows 34,4§· ¨¸©¹Bn p

>@ PX x

4 310,1, 2,3, 444

0§· §· °¨¸ ¨¸®©¹ ©¹

°¯xn x

xCx

otherwise ( 6 . 3 )

Required >@>@31 2 t dProb P X P X

>@43 2 211 3 1 324 6 0 . 9 1 9 844 4 4 4§· §· §· §· §·d ¨¸ ¨¸ ¨¸ ¨¸ ¨¸©¹ ©¹ ©¹ ©¹ ©¹PX

So, >@30 . 0 8 0 1 6t PX

Example 6.3. It is found that 060r of the health care victims are senior citizens . If

a random sample of 10 victims is taken , what is the probability of getting exactly 3

senior citizens in this sample ,

Solution : X is number of victims who are senior citizens in this sample . X

follows >@10 3 7

3 10, 0.6 3 0.6 0.4 0.04247 Bn p PX C

Example 6.4. X follows 6, Bnp such that >@>@24 PX PX Find p. munotes.in

## Page 56

56SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Solution: >@>@24 PX PX

42 62 6 4

24 Cpq Cpq

22 pq means 1

2 pq

Example 6.5. X follows ,Bnp , Y follows ,Bmp , If ,XY are independent .

Find the probability distribution of XY . Solution : Consider >@ PX Y k , for

0,1 k , ... mn

>@0 U , ªº ¬¼n

x PX Y k P X x Y k x

>@0 , ¦n

xPX x Y k x (6.4)

>@ > @0 ¦n

xPX xP Y k x Since ..rvs are independent

0

ªº ª º ¬¼ ¬ ¼¦n nx mkxxk x

xxxCpq Cp q (6.5)

0

ªº ¬¼¦n mnk mk

xk xxCC pq (6.6)

0

ªº ¬¼¦n mnk mk

kxCp q (6.7)

Thus XY follows ,Bnm p .

6.2 Poisson distribution

Now let us introduce another commonly used discrete r.v. Many a times we come

across a r.v counting number of occurrences in a fixed duration of time. or*

example, Number of deaths due to Maleria per month in Mumbai, Number of

accidents per hour on a express highway. Thc number of defects in cloth per square

metor is similar occasion where Poisson distribution is appropriate

Definition 6.3. Poisson distribution : A discrete .rv X assuming values 0,1,2 }f is

said to follow Poisson distribution if its ..pm f is given by

>@; 0,1, ; 0!

0 } f ! ° ®

°¯xexPX x x

otherwiseOOO ( 6 . 8 ) munotes.in

## Page 57

57Chapter 6: Some Special Random variables and their Distributions

x The notation oX O( is used for showing X follows Poisson distribution

with parameter O

x The Poisson probabilities can be evaluated by the following recurrence

relation,starting from >@0 PX eO And then using recursive formula,

>@ >@ 11 PX x PX xxO (6.9)

all probabilities can be evaluated. Tables are also available for various values of O

x We can easily verify that 0,1 1 ªº} f ¬¼PX , hence Poisson is a discrete

r.v.

x In Binomial situation if number of successes are very large and the chance of

success is very small, but average number ͂np̓ i s f i x e d s a y O then,

Binomial probabilities tends to Poisson as n becomes very large

Theorem 6.1 . Poisson as a limiting distribution of Binomial , oXBnp , then if

p is very small and n becomes very large , but ͂np̓ remains constant O then

Binomial probabilities tends to Poisson probabilities

lim 0,1!

of x

nx x

xnenC p q xxOO, ... f ( 6 . 1 0 )

Proof: , oXBnp , so >@ nx nx

x PX x Cp q

By putting pnO

!11!!§· § · § ·¨¸ ¨ ¸ ¨ ¸©¹ © ¹ © ¹xnxn

xn x n n nOOO

112 1 !lim =!! !of } xxxrnn n n x nxnx x n xLOO

lim 1 1

of§· §· ¨¸ ¨¸©¹ ©¹nx

nennO OO

Hence

lim 0,1!

of x

n nx x

xneCpq xxOO, ... f. (6.11)

munotes.in

## Page 58

58SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Limiting distribution of Binomial is Poisson.

Example 6.6 . A sales firm receives on an average three toll̺free calls per hour .

For any given hour find the probability that firm receives iA t most three calls .

ii At least three calls

Solution : XN o . of toll free calls a firm receives 3 o X O(

(i) required prob

>@3

030 . 6 4 7 2!

d ¦x

xePXxOO

(ii) required prob

>@ >@331 2 1 0 . 5 7 6 8 0 . 4 2 3 2!f

t d ¦x

xePX PXxOO

Example 6.7. A .rv X O(, such that >@>@45 PX Px . Find O Solution : As

>@>@45 . PX Px

45

4! 5!

eeOOOO

So, 5 O

Example 6.8. A safety device in laboratory is set to activate an alarm , if it register

5 or more radio active particles within one second . If the back ground radiation is

such that, the no of particles reaching the device has the poison distribution with

parameter 0.5 O flow likely is it that alarm will be activated within a given one

second period ?

Solution : Let X be number of particles reaching safety device within a one sec.

period . 0.5 o X O( The alarm will be activate if 5tX

>@5t Px

50.112!f

¦x

xe

xOO

Example 6.9. 11oX O( and an independent 22. orv X O(

Show that 12 1 2o XX OO( .

Solution : Consider >@ PX Y k , for 0,1 k , ... f

>@ > @ 01 2 1 20U, ,ff

ªº ¬¼ ¦ xxPX Y k P X xX k x PX xX k x

>@ > @120f

¦xPX xPX k x Since .rv s are independent

12 12

12 1 2

00!! !

ff

ªº ªº ª º «» «» « »«» «» « »¬¼ ¬ ¼ ¬¼¦¦xkx k

xxee e

xk x kOO OOOO O O

Thus 12 1 2o XX OO( . munotes.in

## Page 59

59Chapter 6: Some Special Random variables and their Distributions

Example 6.10. 2% students are left̺handed . In a class of 200 students find the

probability that exactly 5 are left̺handed .

Solution : X is no. of left̺handed in 200, .02 p and 200 n thus X has

(2 0 0 , Bnp . 02) . Using Poisson as a limiting distribution of Binomial , 4 o Xn p O(

>@45

50 . 1 5 6 35!

t ePxO

6.3 Normal Distribution

Definition 6.4 . Normal Distribution : A continuous .rv is said to follow Normal

distribution if its ..pd f is given by

2

1z

;0

2§·°¨¸°©¹ f f !®

°

°¯xe

fx xP

VPV

SV (6.12)

x The notation 2,0 oXN P is used to show that X follows normal

distribution with parameters P and 2V

x Normal distribution is applicable to wide range of situations in real life.

x P Mean of X and 20 Variance of X

x When 0 P and 21 V it is called as standard normal distribution. The

tables for >@dPX x are available for this distribution.

x Since any 2, oXNPV its linear combination Ya Xb also has

Normal distribution with parameters abP and 22aV

x If 2, oXNPV then XZPD has standard normal distribution.

x We denote by >@ dzP Z zM as d.f of 0,1N .

x 2

11 1 , oXNPV and 2

22 2 , oXNPV,if 12,XX are independent r.vs,then

22

12 1 1 2 1 , o XX NPPV V munotes.in

## Page 60

60SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Example 6.11. The score of the test are normally distributed with

100, 15 NPV . Find the probability that score is below 112.

Solution : Score is denoted by ,1 0 0 , 1 5o XX N PV

@> 112 100[1 1 2 ] 0 . 8 0 . 7 8 8 115 PX PZ M

, By normal tables . Since XZPV has standard normal distribution .

Example 6.12. 2

1 4,1.5 oXN and 2

2 2,2 oXN , if 12,XX are independent

.rv s, then find >@12 1 tPX X

Solution . 2 11 1, oXNPD and 2 22 2,0 oXN P, if 12,XX are independent

r.v.s., then

>@ 12 1 2

12(2, 2.25 4 ) , . .

1211 ( 0 . 4 ) 0 . 4 0 . 6 5 5 425o

ªº§· t t ¨¸«»©¹¬¼XX N a s X Xa r e i n d e p e n d e n a t r v s

PX X PZ MI

6.4 Chapter End Exercises

1. In order to qualify police academy, candidates must have score in top 10%in

the general ability test. If the test scores 200, 20 o XN PD , find the

lowest possible score to qualify.

2. 2

1 10,2.5 oXN and 2

2 12,2 oXN ,if 12,XX are independent

r.vs,then,find mean and variance of 12XX

3. On an average 0.2% of the screws are defective. ind* the probability that in

a random sample of such 200 screws we get exactly 3 defective screws.

4. oX O( . ind*O if >@>@4 3.38 PX

PX

5. X follows 5, Bnp . Find p if >@>@4 3.38 PX

PX

6. A video tape on an average one defect every 1000 feet.What is the probability

of atleast one defect in 3000 feet? munotes.in

## Page 61

61Chapter 6: Some Special Random variables and their Distributions

7. 3% of all the cars fail emission inspection. ind* the probability that in a

sample of 90 cars three will fail.Use(i) Binomial distribution (ii)Poisson

approximation to Binomial.

8. If a student randomly guesses at five multiple choice questions, find the

probability of getting three or more correct answers .There are four possible

options for each question.

9. 10, 3 . o XN PV Find the probability that

( i ) X is less than 13 but X is greater than 7. ii 2 3 YX ,then Y 26.

2iii 100 iv 8. ! XX

10. iX follows 5,3§· ¨¸©¹iBn . where 1, 2. i i Write p.m f of 12XX

( i i ) F i n d >@12 3 dPX X

munotes.in

## Page 62

7

TWO-DIMENSIONAL R.V.S

Unit Structure

7.0 Objectives

7.1 Probability Distributions of two-dimensional discrete r.v.s

7.2 Probability Distributions of two dimensional continuous r.v. s

7.3 Conditional Probability distributions

7.4 Independence of r.v.s

7.5 Chapter End Exercises

7.0 Objectives

After going through this chapter, you will learn

x Two dimensionaldiscrete r.v.s and its Joint Probability mass function.

x Two-dimensional continuous r.v.s and its joint Probability density function.

x From joint probability function of two-dimensional r.v.s finding marginal

Probability laws.

x The conditional distributions of the r.v.s.

x Notion of independence of r.v.s and its consequences.

7.1 Probability Distributions of two-dimensional discrete r.v.s

The notion of r.v can be extended to multivariate case .In particular if X and Y are

two r.v.s. defined on same probability spacc ,, ,:CP then ^`,XYB C , for

any borel set B in 2 .Note that this Borel set is a 0 field generated by rectangles

,,ab X cd The mapping ,XY : 2,:oC is a two dimensional r.v.

Definition 7.1 . Joint Probability Distribution of two dimensional ..rvs : The

probability measure ,XYP defined on 2 is called as Joint Probability Distribution

of two dimensional .. ,rvs X Y where

62munotes.in

## Page 63

63Chapter 7: Two-Dimensional R.V .S

, ,ªº ¬¼ XYPBP X YB for every borel set 2B (7.1)

Definition 7.2 . Two dimensional discrete .rv. The two-dimensional random

variable ,XY is called as discrete if there exist an at most countable set D such

that , 1 XYPD

Two r.v.s are jointly discrete if and only if they are discrete.

x The joint probability law ,XYP Of two dimensional discrete r.v s satisfy

following

1. ,,0tXYPx y for all x,y

2. ,,1 ¦¦xYyXPx y

x The joint probability law ,XYP Of two dimensional discrete r.v s is also

called as joint probability mass function of ,XY

x ,, ¦yXX YPx P x y is called as Marginal p.m.f of X.

x ,y, ¦xYXPy P x y is called as Marginal p.m.f of Y .

x Marginal p.mfs are proper p.m.fs of one dimensional discrete r.v.s.

Example 7.1 . Given following verify which are joint probability mass function of

joint ..pm f , if so, find the constant .K

(i) ,, XYPx yK x y for 1, 2, 3 1, 2. xy

(ii) ,, XYPx yK x y for 1, 0,1; 1, 0,1 xy

Solution : ,,0tXYiP xy for all ,xy if 0!K

And

,, ¦¦ XYxyPx y

1,1 1, 2 2,1 2, 2 3,1 3, 2 21 1 PP P P PP K

So, for X,YPx , y to be proper joint p.m.f 1K21 ,

(ii) 1, 1 0P , we can not have positive prob for remaining pairs , if K is

selected negative . Which means that for no K, ,y,0!XPx y ,y,XPx y is not

proper joint ..pm f

Example 7.2 . Two cards are drawn from a pack of cards . Let X denotes no. of

heart cards and Y no. of red cards . Find the joint ..pm f of .,rv X Y . Hence

>@ PX Y munotes.in

## Page 64

64SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Solution : ,0 , 1 , 2 xy . following will /. .be ointp m f of ,XY

>@

213 25 251, 2 1 , 252 102u PP h e a r t r e dC

and so on.

YЎ XЍ 0 1 2P[Y = y]

0 25

102 0 0 25

102

1 26

102 26

102 0 52

102

2 6

102 13

102 6

102 25

102

P[X = x] 57

102 39

102 6

102 1

And

>@ 25 26 6 190,0 1,1 2,2102 34 PX Y P P P

Example 7.3. Using the above joint ..pm f of ,XY

Solution: ,y, ¦ XXPx P x y find marginal ...pm f s of .. , .rvs X Y is as

Marginal ..pm f of .X

and ,, ¦ YX YPy P x y is Marginal ..pm f of .Y

X 0 1 2

PX(x) 19

34 13

34 1

17

Y 0 1 2

Py(x) 25

102 52

102 25

102

7.2 Probability Distributions of two dimensional continuous r.v.s

Definition 7. 3. Two dimensional continuous .rv. and their joint probability density

function : The two dimensional random variable ,XY is called as continuous if

there exists a function ,XYf : >@20,1o munotes.in

## Page 65

65Chapter 7: Two-Dimensional R.V .S

satisfying

1. ,,0tXYfx y for all ,xy

2. ,,1ff

f f ³³ XYfxyd x d y

3. >@ , , d d d d³³bd

XYacfdxdy P a X b c Y d where ,,abc , and dare

,y,f

f ³ XXgx f x y d y is called as Marginal p.d.f of X.

y, y ,f

f ³X hy f x y d x is called as Marginal p.d.f of Y.

Marginal p.d.fs are proper p.m.fs of one dimensional continuous r.v.s

Example 7.4. Given following verify which are joint probability density function

of ,)XY if so, find the constant .K

(i) ,y, xy

Xfx y K e for 00 .ttxy

(ii) ,y, Xfxy K x y for 01xy

Solution : (i) ,,0tXYfx y for all ,xy if 0!K

And

,,ff

f f ³³ XYfx y dxdy

000 0.( |ff f f ³³ ³xy yxKe dxdy K e e dy K

So, for ,,XYfxy to be proper joint .. 1 pd f K

(ii) ,,0tXYfx y for all ,xy if 0!K

And

1

,00,ff

f f ³³ ³ ³y

XYfx y d x d y Kxydxdy

2411

000.| |28 8 ³y

yxyKKy d K

So, for ,,XYfxy to be proper joint .. 8 pd f K

Example 7.5 . For the above two joint .pd. fs find [0 . 5 , y 0 . 5 ]IP X

(II)Marginal p ..dfs of X and .Y

Solution :(i) ,y, xy

Xfx y e for 00 .ttxy

(I)

0.5 0.5 20.5

00[0 . 5 , 0 . 5 ] 1 ³³xyPX Y e d x d y e munotes.in

## Page 66

66SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

,,f

f ³ XX YIIg x f x y d y is Marginal ..pd f of .X

Marginal ..pd f of X.

0f ³xyxXgx e d y e for 0tx

,,f

f ³ YX Yhy f x y d x is Marginal ..pd f of .Y

Marginal ..pd f of Y. ,0,f ³y

YX Yhy f x y d xe for 0ty

(ii) ,y,8 Xfxy x y for 01xy

(I)

0.5

00[0 . 5 , y 0 . 5 ] 8 ³³yPX xydxdy

230.5 0.5 4

0008| 8 0 . 5 0 . 0 6 2 522 ³³yxyyd y d y

(II) Marginal ..pd f of X. 1284 1 ³ Xxgx x y d y x x for 01x

Marginal ..pd f of Y. 3

084 ³y

Yhy x y d x y for 01y

Example 7.6. The joint probability density function of ,XY

, fxy1 1 , 14

0 °®

°¯xyxy

Othrewise

Find 22[, ]PXu Yv

Solution : @>22[, , ] PX uY v P u X u v Y v

1

4 ³³uv

uvxydxdy uv

7.3 Conditional Probability distributions

Definition 7.4. Conditional Probability mass function : Let the joint probability law

of a two dimensional discrete .,rv X Y be ,XYP and the marginal ..pm f of X

and Y be ,XYPxPy respectively , then the conditional ..pm f of X given

Yy is given by

,y

/y

y,

X

XyPx yPxPy for all y ,0 z xp r o v i d e d P y (7.2) munotes.in

## Page 67

67Chapter 7: Two-Dimensional R.V .S

And the conditional ..pm f of Y given Xx is given by

,

/,

XY

YX x

XPx yPyPx for all y, provided 0zXPx ( 7 . 3 )

Note that conditional p.m.f s are proper p.m.f.

Definition 7.5. Conditional Probability density function : Let the joint probability

law of a two dimensional continuous .. ,rv X Y is ,,XYfxy , and the marginal

..pd f of X and Y be ,XYgx h y respectively , then the conditional ..pd f of

X given Yy is given by

,

/y,

XY

Xy

Yfxygxhy for all x, provided 0zYhy (7.4)

And the conditional ..pd f of Y given Xx is given by

,

/,

XY

YX x

Xfxyhygx for all y, provided g 0zxx ( 7 . 5 )

Note that conditional p.d.f s are proper p.d.f.s

Example 7.7. There are 4 tickets in a bowl , two are numbered 1 and other

numbered numbers 2. Two tickets are chosen at random from the bowl . X denotes

the smaller of the numbers on the tickets drawn , and Y denotes the smaller of the

numbers on the tickets drawn .

(i)Find the joint ..pm f . of .. ,rv X Y

(ii)Find the conditional ..pm f of Y given 2 X (iii) Find the conditional ..pm f

of X given 2. Y

Solution: ^`1,1 , 1, 2 , 2,1 , 2, 2 :

joint p. m.f. of .. , yrv X

YЎ XЍ1 2 PY(y)

1 1

40 1

4

2 1

21

43

4

PX(x) 3

41

41

The conditional ..pm f of Y given 2 X is given by

,

/2,1, 21

4 XY

YXPx yPy f o r y . (7.6) munotes.in

## Page 68

68SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

y 1 2

PY/X=2(y) 0 1

The conditional ..pm f of X given 2 Y is given by

,

/2,1, 23

4 XY

XYPx yPx f o r x . (7.7)

x 1 2

PX/Y=2(x) 2

3 1

3

Example 7.8 . The joint ..pd f of ,,8 XYfxy x y for 01xy

.. ,rv X Y

(i)Find the conditional ..pd f of Y given Xx (ii) Find the conditional ..pd f of

X given . Yy

Solution : ,y,8 Xfxy x y for 01xy

Marginal ..pd f of X. 1284 1 ³ Xxgx x y d y x x for 01x

Marginal ..pd f of 3

0y. 8 4 ³y

Yhy x y d x y for 01y

(i) The conditional ..pd f of X given Yy is given by

,y

/y 32, 8204 X

Xy

Yfx y xy xgx f o r x yhy y y ( 7 . 8 )

And ii the conditional ..pd f of Y given Xx is given by

,

/ 2 2, 82

1 41 XY

YX x

Xfx yxyyhygx x xx for 1 xy (7.9)

7.4 Independence of r.v.s

Definition 7. 6. Independence of .. : ,rvs X Y be two dimensional .rv s they are

said to be independent if and only if, the events XA and YB are independent

for any Borel sets ,AB

x For the two dimensional discrete r.v s,XY are independent if and only if

the joint probability mass function is equal to the product of marginal p.m.fs

that is

,y, XY XPx yP x P y for all x,y (7.10) munotes.in

## Page 69

69Chapter 7: Two-Dimensional R.V .S

x For the two dimensional continuous r.v s,XY are independent if and only

if the joint probability density is equal to the product of marginal p.d.fs that

is

,, XY X Yfxy g xh y for all x,y (7.11)

x When the r.v.s are independent their conditional p.m.f. s/p . d . f s are same as

marginal p.m.f. s/p . d . f s

Example 7.9. 1. Verify whether (X, Y) are independent ...rv s

(i) The joint ..pm f . of ,XY is 10,09 P 1521,1 , 0,1 , 1, 0999 PP P

(ii) The joint ..pd f of ,,8 XYfxy x y for 01xy

Solution:(i) The joint ..pm f . of ,XY is

YЎ XЍO 1 YPy

0 1

9 2

91

3

1 5

9 1

9 2

3 XPx 2

31

3 1

210033 uXYPP

,10,09 XYP

, 00 0 , 0 zXY X YPP P

,?XY are not independent .

(ii) The joint density is ,,8 XYfxy x y for 01xy Marginal ..pd f of

12.8 4 1 ³ XxXgx x y d y x x for 01x

Marginal ..pd f of Y. 3

084 ³y

Yhy x y d x y for 01y

,,zuXY X Yfxy g x h y

,XY are not independent . munotes.in

## Page 70

70SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

7.5 Chapter End Exercises

1. Given following verify which are joint probability mass function of ,XY,

if so, find the constant K.

(i) ,, XYPx yK x for 1, 2, 3; 0,1, 2. xy

(ii) ,, XYxPx yKy for 1, 2; 1, 4 xy

2. Given following verify which are joint probability density function of

,XY . If so,

(i) find the constant K.

(ii) find marginal p.d.f of X,Y,

(iii) Verify whether they are independent.

(a) ,y, Xfxy K for ,xy [0, 1],

(b) ,,. xy

XYfx y eOPOP for 00 ; . , 0tt !xy OP

3. Given following joint probability mass function of ( X, y)

(i) Find the conditional p.m.f of Y given X2

(ii) Find the conditional p.m.f of X given Y2 .

(iii) Also verify their independence.

(I)

YЎ X̺Ѝ1 2 3 P[Y = y]

0 1

182

18 3

18 1

3

1 1

182

183

181

3

2 1

182

183

181

3

P[X = x] 1

6 1

3 1

2 1

(II) Two fair dice are tossed . X is maximum of the number on two

faces and Y is sum of the number on them. munotes.in

## Page 71

71Chapter 7: Two-Dimensional R.V .S

4. The joint p.d.f of ,,2 XYfx y for 01xy

r.v. ,XY

(ii)Find the conditional p.d.f of Y given X x (iii) Find the conditional p..f

of X given Y y

5. Find the constant K, if the joint p.m.f of ,XY is given as

11,3 4 xyPxy K for ,1 , 2 xy , ....Also verify whether X, Y are

independent.

munotes.in

## Page 72

72SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

8

EXPECTATION, VARIANCE AND

THEIR PROPERTIES

Unit Structure

8.0 Objectives

8.1 Expectation of a r.v.

8.2 Variance of a r.v.

8.3 Characteristic function of a r.v.

8.4 Chapter End Exercises

8.0 Objectives

After going through this chapter you will learn

x The expected value of functions of r.v.s.

x Properties of expectation.

x Variance and its role in studying r.v

x Characteristic function and its properties.

8.1 Expectation of a r. v.

Definition 8.1. Expectation :

Case (I)Expected value of a discrete .rv X assuming values 12,}rxxxL and with

>@.. tpmfP X x is defined as

1 ¦n

iiiBXx p x (8.1)

Provided the sum is convergent .

Case (II) Expected value of a continuous .rv X with ...pdff x is defined as

f

f ³EXx f x d x (8.2)

Provided the integral is convergent .

Expected value of a r.v is its average, simply called as a mean of r.v

72munotes.in

## Page 73

73Chapter 8: Expectation, Variance and their Properties

Example 8.1. Find expectation of X if

(i) A .rvX assuming values 0,1,2 , ... n with probability proportional to n

xC

(ii) , oXBnp

(iii) oX O(

(iv) X be no. of tosses of a coin up to and including the first toss showing heads .

Solution :(i)A .rv X assuming values 0,1,2 , ... n with probability proportional to

n

xC >@ xPX x KC

021 ¦nn

xxKC K

so, 2 nK

By definition of expectation ,

0 ¦n

xEXx p x

02

¦nnn

xxxC As

1

1

nn

xxnCCx

11

1122 22

¦nnn n n

xxnnCn

(ii) oX O(

00!ff

¦¦xxxeEX x px xxOO

1

11!f

¦x

xe

xOOO

Using

2

11112ªº } «»¬¼eO OO

. EX e eOOOO

(iii) Let X be no. of tosses of a coin up to and including the first toss showing

heads . Let ͂p̓ be the chance of showing head . 1 pq , is chance of showing

tail. >@1 xPX x p q munotes.in

## Page 74

74SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

for 1, 2 }x

1

11ff

¦¦xxxEXx p x x p q

2

2112 3

1ªº } ¬¼ppq qp q

Example 8.2. Find expectation of .rvX if

(i) A .rv X assuming values 0,f with .. . xpdf f x eOO[This is Exponential

distribution with parameter ]O

(ii) A .rv X assuming values ,, ab a b real numbers , with .. pdf f x

constant . [This is Rectagular or Uniform distribution ]

(iii) 2, oXNPV

Solution :(i) Since X is absolutely continuous .rv with density fx

f

f ³EXx f x d x (8.3)

0f ³xEXx e d xOO (8.4)

001|ff ³xxeexd xOO

OOOO O

(ii) Since density is constant over ,ab,

01 ³bKdx

gives 1 fx Kba

22 2

|22 2 ³bb

axxb a b aBX d xba ba baD ( 8 . 5 )

(iii)

21

2

;, 0

2§·¨¸©¹

° f d d f !®

°

¯x

efx xP

V

PV

SD (8.6)

21

2

2§·¨¸©¹f

f ³x

eEX x d xP

V

SV (8.7)

Put xzPD then xzVP and dx dz V

21

2

2

f

f ³zezdz VP

S munotes.in

## Page 75

75Chapter 8: Expectation, Variance and their Properties

2211

22

22

ff

f f ³³zzeezdz dz DP

SS (8.8)

0 u VP P . (8.9)

Since the first integral is an even function , and

21

2

1

2

f

f ³zedz

S

Property 8.1. Properties of Expectation

x Expectation of a function of a .rv : If gX is a monotonic function of a .rv

then expected value of gX deoted by EgX is defined as Case (I)

Discrete .rv

1 ªº ¬¼¦n

ii

iEgX gx px (8.10)

Provided the sum is convergent .

C a s e (II) Continuous .rv

f

fªº ¬¼³EgX gxf x d x (8.11)

Provided the integral is convergent .

x Expectation of a constant : >@ ECC Where C is constant .

x Effect of change of origin : >@ EXA E X A Where A is constant .

x Effect of change of Scale : >@ Pi AX APi X Where A is constant .

x Linearity : Combining above two we may write >@ EAX B AE X B

Where ,AB are constants

x Monotone If >@>@,ttXYEX EY

Example 8.3 . A .rv X has 1mean , find mean of following ..rvs, (i) ,X(ii) 2X

(iii) 3

2X, (iv) 2

2X

Solution : (i)1 EX E X (ii) 22 2 EX E X

(iii)

3 3122 §· ¨¸©¹EX XE

(iv)

2 21.522§· ¨¸©¹BX XE munotes.in

## Page 76

76SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

8.2 Variance of a r.v.

The function of r.v rgX X , has special role in the study of a r.v

Definition 8.2. thr raw moment of a .rv : thr raw moment of a .rvX is defined

rasE X and is denoted by '

rP

x We can check that for '1 r we get first raw moment and it is equal to mean

'

1 EXP

x thr raw moment of a r.v X is also called as moment about zero

x Moments can also be defined around arbitrary origin, that is >@rEXA

x In particular if arbitrary origin is mean the moments are called as central

moments So '

1ªº¬¼EXWP is called as thr central moment. Second central

moment is important tool in the study of r.v. and it gives the idea about

spread or scatterness of the values of the variable.

Example 8.4. Show that first central moment '

10 EX P

Solution :

''

11 0 EX EX PP

, since '

1 EXP

Example 8.5. Show that 2EXa is minimum when aE X , hence variance

is least mean square

Solution : Consider

2

220§· ¨¸©¹dfij X a dEX a E X ada da

when EXa , provided 2 2

20!dE X a

da. Thus 2 EXE X V X m e a n

square deviation about mean is minimum

Definition 8.3. Variance of a .rv. : Variance of a .rv is its second central moment

x [Variance of a r.v ] If X is a r.v then variance of X denoted by V(X) is defined

as: Case (I) Discrete r.v:

2'

11 ªº ¬¼¦n

iiiVX x p x P (8.12)

Provided the sum is convergent.

Case (II) Continuous r.v. :

2'

1f

f ³VX X fx d x P (8.13) munotes.in

## Page 77

77Chapter 8: Expectation, Variance and their Properties

Provided thc integral is convergent.

x 22ªº ¬¼VX EX EX , for computational purpose we use this formula.

x Variance of a constant : >@0 VC Where C is constant.

x Effect of change of origin : >@ VX A VX Where A is constant.

x Effect of change of Scalc : >@2 VA X A VX Where A is constant.

x Combining above two we may write >@2 VA X B A VX Where A,B are

constants.

x Positive square root of variance is called as standard deviation (s.d)of the r.v.

Example 8.6. A .rv X has variance 4, find variance and .sd of following ..rvs (i)

,X(ii)2X, (iii) 3

2X, (iv) 2

2X

Solution :

(i) 4, . 2. VX V X s d

(ii) 2V2 x 2VX 1 6 , s . d 4

(iii) 2X3 V ( X )V122§· ¨¸©¹

(iv) 22X V ( X )V1 , s . d 122§· ¨¸©¹

Example 8.7. Find variance of following ...rvs

(i) oX O(

(ii) X has Exponential distribution with parameter O

(iii) X has Uniform ,ab

Solution :(i) oXsoO(, as shown in above exercise EX O

Now consider ,

21 EXE X X E X

0

111 1! ff

¦ ¦xx

xeEXX xx px xxxOO

2

2

1 22f

¦x

xe

xOOO

munotes.in

## Page 78

78SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Using

2

11112ªº } «»¬¼eO OO

221 EXX e eOOOO

,So

221. EX EXX EX OO

222 2ªº ¬¼VX EX EX OO OO

Thus oX O(, then VX O

(ii)X has Exponential distribution with parameter O

22

0f ³xEXx e d xOO (8.14)

2

0 202|2ff ³xxeexx d xOO

OOOO O

Since

01 f ³xEX x e d xOOO

2

22

2221 1§·ªº ¨¸ ¬¼©¹VX P iX EXOOO

(iii) Since X has Uniform over ,ab, as seen above , 2 baEX

23 3 3 2 2

2|32 3 ³bb

aaxxb a b a b aEX d xba ba ba (8.15)

2 2 2222

32 1 2 §·ªº ¨¸ ¬¼©¹ba ba b a b aVX EX HX

8.3 Characteristic function of a r.v.

A complex valued function of a r.v that is useful to study various properties of a r.v

is known as characteristic function(ch.f).

Definition 8.4. Characteristic function : X be a .rv a complex valued function

denoted by )Xt is defined as z) tX

XtE e where t and 1 i

I) For discrete r.v : A discrete r.v X having p.m.f XP, then its ch.f is given by

0f

) ¦itx

XXxtP x e (8.16)

t and 1 i munotes.in

## Page 79

79Chapter 8: Expectation, Variance and their Properties

II) For continuous r.v A continuous r.v X having p.d.f Xfx, then its ch. f is

given by

f

f) ³itx

XXtf x e d x (8.17)

t and 1 i

x We can also write f

f) ³itx

XXte d F x which includes all r.v.

x ( ) itX

XtE e E c o s t Xi E s i n t X

x ) eXRtE c o s t X , which is real part of )Xt .And

) mXItE s i n t X is the imaginary part of )Xt

Example 8.8. Find ch. f of following ..rvs

(i) , oXBnp

(ii) X has .. 0 , 1 , 2 }xpm f p x p q x (Geometric distribution with parameter p)

(iii) 2, oXNPV

Solution :(i)

0

) ¦n nxxitx

XxxtC p q e (8.18)

t and 1 i

0

) ¦xn n nx it it

XxxtC q p e q p e ( 8 . 1 9 )

Using Binomial expansion . .Ch f of , oXBnp is ) nit

Xtq p e (ii)

0 1f

) ¦xi t x

X it

xptp q eqe (8.20)

Using geometric series with common ratio 1iqe t

So, .Ch f of Geometric distribution with parameter pis 1) X tptqe (iii) munotes.in

## Page 80

80SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

21

2

2§·¨¸©¹f

f) ³x

itx

Xete d xP

V

SV (8.21)

Put xzPV then xzVP and dx dz V

21

2

2

f

f ³z

it z eed zVP

S

2

221

12

2

2

f

f ³zi t

it t eed zD

PD

S (8.22)

221

2 it tePV (8.23)

Since

2 1

2

1

2

f

f ³zi tedzV

S

2, oXNPV then its ch. 221

2) it t

Xft ePV

Property 8.2. Properties of ch. f

x 01 . ) X

x 1 )dXt for all .t

x ) ) )XX Xtt t which is complex conjugate of )Xt .

x )Xt is uniformly continuous function of .t

x ) )ibl

aX b X te (at).

x )Xt generates moments of .rv X. The coefficient of rit in the expansion

of )X is thr moment of X. We can also get it from thr derivative of )X .

'10 )r

rX rrd

id tP (8.24)

x 2 0 2

p(x)̺4l̺2l ̺4l

x If X and Y are independent ..rv s c h f of XY is equal to product of their

ch. fs

x Product of any two ch. fs is also a ch, f. Thus any power of )Xt is also

a ch, .f munotes.in

## Page 81

81Chapter 8: Expectation, Variance and their Properties

Example 8.9. Find ch. f of X which is Uniform .rv over 0,1 . Hence that of

̺X. :1 Solution f x for 01x

1

01) ³it

itx

Xete d xit

1

) it

Xetit

Example 8.10. Using above result find ch. f of X̺Y if ,XY are ..iid Uniform

0,1 .

Solution: 1) ) it

XYettit

1

) it

Yetit

X and Y are independent , X,̺Y are also independent , by property of ch. f

211 2

) ) ) it it it it

XY X Yee e ett tit it t

Example 8.11. Is 2cos t a ch. f?

Solution :

2 il ileecost So, 22

2 2

4 il il

itX eecos t E e

Where X has ..pm f .

X -2 0 2

p(x) 1

4 1

2 1

4

2cos t is a ch. .f

Theorem 8.1. 12,}n XX X be the .rv s with d. fs

12,}

n XX XFF F respectively and

ch. fs

19,))} )

n XX X respectively , then for the constants 12,}n aa a such that

0tia , and 1, )¦¦x iX iaa is a ch. f of ¦x iXaF

Following theorem characterizes the ch. f and its density or d.f. Uniquely.

munotes.in

## Page 82

82SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Theorem 8.2. Inversion theorem : X is absolutely continuous .rv such that

||f

f) f³ Xtd t

then its ..pd f is given by

1

2f

f )³itx

XXfxe t d tS (8.25)

Example 8.12. Show that t1)ew a .ch f . if . )ti s c h f of some .df F x

Solution :

23

112! 3! ))))

tttt

ee

1f

)¦ jjjat

where a1

! jaje with 1

j1a1f

¦and ) )j

jtt , ch. f. of j ̺fold convolution

of ’Fx . hence 1 )te is a ch.f

Example 8.13 . 2

xefx forf fx then find its ch. .f

Solution :

t

2f

f) ³x

ix

Xete d x

0

2011 1 1

22 2 1 1 1f

f§· ¨¸ ©¹³³xx

itx itxeeed x e d xit it t

) t

Xte find its ..pd f

Solution : Since from the above example for fxe x ch f is 21

1t, we write

21

12f

f ³x

itxeed xt

replace t by ̺y in above equation gives

21

12f

f ³x

iyxeed xy munotes.in

## Page 83

83Chapter 8: Expectation, Variance and their Properties

replace x by ̺t in above equation and multiply by 1

S gives

211 1

12f

f ³t iytee d ty SS

By inversion theorem , te is a ch. f of .rv whose density is

211

1 fyy Sfor f fx This is Cauchy density.

8.4 Chapter End Exercises

1. Find expectation and variance of X if

( i ) , oXBnp

( i i ) X h a s U n i f o r m o v e r 1,1

( i i i ) p . d . f o f X i s

1f o r 1 0

1f o r 0 1

0O t h e r w i s e d

° d®

°¯xx

fx x x ( 8 . 2 6 )

( i v ) X h a s p . m . f 0,1,2 }xpx p qx

2. Find ch. f of iX has Poisson with parameter .O(ii) 12XX , where

2 11 1, oXNPV and 2 22 2, oXNPV are independent r.v.s.

3. Are following ch. f? (i) cost (ii) )eXRt (iii) ) )¦k

XX k Pt t p .

(iv) 1

1t

munotes.in

## Page 84

9

THEOREMS ON EXPECTATION AND

CONDITIONAL EXPECTATION

Unit Structure

9.0 Objectives

9.1 Expectation of a function of two dimensional r.v.s

9.2 Conditional Expectation

9.3 Chapter End Exercises

9.0 Objectives

After going through this chapter you will learn

x Expectation of a function of two dimensional r.v

x Theorems on expectation.

x Some inequalities based on expectations.

x Conditional expectation and its relation with simple expectation.

9.1 Expectation of a function of two dimensional r.v.s

Definition 9.1. Expectation of ,Y gX : Let the function of two dimensional

.,rv s X Y be , gX Y its expected value denoted by , ªº¬¼EgX Y is defined as

Case (I)Discrete r v. with joint ,y .. ,Xi j pm f P x y

11,, , ªº ¬¼¦¦mn

ij ijjiPi g X Y g x y p x y (9.1)

Provided the sum is convergent . Case (II)Continuous r v . with joint

,y ... ,Xpdf f xy

,, ,ff

f fªº ¬¼³³Pi g X Y g x y f x y dxdy (9.2)

Provided the integral is convergent .

84munotes.in

## Page 85

85Chapter 9: Theorems on Expectation and Conditional Expectation

Theorem 9.1. Addition Theorem on expectation: ,XY be two dimensional .rv s

then

EXY E X E Y (9.3)

Proof: We assume that the r.v.s ( X, Y) are continuous with joint p.d.f. ,,XYfxy

and marginal p.d.f .s ,XYgx h y .

, y,ff

f f ³³ XYEXx y f x y d x d y

,y , , ,ff ff

f f f f ³³ ³³ XX Yxfx y d x d y y fx y d x d y

,, , ,ff ff

f f f fªº ªº «» «»¬¼ ¬¼³³ ³³ XY XYxfx y d y d x y fx y d x d y

romff

f f*³³XYxg x dx yh y dy the definition ,XYgx h y

Pi X Pi Y From the definition of >@EX and >@EY

Hence the proof.

Theorem 9.2. Multiplication Theorem on expectation : ,XY be two

dimensional independent .rv s then

EXY E X E Y (9.4)

Proof: We assume that the r.v.s ,XY are continuous with joint p.d.f. ,,XYfxy

and marginal p.d.f .s ,XYgx h y .

,y,ff

f f ³³ XEXY xyf x y dxdy

, ,ff

f f ³³ XYxyf x y dxdy

ff

f f³³ Y Xxyg x h y dxdy By independence of X and Y

y ff

f f ³³Xxgx y h y d x d y

rom* EXP i Y the definition of >@XE and >@EY

Hence the proof.

Above theorems can be generalized for n variables 12,XX , ... nX munotes.in

## Page 86

86SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

12 1 2 } }nnEXX X E X E X E X (9.5)

orn* independent random variables 12,XX , ... nX

12 1 2} }nnEXX X E X E X E X (9.6)

Example 9.1. Show that EaX bY c aE X bE Y c

Solution : Consider continuous .. ,rv X Y with joint , ... ,XYpdf f xy , and

marginal ...pd f s

,XYgx h y

> ],ff

f f ³³Pi aX bY c ax by c f x y dxdy (9.7)

,,ff ff

f f f f ³³ ³³ XYaxf x y dxdy by ,y ,,,ff

f f³³ XX Yfxyd x d y c f xyd x d y

,y ,,,ff ff

f f f fªº ªº«» «»¬¼ ¬¼³³ ³³ XX Y ax f x y d y d x by f x y d x d y c B y property of joint

.. pd f ff

f f³³XY ax g x d x by h y d y c By the definition 9xx, Yhy

aE X bE Y c By the definition of 1Xl and []EY

If tXY , then prove that >@>@tEXE Y

Solution: If ytX , then >@0 tXY , and hence >@0 tEX Y

From the property of expectation and addition theorem ,

>@>@0 tEX E Y or >@>@. tEXE Y

Example 9.2. Prove for any two .. , ,rvs X Y

>@2 22y ªº ªºd¬¼ ¬¼EXE X E Y

[This is Cauchy Schwarz ̓s inequality .1

Solution : consider a function >@2 ha EX a Y

>@22 22 ªº ªº ¬¼ ¬¼ha E x aEY a EX Y

>@222 0ªº ¬¼dh aaE Y E XYda

gives >@2 ªº¬¼pjX Ya

EY munotes.in

## Page 87

87Chapter 9: Theorems on Expectation and Conditional Expectation

And

2

2

220ªº t¬¼dhaEYda

Thus ha is minimum vjhen >@2 ªº¬¼EX Ya

EY

>@ >@ >@2

22

22 222ªº

ªº ªº «»t ¬¼ ¬¼ªº ªº ªº«»¬¼ ¬¼ ¬¼¬¼EX Y EX Y EX Yha E X Y f i j X EY

EY EY EY fii >@XY

>@2

2

20 ªºt¬¼ªº¬¼EX YEX

pj Y

gives >@2 22ªº ªºd¬¼ ¬¼EXY E X E Y

Example 9.3. Show that

2 211§·t¨¸ªº ©¹¬¼EX EX

Solution : Using Cauchy Schwarz ̓s inequality for 1y X

2

211ªºªºd¬¼«»¬¼EX EX

divide inequality by 20 ªº!¬¼EX to get,

2 211§·t¨¸ªº ©¹¬¼BX EX

Example 9.4. Show that for any two .. , ,rvs X Y

>@2 22ªº ªº d ¬¼ ¬¼EXY E X E Y

Solution : Consider , >@2 222y ªº ¬¼EXY E X X Y >@22 2ªº ªº ¬¼ ¬¼EXE X Y E Y

d 22 2 22 ªº ªº ªº ªº¬¼ ¬¼ ¬¼ ¬¼EXE X E Y E Y B y Cauchy Schwarz ̓s inequality

2

22 ªºªº ªº ¬¼ ¬¼«»¬¼EX E Y

Taking square root,

>@2 22ªº ªº d ¬¼ ¬¼pj X Y B X pj Y munotes.in

## Page 88

88SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

9.2 Conditional Expectation

Definition 9.2. Conditional Expectation of X:

Case (I) Let ,XY be the two dimensional discrete ./rv w i t h o i n t probability

mass function ,, XY i jPx y i 1 ton, 1 jtom the conditional expected value of X

given jYy denoted by /y jEXy and defined as

1// ¦njii jiEX Y y x PX x Y y (9.8)

Case (II) Let ,XY be the two dimensional continuous .rv with joint probability

density function 2

,,, XYfx y x y the conditional expected value of X given

Yy denoted by /y EXy and defined as

//f

f ³EXY y x g XY y d x (9.9)

Similarly using conditional p.m.f s or p.d.f s of Y given X we can define

conditional expected value of Y.

Definition 9.3. Conditional Expectation of Y: Case (I) Let ,XY be the two

dimensional discrete .rv with joint probability mass function ,, XY i jPx y i 1ton,

j 1tom the conditional expected value of Y given i Xx denoted by

/ i Pi Y X x and defined as

1// ¦m

ij j ijEYX x y P Y y X x (9.10)

Case (II) Let ,XY be the two dimensional continuous .rv with joint probability

density function 2

,,, XYfx y x y the conditional expected value of X given

Yy denoted by / EYX x and defined as

//f

f ³EYX x y h YX x d y ( 9 . 1 1 )

Theorem 9.3. For any two ..rvs X and Y

/ yEEX Y y EX

and / xEEY X x EY

munotes.in

## Page 89

89Chapter 9: Theorems on Expectation and Conditional Expectation

Proof: Consider

/y /f

f ³EXy x g X y d x

Multiply both sides of above equation by hy and itegrate with respect to y,to get

//ff f

f f f ³³ ³h y Pi X y dy xh y g X y dxdy

L.H.S becomes / yEEX y EX and since ,y /, X hygX y f x y R.H.S

becomes

, ,ff

f fªº «»¬¼³³ XYxfx y d y d x

f

f³Xxgx d x By definition of Xgx (9.12)

Pi X (9.13)

Hence the proof. We can similarly prove / xEEY X x EY

Example 9.5. Find conditional means of X and Y of the following ..rv The joint

..pd f of ,,8 XYfxy x y for 01xy

Solution: From 13.8 the conditional ,.pdf of X givenY yis

/ 22

XY yxgxy for 0xy

2022/3 ³yxEXy x d x yy

From (13.9) the conditional ..pd f of Ygiven X xis

/ 22

1 YX xyhyx for 1 xy

2

1

221 2/13 1

³xxx yEY x y d yxx

Example 9.6. Find /2 EX Y , and /2 EY X if the joint ..pm f . of

.. ,rv X Y is as given below .

YЎ XЍ1 2 PY(y)

1 1

40 1

4

2 1

21

43

4

PX(x) 3

41

41

Solution : The conditional ..pm f of Y given 2 X is given by munotes.in

## Page 90

90SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Y 12

PX/Y=2(y) 0 1̺3l

/2 2 : /2 2 EY X y PY y X

The conditional ..pm f of X given 2 Y is given by

x 1 2

PX/Y=2(x)2

31

3

4/2 /23 ¦ EX Y x PX xY

9.3 Chapter End Exercises

1. If X and Y are independent then show that conditional means are same same

as simple means.

2. Show that 2d EX E XPP

3. Find conditional means of X and Y of the following r.v. The joint p.d.f of

,,2 XYfx y for 01xy

4. Given the joint p.m.f of X,Y as

>@3,24 xyPX x Y y for ,1 , 2 xy . Find conditional mean of Y given X

1 and conditional mean of X given Y 1

5. For the above problem 3 and 4,find Ei XY ,E X Y .

6. Two balls are drawn from an urn containing one yel of blue balls

drawn.low,two red and three blue balls. Let X be the number of red balls, and

Y be the number of blue balls drawn. ind* the joint p.m.f of X, Y,hence

/2 EX Y and /2 EY X .

munotes.in

## Page 91

10

LAW OF LARGE NUMBERS AND

CENTRAL LIMIT THEOREM

Unit Structure

10.0 Objectives

10.1 Chebyshev’s inequality

10.2 Modes of Convergence

10.3 Laws of large numbers

10.4 Central Limit Theorem

10.5 Chapter End Exercises

10.0 Objectives

After going through this chapter you will learn

x Chebyshev ̓s inequality and its applications

x Various modes of convergence and their interrelations.

x Weak law of large numbersand necessary condition for asequence to obey

this law.

x Strong law of large numbersand necessary condition for asequence to obey

this law.

x CLT : An important theorem for finding probabilities using Normal

approximation.

10.1 Chebyshev’s inequality

In this chapter we will study asymptotic behavior of the sequence of r.v.s.

Theorem 10.1. X is a non̺negative .rv with finite mean then for any 0!C

>@tdEXPX CC (10.1)

91munotes.in

## Page 92

92SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

proof :

>@ f

f ³XEXx f x d x (10.2)

Consider ^`U{ } : t XCX C So,

>@

t ³³ XXXC XCEXx f x d x x f x d x (10.3)

tt³ XXCxfx d x

tt³ XXCCf x

>@ tCP X C

Hence we get

>@tdBXPX CC (10.4)

Following inequality is directly followed from above theorem

Theorem 10.2. Chebyshev ’s inequality X is a .rv with mean P and variance

2 V then for any 0!C

2

2ªºt d¬¼PX CCVP

proof: tXCP implies 2 2tXC P

So

2

2 2

2( ªº ªºt d t d¬¼ ¬¼EXEX C P X CCPPP

Using above theorem for 2XP

2 VX

C

Hence

2

2ªºt d¬¼PX CCVP (10.5)

x Inequality can also be written as

2

2[] 1t PX CCVP (10.6)

we get an lower bound on the probability that r.v deviates from its mean by C munotes.in

## Page 93

93Chapter 10: Law of Large Numbers and Central Limit Theorem

x If C is replaced by (kJ, where 0!k then inequality reduces to give an upper

bound

21ªºt d¬¼ Cf PX kkP (10.7)

x By complementation can also write a lower bound.

21[] 1 t PX kkPV (10.8)

x If 2 k , lower bound is 3

4, which means that 75% of the times r.v assumes

values in 2, 2PDP D

x The bounds given by Chebyshev’s inequality are theoretical and not practical,

in the sense that the bounds are rarely attained by the r.v.

x Inequality is useful when the information regarding probability distribution

of r.v is not available but the mean and variance is known.

Example 10.1. A .rv X has mean 4 0 and variance 1 2 . Find the bound for the

probability >@>@32 48 .d tPX PX

Solution :

>@>@32 48 40 8 ªº d t t¬¼PX PX P X

By Chebyshev ’s inequality ,

240 8 0.18758ªºtd ¬¼variancePX

Example 10.2. A unbiased coin is tossed 400 times ,find the probability that number

of heads lie between (160, 240).

Solution : X has 1400,2§·

¨¸©¹B . X has Mean 200 and Variance 100.

So, >@160 240 200 40 ªº dd d¬¼PX P X ,By Chebyshev ̓s inequality , with

4 k and 10 V

1200 40 116ªºdt ¬¼PX . This gives the lower bound 0.9375

munotes.in

## Page 94

94SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

10.2 Modes of Convergence

Th modes of convergence are itroduced so as to define further laws of large

numbers.

Definition 10.1. Convergence in Probability : ,A,:P be a probability space

^`nX is a sequence of a ..rvs nX is said to converge in probability to a .rv X ,

from the same space , if for any 0!H

lim (| | )

ofnnPX X H 0 (10.9)

We say that onPXX

Definition 10.2. Almost Sure Convergence ,A,:P be a probability ^`n space X

is a sequence of a ..rvs nX is said to converge almost surely to a .rv X , from the

same space , if for any 0!H

(lim | | ) 0

of! nnPX X H (10.10)

We say that .oas

nXX

Definition 10.3. Convergence in Distribution : ,A,:P be a probability space

^`nX is a sequence of a ..rvs with ^`.. .n dfsF nX is said to converge in

distribution to a .rv X , from the same space , if there exists a .df F of X such that

nF converges to F at all continuity points of .F

x Almost sure convergence implies convergence in probability.

x Convergence in probability implies convergence in distribution.

10.3 Laws of large numbers

Theorem 10.3. Weak law of Large Numbers (WLLN ): 12,}n XX X be the

independent .rv s with means 12,}nPP P respectively and finite variances

22 2

12,}n VV V respectively and ¦niSX for any 0!H, if

1lim ( ) 0

of ! ¦n

iin

nSPnnP

H (10.11)

We say the Weak Law of Large Numbers (WLLN ) holds for the sequence of

^`..i rvs X munotes.in

## Page 95

95Chapter 10: Law of Large Numbers and Central Limit Theorem

proof: Consider

ܲሺቤܵ

݊െσൌͳ

ߤ

݊ቤ ߝሻ ൌ ܲሺฬܵ

݊െܧ൬ܵ

݊൰ฬߝሻ

2§·¨¸©¹dnSVarn

H

Because of independence of r.vs and by Chebyshev ̓s Inequality .Further

2

22 ¦i

nV

H (10.12)

Taking limit as n tends to f of both sides of inequality we get R.H. S limit zero

since variances are finite, finally

lim ( ) 0

of! ¦i n

nSPnnPH (10.13)

Theorem 10.4. Khintchine̓s Weak law of Large Numbers (WLLN ) 12,}n XX X

be the .. .iid rv s with common mean P, then ¦niSX for any 0!H, if

lim ( ) 0

of! n

nSPnPH (10.14)

We say the Weak Law of Large Numbers (WLLN ) holds for the sequence of .rv .

^`isX

The law can be equivalently stated using complementation as

lim ( ) 1

of n

nSPnPH (10.15)

x In short when WLLN holds for thc sequence if o¦ niSPP .

x The limiting value of chance that average values of the r.v.s becomes close to

the mean is nearly 1,as n approaches to f

x Assumption of finite variance is required for non identically distributed

r.v.s.The condition for WLLN to hold for such sequence is that 2rVS

nL

tends to zero as n approaches to infinity. or* i.i.d.r.vs only existence of

finite mean is required.

x Above law is a weak law in the sense that there is another law which implies

this law munotes.in

## Page 96

96SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

Example 10.3. Examine whether WLLN holds for the following sequence of the

independent ...rvs

1.

121 2

121 2 °° ®

° °¯kkw i t h p r o b

X

kw i t h p r o b

Solution : >@0 kEX and >@21, kn VX Z k VS n

2nVS

n does not tend to zero as n approaches to f . WLLN does not hold for the

sequence

2.

21

212 2

10 12r °° ®

° °¯k

k

k

kwithprob

X

withprob

Solution : >@0 kEX and >@1, kn VX VS n

n

2VS

n tends to zero as n approaches to f, therefore WLLN holds for the

sequence

Theorem 10.5. Strong Law of Large Numbers (SLLN ): 12,}n XX X be the

independent .rv s with means 12 7,}lPP P respectively and finite variances

22 2

12,}n VV V respectively and ¦niSX for any 0!H, if

1[lim | | ] 0

f

o! ¦n

iin

nSPnnP

H (10.16)

We say that Strong Law of Large Numbers (SLLN ) holds for the sequence of

^`..i rvs X

x In short when SLLN holds for the sequence if o¦ niSa sP .

x The average values of the r.v.s becomes close to the mean as n approaches

to f with very high probability.That is almost surely.

x Assumption of finite variance is required for non identically distributed r.v.s.

The condition for SI LNದ to hold for such sequence is that munotes.in

## Page 97

97Chapter 10: Law of Large Numbers and Central Limit Theorem

2 1f

f¦i

iVX

i

a s n approaches to infinity.This condition is known as Kolmogorov’s

Condition. For i.i.d.r.v.s only existence of finite mean is required.

x Above law is a Strong law in the sense that which implies Weak law

Example 10.4. Examine whether SLLN holds for the following sequence of the

independent ...rvs

1.

1

2

10 1r °° ®

° °¯kkw i t h p r o b

kX

withprob

k

Solution : >@0 kEX and >@3

2

2,4 f¦k

kVXVX kk SLLN does not hold for the

sequence

2.

12 2

12 2 °° ®

° °¯k

k

kwithprob

X

withprob

Solution : >@0 kEX and >@2

22, f¦kk

kVXVXk SLLN holds for the

sequence

Weak law of large numbers gives an idea about whether the difference between

average value of r.v.s and their mean becomes small But following theorem gives

the limiting probability of this becomes less than small number H

10.4 Central Limit Theorem

Central Limit Theorem is basically used to find the approximate probabilities, using

Normal distribution.The theorem was initia lly proved for bernoulli r.v.s It has been

proved by many mathematicians and statisticians, by imposing different conditions.

Theorem 10.6. Central Limit Theorem by Lindberg̺Léévy (CLT):

12,}n XX X be the .. .iid rv s with common mean P and common variance 2V, let

¦niSX for any 0 !a, munotes.in

## Page 98

98SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

21

2

711lim ( )

2

of

³n

ax

a nS

nPa e d x

nP

D (10.17)

We say that CLT holds for the sequence of .rv . ^`isX

x This theorem is useful to find the probabilities using normal approximation.

Normal distribution tables are available for d.f of 0,1N for all a0!

21

2

711

2

f ³a xae d xM (10.18)

x () ( )

n

nS

S nPa P an n

nPVPV (10.19)

Now if the choice of H is arbitrary take a

nVH that is naVH, as n

approaches to f the above probability becomes,

lim ( ) 1

of f n

nSPnPHM (10.20)

x 1 o¦n

niiSPP

.Thus WLLN holds CLT gives probability bound for nS

nP,where as

WLLN gives only the limiting value.

x If 12,}n XX X be the i.i. d bernoulli r.v s CLT becomes

21

21lim ( )

2

f of

³n

ox

nSpnPa e d x

pq

nS (10.21)

Example 10.5. A fair coin is tossed 10, 000 independently . Find the probability

that number of heads (i)differs by less than 1% from 5000 ii is greater than 5100 munotes.in

## Page 99

99Chapter 10: Law of Large Numbers and Central Limit Theorem

Solution : nS be the number of heads in 10, 000 independent tosses of a fair coin.

nPi S 5000 and 2500 nVS

By CLT i

5000lim ( 5000 50) lim ( 1)

2500of of n

nnnSPS P (10.22)

2112

112110 . 6 8 2 6

2

³xed x M

S (10.23)

(ii)

5000 5100 5000lim ( 5100) lim ( )

2500 2500of of! !n

nnnSPS P (10.24)

21

2

2

7112 0 . 0 2 2 8

2rf ³xed x M (10.25)

Example 10.6. How many independent tosses of a fair die are required for the

probability that average number of sixes differ from 1

6 by less than 6% to be at

least 0.95?

Solution .5

36 ) nnSS thbeenn n number average of sixes in n independent tosses of a

fair die. §· ¨¸©¹nSEn By CLT i

15lim ( .01 ) 0.9563 6of t n

nSPnn (10.26)

2 5 1.0136 2

5.0136152. 0 1 1 0 . 9 536 2

§· t ¨¸¨¸©¹³xn

ned xnM

S (10.27)

21 . 9 6 1 1 . 9 6 1 . 9 6 0 . 9 5 MM M (10.28)

5,.01 1.9636t Son or it gives 2

2196 5533636 .01! xnx Toss the die at least 5336 times

to get the result .

munotes.in

## Page 100

100SET THEORY AND LOGIC & ELEMENTRY PROBABILITY THEORY

10.5 Chapter End Exercises

1. X is a r.v assuming values Ѹl, 0,1 with probabilities 0.125, 0.75, 0.125

respectively.Find the bounds on [1 ]!PX

2. Find K such that probability of getting hcad between 450toK is 0.9, in 1000

tosscs of a fair coin.

3. 3. 0i n d t *fx e x x the bound on the probability [1 2 ]!PX , and

compare it with actual probability

4. Examine whether SLI Nದ holds for the following sequence of the

independent r.v.s.

2112 2

10 12r °° ®

° °¯k

k

k

kwithprob

X

withprob

5. Examine whether WI LNದ holds for the following sequence of the

independent r.v.s.

1

2

1

2 °° ®

° °¯kkw i t h p r o b

X

kw i t h p r o b

6. Suppose a large lot contains l% defectives. By using CLT, find approximate

probability of getting at least 20 defectives in a random sample of size 1000

units.

7. ^`iX is sequence of independent r.vs such that 0 iEX and

1

100

1001

3 ¦ iiiVX S X approximately the [0 . 2 ]!nPS

munotes.in