Lecture notes for Probability and random process

probability and random processes with applications to signal processing, probability and random processes problems and solutions pdf, probability pdf free download
Dr.TomHunt Profile Pic
Dr.TomHunt,United States,Teacher
Published Date:23-07-2017
Your Website URL(Optional)
Comment
A Course Material on Probability and Random Processes QUALITY CERTIFICATE This is to certify that the e-course material Subject Code : MA6451 Subject : Probability and Random Processes Class : II Year ECE being prepared by me and it meets the knowledge requirement of the university curriculum. S.NO CONTENTS Page.NO UNIT I RANDOM VARIABLES 1. Introduction 1 2. Discrete Random Variables 1 3 Continuous Random Variables 5 4 Moments 14 5 Moment generating functions 14 6 Binomial distribution 18 7 Poisson distribution 21 8 Geometric distribution 25 9 Uniform distribution 27 10 Exponential distribution 29 11 Gamma distribution 31 UNIT II TWO –DIMENSIONAL RANDOM VARIABLES 11 Introduction 37 12 Joint distribution 37 13 Marginal and Conditional Distribution 38 14 Covariance 43 15 Correlation Coefficient 44 16 Problems 41 17 Linear Regression 45 18 Transformation of random variables 46 19 Problems 47 UNIT III RANDOM PROCESSES 20 Introduction 49 21 Classification 50 22 stationary processes 51 23 Markov processes 55 24 Poisson processes 56 25 Random Telegraph processes 57 UNIT IV CORRELATION AND SPECTRAL DENSITIES 26 Introduction 60 27 Auto Correlation functions 60 28 Properties 61 29 Cross Correlation functions 63 30 Properties 64 31 Power spectral density 65 32 66 Properties 33 Cross spectral density 66 34 Properties 67 UNIT V LINER SYSTEMS WITH RANDOM INPUTS 35 Introduction 71 36 Linear time invariant systems 72 37 Problems 72 38 Linear systems with random inputs 73 39 Auto Correlation and Cross Correlation functions of inputs and 74 outputs 40 System transfer function 75 41 Problems 76 MA6451 PROBABILITY AND RANDOM PROCESSES L T P C 3 1 0 4 OBJECTIVES: To provide necessary basic concepts in probability and random processes for applications such as random signals, linear systems etc in communication engineering. UNIT I RANDOM VARIABLES 9+3 Discrete and continuous random variables – Moments – Moment generating functions – Binomial, Poisson, Geometric, Uniform, Exponential, Gamma and Normal distributions. UNIT II TWO - DIMENSIONAL RANDOM VARIABLES 9+3 Joint distributions – Marginal and conditional distributions – Covariance – Correlation and Linear regression – Transformation of random variables. UNIT III RANDOM PROCESSES 9+3 Classification – Stationary process – Markov process - Poisson process – Random telegraph process. UNIT IV CORRELATION AND SPECTRAL DENSITIES 9+3 Auto correlation functions – Cross correlation functions – Properties – Power spectral density – Cross spectral density – Properties. UNIT V LINEAR SYSTEMS WITH RANDOM INPUTS 9+3 Linear time invariant system – System transfer function – Linear systems with random inputs – Auto correlation and Cross correlation functions of input and output. TOTAL (L:45+T:15): 60 PERIODS OUTCOMES:  The students will have an exposure of various distribution functions and help in acquiring skills in handling situations involving more than one variable. Able to analyze the response of random inputs to linear time invariant systems. TEXT BOOKS: 1. Ibe.O.C., “Fundamentals of Applied Probability and Random Processes", Elsevier, 1st Indian Reprint, 2007. 2. Peebles. P.Z., "Probability, Random Variables and Random Signal Principles", Tata McGraw Hill, 4th Edition, New Delhi, 2002. REFERENCES: 1. Yates. R.D. and Goodman.D.J., "Probability and Stochastic Processes", 2nd Edition, Wiley India Pvt. Ltd., Bangalore, 2012. 2. Stark. H., and Woods. J.W., "Probability and Random Processes with Applications to Signal Processing", 3rd Edition,Pearson Education, Asia, 2002. 3. Miller. S.L. and Childers.D.G., "Probability and Random Processes with Applications to Signal Processing and Communications", Academic Press, 2004. 4. Hwei Hsu, "Schaum‟s Outline of Theory and Problems of Probability, Random Variables and Random Processes", Tata McGraw Hill Edition, New Delhi, 2004. 5. Cooper. G.R., McGillem. C.D., "Probabilistic Methods of Signal and System Analysis", 3rd Indian Edition, Oxford University Press, New Delhi, 2012. MA6451 PROBABILITY AND RANDOM PROCESSES UNIT - I RANDOM VARIABLES Introduction Consider an experiment of throwing a coin twice. The outcomes HH, HT, TH, TT consider the sample space. Each of these outcome can be associated with a number by specifying a rule of association with a number by specifying a rule of association (eg. The number of heads). Such a rule of association is called a random variable. We denote a random variable by the capital letter (X, Y, etc) and any particular value of the random variable by x and y. Thus a random variable X can be considered as a function that maps all elements in the sample space S into points on the real line. The notation X(S)=x means that x is the value associated with the outcomes S by the Random variable X. 1.1 SAMPLE SPACE Consider an experiment of throwing a coin twice. The outcomes S = HH, HT, TH, TT constitute the sample space. 1.2 RANDOM VARIABLE In this sample space each of these outcomes can be associated with a number by specifying a rule of association. Such a rule of association is called a random variables. Eg : Number of heads We denote random variable by the letter (X, Y, etc) and any particular value of the random variable by x or y. S = HH, HT, TH, TT X(S) = 2, 1, 1, 0 Thus a random X can be the considered as a fun. That maps all elements in the sample space S into points on the real line. The notation X(S) = x means that x is the value associated with outcome s by the R.V.X. Example In the experiment of throwing a coin twice the sample space S is S = HH,HT,TH,TT. Let X be a random variable chosen such that X(S) = x (the number of heads). Note Any random variable whose only possible values are 0 and 1 is called a Bernoulli random variable. 1.2.1 DISCRETE RANDOM VARIABLE Definition : A discrete random variable is a R.V.X whose possible values consitute finite set of values or countably infinite set of values. Examples 1 MA6451 PROBABILITY AND RANDOM PROCESSES All the R.V.’s from Example : 1 are discrete R.V’s Remark The meaning of P(X ≤a). P(X ≤a) is simply the probability of the set of outcomes ‘S’ in the sample space for which X(s) ≤ a. Or P(X≤a) = PS : X(S) ≤ a In the above example : 1 we should write 3 P(X ≤ 1) = P(HH, HT, TH) = 4 3 Here P(X≤1) = means the probability of the R.V.X (the number of heads) is less than or equal 4 3 to 1 is . 4 Distribution function of the random variable X or cumulative distribution of the random variable X Def : The distribution function of a random variable X defined in (-∞, ∞) is given by F(x) = P(X ≤ x) = Ps : X(s) ≤ x Note Let the random variable X takes values x , x , ….., x with probabilities P , P , ….., P 1 2 n 1 2 n and let x x ….. x 1 2 n Then we have F(x) = P(X x ) = 0, -∞ x x, 1 F(x) = P(X x ) = 0, P(X x ) + P(X = x ) = 0 + p = p 1 1 1 1 1 F(x) = P(X x ) = 0, P(X x ) + P(X = x ) + P(X = x ) = p + p 2 1 1 2 1 2 F(x) = P(X x ) = P(X x ) + P(X = x ) + ….. + P(X = x ) n 1 1 n = p + p + ………. + p = 1 1 2 n 1.2.2 PROPERTIES OF DISTRIBUTION FUNCTIONS Property : 1 P(a X ≤ b) = F(b) – F(a), where F(x) = P(X ≤ x) Property : 2 P(a ≤ X ≤ b) = P(X = a) + F(b) – F(a) Property : 3 P(a X b) = P(a X ≤ b) - P(X = b) = F(b) – F(a) – P(X = b) by prob (1) 1.2.3 PROBABILITY MASS FUNCTION (OR) PROBABILITY FUNCTION Let X be a one dimenstional discrete R.V. which takes the values x , x , …… To each possible outcome ‘x ’ we can associate a number p . 1 2 i i i.e., P(X = x) = P(x) = p called the probability of x. The number i i i i p = P(x ) satisfies the following conditions. i i ∞ (i) p x ≥∀ 0, (ii) p(x )=1 () i i∑ i i=1 2 MA6451 PROBABILITY AND RANDOM PROCESSES The function p(x) satisfying the above two conditions is called the probability mass function (or) probability distribution of the R.V.X. The probability distribution x , p can be i i displayed in the form of table as shown below. X = x x x ……. x i 1 2 i P(X = x ) = p p p ……. p i i 1 2 i Notation Let ‘S’ be a sample space. The set of all outcomes ‘S’ in S such that X(S) = x is denoted by writing X = x. P(X = x) = PS : X(s) = x ly P(x ≤ a) = PS : X() ∈ (-∞, a) and P(a x ≤ b) = Ps : X(s) ∈ (a, b) P(X = a or X = b) = P(X = a) ∪ (X = b) P(X = a and X = b) = P(X = a) ∩ (X = b) and so on. Theorem :1 If X1 and X2 are random variable and K is a constant then KX , X + X , X X , 1 1 2 1 2 K X + K X , X -X are also random variables. 1 1 2 2 1 2 Theorem :2 If ‘X’ is a random variable and f(•) is a continuous function, then f(X) is a random variable. Note If F(x) is the distribution function of one dimensional random variable then I. 0 ≤ F(x) ≤ 1 II. If x y, then F(x) ≤ F(y) III. F(-∞) = F(x)= 0 lim x→−∞ IV. F(∞) = F(x)= 1 lim x →∞ V. If ‘X’ is a discrete R.V. taking values x , x , x 1 2 3 Where x x x x …….. then 1 2 i-1 i P(X = x ) = F(x ) – F(x ) i i i-1 Example:1.2.1 A random variable X has the following probability function Values of X 0 1 2 3 4 5 6 7 8 Probability P(X) a 3a 5a 7a 9a 11a 13a 15a 17a (i) Determine the value of ‘a’ (ii) Find P(X3), P(X≥3), P(0X5) (iii) Find the distribution function of X. Solution 3 MA6451 PROBABILITY AND RANDOM PROCESSES Table 1 Values of X 0 1 2 3 4 5 6 7 8 p(x) a 3a 5a 7a 9a 11a 13a 15a 17a (i) We know that if p(x) is the probability of mass function then 8 p(x )=1 ∑ i i0= p(0) + p(1) + p(2) + p(3) + p(4) + p(5) + p(6) + p(7) + p(8) = 1 a + 3a + 5a + 7a + 9a + 11a + 13a + 15a + 17a = 1 81 a = 1 a = 1/81 put a = 1/81 in table 1, e get table 2 Table 2 X = x 0 1 2 3 4 5 6 7 8 P(x) 1/81 3/81 5/81 7/81 9/81 11/81 13/81 15/81 17/81 (ii) P(X 3) = p(0) + p(1) + p(2) = 1/81+ 3/81 + 5/81 = 9/81 (ii) P(X ≥ 3) = 1 - p(X 3) = 1 - 9/81 = 72/81 (iii) P(0 x 5) = p(1) + p(2) + p(3) + p(4) here 0 & 5 are not include = 3/81 + 5/81 + 7/81 + 9/81 3 + 5 + 7 + 8 + 9 24 = ––––––––––––––– = ––––– 81 81 (iv) To find the distribution function of X using table 2, we get X = x F(X) = P(x ≤ x) 0 F(0) = p(0) = 1/81 F(1) = P(X ≤ 1) = p(0) + p(1) 1 = 1/81 + 3/81 = 4/81 F(2) = P(X ≤ 2) = p(0) + p(1) + p(2) 2 = 4/81 + 5/81 = 9/81 F(3) = P(X ≤ 3) = p(0) + p(1) + p(2) + p(3) 3 = 9/81 + 7/81 = 16/81 F(4) = P(X ≤ 4) = p(0) + p(1) + …. + p(4) 4 = 16/81 + 9/81 = 25/81 4 MA6451 PROBABILITY AND RANDOM PROCESSES F(5) = P(X ≤ 5) = p(0) + p(1) + ….. + p(4) + p(5) 5 = 2/81 + 11/81 = 36/81 F(6) = P(X ≤ 6) = p(0) + p(1) + ….. + p(6) 6 = 36/81 + 13/81 = 49/81 F(7) = P(X ≤ 7) = p(0) + p(1) + …. + p(6) + p(7) 7 = 49/81 + 15/81 = 64/81 F(8) = P(X ≤ 8) = p(0) + p(1) + ….. + p(6) + p(7) + 8 p(8) = 64/81 + 17/81 = 81/81 = 1 1.3 CONTINUOUS RANDOM VARIABLE Def : A R.V.’X’ which takes all possible values in a given internal is called a continuous random variable. Example : Age, height, weight are continuous R.V.’s. 1.3.1 PROBABILITY DENSITY FUNCTION Consider a continuous R.V. ‘X’ specified on a certain interval (a, b) (which can also be a infinite interval (-∞, ∞)). If there is a function y = f(x) such that P(x X x + ∆x) = f (x) lim ∆→ x0 ∆x Then this function f(x) is termed as the probability density function (or) simply density function of the R.V. ‘X’. It is also called the frequency function, distribution density or the probability density function. The curve y = f(x) is called the probability curve of the distribution curve. Remark If f(x) is p.d.f of the R.V.X then the probability that a value of the R.V. X will fall in some interval (a, b) is equal to the definite integral of the function f(x) a to b. b P(a x b) = f (x)dx ∫ a (or) b P(a ≤ X ≤ b) = f (x)dx ∫ a 1.3.2 PROPERTIES OF P.D.F The p.d.f f(x) of a R.V.X has the following properties ∞ (i) f(x) ≥ 0, -∞ x ∞ (ii) f (x)dx= 1 ∫ −∞ Remark 5 MA6451 PROBABILITY AND RANDOM PROCESSES 1. In the case of discrete R.V. the probability at a point say at x = c is not zero. But in the case of a continuous R.V.X the probability at a point is always zero. ∞ C P X = c = f (x)dx= x =−= C C 0 () ∫ c −∞ 2. If x is a continuous R.V. then we have p(a ≤ X ≤ b) = p(a ≤ X b) = p(a X V b) IMPORTANT DEFINITIONS INTERMS OF P.D.F If f(x) is the p.d.f of a random variable ‘X’ which is defined in the interval (a, b) then b x f (x)dx i Arithmetic mean ∫ a b 1 f (x)dx ii Harmonic mean ∫ x a b log x f (x)dx iii Geometric mean ‘G’ log G ∫ a b r x f (x)dx iv Moments about origin ∫ a b r (x− A) f (x)dx ∫ v Moments about any point A a b r (x− mean) f (x)dx vi Moment about mean µ∫ r a b 2 (x− mean) f (x)dx vii Variance µ∫ 2 a b x− mean f (x)dx viii Mean deviation about the mean is M.D. ∫ a 1.3.3 Mathematical Expectations Def :Let ‘X’ be a continuous random variable with probability density function f(x). Then the mathematical expectation of ‘X’ is denoted by E(X) and is given by ∞ E(X)= x f (x)dx ∫ −∞ It is denoted by ∞ 'r µ= x f (x)dx ∫ r −∞ Thus 6 MA6451 PROBABILITY AND RANDOM PROCESSES ' ' µ= E(X) (µ about origin) 1 1 '2 ' µ= E(X ) (µ about origin) 2 2 ' ∴ Mean = X =µ =E(X) 1 And ' '2 Variance = µ −µ 22 22 Variance E(X )−E(X) (a) th r moment (abut mean) Now ∞ r r E X−= E X x− E(X) f (x)dx () ∫ −∞ ∞ r x− X f (x)dx ∫ −∞ Thus ∞ r µ= x− X f (x)dx ∫ r −∞ (b) r Where µ= EX− E(X) r th This gives the r moment about mean and it is denoted by µ r Put r = 1 in (B) we get ∞ µ= x− Xf (x)dx ∫ r −∞ ∞∞ = x f (x)dx− x f (x)dx ∫∫ −∞−∞ ∞∞   = X−= X f (x)dx f (x)dx 1 ∫∫   −∞ −∞  XX− µ=0 1 Put r = 2 in (B), we get ∞ 2 µ= (x− X) f (x)dx ∫ 2 −∞ 2 Variance=µ= EX− E(X) 2 Which gives the variance interms of expectations. Note Let g(x) = K (Constant), then 7 = = =MA6451 PROBABILITY AND RANDOM PROCESSES ∞ EgX EK K f (x ) dx ()() ∫   −∞ ∞ ∞    f (x)dx=1 = K f (x)dx ∫ ∫   −∞  −∞ = K . 1 = K Thus E(K) = K ⇒ Ea constant = constant. 1.3.4 EXPECTATIONS (Discrete R.V.’s) Let ‘X’ be a discrete random variable with P.M.F p(x) Then E(X)= x p(x) ∑ x For discrete random variables ‘X’ rr E(X )= x p(x) ∑ x (by def) If we denote r ' E(X )µ r Then ' rr µ= EX x p(x) ∑ r x Put r = 1, we get ' Meanµ= x p(x) ∑ r Put r = 2, we get ' 22 µ= EX x p(x) ∑ 2 x ∴ 2 ' '2 2 µ = µ −µ = E(X ) − E(X) 2 21 th The r moment about mean 'r µ= EX− E(X) r r =∑(x−= X) p(x), E(X) X x Put r = 2, we get 2 Variance = µ= (x− X) p(x) ∑ 2 x 1.3.5 ADDITION THEOREM (EXPECTATION) Theorem 1 If X and Y are two continuous random variable with pdf f (x) and f (y) then x y E(X+Y) = E(X) + E(Y) 8 = = = = =MA6451 PROBABILITY AND RANDOM PROCESSES 1.3.6 MULTIPLICATION THEOREM OF EXPECTATION Theorem 2 If X and Y are independent random variables, Then E(XY) = E(X) . E(Y) Note : If X , X , ……, X are ‘n’ independent random variables, then 1 2 n EX , X , ……, X = E(X ), E(X ), ……, E(X ) 1 2 n 1 2 n Theorem 3 If ‘X’ is a random variable with pdf f(x) and ‘a’ is a constant, then (i) Ea G(x) = a EG(x) (ii) EG(x)+a = EG(x)+a Where G(X) is a function of ‘X’ which is also a random variable. Theorem 4 If ‘X’ is a random variable with p.d.f. f(x) and ‘a’ and ‘b’ are constants, then Eax + b = a E(X) + b Cor 1: If we take a = 1 and b = –E(X) = – X , then we get E(X- X ) = E(X) – E(X) = 0 Note 11   E≠   X E(X)   Elog (x) ≠ log E(X) 2 2 E(X ) ≠ E(X) 1.3.7 EXPECTATION OF A LINEAR COMBINATION OF RANDOM VARIABLES Let X , X , ……, X be any ‘n’ random variable and if a , a , ……, a are constants, then 1 2 n 1 2 n Ea X + a X + ……+ a X = a E(X ) + a E(X )+ ……+ a E(X ) 1 1 2 2 n n 1 1 2 2 n n Result If X is a random variable, then 2 Var (aX + b) = a Var(X) ‘a’ and ‘b’ are constants. Covariance : If X and Y are random variables, then covariance between them is defined as Cov(X, Y) = EX - E(X) Y - E(Y) = EXY - XE(Y) – E(X)Y + E(X)E(Y) Cov(X, Y) = E(XY) – E(X) . E(Y) (A) If X and Y are independent, then E(XY) = E(X) E(Y) Sub (B) in (A), we get Cov (X, Y) = 0 ∴ If X and Y are independent, then 9 MA6451 PROBABILITY AND RANDOM PROCESSES Cov (X, Y) = 0 Note (i) Cov(aX, bY) = ab Cov(X, Y) (ii) Cov(X+a, Y+b) = Cov(X, Y) (iii) Cov(aX+b, cY+d) = ac Cov(X, Y) (iv) Var (X + X ) = Var(X ) + Var(X ) + 2 Cov(X , X ) 1 2 1 2 1 2 If X , X are independent 1 2 Var (X + X ) = Var(X ) + Var(X ) 1 2 1 2 EXPECTATION TABLE Discrete R.V’s Continuous R.V’s ∞ 1. E(X) = 1. E(X) = ∑x p(x) x f (x)dx ∫ −∞ ∞ r' r r' r 2. E(X )=µ=∑ x p(x) 2. E(X )=µ= x f (x)dx r ∫ r x −∞ ∞ ' ' 3. Mean = µ= x p(x) ∑ 3. Mean = µ= x f (x)dx r r∫ −∞ ∞ '2 ' 2 4. µ= x p(x) ∑ 4. µ= x f (x)dx 2 ∫ 2 −∞ ' '2 2 2 ' '2 2 2 5. Variance = µ −µ = E(X ) – E(X) 5. Variance = µ −µ = E(X ) – E(X) 21 21 SOLVED PROBLEMS ON DISCRETE R.V’S Example :1 2 When die is thrown, ‘X’ denotes the number that turns up. Find E(X), E(X ) and Var (X). Solution Let ‘X’ be the R.V. denoting the number that turns up in a die. ‘X’ takes values 1, 2, 3, 4, 5, 6 and with probability 1/6 for each X = x 1 2 3 4 5 6 1/6 1/6 1/6 1/6 1/6 1/6 p(x) p(x ) p(x ) p(x ) p(x ) p(x ) p(x ) 1 2 3 4 5 6 Now 6 E(X)= x p(x ) ∑ ii i1= = x p(x ) + x p(x ) + x p(x ) + x p(x ) + x p(x ) + x p(x ) 1 1 2 2 3 3 4 4 5 5 6 6 = 1 x (1/6) + 1 x (1/6) + 3 x (1/6) + 4 x (1/6) + 5 x (1/6) + 6 x (1/6) = 21/6 = 7/2 (1) 10 MA6451 PROBABILITY AND RANDOM PROCESSES 6 E(X)= x p(x ) ∑ ip i1= 2 2 2 2 2 = x p(x )+x p(x )+x p(x )+x p(x )+x p(x )+x p(x ) 1 1 2 2 3 3 4 4 5 5 6 6 = 1(1/6) + 4(1/6) + 9(1/6) + 16(1/6) + 25(1/6) + 36(1/6) 1+ 4++ 9 16+ 25+ 36 91 = = (2) 6 6 2 2 Variance (X) = Var (X) = E(X ) – E(X) 2 91 7 91 49 35   = – = − =   6 2 64 12   Example :2 Find the value of (i) C (ii) mean of the following distribution given 2  C(x− x ), 0 x 1 f (x)=  0 otherwise  Solution 2  C(x− x ), 0 x 1 Given f (x)= (1)  0 otherwise  ∞ f (x)dx=1 ∫ −∞ 1 2 C(x−= x )dx 1 using (1) ∴ 0x1 ∫ 0 1 23  xx C1 −=  23  0 11  C1 −=  23  32−  C1=  6  C = 1 C = 6 (2) 6 2 Sub (2) in (1), f(x) = 6(x – x ), 0 x 1 (3) ∞ Mean = E(x) = x f (x)dx ∫ −∞ 1 2 = x 6(x− x )dx from (3) ∴ 0 x 1 ∫ 0 1 23 = (6x− x )dx ∫ 0 11 MA6451 PROBABILITY AND RANDOM PROCESSES 1 34  6x 6x = −  34  0 ∴ Mean = ½ Mean C ½ 6 1.4 CONTINUOUS DISTRIBUTION FUNCTION Def : If f(x) is a p.d.f. of a continuous random variable ‘X’, then the function ∞ F (x) = F(x) = P(X ≤ x) = f (x)dx, −∞ x ∞ X ∫ −∞ is called the distribution function or cumulative distribution function of the random variable. PROPERTIES OF CDF OF A R.V. ‘X’ (i) 0 ≤ F(x) ≤ 1, - ∞ x ∞ (ii) F(x)= 0, F(x)= 1 Lt Lt x→−∞ x→−∞ b (iii) P(a ≤ X ≤ b) = f (x)dx F(b)−F(a) ∫ a dF(x) (iv) F'(x)= = f(x) ≥ 0 dx (v) P(X = x ) = F(x ) – F(x – 1) i i i Example :1.4.1 Given the p.d.f. of a continuous random variable ‘X’ follows 6x(1− x), 0 x 1  f (x)= , find c.d.f. for ‘X’  0 otherwise  Solution 6x(1− x), 0 x 1  Given f (x)=  0 otherwise  x The c.d.f is F(x) f (x)dx , −∞ x ∞ ∫ −∞ (i) When x 0, then x F(x)= f (x)dx ∫ −∞ x = 0 dx = 0 ∫ −∞ (ii) When 0 x 1, then 12 = =MA6451 PROBABILITY AND RANDOM PROCESSES x F(x)= f (x)dx ∫ −∞ 0x f (x)dx+ f (x)dx ∫∫ −∞ 0 x 23 x x  xx = 0 + 6x(1− x)dx = 6 x(1−= x)dx 6 − ∫∫  23 0 0  0 23 = 3x− 2x (iii) When x 1, then x F(x)= f (x)dx ∫ −∞ 01 x 0dx+ 6x(1− x)dx+ 0 dx ∫∫ ∫ −∞ 00 1 2 = 6 (x− x )dx = 1 ∫ 0 Using (1), (2) & (3) we get 0, x 0   23 F(x) 3x − 2x , 0 x 1   1, x 1  Example:1.4.2 −x e , x≥ 0 (i) If f (x)= defined as follows a density function ?  0, x 0  (ii) If so determine the probability that the variate having this density will fall in the interval (1, 2). Solution −x  e , x≥ 0 Given f (x)=  0, x 0  -x (a) In (0, ∞), e is +ve ∴f(x) ≥ 0 in (0, ∞) ∞ 0∞ (b) f (x)dx = f (x)dx+ f (x)dx ∫ ∫ ∫ −∞−∞ 0 0∞ −x = 0dx+ e dx ∫∫ −∞ 0 ∞ −x−∞ =   −e =−+ e1   0 13 = = =MA6451 PROBABILITY AND RANDOM PROCESSES = 1 Hence f(x) is a p.d.f (ii) We know that b P(a ≤ X ≤ b) = f (x) dx ∫ a 22 −− x x2 P(1 ≤ X ≤ 2) = f (x) dx e dx −e ∫∫ +1 11 2 −− x x2 e dx −e ∫ +1 1 -2 -1 = -e + e = -0.135 + 0.368 = 0.233 Example:1.4..3 -x A probability curve y = f(x) has a range from 0 to ∞. If f(x) = e , find the mean and variance and the third moment about mean. Solution ∞ Mean = x f (x) dx ∫ 0 ∞ ∞ −x −xx−   = x e dx = x−− e e ∫   0 0 Mean = 1 ∞ 2 Variance µ= (x− Mean) f (x)dx ∫ 2 0 ∞ 2x− (x−1) e dx ∫ 0 µ= 1 2 Third moment about mean b 3 µ (x− Mean) f (x)dx ∫ 3 a Here a = 0, b = ∞ b 3−x µ= (x−1) e dx ∫ 3 a ∞ 3−x 2x− −− x x = (x−1) (−e)− 3(x−1) (e)+ 6(x−1)(−e)− 6(e) 0 = -1 + 3 -6 + 6 = 2 µ = 2 3 1.5 MOMENT GENERATING FUNCTION Def : The moment generating function (MGF) of a random variable ‘X’ (about origin) whose probability function f(x) is given by tX M (t) = Ee X 14 = = == ==MA6451 PROBABILITY AND RANDOM PROCESSES ∞  tx e f (x)dx, for a continuous probably function ∫  x=−∞ =  ∞ tx  e p(x), for a discrete probably function ∑ x=−∞ Where t is real parameter and the integration or summation being extended to the entire range of x. Example :1.5.1 r ∞ t th ' Prove that the r moment of the R.V. ‘X’ about origin is M (t)µ ∫ Xr r r0= Proof tX WKT M (t) = E(e ) X 23 r  tX (tX) (tX) (tX) = E 1+ + + + .... ++ + ....   1 2 3 r   2r tt 2r = E1+ t E(X)+ E(X )++ ..... E(X )+ ........ 2 r 23 r tt t '' ' ' M (t) = 1+ tµ+ µ + µ + .....+ µ + ........ X 12 3 r 2 3 r 'r using µ= E(X ) r r t th Thus r moment = coefficient of r Note 1. The above results gives MGF interms of moments. 2. Since M (t) generates moments, it is known as moment generating function. X Example:1.5.2 ' ' Find µ and µ from M (t) X 1 2 Proof r ∞ t ' WKT M (t)µ ∑ Xr r0= r 2r tt t '' ' ' M (t) =µ + µ+ µ + .....+ µ (A) X 01 2 r 1 2 r Differenting (A) W.R.T ‘t’, we get 3 2t t ' '' ' M (t) =µ+ µ + µ + ..... (B) X 12 3 2 3 Put t = 0 in (B), we get '' M (0) =µ =Mean X 1 d   ' Mean = M (0) (or) (M (t)) 1 X   dt   t0= 15 = =

Advise: Why You Wasting Money in Costly SEO Tools, Use World's Best Free SEO Tool Ubersuggest.